Context is Particularly Well Placed to Replace Audience Targeting for Video Advertising

Tim Cross 07 December, 2020 

Video intelligence, where software scans video content to recognise the images, events and emotions happening on screen, has the potential to play a significant role in tackling some of the key issues in video advertising.

Video intelligence tech can power contextual ad targeting, which many believe will become more important as cookie-based audience targeting meets its end. And visual recognition can also filter out brand-unsafe content which has plagued user-generated content platforms over the past few years.

But some remain doubtful of contextual targeting’s viability as an alternative to audience targeting. And several of the major brand safety scandals of recent years have actually come about as a result of automated content filters failing to recognise offensive content.

VAN spoke with Amit Phansalkar, CEO and co-founder of Boston-based video intelligence company Netra, to hear how video intelligence technology has advanced over recent years, and whether he expects the ad world to flock to context following the death of the third-party cookie.

What do you make of contextual targeting as an alternative to cookie-based audience targeting?

I think it will emerge as a very strong option. You can look at the three aspects of any successful campaign. You’ve got the creative, the audience, and where the creative gets placed, which is the context. Now the audience piece of that isn’t going away, but it will get more murky. In the walled gardens, there will still probably be good information on your audience, but not on the open web. And that’s where context becomes extremely important.

In video in particular, it’s easier to replicate audience based on context, because sometimes the context gives you the audience. You can target people watching adventure travel videos for example, and that’s an audience target, but it’s defined by context. So those kinds of behavioural audience segments can be replicated using context.

Then the other part of it is also the creative context matching, which is also very context driven. So for the same brand you can have multiple creatives with different messages and different emotions behind them. And those will each resonate better in different places – a funny ad will probably work better alongside a funny video. And I think being able to match all of that will become critical, and will be big in the coming years.

There’s been talk for years about ‘in-the-moment’ advertising, where a football player sponsored by Nike might score a goal in a live match, and then you see an ad featuring that player during the ad break. Is that sort of thing happening yet?

I think that’s going to become very important, and there’s a few different aspects to it. With live programming it’s particularly important, because for things like sports and news you have little control over the content. As an advertiser, you might know your ad is running next to the news, but you don’t know that that news is going to be about. Or in live sports, you don’t know when the action is going to happen. So being able to analyse the live video becomes very important.

And there are particularly interesting opportunities in areas like sports, where brands play a big role in the live footage. So you might have a moment where someone hits a home run in a game of baseball, and around that time an airline banner is shown on screen. Now audiences probably haven’t paid too much attention to that airline banner, but it will have made some kind of impression, and that presents a great opportunity for that airline to run an ad during the next commercial break to improve their brand metrics, and that sort of thing is happening now.

But there are also opportunities outside of live programming. Where broadcasters aren’t selling all their inventory, being able to make TV a bit more like digital, and offer up contextually relevant spots based on what viewers have just seen within a programme, can help.

How much has visual recognition tech evolved in the four years since the peak of YouTube’s brand safety scandals?

We’ve come a long way from where we were back then, and a couple of technological advances have helped.

One change is now we’re looking more at the actual content, rather than video metadata. Three years ago, video technology hadn’t penetrated the market enough where people were actually looking at the video content. That’s because this stuff has historically been expensive. And the most important part of that was scale, which we’ve had to solve for. But if you’re just looking at metadata, you create a lot of issues. And it’s not just a case of videos slipping through the cracks, there’s also the other side of it, where perfectly safe content gets marked as unsafe. Now, from a technology perspective, we’re getting to a place where we can get into more nuances.

The other aspect of it is that brand safety is now getting more clearly defined. Brand safety has often been a case of “I’ll recognise it when I see it” from a buyer’s or platform’s perspective, and that’s not something you can replicate from a technology perspective! But now we’re getting to a place where there is more standardisation around brand safety, so as a result of that it’s easier to create technology solutions to filter out unsafe content. One of the things we do is allow our clients to create categories on the fly, so if something new emerges which they don’t want to advertise next to, they can create a rule to filter out that content in the future.

TikTok seems to have avoided major brand safety scandals so far – are you hearing concern from clients about brand safety on TikTok?

TikTok is an interesting platform. I think everyone is aware of the potential problems, but they are also very keen to leverage its popularity at the moment

From what I’ve seen, TikTok doesn’t have a lot of brand safety tools. They have started making some progress around brand safety, but I think they are way behind some of the other platforms. But it’s such a popular platform right now, and especially if you want to reach the sorts of audiences that TikTok attracts, it’s very much the place to be. So brands essentially have to take that risk! But I hope that at some point the industry will push for higher standards from TikTok, just so everyone feels safe advertising on their platform.

Looking at the sell-side, what sort of use cases are media owners finding for video intelligence technology?

So one use case is to help pick out highlights from video content. And we do that primarily based on three things. Looking at spots events as an example, the first thing we do is look at text – which could tell you for example when the score has changed, or when a player has been sent off. Then there’s recognising specific events visually, like when a ball has gone in the goal, when a player has hit a home run, things like that. For a home run for example, the system can learn that when you see shots of the baseball flying high into the stands, that’s an important event. And the third piece is reactions, which in sports would mean both the players’ reactions and the audience reactions.

And we’ve built flexibility into our API, where a sports distributor might want to see a two minute highlight reel, or a five minute highlight reel. And we can show them top highlights in ordered lists, and they can pick which ones they want to use.

Another common use is for search and recommendations, which are two sides of the same coin. So as a media owner, you might want to be able to search through your library of content to find specific videos or clips for new content. Using visual recognition, you could search, for example, for all clips you have which show someone drinking coffee. And then you might want to use the same functionality for video recommendation. If a viewer has interacted or engaged with a video, you might want to search through your library for similar videos to show them next in order to keep them engaged. That’s something that almost all video platforms do now, and we do offer that, but the search functionality is what really seems to be resonating right now.

2020-12-22T13:27:05+01:00

About the Author:

Tim Cross is Assistant Editor at VideoWeek.
Go to Top