Evidence-Based Creative and the Mysterious Case of the Iridescent Birds

Tim Cross 12 June, 2019 

Grant Gudgel is VP, Head of Teads Studio.

For many years now there has been a healthy tension between the creative side of the industry and those who would like to see a more evidence-based approach. This is understandable as in  some ways it seems like an overreach to claim that we could ever fully measure what makes a given piece of ad creative great, since good creative is more than just copy and images. Good creative work provides shared meaning that unites people around a brand, and meaning isn’t something you can quantify.

What then do we mean by evidence-based creative? It’s the process of using campaign performance data, live in near-real-time to identify what’s working, design tests, validate hypothesis and iterate.

I recently worked on a campaign where we were trying to work out which elements of a piece of video creative – so it was actually many different videos – were driving user action.

Looking at the data a week into the campaign, we noticed two versions of the video were driving significantly higher click-through rates. When we went back to compare those videos scene-by-scene with the other lower performing videos, we found that one particular scene set these versions apart.

What was in the one unique scene that drove better performance? Birds. Loads of birds.

In one 2.5 second sequence of those two videos, thousands of emerald hummingbirds bursts on to the screen.Those birds apparently touched a chord and drove user action well above the rest of the lot. My best guess is that something about the aesthetics of the shining birds – it really is a beautiful image – coupled with the other visuals in the ad, resonated more deeply with the audience.  It was something beautiful and extraordinary. At some level the beauty of the image meant something more.

At this point it’s worth stressing that no amount of data, research or A.I. will get you to those birds. We still need creative minds to come up with the original idea. But we can use the tools of data science to improve already great creative and make less-than-great creative work a whole lot harder through iteration. This is what I’m calling evidence based creative.

The concept is rooted in the realization, supported by Nielsen research, that 65 percent of sales lift in digital can be attributed to the quality of the creative. That means, we can perfectly calibrate every variable in a media buy and still only have a 35 percent impact on campaign performance.

As much as data targeting and programmatic efficiencies empower brands, we can pull all the right levers and not even account for half of the sales ROI of your campaign.

But what does an evidence based creative strategy strategy look like? Through the campaigns I’ve been involved with, I’ve identified a few key principles for success.

More is More

Creative is subjective and can only be judged in relation to the performance of other creative.  The more versions or iterations you can put in front of an audience or audiences, the more you can learn about what is working and what is not.  So, cut two versions of that video, or layer on interactive skins, overlays, headlines, branding etc… Switch messaging, try one with a product image and another with just that killer line of copy.

The more variations you can try the faster you’ll learn and the more you’ll be able to boost performance.

Automate to Iterate

We use our own proprietary platforms to build creative iterations, automate versioning, simplify approvals and to run analytics.  We’ve developed an entire methodology for managing campaign flighting on complex tests.

A number of similar tools are available in the market and we’ve explored many of them.  The key here is automation, identifying tools that allow you to build more versions, faster and manage a campaign without getting lost in the complexity.

Build (One) Clear Hypothesis Per Test

It’s tempting to want to test every variable, and we can indeed conduct multivariate tests that compare multiple elements in concert.  However, your hypothesis needs to be simple and allow for mutually exclusive outcomes if you want to actually learn anything meaningful.  This is why we tend to run tests in ‘sprints.’ Each sprint tends to test only one variable to get clear results, then we test another variable in the next sprint.

Scale Matters

If you want meaningful insights you need meaningful scale. We find that 50K impressions per version is about right as a minimum to drown out any noise in the data.  So if you’re testing ten creatives, it’s best to run 500K impressions in your test, or 50K per version. More is better, but the key here is to ensure no single data point is skewing your results.

Bust Those Silos

With all this talk of data and analysis we might forget that it takes a team of highly skilled humans to actually make this work. This can’t just be outsourced to your technology vendor. The entire ecosystem of creative agency, media agency, brand and technology partners need to be onboard, aligned and in constant communication.

For example on one recent project our analysis/creative refresh/approvals process had to fit into a 72 hour window every 10 days to keep up with the sprints. It works but only if everyone involved in the chain is committed to the project, respects deadlines and talks regularly with each other.

Of course there is far more to consider here, but much of it is best learned by doing.  My recommendation is that you start with an initial hypothesis and run a simple test, then ramp up the complexity from there.

Evidence based creative might sound like a lot of work, and in some ways it is. But whatever your brand the insights gained from integrating an iterative testing initiative into your creative strategy can be the ticket to a much deeper understanding of your creative and ultimately unlocking what works best for your brand.

Far from representing a menace to the creative agency world, the techniques for evidence based creative are wonderful tools for insight and improving creative performance.  They help us get beyond the raw data and dial down to which elements of a piece of creative are really impacting performance. Sometimes we even find surprises, like the beauty and shared meaning to be found in 2.5 seconds of brilliantly hued hummingbirds.

2019-06-12T10:45:56+01:00

About the Author:

Tim Cross is Assistant Editor at VideoWeek.
Go to Top