Focusing on Attention

Dan Meier 11 May, 2022 

A study released today revealed high levels of advertising attention for CTV. According to ShowHeroes Group, CTV ads deliver an 82 percent attention rate, compared with 69 percent for linear TV and 42 percent for social video. But what is attention and how is it measured?

Dr. Alastair Goode is a cognitive scientist at immersive research company Gorilla in the Room. The name comes from a famous experiment whereby subjects are asked to watch two teams dressed in different colours, and count how many times one team passes a ball between them. “In the middle of this, a gorilla comes on the screen and dances and then moves off again,” Goode explains. “And nobody notices it, or very few people notice it.” Another experiment involves asking people in the street for directions, only to swap the enquirer mid-conversation, “and about 50 percent of people don’t even notice that the person has changed.”

These tests demonstrate what psychologists call “inattentional blindness”, the phenomenon of changes occurring unnoticed within our field of vision. The concept is important when it comes to measuring attention to advertising, because being shown an ad – even looking directly at it – does not guarantee that we register its content. As VideoWeek recently explored, the main methodologies for measuring attention use laptop webcams and front-facing phone/tablet cameras to track (consenting) users’ eye movements.

Another technique employs AI for the purposes of head-tracking. “It takes head gaze or head position as a proxy for attention,” says Goode, “so if the head’s orientated towards the screen, it’s assumed that that is where the attention is being paid.” But as we know from the invisible gorilla experiment, “gaze and head position don’t necessarily indicate attention.” Focusing too much on gaze or dwell time can neglect the importance of specific eye movement, the kind that indicates actual perception or processing of information.

The ShowHeroes study incorporated electrodermal response sensors for a clearer picture of viewer reaction. “If you know what people are looking at through eye-tracking, that’s important, but you don’t actually know anything about how engaged they are,” Amanda Ellison, Professor of Neuroscience at Durham University, said of the survey. “By using electrodermal response sensors to track electrical variations in the skin, we’re able to measure a viewer’s current engagement state, telling us how receptive they are to visual stimuli.”

Resolving attention

Increasingly our daily lives are inundated with sensory information. A commute could expose us to billboard advertising, station announcements, listening to a podcast and rushing past a busker all at the same time. Selective attention allows us to tune some of this out, raising the question of which sources we are taking in and how much information we disregard.

Neuroscientist Nilli Lavie proposed “perceptual load theory” to explain the distribution of attention across different senses. Anyone who has tried reading one sentence while listening to another will understand the notion of perceptual deafness, the inability to simultaneously digest multiple pieces of complex information. Conversely, engaging in a low-attention activity, such as listening to instrumental music, is generally compatible with working.

This has interesting implications for advertising, particularly as multi-screening becomes the norm. Where the TV was once the main point of attention in a living room, ad breaks are now more likely to be spent on our phones. But advertising need not require high attention to make an impression – at least according to communications specialist Robert Heath, whose “low attention processing theory” suggests that emotional adverts are actually more effective when viewed passively.

Further research considers the impact of auditory distractions, based on the idea that auditory attention is often open and spread out, compared to the way we can choose to direct our visual attention. Goode notes that auditory attention has evolved as a kind of “early warning system”, alerting us to information outside of our visual focus. In theory then, sound could be used to tell people who are passively watching an ad when they need to pay more attention.

As we move away from watching a single screen, the ability to measure attention is potentially enhanced in virtual environments. “Tracking attention in the metaverse is actually probably easier than tracking attention within traditional media,” Goode comments. In VR contexts, the direction the head is pointing provides a stronger indication of what someone is looking at. While this does not resolve the false equivalence between gaze and attention, it poses exciting possibilities for brands and advertisers to build a “lived experience” for consumers to move through. For Goode, those opportunities warrant serious attention.

Follow VideoWeek on Twitter and LinkedIn.

2022-05-11T17:29:24+01:00

About the Author:

Reporter at VideoWeek.
Go to Top