We live in a very noisy world, constantly bombarded with electronic and other communications. Breaking through the clutter is hard, but market research offers various tools for measuring whether a communication (ad, packaging, website or direct mail) will stand out and be noticed or not. Some work better than others, writes Caroline Sharp, member of The ICG.
All our work tells us that both qualitative and quantitative research have very valid roles to play, but only eye tracking accurately measures real impact, as only eye tracking measures what people really notice. By eye tracking we mean real in person eye tracking as opposed to mouse tracking, algorithms or online eye tracking, all of which have been proven to be less accurate.
Mouse clicking (‘click on what you saw’), for example, only measures conscious attention paid. In a recent study by PRS, mouse clicking provided wildly different results from eye tracking for the same piece of packaging – and would have resulted in a completely different set of recommendations.
Visual Attention algorithms, where computer software ‘calculates’ the visibility of the different elements within the image, are often sold as a cheaper alternative. But the same study by PRS, showed that the algorithms are not sensitive enough to detect differences between designs, and are particularly poor at predicting the visibility levels of on-pack claims and messaging.
More recently online eye tracking has become available, using the respondent’s own webcam as the eye tracking camera. While this method appears to offer the ability to track large samples at competitive rates, it should be approached with caution. Using the respondents built in webcam results in up to a 75% failure rate in successful data collection, and the accuracy for the remaining 25% can be down to a quarter of the screen. Hence, while it looks a superficially attractive proposition, online eye tracking can be poor value.
When the objective is to accurately measure what consumers see (and don’t see), there is no substitute for actual in-person eye tracking. And when accuracy is important you need to quiz your research supplier about the accuracy of the equipment they use – again some is more accurate than others.
When we want to explore the differences between 2 designs, we simply place them side by side on screen (rotating which left and which right) and then measure which they look at 1st. To then understand that relative impact we run the more detailed eye tracking analysis, and conduct some qualitative interviews.
We applied this technique to 2 different promotional packs for Kelloggs Coco Pops, and were able to demonstrate that the pack with the ‘Free swimming promotion’ had significantly more impact, although the brand logo on the ‘Croc’ pack attracted more attention than the plain.
We’ve also applied this approach to websites, ads, direct mail, POS and even strategic messaging (the message that consumers pay most attention to is the message that interests them most). In each case we measure not just differences in overall impact, but relative impact of branding, headlines, and other messages or images.
Article supplied by The ICG.
Caroline can be contacted via the ICG here