The emotional pull of advertisements: How emotional reactions relate to purchase intent

The emotional pull of advertisements:
How emotional reactions relate to purchase intent

The Question: Why is emotion important, and how do we assess it?
Noldus Consultants: Jason Rogers, PhD and Abbe Macbeth, PhD
Case study write up: Abbe Macbeth, PhD

Market researchers, advertisers, and consumer scientists have long been aware of the intense relationship between emotion and consumer behavior1, and the interplay between emotion and advertising2. Advertisers make use of this knowledge by tugging on our emotions: they employ cute animals and babies, remind us of when we were kids, condition us to react to certain jingles, etc; all of these practices are designed to create an emotional bond between the consumer and the product.

How do consumer researchers, trying to harness consumers’ emotions, actually assess emotionality? With technological advances over the past fifty years, researchers can go beyond self-report, and do more than simply asking the consumer “Tell me how this ad makes you feel”. Instead, researchers began utilizing methods such as implicit reaction time tests3 and neuromarketing4. These methods both get at unconscious drivers of decision making, yet neither looks at the one place where emotions are displayed: the face.

Why is emotion important, and how do we assess it?

In the past fifteen years, technology has made it to market that automatically classifies facial expressions. Noldus provides our clients with FaceReader™, an advanced software platform providing automatic and objective assessment of facial emotion. Based on the original “basic” emotions set forth by Paul Ekman5, FaceReader automatically determines the presence and intensity of Happy, Sad, Angry, Surprised, Scared, and Disgusted, as well as Neutral. FaceReader has been validated against human coders6, with degree of agreement ranging from 70% (Disgusted) to 99% (Happy). While much of the previous work using FaceReader has focused in psychology7,8 or food science research9,10, recent work demonstrated the usefulness of FaceReader in the consumer research field11. Specifically, the expression of Happy predicted an advertisement’s effectiveness: positive correlations were found between Happy and the respondents’ attitudes towards the advertisement (AAD) and attitude towards the brand (AB) for ads with high and medium levels of amusement, but not low11.

A drawback to using the traditional FaceReader software for consumer research is that the software must be hosted on a local computer, with respondents present in the laboratory in order to analyze their facial emotions. To address this, Noldus recently debuted FaceReader Online, which provides the researcher with a user-friendly, easily-accessible portal built around proven, reliable FaceReader technology. By capturing respondents in their own homes, FaceReader Online provides researchers with the option of gathering respondents from around the globe.

A known metric of consumer behavior is purchase intent (PI), which has long been relied upon by consumer researchers as an estimate of actual buying behavior12. Here we ask: how does FaceReader’s output compare to PI? Is it as good of a predictor as PI? Can facial emotion assessments actually replace PI as a measure of purchasing desire? In the current study, FaceReader Online was used to capture data from respondents around the United States as they watched a variety of advertisements. Afterwards, a PI measure was taken and correlated to the expression of Happy, as it was hypothesized that a) Happy could predict and correlate with PI, and b) ads that performed better would also have higher PI and greater expressions of Happy.


The Journey: Using FaceReader Online

Respondents

Respondents were recruited via Survey Monkey. 22% of those invited responded, with 113 people total completing the study. Respondents varied in age from 21-65 and were split across gender. The only exclusionary criteria included were requiring that no respondents wear glasses and all must have a webcam attached to, or embedded within, their computers.

Stimuli

After a few brief introduction slides requesting permission to use the webcam, verifying age, and lack of glasses, respondents were shown one of eight ads. Each ad originally aired at a SuperBowl from 2009 to 2014, and ranged in category (consumer package goods, household needs, food and beverage), as well as known market performance13. Each respondent saw one ad, with a final n=13-15 for each video. Ads were randomly presented to respondents via FaceReader Online, randomized across age and gender. Videos were not taken of respondents; FaceReader Online used the respondents’ webcams only to gather facial expression data and analyze it online. Immediately after playing the advertisement, a Purchase Intent (PI) measure was taken. A short survey appeared asking respondents if, based on the advertisement seen, they would be likely to purchase that product within the month. The traditional 5-point Likert scale was used14.

FaceReader & FaceReader Online technology

FaceReader works in 3 simple steps, in both the original version15 and subsequent releases16. The software detects the face and creates an accurate model of the face based on the Active Appearance method17. The model describes over 500 key points on the face, and facial texture is determined by how those points interact with each other. The actual classification occurs by comparing the current facial expression of the respondent against an artificial neural network18 that is trained with a database of over 10,000 manually-annotated images. For each frame, FaceReader provides a value from 0 (not present at all) to 1 (maximally present) for all 7 emotions (Happy, Sad, Angry, Surprised, Scared, Disgusted, and Neutral).

FaceReader Online uses FaceReader technology, but data is analyzed using Microsoft Windows Azure cloud platform instead of running on a local computer. Analysis is then carried out in the cloud using the same model and classification described above.

Data analysis

All data were exported from FaceReader and analyzed in SPSS (Version 22, IBM, Armonk, NY), and Microsoft Excel (Microsoft, Redmond, WA) using the Data Analysis plug-in.

 

The Outcomes

 

FaceReader Online is a robust platform

Despite potential differences in lighting, web cameras, and camera placement in respondents’ homes, there were no significant differences in number of frames analyzed across advertisements (average per person per ad was 415 +/- 11). All respondents had fewer than 11 % missed frames during analysis, with no ad having significantly more missed frames than any other ad.

 Ad performance predicted Purchase Intent

First, we wanted to compare the self-reported PI with each advertisement’s known performance13, which split the ads into three categories: High-, Average-, and Low- performing ads. High-performing ads showed significantly greater PI compared with Average- and Low-performing ads (Figure 1; p < 0.05); however, Average- and Low- performing ads did not significantly differ from one another.

Figure 1. Ads that performed well showed significantly higher Purchase Intent than Average- or Low-performing ads (*p< 0.05).

Ad performance predicted the amount of Happy

Based on earlier work11, it was hypothesized that High-performing ads would result in greater Happy expressions. Just as we saw with PI, High-performing ads showed significantly greater outputs of Happy compared with Average- and Low-performing ads (Figure 2; p<.001); Average- and Low-performing ads did not significantly differ from one another.

Figure 2: Ads that performed well showed significantly higher Happy expressions than Average- or Low-performing ads (***p< 0.001).

“Happy” correlated with Purchase Intent

Using a multiple regression analysis, we determined that Happy, unlike any other emotion, significantly predicted PI (β = .58, p<.001; data not shown). Furthermore, as shown in Figure 3, Happy and Purchase Intent correlated in the same way with High-, Average-, and Low-performing ads.

Figure 3: Ad performance as a function of Happy expression and Purchase Intent.

Taken together, these data demonstrate that the Happy expression is a valid predictor of PI and that an ad’s performance can be defined by the amount of Happy expressed during viewing.

Ad performance and general emotionality

In looking at overall emotion expression, we found that both High- and Low-performing ads resulted in similar emotionality: just over 30% of viewing time for both types of ads (Figure 4). In contrast, Average-performing ads elicited much lower emotionality during viewing – emotion expression dropped to approximately 20% of the viewing time for Average-performing ads (Figure 4).

 

Figure 4: Overall emotional expression as a function of ad performance.

Despite eliciting similar overall levels of emotion during viewing, the types of emotions elicited by High- and Low-performing ads differed. Viewers of High-performing ads registered more Happy, whereas viewers of Low-performing ads displayed more Sad and Angry emotions. Although not significant, the data shown in Figure 4 are compelling in the types of emotions that these ads elicit from viewers.

 

  • The Insights

Similar to what was found previously by Lewinski et al11, FaceReader Online was able to accurately predict PI. As anticipated, viewers of High-performing ads displayed the highest levels of PI (Figure 1) and Happy (Figure 2). During the 8 advertisements presented, regardless of the performance of the ads, Happy was the only measured emotion that could predict PI, based upon a linear regression analysis.

It should be noted that while Happy was a significant predictor of PI, is not the only factor in determining an ad’s performance. For example, not every advertisement is meant to be humorous; many are meant to be taken seriously, and thus would not evoke a response of “Happy”.

Furthermore, we don’t know how ad exposure, and market saturation, could have influenced the outcomes of this experiment. Over exposure to an ad can decrease ad effectiveness over time, which could influence the emotions expressed during viewing. Finally, the halo effect, wherein the consumer’s overall impression of a brand/market can influence his/her thoughts and feelings towards that brand19 can result in an immeasurable effect upon the effectiveness of any given advertisement.

In addition to confirming the usefulness of FaceReader in predicting PI, this study also determined that FaceReader online is a valuable tool for assessing advertisement effectiveness. Even given the technical constraints of using an online platform, the data clearly show that FaceReader is a tool that is well-placed in the market as an automated, non-intrusive measure of engagement with an advertisement. Furthermore, data obtained from the software can be used to accurately predict PI by the viewer. Both the overall amount of emotions displayed, as well as the type of emotion detected by the software, can be used by the researcher to predict ad effectiveness. With this tool, Noldus has provided consumer researchers with technology that rivals older measures, such as PI, in predicting advertisement effectiveness.

 

References

1Weinberg, P. & Gottwald, W. (1982). Impulsive consumer buying as a result of emotions. Journal of Business Research, 10, 43-57.

2Holbrook, M.B. & Batra, R. (1987). Assessing the role of emotions as mediators of consumer responses to advertising.

3http://gemmacalvert.com/everything-you-need-to-know-about-implicit-react...

4Sebastian, V. (2014). Neuromarketing and evaluation of cognitive and emotional responses of consumers to marketing stimuli. Procedia – Social and Behavioral Sciences, 127, 753-757.

5Ekman, P. Friesen, W.V. (1971). Constants across cultures in the face and emotion. Journal of Personality and Social Psychology, 17, 124-129.

6Terzis, V.; Moridis, Ch. N.; Economides, A.A. (2010). Measuring instant emotions during a self-assessment test: The use of FaceReader. Proceedings of Measuring Behavior 2010, Eindhoven, The Netherlands, 192-193.

7Chentsova-Dutton, Y.E. & Tsai, J.L. (2010). Self-Focused attention and emotional reactivity: the role of culture. Journal of Personality and Social Psychology, 98, 507-519.

8Ceccarini, F. & Caudek, C. (2013). Anger superiority effect: The importance of dynamic emotional facial expressions. Visual Cognition, 21, 498-540.

9Garcia-Burgos, D. and Zamora, M.C. (2013). Facial affective reactions to bitter-tasting foods and body mass index in adults. Appetite, 71, 178-186.

10de Wijk, R.A., Kooijman, V., Verhoeven, R.H.G., Holthuysen, N.T.E., & de Graaf, C. (2012). Autonomic nervous system responses on and facial expressions to the sight, smell, and taste of liked and disliked foods. Food Quality and Preference, 26, 196-203.

11Lewinski, P., Fransen, M.L., and Tan E.S.H. (2014). Predicting advertising effectiveness by facial expressions in response to amusing persuasive stimuli. Journal of Neuroscience, 1, 1-14.

12Fishbein, M. & Ajzen, I. (1975). Belief, Attitude, Intention, and Behavior. Reading, MA: Addison-Wesley Publishing Company.

13www.acemetrix.com/spotlights/events

14Risen, E. & Risen, L. (2008). The Use of Intent Scale Translations to Predict Purchase Interest. Retrieved from http://www.biotrak.com/wp-content/uploads/2011/11/Intent-Scale-White-Pap....

15den Uyl, M.J. & van Kuilenberg, H. (2005). The FaceReader: Online facial expression recognition. Proceedings of Measuring Behavior 2005, 5th International Conference on Methods and Techniques in Behavioral Research, Wagengingen, The Netherlands, 589-590.

16Van Kuilenberg, H, den Uyl, M.J., Israel, M.L., & Ivan, P. (2008). Advances in face and gesture analysis. Proceedings of Measuring Behavior 2008, Maastricht, The Netherlands, 371-372.

17Cootes, T. & Taylor, C. (2000). Statistical models of appearance for computer vision. Technical report, University of Manches­ter, Wolfson Image Analysis Unit, Imaging Science and Biomedical Engineering.

18Bishop, C.M. (1995). Neural Networks for Pattern Recognition. Oxford: Clarendon Press

19 Nisbet, R.E. & Wilson, T.D. (1977). The halo effect: Evidence for unconscious alteration of judgments. Journal of Personality and Social Psychology, 35, 250-256.

Contact us