Hear me feel me

USING EMOTION AS A SERVICE

Getting emotional

Emotions play a pivotal role in all aspects of our life, decision making in particular, affecting decisions that dictate our health and finances. The relationship between emotional state and health is irrefutable with recent evidence pointing to the role depression has as an independent risk factor in chronic heart disease, (Bunker et al., 2003). There are also studies that link hypertension development to chronic mental stress in the workplace, (Odagiri 2010). Given this association between emotional state and health risk and the central place money has in our lives, it's little wonder there are associations between emotion and financial wellbeing.  

When it comes to emotional flux, the investment markets have  long been at the mercy of fear and greed, frequently being inundated with sales of stock when the market eases and then congested and overheated when markets rise and stock is in demand again. Closer to home, research has shown that hope and hopefulness play a strong part in influencing retirement savings behaviour; (McInnis, Nenkov, & Morrin, 2009). If emotion holds such strong sway over financial decision-making why is there a dearth of consumer-friendly savings and debt reduction apps that leverage established evidence, models and tools? Likely, in most instances there is no engagement with behavioural systems scientists and engineers as part of the product team; shock, horror. Be that as it may, it's time to frame a prosaic approach to the potential incorporation of Emotion as a Service API's into next generation financial wellbeing app design by understanding their fit with persona-tailored feature delivery and overarching behavioural system models.

Emotions and facial expressions

Detection and assessment of the facial expressions of emotion; anger, contempt, disgust, fear, happiness, sadness, and surprise using devices as simple as a webcam is becoming more commonplace. These emotions as universally expressed in the face are correlated with physiological states, appraisal processes and subsequent behaviors; (Matsumoto, & Hwang, 2014). Such facial expressions are fleeting and volatile but they can be detected and measured. Being discrete these emotions as captured by the technology can also generate qualitatively distinct and high fidelity information as each has its own unique trigger, physiological and behavioral correlates, and social meaning.

Breaban & Noussair (2013), examined relationships between emotions, as measured by traders’ facial expressions, and market behavior using facial recognition software. They found thatpositive emotion is associated with higher prices and larger bubbles.  Individuals in more positive emotional states are more likely to make purchases. Fearful traders before a market opens is predictive of low prices. Those who exhibit more emotional neutrality during a crash earn greater profits. It was also noted that a strong correlation existed between fear and loss aversion. Overall, the researchers observed a connection between market behavior and emotion to be close with bidirectional causal relationships. Nguyen and Noussair (2013) also find that fear in facial expressions is positively correlated with risk averse choices.

There is clearly something of value to be gleaned from incorporating technology that can capture, measure and analyse the facial expressions and emotions of users interacting with your system in order to better manage their finances.

Emotions and savings behaviour

According to McInnis, Nenkov, & Morrin (2009), 

"people feel hope when they yearn for a good outcome that seems possible even if it might not be likely.They yearn for having enough money to retire securely even if it doesn’t seem likely that it will happen. In contrast, they feel hopeful when they believe that there is a strong likelihood that something good will happen. In a retirement saving context, they not only want to have enough money to retire securely, but they believe that it is likely."

The research team found that stronger hope is related to less rational behavior. Those with higher levels of hope display more anxiety about investing and search for more information before making a decision. They think about the consequences of their decisions to a greater extent and seem to be slightly more risk averse in general. They also tend to expect a higher return from their investments but avoid those funds, characterized by higher risk and higher return potential and trust more more conservative funds entailing minimal risk and a lower return potential. Paradoxically, their risk averse tendencies contrastwith their expectations of a higher return.  

Stronger hopefulness, on the other hand, is positively related to more rational behavior. Participants with higher levels of hopefulnessare cognizant of the fact that investments might not yield high returns. They are more knowledgeable about investments, less risk averse, and more optimistic. They find the investment decision less difficult and are more satisfied with it once they have made it. In sum, these individuals seem to have more peace of mind with their decisions – yet they take more risks.

There is no clean cut means of identifying hope as a discrete facial expression using recognition tech. It can however be identified using advanced sentiment and tension analysis platforms.

Emotion as a Service APIs

There is an emerging cohort of Emotion as Service APi's the majority of which are based on facial recognition. Here's a reasonable list of available tools. http://bit.ly/2n8InT8

Let's look at one of the more popular platforms for no other reason than it is in popular use, has a robust and well supported coupling of SDK-API; AFFECTIVA.

It measures unfiltered and unbiased facial expressions of emotion, using a webcam.  Computer vision algorithms identify key landmarks on the face such as the tip of the nose, the corners of the mouth. and machine learning algorithms then analyze pixels in those regions to classify facial expressions. Combinations of these facial expressions are then mapped to emotions. Affectiva measures 7 emotion metrics: anger, contempt, disgust, fear, joy, sadness and surprise, providing 20 facial expression metrics. Being a popular service it has the advantage of an ever growing dataset to use to constantly tweak and improve their ML algorithms. 

Sentiment and tension analysis platforms are plentiful and do not rely on facial recognition but rather advanced text processing algorithms. A sentiment analysis model is typically used to analyze a text string and classify it with one of the labels that you provide; for example, you could analyze a tweet to determine whether it is positive or negative, or analyze an email to determine whether it is happy, frustrated, or sad. You can use Google Prediction API for this purpose. We are fiddling with http://sentistrength.wlv.ac.uk/ as well. Of course trawling your social media in-tray for app-related sentiment is useful but what of your horde of email and chat interactions with users and prospects?

Integration

Look, it's a pointless sandbox exercise to play with platforms such as these and try to retrofit them into your existing architecture without having done a solid amount of behavioural science research and modeling prior to that. Here's the takeaway - find out who the hell you think will benefit from your app and why they would be even remotely interested in tipping another bucket of information into their overfull daily bucket to accommodate your service. Don't guess, find the largest relative population sample and ask them; analyse, classify and model their responses. Know them and keep knowing them. Use evidence-based models to categorise these people in order to tailor functionality and UX. Their behavioural preparedness, emotional flux and motivations vary so they won't all need or use that projectile vomit of graphs and numbers you have spent person-years crafting. Once you have done that homework sit and think about the role emotion and sentiment play in financial decision-making and behaviour as framed by the behavioural model you are using. Then and only then identify the appropriate data sources and candidate API's to manage their processing. How would a webcam interaction fit in with your app and its interfaces with the user; how about at user acceptance testing stage for a start.  

As to how you fit these processes into your in-house agile-lean product development methodology. You ask us @mortonandlawson

References

Barabanschikov, V. (2015). Gaze Dynamics in the Recognition of Facial Expressions of Emotion. Perception. http://dx.doi.org/10.1177/0301006615594942

Breaban, A., & Noussair, C. (2013). Emotional State and Market Behavior. SSRN. Retrieved 4 April 2017, from https://ssrn.com/abstract=2276905

Chandola, T. (2006). Chronic stress at work and the metabolic syndrome: prospective study. BMJ, 332(7540), 521-525. http://dx.doi.org/10.1136/bmj.38693.435301.80

Glozier, N., Tofler, G., Colquhoun, D., Bunker, S., Clarke, D., & Hare, D. et al. (2013). The National Heart Foundation of Australia Consensus Statement on Psychosocial Risk Factors for Coronary Heart Disease. Heart, Lung And Circulation, 22, S258. http://dx.doi.org/10.1016/j.hlc.2013.05.615

Matsumoto, D., & Hwang, H. (2014). Judgments of subtle facial expressions of emotion. Emotion, 14(2), 349-357. http://dx.doi.org/10.1037/a0035237

McInnis, D., Nenkov, G., & Morrin, M. (2009). How Do Emotions Influence Savings Behavior?. Center For Retirement Research At Boston College, 9(8), 1-10.

Odagiri, Y. (2010). Psychosocial stress at work and metabolic syndrome. Stress Science Research, 25, 19-22. http://dx.doi.org/10.5058/stresskagakukenkyu.25.19

Dr Daryl Foy

Dr Daryl Foy is a Behavioural Scientist who specialises in the design of effective health behaviour change apps based on evidence including his own validated models for optimising persistent use. He consults to industry on how-to integrate persuasive design into LEAN product development as well as conversational UI. He can be contacted at dlfoy@mortonlawson.com