UPDATE: A Rumor of Empathy at Affectiva: Reading Faces and Facial Coding Schemes Using Computer Systems

UPDATE: Join Lou Agosta, host of a Rumor of Empathy, and his guest Dan McDuff, Principal Scientist Affectiva software, for an on the air conversation about the emotions on the Wednesday April 8, 2015 at noon CDT on the VoiceAmerican Empowerment Channel: Radio Empathy: Affectiva Software Delivers Emotional Recognition For the Human Face – and Empathy!? (replay available shortly thereafter).

The human face is an emotional “hot spot”. New-born babies seem to gravitate spontaneously towards the face of the caretaker. The human face is an emotionally expressive display that is more than the sum of its parts. The face forms a total configuration that manifests a person’s humanity in a way especially engaging to another person. We humans seem to be hard-wired to interact with faces as the location for emotional expression – and the lack of expression. Freud famously said that “betrayal oozes at every pore.” Though Freud’s quip is not about his patient Dora’s facial expression as such, his slogan applies

Enigmatic Micro Expression

The MONA LISA’s enigmatic micro expression

to the face in ways he would have appreciated and which were being explored by Darwin (1871) when Freud was still only fifteen years old. This is where the game gets interesting.

 Innovations in computing hardware power, social networking, neural networks, and pattern recognition, are advancing the automated understanding of the expression of human facial emotions. Enter Affectiva (www.Affectiva.com), which, as the saying goes, is disrupting the disrupters. Affectiva was founded in 2009 by Rana el Kaliouby and Rosalind W. Picard, scientists at the MIT Media Lab. In a conversation with Daniel McDuff, Ph.D. (see http://alumni.media.mit.edu/~djmcduff), principal scientist at Affectiva (www.affectiva.com), I had an opportunity to learn how innovations in the computer-mediated assessment of the emotions are being implemented and brought to market in a variety of applications in advertising and media measurement (Affectiva’s current chosen market), law enforcement, and engaging diseases of empathy. One innovation that Dr McDuff brought to Affectiva from his work at the MIT Media Lab was the use of webcams to collect facial data from persons providing informed consent. This has enabled Affectiva to build a Big Data database of facial expressions to power the processing of its software algorithms. Simply stated, the output of the software process is as assessment of the individual’s emotional experience along a number of dimensions and variables. The devil is in the details. Layers of technology go into Affectiva’s Affdex system to capture and feed information to its algorithms. The system tracks and processes the texture of the human face, performing a complicated mapping to minute facial muscle movements described by the Facial Action Coding System (FACS), resulting in inferences about the categorization and intensity of emotional engagement, valence, and related nuances of affect. As with any software system, issues of ease of use, accessibility and flexibility of the human-machine interface, scalability, maintainability, and end-to-end system integration are front and center. This is where Affectiva seems to have stolen a march on the competition with the use of small computer-based webcams to capture data that is then stored to a Big Data backend. Webcams are pervasive. The potential amount of data is formidable. I have been known to say: “We don’t need more data; we need expanded empathy.” However, sometimes we need both. This seems to be one of those occasions. The deep background to Affectiva’s work is to be found in Paul Ekman (1992) and his colleague Wallace Friesen, who themselves relied on the researches of Charles Darwin (1871) and Silvan S. Tomkins (1961, 1962, 1992, 1993) and Duchenne de Boulogne (1862). In an enormous research effort lasting some eight years, Ekman led a team that coded some 5000 detailed movements of muscles in the face that are activated, in many cases involuntarily, in the arousal of some seven basic emotions. This Facial Action Coding Scheme becomes the basis for software automation. What’s so innovative about that? Well, anyone can try to fake a smile, pretending to be happy when one is really miserable. But what one cannot fake is activation of the “smile muscles” around one’s eyes, which are only engaged by an authentic and positive emotion that expresses one’s sincere delight and which remain uninvolved in an insincere baring of one’s teeth. Unlike one’s lips, which can be voluntarily displayed in a grimace, the muscles around the eyes are not subject to voluntary control. Hence, the opportunity exists for “betrayal to ooze out at every pore”. Moreover, the activation of such a muscle can occur and vanish in a fraction of a second. It moves rapidly across the face at a speed that lies beneath the threshold of one’s ability to see it without significantly slowing down the digital recording and playing it back. Yet the emotion occurs, however briefly. It lives. Such a detailed, minute muscle activation is called a “micro expression”. Micro expressions are hypothesized to be the basis for the enigmatic smile of Leonardo da Vinci’s painting of the Mona Lisa. In our time, Ekman’s facial action coding scheme of micro expressions becomes the basis for detecting deceit in the marketplace, politics and marriage (which, incidentally, is the subtitle of Ekman’s Telling Lies (1992)). In some cases, the marketplace hype is justified, and, in this case, caused me to chase the “rumor of empathy”. Ekman is on record as saying that he is skeptical about empathy, and I do not aim to change that here. In any case, Ekman identifies “duping delight” as the micro expression of happiness of the liar at having “put one over” on the teacher with the deception of being believed that the “dog really did eat the homework”. Or in law enforcement, the micro expression of contempt on the otherwise emotionless face of the would-be terrorist at striking back at the “running dogs of western imperialism”. What would the detection of such micro expressions be if not a subtle example of empathy or, more precisely, empathic receptivity? Before Ekman – and even before Darwin – the philosopher David Hume wrote of a “delicacy of sympathy” (1741), in detecting an impression of which another person was unaware. The word “empathy” had not yet been invented. Close enough.

Take aways include:

No market, no mission: Affectiva has traction in the advertising and marketing verticals. While all the usual disclaimers apply, and I have not “test driven” the Affdex system, Affectiva seems to be well on its way to integrating its facial recognition algorithm(s) in a comprehensive end-to-end automated process that includes a user friendly frontend (webcam) and big data backend. The intellectual property is relevant, and patent the algorithms. But absent an integrated, usable approach, it is going to be an idle wheel that does not move any other part of the business process. While a strong start is no guarantee of long term success, Affectiva has innovative technology and a compelling message that resonates with corporate needs to spend money on advertising that delivers demonstrable bang for the buck.

An implied definition of empathy: An account of empathy exists here based on micro expressions, which is what inspired my interest. As noted by Hume, if you are empathically receptive to a micro expression of which I am unaware, then your empathy is more sensitive and accurate than mine is in this situation. Now bring in the software from Affectiva, which captures many facial actions a second to digital storage. When applied by a skilled practitioner, the research analyst gets access to an unprecedented wealth of empathically relevant information.

The sun sets on focus groups (wouldn’t it be nice?): The fervent hope of many advertisers with budget? I know half the budget is wasted, but which half? The participant in the advertising focus group says that he loves the product, but his micro expressions also show boredom and anger. Hmmm. This implies a cautionary tale for all applications of emotion-detecting mechanisms – including focus group coordinators, police interrogators, teachers, therapists, and software systems. Much value is to be found in accurately assessing what emotion the other person is experiencing. But we still do not know the motive and background for the emotion without interacting with the other person. Is the other person experiencing anger because he was dismissive of the taste of the candy bar or because he is a survivor of and some childhood trauma, unrelated to the product?

We do not learn to express our emotions; we learn to inhibit them: Display rules often trump the uninhibited expression of our emotions. Instead of banging one’s cup on the table, one learns to make a request: “May I please have more apple juice?” Display rules inhibiting the expression of negative emotions are not just common in the Japanese culture, which is famous for inhibiting disagreement with the community, but are pervasive in most polite social settings. This is where Big Data can make a difference in providing samples of facial expressions that may be relatively rare in our day-to-day life.

Confabulation lives too (unfortunately): Consumers are notorious for not really knowing what they like or dislike about products. This is a reflection of a key feature of human behavior – people “confabulate,” which means they make up stories, which, however, they sincerely believe. Conscientious participants in focus groups are skilled at making up reasons for preferring or disliking products without being able to identify the authentic cause of their preferences. Comparing two produces – brand A and brand X – would-be buyers provide elaborate and complex reasons for preferring the brand A over brand X. It’s actually a psychology experiment, not market research – the products are identical! The respondents are confabulating. This is where a detailed analysis of facial expressions, including expressions that may lie at or beneath the threshold of conscious awareness, can provide not just data but information about what the individual is really experiencing in relation to the product or service. One can go back and replay the digital video and analyze it in detail – or even better do so using automated methods. Supposedly there is nothing new under the sun; but automated facial analysis gives marketers access to information that was previously hidden in plain view.

Applications in law enforcement: “If I tell you about this, I will have to kill you.” All bad jokes aside, large governmental organizations are definitely interested (e.g., Davis et al 2002). Here the issues relating to privacy and the integrity of one’s own personal data are complex and confronting. I recall Martha Bennett saying on my first research meeting at Giga Information Group in 1999, “Privacy, what privacy?” Deeply cynical? Matters have not gotten simpler – or more private – in the interim. It turns out that the expression on one’s face is a new level of personal data that had previously not been subject to collection or analysis. While a traveler at an airport may betray a micro expression that discloses anger or contempt, one does not know whether this is because he is “living into the future” and anticipating being in a stressed position in a middle seat in his long flight or because he has more sinister plans. Ekman is famous for debunking the polygraph or “liar detector” (1992), and it is unlikely that such a result will be changed by the provision of additional facial software. The best way to pass a “lie detector” test continues to be to believe one’s own lies. The so-called lie detector measures the physiological stresses aroused by the experience of the inconsistency between the person’s word and (intended) deed. The myth of technology is also a factor with the induced-stress of the imagined effectiveness of the “lie detecting technology” (e.g., enhanced by the examiner wearing a white coat). Ekman is also celebrated for saying he would not teach his technology to the Russian equivalent of the secret service since he was not persuaded that its use would be restricted to helping and protecting people. For law enforcement the privacy issues of responsible data use are fraught. Expect major litigation and controversy. For business applications, permissioned, approved, unobtrusive data gathering is the watchword.

Applications to disorders of empathy: This is an area of great promise to relieve human suffering, but not necessarily an area of revenue provisioning. Though the debate about the relevance of empathy continues, empathy is hypothesized to be at the basis of the human ability (1) to relate emotionally to other persons like oneself (2) to experience other persons as intentional agents (3) to attribute a mind such as one’s own – as in “mindedness” – to other persons like oneself. Empathy is not reducible to emotional contagion, shared-joint attention or mindedness; but at least the first two are input for further empathic understanding, empathic interpretation, and empathic responsiveness that enriches a person’s relations with other human beings. Absent empathy, people cease to matter to the person lacking empathy, though people may be useful in certain means-ends way of providing services. In disorders of empathy, one or more of these mechanisms has misfired or is hypothesized to be missing. The individual “on the autism spectrum” seems to be unaware of the emotions, intentions or mindedness of other people. The subsequent breakdowns in human development, education, and day-to-day functioning are debilitating in the extreme and can even be life threatening. People and computer systems can produce similar results and output using profoundly different means and methods. Though it is improbable that the Affdex software arrives at its conclusions about what people are experiencing emotionally in the same way that the human brain arrives at its results, the possibilities for comparison are significant. A comparison between the steps of human brain-based empathy and the artificial empath implemented in software may reveal what can go wrong and suggest meaningful interventions. Finally, no substitute exists for an expert clinical differential diagnosis by an informed human – though hope springs eternal in the matter of eliminating focus groups in marketing – but the use of software to detect and analyze micro expressions – or their absence – can be a significant check and balance in the diagnostic process. Empathy is still on the short list of those things where humans enjoy a decisive advantage over automated systems. Still, it is sobering the way the boundary keeps getting pushed around as humans are beaten by computing systems in chess, natural language processing in Jeopardy, and now challenged by decoding facial emotions. In any case, the combination of human judgment and a software system acting as a kind of “co-pilot” to the decision-making human is a compelling partnership. In summary, the rumor of empathy at Affectiva is confirmed. Empathy lives at Affectiva. I hasten to add that “a rumor of empathy” is my turn of phrase – my spin – and not a description employed by Affectiva, though I suggest that the distinction is an apt one.

Relevant URLs



https://angel.co/eyeris http://www.empatica.com


Charles Darwin. (1871). The Expression of Emotions in Man and Animals. Chicago: University of Chicago Press, 1968.

Ann Davis, Joseph Pereira, and William Bulkeley. (2002). Silent Signals: Security Concerns Bring New Focus on Body Language: FBI, Customs Officials Take ‘Science’ More Seriously; Experiment at Logan. The Wall Street Journal, August 15, 2002: p. 1,6.

Duchenne de Boulogne. (1862). The Mechanism of Human Facial Expression. A. Cuthbertson (tr. and ed.). Cambridge, UK: Cambridge University Press, 1990.

Paul Ekman. (1992). Telling Lies: Clues to Deceit in the Marketplace, Politics, and Marriage. New York: W. W. Norton.

David Hume. (1741). Of the Delicacy of Taste and Passion. In Charles W. Hendel (ed.). Of the Standard of Taste and Other Essays, Indianapolis: Bobbs-Merrill: 1965: 25-28.

Daniel J. McDuff, Rana Kaliouby, and R. W. Picard. (2012). Crowdsourcing Facial Responses to Online Videos. IEEE Transactions on Affective Computing, forthcoming.

Richard Nisbett and Timothy Wilson. (1977). Telling More Than We Can Know: Verbal Reports on Mental Processes. Psychology Review. Vol 84, No. 3: 231-259.

Silvan S. Tomkins. (1961, 1962, 1992, 1993). Affect, Imagery, Consciousness. Vols. 1-4. New York: Springer Publishing.

Lou Agosta. (2015). A Rumor of Empathy: Resistance, Narrative, and Recovery. London: Routledge.

Find A Rumor of Empathy on Facebook

(c) Lou Agosta, Ph.D. and the Chicago Empathy Project

Categories: Affectivity, Ekman, Emotions, Empathy, simulation

Tags: , , , , , ,

%d bloggers like this: