How Emotions Impact Cognition
- 809 shares
- 3 years ago
Affective computing is the expanding intersection between technology and emotion. It is characterized by the detection of and response to human feelings.
The term Affective Computing originates in the 1990s with Rosalind Picard's paper and book of that title. Picard’s view of the field is broad, including several far-reaching philosophical issues and technical challenges. Naturally, there are significant ethical issues too. For example, if a system can detect your emotional state, how widely should that information be available? Does it belong to you? Some issues are similar to those with facial recognition, which is still very controversial.
Affective computing is alive and well, albeit nascent. It has promising applications in problem domains where face-to-face contact is less available. Remote learning and healthcare are two areas where responding to users’ emotional states could considerably enhance the effectiveness of human-computer and human-robot interactions. With an aging population and severe shortages of trained staff, healthcare may be a vital problem domain for the burgeoning field of applied affective computing (AAC).
As the name suggests, AAC focuses on the practical aspects of detecting and responding to human emotions. It is a complex mix of “psychology, AI, human-computer interaction (HCI), robotics, engineering, social science, and medical science.” (See Applied Affective Computing below.) Artificial Intelligence, in particular, has an increasingly vital role, but effective sensors and algorithms for identifying emotions are also needed. As suggested earlier, facial responses are critical for many emotions and feature detection, as illustrated below, is a necessary stepping stone.
Detection of Static Geometric Facial Features
Intelligent Behaviour Understanding Group (iBUG), Department of Computing, Imperial College London
AAC promises to be an exciting field. The name itself was coined only in the late 2010s. AAC systems already perform better than humans in recognizing some emotions, especially in audio-visual emotion classification. The future of affective computing will be spell-binding and complex.
Understanding and interpreting emotions is a primary goal of affective systems. Surprisingly, the famous botanist Charles Darwin was instrumental in the study of emotions through his 1872 book, The Expression of the Emotions in Man and Animals. He is credited by Paul Ekman, among others, for being the creator of modern psychology. Ekman himself was responsible for the “big-6” model of basic emotion theory later in the 20th century, identifying them as:
Darwin’s universality hypothesis proposed that all humans convey these emotions through facial expressions, but recent evidence shows this is not necessarily true (see Jack et al. in the references below). Further study in this area has resulted in a cross-cultural expression model:
On the technical side, it may be some time before we can develop fully emotion-aware systems. Psychologists divide emotions into two categories. Primary emotions are basic “animal” responses from our primitive brains (the brain stem and limbic system). These are primarily alarm responses and lead to immediate physiological reactions that can be readily detected. Secondary emotions are more considered and potentially complex. Some authors suggest that they ultimately produce physiological responses, but this is not universally accepted. In his review of Picard’s original paper, Aaron Sloman is skeptical that affective systems could detect secondary emotions. He also doubts the ability to identify mixed emotions and their component causes. For example, you might be sad that a pet has died but relieved that it is no longer suffering. (See the Sloman and Damasio references below.)
Take our course on Design for Thought and Emotion.
Here’s the entire UX literature on Affective Computing by the Interaction Design Foundation, collated in one place:
Take a deep dive into Affective Computing with our course Design for Thought and Emotion .
Throughout the course, the well-respected author and professor of Human-Computer Interaction, Alan Dix, will give valuable insights into the basics of thought and emotion. He will also touch on how these factors influence us as designers of interactive systems.
In the “Build Your Portfolio: Thought and Emotion Project”, you’ll find a series of practical exercises that will give you first-hand experience in applying what we’ll cover. If you want to complete these optional exercises, you’ll create a series of case studies for your portfolio which you can show your future employer or freelance customers.
Gain an Industry-Recognized UX Course Certificate
Use your industry-recognized Course Certificate on your resume, CV, LinkedIn profile or your website.
We believe in Open Access and the democratization of knowledge. Unfortunately, world class educational materials such as this page are normally hidden behind paywalls or in expensive textbooks.