12. Affective Computing

502 shares
Download PDF

As Human-Computer Interaction (HCI) and Interaction Design moved from designing and evaluating work-oriented applications towards dealing with leisure-oriented applications, such as games, social computing, art, and tools for creativity, we have had to consider e.g. what constitutes an experience, how to deal with users’ emotions, and understanding aesthetic practices and experiences. Here I will provide a short account of why in particular emotion became one such important strand of work in our field.

Author/Copyright holder: Courtesy of Rikke Friis Dam and Mads Soegaard. Copyright terms and licence: CC-Att-ND (Creative Commons Attribution-NoDerivs 3.0 Unported).

Affective Computing video 1 - Introduction to Affective Computing and Affective Interaction

Author/Copyright holder: Courtesy of Rikke Friis Dam and Mads Soegaard. Copyright terms and licence: CC-Att-ND (Creative Commons Attribution-NoDerivs 3.0 Unported).

Affective Computing video 2 - Main Guidelines and Future Directions

Author/Copyright holder: Courtesy of Rikke Friis Dam and Mads Soegaard. Copyright terms and licence: CC-Att-ND (Creative Commons Attribution-NoDerivs 3.0 Unported).


Affective Computing video 3 - Designing Affective Interaction Products Dealing With Stress

Author/Copyright holder: Courtesy of Rikke Friis Dam and Mads Soegaard. Copyright terms and licence: CC-Att-ND (Creative Commons Attribution-NoDerivs 3.0 Unported).

Affective Computing video 4 - Business value, value, and inspirations

I start by describing the wave of research in a number of different academic disciplines that resurrected emotion as a worthy topic of research. In fact, before then one of the few studies of emotion and emotion expression that did not consider emotion as a problem goes back as far as to Darwin’s “The Expression of the Emotions in Man and Animals” in 1872 (Darwin, 1872). After Darwin, much attention in the academic world was focused on how emotion is problematic to rational thinking.

The new wave of research on emotion spurred ideas both amongst AI-researchers and HCI-researchers. In particular, the work by Rosalind Picard with her book on “Affective Computing” opened a viable research agenda for our field (Picard, 1997). But as with any movement within HCI, there will be different theoretical perspectives on the topic. A counter reaction to Picard’s cognitivistic models of emotion came from the work by Sengers, Gaver, Dourish and myself (Boehner et al 2005,Boehner et al 2007, dePaula and Dourish 2005, Gaver 2009, Höök, 2008, Höök et al., 2008). Rather than pulling on a cognitivistic framework, this strand of work, Affective Interaction, draws upon phenomenology and sees emotion as constructed in interaction – between people and between people and machines.

While the work in these two strands on designing for emotion has contributed a lot of insights, novel applications, and better designs, both have lately come to a more realistic design aim where emotion is just one of the parameters we have to consider. Instead of placing emotion as the central topic in a design process, it is now seen as one component contributing to the overall design goal. In particular, it becomes a crucial consideration as we approach design for various experiences and interactions.

12.1 History: the resurrection of emotion

During the 1990ies, there was a wave of new research on the role of emotion in diverse areas such as psychology (e.g. Ellsworth and Scherer, 2003), neurology (e.g. LeDoux, 1996), medicine (e.g. Damasio, 1995), and sociology (e.g. Katz, 1999). Prior to this new wave of research, emotions had, as I mentioned, been considered to be a low-status topic of research, and researchers had mainly focused on how emotion got in the way of our rational thinking. Results at that point focused on issues like when getting really scared, pilots would get tunnel vision and stop being able to notice important changes in the flight’s surroundings. Being upset with a colleague and getting angry in the middle of a business meeting could sabotage the dialogue. Or giving a presentation and becoming very nervous could make you loose the thread of the argument. Overall emotions were seen the less valued pair in the dualistic pair rational – emotional, and associated with body and female in the “mind – body”, “male – female” pairs. This dualistic conceptualisation goes back as far as to the Greek philosophers. In Western thinking, the division of mind and body was taken indisputable and, for example, Descartes looked for the gland that would connect the thoughts (inspired by God) with the actions of the body, Figure 1.

René Descartes' illustration of dualism. Inputs are passed on by the sensory organs to the epiphysis in the brain and from there to the immaterial spirit.

Copyright terms and licence: pd (Public Domain (information that is common property and contains no original authorship)).

Figure 12.1: René Descartes' illustration of dualism. Inputs are passed on by the sensory organs to the epiphysis in the brain and from there to the immaterial spirit.

But with this new wave of research in the 90ies, emotion was resurrected and given a new role. It became clear that emotions were the basis for behaving rationally. Without emotional processes we would not have survived. Being hunted by a predator (or enemy aircraft) requires focusing all our resources on escaping or attacking. Tunnel vision makes sense in that situation. Unless we can associate feelings of uneasiness with dangerous situations, as food we should not be eating, or people that aim to hurt us, we would make the same mistakes over and over, see Figure 2.

Focusing on enemy aircraft, getting tunnel vision

Author/Copyright holder: Courtesy of Master Sgt. Lance Cheung. Copyright terms and licence: pd (Public Domain (information that is common property and contains no original authorship)).

Figure 12.2: Focusing on enemy aircraft, getting tunnel vision

While fear and anger may seem as most important to our survival skills, our positive and more complex socially-oriented emotion experiences are also invaluable to our survival. If we do not understand the emotions of others in our group of primates, we cannot keep peace, share food, build alliances and friendships to share what the group can jointly create (Dunbar, 1997). To bring up our kids to function in this complex landscape of social relationships, experiences of shame, guilt, and embarrassment are used to foster their behaviour (Lutz 1986, Lutz 1988). But positive emotions also play an important role in bringing up our kids: conveying how proud we are of our kids, making them feel seen and needed by the adults, and unconditional love.

The new wave of research also questioned the old Cartesian dualistic division between mind and body. Emotional experiences are not residing in our minds or brains solely. They are experienced by our whole bodies: in hormone changes in our blood streams, nervous signals to muscles tensing or relaxing, blood rushing to different parts of the body, body postures, movements, facial expressions (Davidson et al., 2002). Our bodily reactions in turn feedback into our minds, creating experiences that regulate our thinking, in turn feeding back into our bodies. In fact, an emotional experience can start through body movements; for example, dancing wildly might make you happy. Neurologists have studied how the brain works and how emotion processes are a key part of cognition. Emotion processes are basically sitting in the middle of most processing going from frontal lobe processing in the brain, via brain stem to body and back (LeDoux, 1996), see Figure 3.

LeDoux’s model of fear when seeing a snake.

Author/Copyright holder: Unknown (pending investigation). Copyright terms and licence: Unknown (pending investigation). See section "Exceptions" in the copyright terms below.

Figure 12.3: LeDoux’s model of fear when seeing a snake.

Bodily movements and emotion processes are tightly coupled. As discussed by the philosopher Maxine Sheets-Johnstone in The Corporeal Turn: A interdisciplinary reader, there is “a generative as well as expressive relationship between movement and emotion” (Sheets-Johnstone, 2009). Certain movements will generate emotion processes and vice-versa.

But an emotional experience is not only residing “inside” our bodies as processes going back and forth between different parts of our body, they are also in a sense spread over the social setting we are in (Katz, 1999, Lutz, 1986, Lutz 1988, Parkinson, 1996). Emotions are not (only) hard-wired processes in our brains, but changeable and interesting regulating processes for our social selves. As such, they are constructed in dialogue between ourselves and the culture and social settings we live in. Emotion is a social and dynamic communication mechanism. We learn how and when certain emotions are appropriate, and we learn the appropriate expressions of emotions for different cultures, contexts, and situations. The way we make sense of emotions is a combination of the experiential processes in our bodies and how emotions arise and are expressed in specific situations in the world, in interaction with others, coloured by cultural practices that we have learnt. We are physically affected by the emotional experiences of others. Smiles are contagious.

Catherine Lutz, for example, shows how a particular form of anger, named song by the people on the south Pacific atoll Ifaluk, serves an important socializing role in their society (Lutz, 1986, Lutz 1988). Song is, according to Lutz, “justifiable anger” and is used with kids and with those who are subordinate to you, to teach them appropriate behaviour in e.g. doing your fair share of the communal meal, failing to pay respect to elders, or acting socially inappropriately.

Ethnographic work by Jack Katz (1999) provides us with a rich account of how people individually and group-wise actively produce emotion as part of their social practices. He discusses, for example, how joy and laughter amongst visitors to a funny mirror show is produced and regulated between the friends visiting together. Moving to a new mirror, tentatively chuckling at the reflection, glancing at your friend, who in turn might move closer, might in the end result in ‘real’ laughter when standing together in front of the mirror. Katz also places this production of emotion into a larger complex social and societal setting when he discusses anger among car drivers in Los Angeles, see Figure 4. He shows how anger is produced as a consequence of a loss of embodiment with the car, the road, and the general experience of travelling. He connects the social situation on the road; the lack of communicative possibilities between cars and their drivers; our prejudice of other people’s driving skills related to their cultural background or ethnicity; etc.; and shows how all of it comes together explaining why anger is produced and when, for example, as we are cut off by another car. He even sees anger as a graceful way to regain a sense of embodiment.

Katz places the production of emotion into a larger complex social and societal setting when he discusses anger among car drivers in Los Angeles

Copyright terms and licence: pd (Public Domain (information that is common property and contains no original authorship)).

Figure 12.4: Katz places the production of emotion into a larger complex social and societal setting when he discusses anger among car drivers in Los Angeles

12.2 Emotion in Technology?

A part of the new wave of research on emotion also affected research and innovation of new technology. In artificial intelligence, emotion had to be considered as an important regulatory process, determining behaviour in autonomous systems of various kinds, e.g. robots trying to survive in a dynamically changing world (see e.g. Cañamero, 2005).

In HCI, we understood the importance of considering users’ emotions explicitly in our design and evaluation processes. Broadly, the HCI research came to go in three different directions with three very different theoretical perspectives on emotion and design.

1. The first, widely known and very influential perspective is that of Rosalind Picard and her group at MIT, later picked up by many other groups, in Europe most notably by the HUMAINE network. The cognitivistically inspired design approach she named Affective Computing in her groundbreaking book from 1997.

2. The second design approach might be seen as a counter-reaction to Affective Computing. Instead of starting from a more traditional perspective on cognition and biology, the Affective Interaction approach starts from a constructive, culturally-determined perspective on emotion. Its most well-known proponents are Phoebe Sengers, Paul Dourish, Bill Gaver and to some extent myself (Boehner et al., 2007, Boehner et al. 2005, Gaver 2009, Sundström et al. 2007, Höök, 2006, Höök 2008, Höök 2009).

3. Finally, there are those who think that singling out emotion from the overall interaction leads us astray. Instead, they see emotion as part of a larger whole of experiences we may design for – we can name the movement Technology as Experience. In a sense, this is what traditional designers and artists have always worked with (see e.g. Dewey 1934) – creating for interesting experiences where some particular emotion is a cementing and congruous force that unites the different parts of the overall system of art piece and viewer/artist. Proponents of this direction are, for example, John McCarthy, Peter Wright, Don Norman and Bill Gaver (McCarthy and Wright, 2004, Norman, 2004, Gaver, 2009).

Let us develop these three directions in some more detail. They have obvious overlaps, and in particular, the Affective Interaction and Technology as Experience movements have many concepts and design aims in common. Still, if we simplify them and describe them as separate movements, it can help us to see the differences in their theoretical underpinnings.

12.2.1 Affective Computing

The artificial intelligence (AI) field picked up the idea that human rational thinking depends on emotional processing. Rosalind Picard’s “Affective Computing” had a major effect on both the AI and HCI fields (Picard, 1997). Her idea, in short, was that it should be possible to create machines that relate to, arise from, or deliberately influence emotion or other affective phenomena. The roots of affective computing really came from neurology, medicine, and psychology. It implements a biologistic perspective on emotion processes in the brain, body, and interaction with others and with machines.

The most discussed and widespread approach in the design of affective computing applications is to construct an individual cognitive model of affect from what is often referred to as “first principles”, that is, the system generates its affective states and corresponding expressions from a set of general principles rather than having a set of hardwired signal-emotion pairs. This model is combined with a model that attempts to recognize the user’s emotional states through measuring the signs and signals we emit in face, body, voice, skin, or what we say related to the emotional processes going on. In Figure 5 we see for example how facial expressions, portraying different emotions, can be analysed and classified in terms of muscular movements.

Facial expressions from Ekman portraying anger, fear, disgust, surprose, happiness and sadness

Author/Copyright holder: Paul Ekman 1975. Copyright terms and licence: All Rights Reserved. Reproduced with permission. See section "Exceptions" in the copyright terms below.

Figure 12.5: Facial expressions from Ekman portraying anger, fear, disgust, surprose, happiness and sadness

Figure 5B: Facial muscles moving eyebrow and muscles around the eye when expressing different emotions

Author/Copyright holder: Greg Maguire. Copyright terms and licence: All Rights Reserved. Reproduced with permission. See section "Exceptions" in the copyright terms below.

Figure 5B: Facial muscles moving eyebrow and muscles around the eye when expressing different emotions

Emotions, or affects, in users are seen as identifiable states or at least identifiable processes. Based on the identified emotional state of the user, the aim is to achieve an interaction as life-like or human-like as possible, seamlessly adapting to the user’s emotional state and influencing it through the use of various expressions. This can be done through applying rules such as those brought forth by Ortony et al. 1988, see Figure 6.

A rule from the OCC-model (Ortony et al., 1988)

Author/Copyright holder: Ortony, Clore and Collins. Copyright terms and licence: From The cognitive structure of emotions (1988). Cambridge University Press. All Rights Reserved. Reproduced with permission. See section "Exceptions" in the copyright terms below.

Figure 12.6: A rule from the OCC-model (Ortony et al., 1988)

This model has its limitations, both in its requirement for simplification of human emotion in order to model it, and in its difficult approach into how to infer the end-users emotional states through interpreting human behaviour through the signs and signals we emit. This said, it still provides for a very interesting way of exploring intelligence, both in machines and in people.

Examples of affective computing systems include, for example, Rosalind Picard and colleagues’ work on affective learning. It is well known that students’ results can be improved with the right encouragement and support (Kort et al., 2001). They therefore propose an emotion model built on James A. Russell’s circumplex model of affect relating phases of learning to emotions, see Figure 7. The idea is to build a learning companion that keeps track of what emotional state the student is in and from that decides what help she needs.

Russell's circumplex model of affect

Author/Copyright holder:James A. Russell and American Psychological Association. Copyright terms and licence: All Rights Reserved. Reproduced with permission. See section "Exceptions" in the copyright terms below.

Figure 12.7: Russell's circumplex model of affect

But the most interesting applications from Rosalind Picard’s group deal with important issues such as how to train autistic children to recognise emotional states in others and in themselves and act accordingly. In a recent spin-off company, named Affectiva, they put their understanding into commercial use – both for the autistic children, but also for recognising interest in commercials or dealing with stress in call centres. A sensor bracelet recognising Galvanic Skin Response (GSR) is used in their various applications, see Figure 8.

The bracelet, named Q Sensor, measures skin conductance which in turn is related to emotional arousal - both positive and negative

Author/Copyright holder: Affectiva, Inc. Copyright terms and licence: All Rights Reserved. Reproduced with permission. See section "Exceptions" in the copyright terms below.

Figure 12.8: The bracelet, named Q Sensor, measures skin conductance which in turn is related to emotional arousal - both positive and negative

Other groups, like the HUMAINE network in Europe, starts from this way of seeing affective interaction.

Samples of Affector Output

Author/Copyright holder: Phoebe Sengers, Kirsten Boehner, Simeon Warner, and Tom Jenkin. Copyright terms and licence: All Rights Reserved. Reproduced with permission. See section "Exceptions" in the copyright terms below.

Figure 12.9: Samples of Affector Output

12.2.2 Affective Interaction: The Interactional Approach

An affective interactional view is different from the affective computing approach in that it sees emotions as constructed in interaction, whereas a computer application supports people in understanding and experiencing their own emotions (Boehner et al., 2005, Boehner et al 2007, Höök et al., 2008, Höök 2008). An interactional perspective on design will not aim to detect a singular account of the “right” or “true” emotion of the user and tell them about it as in a prototypical affective computing application, but rather make emotional experiences available for reflection. Such a system creates a representation that incorporates people’s everyday experiences that they can reflect on. Users’ own, richer interpretation guarantees that it will be a more “true” account of what they are experiencing.

According to Kirsten Boehner and colleagues (2007)), the interactional approach to design:

  1. recognizes affect as a social and cultural product

  2. relies on and supports interpretive flexibility

  3. avoids trying to formalize the unformalizable

  4. supports an expanded range of communication acts

  5. focuses on people using systems to experience and understand emotions

  6. focuses on designing systems that stimulate reflection on and awareness of affect

Later, I and my colleagues added two minor modifications to this list (Höök et al., 2008):

  1. Modification of #1: The interactional approach recognizes affect as an embodied social, bodily and cultural product

  2. Modification of #3: The interactional approach is non-reductionist

The first change is related to the bodily aspects of emotional experiences. But explicitly pointing to them, we want to add some of the physical and bodily experiences that an interaction with an affective interactive system might entail. We also took a slightly different stance towards design principle number three, “the interactional approach avoids trying to formalize the unformalizable”, in Boehner and colleagues’ list of principles. To avoid reductionist ways of accounting for subjective or aesthetic experiences, Boehner and colleagues’ aim to protect these concepts by claiming that human experience is unique, interpretative, and ineffable. Such a position risks mystifying human experience, closing it off as ineffable and thereby enclosing it to be beyond study and discussion. While I wholeheartedly support the notion of unity of experience and support the idea of letting the magic of people’s lives remain unscathed, I do believe that it is possible to find a middle ground where we can actually speak about qualities of experiences and knowledge on how to design for them without reducing them to something less than the original. This does not in any way mean that the experiential strands, or qualities, are universal and the same for everyone. Instead they are subjective and experienced in their own way by each user (McCarthy and Wright, 2004).

A range of systems has been built to illustrate this approach, such as Affector (Sengers et al., 2005), the VIO (Kaye, 2006), eMoto (Sundström et al., 2009), Affective Diary (Ståhl et al., 2009) and Affective Health (Ferreira et al., 2010) – just to mention a few.

Affector is a distorted video window connecting neighbouring offices of two friends (and colleagues), see Figure 9. A camera located under the video screen captures video as well as 'filter' information such as light levels, colour, and movement. This filter information distorts the captured images of the friends that are then projected in the window of the neighbouring office. The friends determine amongst themselves what information is used as a filter and various kinds of distortion in order to convey a sense of each other's mood.

eMoto is an extended SMS-service for the mobile phone that lets users send text messages between mobile phones, but in addition to text, the messages also have colourful and animated shapes in the background (see examples in Figure 11). To choose an expression, you perform a set of gestures using the stylus pen (that comes with some mobile phones), which we had extended with sensors that could pick up on pressure and shaking movements. Users are not limited to any specific set of gestures but are free to adapt their gesturing style according to their personal preferences. The pressure and shaking movements can act as a basis for most emotional gestures people do, a basis that allows users to build their own gestures on top of these general characteristics, see Figure 11.

Different physical movements (left) that remind of the underlying affective experiences of the circumplex model of affect from Russell (middle), which is then mapped to a colourful, animated expressio

Author/Copyright holder: Petra Sundström, Anna Ståhl, and Kristina Höök (images to the left and right) and James A. Russell and American Psychological Association (the image in the middle). Copyright terms and licence: All Rights Reserved. Reproduced with permission. See section "Exceptions" in the copyright terms below.

Figure 12.10: Different physical movements (left) that remind of the underlying affective experiences of the circumplex model of affect from Russell (middle), which is then mapped to a colourful, animated expression (right), also mapped to the circumplex model of affect

eMoto-messages sent to boyfriends in the final study of eMoto. On the left, a high energy expression of love from study participant Agnes to her boyfriend. On the right, Mona uses her favourite green

Author/Copyright holder: Unknown (pending investigation). Copyright terms and licence: Unknown (pending investigation). See section "Exceptions" in the copyright terms below.

Figure 12.11: eMoto-messages sent to boyfriends in the final study of eMoto. On the left, a high energy expression of love from study participant Agnes to her boyfriend. On the right, Mona uses her favourite green colours to express her love for her boyfriend

Affective Diary works as follows: as a person starts her day, she puts on a body sensor armband. During the day, the system collects time stamped sensor data picking up movement and arousal. At the same time, the system logs various activities on the mobile phone: text messages sent and received, photographs taken, and Bluetooth presence of other devices nearby. Once the person is back at home, she can transfer the logged data into her Affective Diary. The collected sensor data are presented as somewhat abstract, ambiguously shaped, and coloured characters placed along a timeline, see Figure 12. To help users reflect on their activities and physical reactions, the user can scribble diary-notes onto the diary or manipulate the photographs and other data, see example from one user in Figure 12.

The Affective Diary system. Bio-sensor data are represented by the blobby figures at the bottom of the screen. Mobile data are inserted in the top half of the screen along the same time-line as the bl

Author/Copyright holder: Unknown (pending investigation). Copyright terms and licence: Unknown (pending investigation). See section "Exceptions" in the copyright terms below.

Figure 12.12: The Affective Diary system. Bio-sensor data are represented by the blobby figures at the bottom of the screen. Mobile data are inserted in the top half of the screen along the same time-line as the blobby characters

A user says about this screendump: “[pointing at the orange character] And then I become like this, here I am kind of, I am kind of both happy and sad in some way and something like that. I

Author/Copyright holder: Unknown (pending investigation). Copyright terms and licence: Unknown (pending investigation). See section "Exceptions" in the copyright terms below.

Figure 12.13: A user says about this screendump: “[pointing at the orange character] And then I become like this, here I am kind of, I am kind of both happy and sad in some way and something like that. I like him and then it is so sad that we see each other so little. And then I cannot really show it.

As can be seen from all these three examples, an interactional approach to design tries to avoid reducing human experience to a set of measurements or inferences made by the system to interpret users’ emotional states. While the interaction of the system should not be awkward, the actual experiences sought might not only be positive ones. eMoto may allow you to express negative feelings about others. Affector may communicate your negative mood. Affective Diary might make negative patterns in your own behaviour painfully visible to you. An interactional approach is interested in the full (infinite) range of human experience possible in the world.

12.2.3 Technology as Experience

While we have so far, in a sense, separated out emotion processes from other aspects of being in the world, there are those who posit that we need to take a holistic approach to understanding emotion. Emotion processes are part of our social ways of being in the world, they dye our dreams, hopes, and experiences of the world. If we aim to design for emotions, we need to place them in the larger picture of experiences, especially if we are going to address aspects of aesthetic experiences in our design processes (Gaver, 2009, McCarthy and Wright, 2004, Hassenzahl, 2008).

John Dewey, for example, distinguishes aesthetic experiences from other aspects of our life through placing it in-between two extremes on a scale (Dewey, 1934). On the one end of that scale, we just drift and experience an unorganized flow of events in everyday life, and on the other end of the scale we experience events that do have a clear beginning and end but that only mechanically connect the events with one-another. Aesthetic experiences exist between those extremes. They have a beginning and an end; they can be uniquely named afterwards, e.g. “when I took those horseback riding lessons with Christian in Cambridge” (Höök, 2010), but in addition, the experience has a unity – there is a single quality that pervades the entire experience (Dewey 1934, p. 36-57):

An experience has a unity that gives it its name, that meal, that storm, that rupture of a friendship. The existence of this unity is constituted by a single quality that pervades the entire experience in spite of the variation of its constituent parts

In Dewey’s perspective, emotion is (Dewey, 1934 p. 44):

the moving and cementing force. It selects what is congruous and dyes what is selected with its color, thereby giving qualitative unity to materials externally disparate and dissimilar. It thus provides unity in and through the varied parts of an experience

However emotions are not static but change in time with the experience itself, just as a dramatic experience does (Dewey 1934, p. 43).

Joy, sorrow, hope, fear, anger, curiosity, are treated as if each in itself were a sort of entity that enters full-made upon the scene, an entity that may last a long time or a short time, but whose duration, whose growth and career, is irrelevant to its nature. In fact emotions are qualities, when they are significant, of a complex experience that moves and changes.

While an emotion process is not enough to create an aesthetic experience, emotions will be part of the experience and inseparable from the intellectual and bodily experiences. In such a holistic perspective, it will not make sense to talk about emotion processes as something separate from our embodied experience of being in the world.

Bill Gaver makes the same argument when discussing design for emotion (Gaver 2009). Rather than isolating emotion as if it is something that “can be canned as a tomato in a Campbell tomato soup” (as John Thackara phrased it when he criticised the work by Don Norman on the subject), we need to consider a broader view on interaction design, allowing for individual appropriation. Bill Gaver phrases it clearly when he writes:

Clearly, emotion is a crucial facet of experience. But saying that it is a ‘facet of experience’ suggests both that it is only one part of a more complex whole (the experience) and that it pertains to something beyond itself (an experience of something). It is that something—a chair, the home, the challenges of growing older—which is an appropriate object for design, and emotion is only one of many concerns that must be considered in addressing it. From this point of view, designing for emotion is like designing for blue: it makes a modifier a noun. Imagine being told to design something blue. Blue what? Whale? Sky? Suede shoes? The request seems nonsensical

If we look back at the Affector, eMoto, and Affective Diary systems, we see clearly that they are designed for something else than the isolation of emotion. Affector and eMoto are designed for and used for communication between people where emotion is one aspect of their overall communication. And, in fact, Affector turned out to not really be about emotion communication, but instead became a channel for a sympathetic mutual awareness of your friend in the other office.

12.3 Concluding remarks - some directions for the future

It seems obvious that we cannot ignore the importance of emotion processes when designing for experiences. On the other hand, designing as if emotion is a state that can be identified in users taken out of context, will not lead to interesting applications in this area. Instead, the knowledge on emotion processing needs to be incorporated in our overall design processes.

The work in all the three directions of emotion design outlined above contributes in different ways to our understanding of how to increase our knowledge on how to make emotion processes an important part of our design processes. The Affective Computing field has given us a range of tools for both affective input, such as facial recognition tools, voice recognition, body posture recognition, bio-sensor models, and tools for affective output e.g. emotion expression for characters in the interface or regulating robot behaviours. The Affective Interaction strand has contributed to an understanding of the socio-cultural aspects of emotion, situating them in their context, making sure that they are not only described as bodily processes beyond our control. The Technology as Experience-field has shifted our focus from emotion as an isolated phenomenon towards seeing emotion processes as one of the (important) aspects to consider when designing tools for people.

There are still many unresolved issues in all these three directions. In my own view, we have not yet done enough to understand and address the everyday, physical, and bodily experiences of emotion processes (e.g. Sundström et al., 2007, Ståhl et al., 2009, Höök et al., 2008, Ferreira et al., 2008, Ferreira et al., 2010, Sundström et al., 2009, Ferreira and Höök, 2011). Already Charles Darwin made a strong coupling between emotion and bodily movement (Darwin, 1872). Since then, researchers in areas as diverse as neurology (leDoux 1996, Davidson et al., 2003), philosophy and dance (Sheets-Johnstone, 1999, Laban and Lawrence, 1974), and theatre (Boal, 1992), describe the close coupling between readiness to action, muscular activity, and the co-occurrence of emotion.

I view our actual corporeal bodies as key in being in the world, in creating for experiences, learning and knowing, as Sheets-Johnstone has discussed (1999). Our bodies are not instruments or objects through which we communicate information. Communication is embodied - it involves our whole selves. In design, we have had a very limited view on what the body can do for us. Partly this was because the technology was not yet there to involve more senses, movements and richer modalities. Now, given novel sensing and actuator materials, there are many different kinds of bodily experiences we can envision designing for - mindfulness, affective loops, excitement, slow inwards listening, flow, reflection, or immersion (see e.g. Moen, 2006, Isbister and Höök, 2009, Hummels et al., 2007). In the recently emerging field of design for somaesthetics (Schiphorst, 2007), interesting aspects of bodily learning processes, leading to stronger body awareness are picked up and explicitly used in design. This can be contrasted with the main bulk of e.g. commercial sports applications, such as pedometers or pulse meters, where the body is often seen as an instrument or object for the mind, passively receiving sign and signals, but not actively being part of producing them. Recently, Purpura and colleagues (2011) made use of a critical design method to pinpoint some of the problems that follows from this view. Through describing a fake system, Fit4Life, measuring every aspect of what you eat, they arrive at a system that may whisper into your ear "I'm sorry, Dave, you shouldn't eat that. Dave, you know I don't like it when you eat donuts" just as you are about to grab a donut. This fake system shows how we may easily cross the thin line from persuasion to coercion, creating for technological control of our behavior and bodies. In my view, by designing applications with an explicit focus on aesthetics, somaesthetics, and empathy with ourselves and others, we can move beyond impoverished interaction modalities and treating our bodies as mere machines that can be trimmed and controlled, towards richer, more meaningful interactions based on our human ways of physically inhabiting our world.

We are just at the beginnings of unravelling many novel design possibilities as we approach emotions and experiences more explicitly in our design processes. This is a rich field of study that I hope will attract many young designers, design researchers and HCI-experts.

12.4 References

Boal, Augusto (1992): Games for Actors and Non-Actors. Routledge

Boehner, Kirsten, dePaula, Rogerio, Dourish, Paul and Sengers, Phoebe (2007): How emotion is made and measured. In International Journal of Human-Computer Studies, 65 (4) pp. 275-291

Boehner, Kirsten, dePaula, Rogerio, Dourish, Paul and Sengers, Phoebe (2005): Affect: from information to interaction. In: Bertelsen, Olav W., Bouvin, Niels Olof, Krogh, Peter Gall and Kyng, Morten (eds.) Proceedings of the 4th Decennial Conference on Critical Computing 2005 August 20-24, 2005, Aarhus, Denmark. pp. 59-68

Cañamero, Lola (2005): Emotion understanding from the perspective of autonomous robots research. In Neural Networks, 18 (4) pp. 445-455

Damasio, Antonio R. (1995): Descartes' Error: Emotion, Reason, and the Human Brain. Harper Perennial

Darwin, Charles (0000): The Expression of the Emotions in Man and Animals. London, UK, John Murray

Davidson, Richard J., Scherer, Klaus R. and Goldsmith, H. Hill (2002b): Handbook of Affective Sciences. Oxford University Press, USA

Davidson, Richard J., Pizzagalli, Diego, Nitschke, Jack B. and Kalin, Ned H. (2002a): Parsing the subcomponents of emotion and disorders of emotion: perspectives from affective neuroscience. In: Davidson, Richard J., Scherer, Klaus R. and Goldsmith, H. Hill (eds.). "Handbook of Affective Sciences". Oxford University Press, USA

dePaula, Rogerio and Dourish, Paul (2005): Cognitive and Cultural Views of Emotions. In: Proceedings of the Human Computer Interaction Consortium Winter Meeting 2005, Douglas, CO, USA.

Dewey, John (1934): Art as Experience. Perigee Trade

Dunbar, Robin (1997): Grooming, Gossip, and the Evolution of Language. Harvard University Press

Dunbar, Robin (1998): Grooming, Gossip, and the Evolution of Language. Harvard University Press

Ellsworth, Phoebe C. and Scherer, Klaus R. (2003): Appraisal processes in emotion. In: Davidson, Richard J., Sherer, Klaus R. and Goldsmith, H. Hill (eds.). "Handbook of Affective Sciences". Oxford University Press, USA

Ferreira, Pedro and Höök, Kristina (2011): Bodily Orientations around Mobiles: Lessons learnt in Vanuatu. In:Proceedings of the ACM CHI Conference on Human Factors in Computing Systems 7-12 May, 2011, Vancouver, Canada.

Ferreira, Pedro, Sanches, Pedro, Höök, Kristina and Jaensson, Tove (2008): License to chill!: how to empower users to cope with stress. In: Proceedings of the Fifth Nordic Conference on Human-Computer Interaction 2008. pp. 123-132

Gaver, William (2009): Designing for emotion (among other things). In Philosophical Transactions of the Royal Society, 364 (1535) pp. 3597-3604

Grosz, Elizabeth (1994): Volatile Bodies: Toward a Corporeal Feminism (Theories of Representation and Difference). Indiana University Press

Hummels, Caroline, Overbeeke, Kees and Klooster, Sietske (2007): Move to get moved: a search for methods, tools and knowledge to design for expressive and rich movement-based interaction. In Personal and Ubiquitous Computing, 11 (8) pp. 677-690

Hutchins, Edwin (1995): Cognition in the wild. Cambridge, Mass, MIT Press

Höök, Kristina (2009): Affective loop experiences: designing for interactional embodiment. In Philosophical Transactions of the Royal Society, 364 p. 3585–3595

Höök, Kristina (2008): Affective Loop Experiences - What Are They?. In: Oinas-Kukkonen, Harri, Hasle, Per F. V.,Harjumaa, Marja, Segerståhl, Katarina and Øhrstrøm, Peter (eds.) PERSUASIVE 2008 - Persuasive Technology, Third International Conference June 4-6, 2008, Oulu, Finland. pp. 1-12

Höök, Kristina (2006): Designing familiar open surfaces. In: Proceedings of the Fourth Nordic Conference on Human-Computer Interaction 2006. pp. 242-251

Höök, Kristina (2010): Transferring qualities from horseback riding to design. In: Proceedings of the Sixth Nordic Conference on Human-Computer Interaction 2010. pp. 226-235

Höök, Kristina, Ståhl, Anna, Sundström, Petra and Laaksolaahti, Jarmo (2008): Interactional empowerment. In:Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 647-656

Isbister, Katherine and Höök, Kristina (2009): On being supple: in search of rigor without rigidity in meeting new design and evaluation challenges for HCI practitioners. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 2233-2242

Katz, Jack (2001): How Emotions Work. University of Chicago Press

Katz, Jack (1999): How Emotions Work. University of Chicago Press

Kaye, Joseph Jofish (2006): I just clicked to say I love you: rich evaluations of minimal communication. In: Olson, Gary M. and Jeffries, Robin (eds.) Extended Abstracts Proceedings of the 2006 Conference on Human Factors in Computing Systems April 22-27, 2006, Montréal, Québec, Canada. pp. 363-368

Kort, Barry, Reilly, Rob and Picard, Rosalind W. (2001): An Affective Model of Interplay between Emotions and Learning: Reengineering Educational Pedagogy - Building a Learning Companion. In: ICALT 2001 2001. pp. 43-48

Laban, Rudolf von and Lawrence, F. C. (1974): Effort: economy in body movement. Plays, inc

Ledoux, Joseph (1996): The Emotional Brain: The mysterious underpinnings of emotional life. Simon and Schuster

Ledoux, Joseph (1998): The Emotional Brain: The mysterious underpinnings of emotional life. Simon and Schuster

Longo, Giuseppe O. (2003): Body and Technology: Continuity or Discontinuity?. In: Fortunati, Leopoldina, Katz, James E. and Riccini, Raimonda (eds.). "Mediating the Human Body: Technology, Communication, and Fashion". Routledge

Lutz, Catherine (1986): Emotion, Thought, and Estrangement: Emotion as a Cultural Category. In Cultural Anthropology, 1 (3) pp. 287-309

Lutz, Catherine A. (1988): Unnatural Emotions: Everyday Sentiments on a Micronesian Atoll and Their Challenge to Western Theory. University of Chicago Press

McCarthy, John and Wright, Peter (2004): Technology as Experience. The MIT Press

Merleau-Ponty, Maurice (1958): Phenomenology of Perception. London, England, Routledge

Moen, Jin (2006). KinAesthetic Movement Interaction : Designing for the Pleasure of Motion (Doctoral Thesis). KTH

Norman, Donald A. (2004): Emotional Design: Why We Love (Or Hate) Everyday Things. Basic Books

Ortony, Andrew, Clore, Gerald L. and Collins, Allan (1988): The Cognitive Structure of Emotions. Cambridge University Press

Ortony, Andrew, Clore, Gerald L. and Collins, Allan (1990): The Cognitive Structure of Emotions. Cambridge University Press

Parkinson, B. (1996): Emotions are social. In British Journal of Psychology, 87 p. 663–683

Picard, Rosalind W. (1997): Affective computing. Ma, USA, The MIT Press

Purpura, Stephen, Schwanda, Victoria, Williams, Kaiton, Stubler, William and Sengers, Phoebe (2011): Fit4life: the design of a persuasive technology promoting healthy behavior and ideal weight. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 423-432

Russell, James A. (1980): Circumplex Model of Affect. In Journal of Personality and Social Psychology, 39 (6) pp. 1161-1178

Sanches, Pedro, Höök, Kristina, Vaara, Elsa, Weymann, Claus, Bylund, Markus, Ferreira, Pedro, Peira, Nathalie andSjölinder, Marie (2010): Mind the body!: designing a mobile stress management application encouraging personal reflection. In: Proceedings of DIS10 Designing Interactive Systems 2010. pp. 47-56

Schiphorst, Thecla (2007): Really, really small: the palpability of the invisible. In: Proceedings of the 2007 Conference on Creativity and Cognition 2007, Washington DC, USA. pp. 7-16

Sengers, Phoebe, Boehner, Kirsten, Warner, Simeon and Jenkins, Tom (2005): Evaluating Affector: Co-Interpreting What 'Works'. In: CHI 2005 Workshop on Innovative Approaches to Evaluating Affective Systems 2005.

Sheets-Johnstone, Maxine (2009): The Corporeal Turn: An Interdisciplinary Reader. Imprint Academic

Sheets-Johnstone, M. (1999): Emotion and Movement: A beginning Empirical-Phenomenological Analysis of Their Relationship. In Journal of Consciousness Studies, 6 (11) pp. 259-277

Shusterman, Richard (2008): Body Consciousness: A Philosophy of Mindfulness and Somaesthetics. Cambridge University Press

Ståhl, Anna, Höök, Kristina, Svensson, Martin, Taylor, Alex S. and Combetto, Marco (2009): Experiencing the Affective Diary. In Personal and Ubiquitous Computing, 13 (5) pp. 365-378

Sundström, Petra, Ståhl, Anna and Höök, Kristina (2007): In situ informants exploring an emotional mobile messaging system in their everyday practice. In International Journal of Human-Computer Studies, 65 (4) pp. 388-403

Sundström, Petra, Jaensson, Tove, Höök, Kristina and Pommeranz, Alina (2009): Probing the potential of non-verbal group communication. In: GROUP09 - International Conference on Supporting Group Work 2009. pp. 351-360

Chapter TOC

12.4 Commentary by Rosalind W. Picard

This was an interesting chapter for me to try to understand and there is a banquet here for discussion, although I only have time to address one of the main dishes.

First, I want to say that I greatly appreciate the work of Kia Höök and others she cites to develop technologies for enhancing people’s awareness of affect and helping people better reflect on and understand emotions, of self and others. I also deeply appreciate the work of designers to address holistic situations and design for people, including their feelings but also never only their feelings. These goals – creating interactions and designs that enhance affective understanding and that respond to the richness of human needs – are truly significant for improving much of what it means to be human. That said, I would like to correct an important misconception. Let me start with a story.

It was 1999 and Joe LeDoux, Antonio Damasio, and I had been invited to give talks on Emotion & Knowledge for the Barcelona Museum. The talks were simultaneously translated into multiple languages, giving me time to speak carefully and slowly in English, relying on the hard work of people more talented than I to translate into Catalan, Spanish, French, and more. It was a great experience overall – meeting fascinating people and engaging deeply in topics that were new and stimulating. But, there was one negative part that stands out in my memory.  At the reception, a dark, trim, middle-aged man came striding in my direction, red-faced, furrowed brow, gesturing sharply, and having a hard time speaking. I’ve never seen somebody so angry in a museum. I glanced around me, thinking he was targeting somebody nearby who tried to steal his wife, after all, I was just nibbling on a canape. But his anger was at me. I swallowed, listened carefully, and gradually came to understand that in the language he was hearing my talk translated, he heard me claim something to the effect of “We have built or could now build human emotion into computers.” I was actually extremely careful to NOT say that, but in his mind, I was denying the special feelings and experience we have that accompany human emotion, and reducing the great riches of our emotional experience entirely down to something like a text editor or game app. Listening to him, I realized that my careful choice of words in English, to say what we were doing precisely, and what I thought could be done, was translated inaccurately from my engineering culture, to his culture, which was social psychology.

In that reception, I learned, painfully, that what I meant by “modeling” was very different than what he heard when I said that word. I learned it was not enough to just be very careful with my words stating what we’re doing. I needed to also anticipate how people from different fields could misinterpret what I said. I needed to learn to make additional clarifying remarks of what I did not mean.  I should have said not only, “These are some of the mechanisms related to emotion that we are able to implement,” but also, “These are not all of what emotion is.” I should have said not only “by ‘mechanisms of’ I mean ‘Attempts to represent’,” but also, “Representing is not the same as reproducing.” I did not realize he would otherwise be led to the wrong conclusion.
 
Why do I bring up this story? Höök’s article refers to the Affective Computing approach as cognitivistic and reductionist, which is quite similar to the misunderstanding that happened in Barcelona in 1999.

When I speak or write of mechanisms of emotion, or models of emotion, I speak as an engineer trying to represent a complex phenomenon as best we can with tools we have: I do not confuse these representations with emotion itself. I am not a reductionist and Affective Computing is not reductionistic. I do not believe that emotion can be reduced to these representations, nor does Affective Computing claim this. I do not believe that emotion is “nothing but” the mechanisms we identify and build. The mechanisms we implement are not equivalent to the riches of human emotional experience, nor have I ever said that they will be: We have no evidence to make such claims. If people want to believe that emotions are entirely reducible to logical computation and bits, then that belief is based on faith, not science.

While people can write about any concept using information and bits, including emotion, I do not see evidence supporting the view that emotion can be fully reduced to bits and information. When I wrote Affective Computing, I knew many readers would be from AI, and would want to know how emotion might be implemented in machines, and so I described the parts of that process that I could envision. I was also very careful in my wording to not promote that such a method would be sufficient. However, I had not yet encountered the man in Barcelona, so one has to read my words carefully.

Unfortunately, if a person’s views are multi-dimensional, people will try to reduce them to one dimension, and conveniently peg them on one hook or the other. The process is rather like tidying up the foyer by hanging each jacket on whatever hook happens to be available and strong enough to hold it up. Cognitivism is a handy hook, promoting the belief that thought can be fully reduced to rules and algorithms.

Cognitivism was a strong influence for AI pioneers like my friend and colleague, Marvin Minsky, who kept telling me “Emotions are just a special kind of thought”, a sentence I disagreed with him on regularly and once get him to at least compromise by removing the word “just”. Marvin believed bodies were irrelevant, except during infancy when people needed to be touched, else (studies showed) they withered and died. I have met other pioneers in AI who thought similarly. I am not of their camp, and my writings in Affective Computing talk about the body and about aspects of conscious experience that we haven’t a clue how to implement in information and bits. There are some researchers who work in Affective Computing who hold a cognitivist view, but Affective Computing is not cognitivist.

Yes, Affective Computing includes some models and some researchers whose work might fit on a cognitivist hook, e.g. the cognitive rule-based models like OCC’s could hang on a cognitivist hook for people who believe that approach could fully account for emotion (I don’t). The stochastic signal-representing models of affect in speech or facial expression dynamics might hang on a different hook, and there are other hooks as well. The closet can be better organized than I have taken time to write about, especially as new garments keep arriving.  But don’t confuse the hooks with the house.

For some supportive examples, see “Chapter 1: Emotions are Physical and Cognitive” in Affective Computing (1997), containing some of my earliest writings on the need for a combined body-mind view in emotion research. That clearly does not fit on the cognitivist hook. Similarly, readers might be interested in the emphasis I placed on machines continusly co-creating interactions with people, taking into account not only emotion but also context and more, which resonates with the other areas Kia Höök’s article attempts to delineate (more examples are in “Chapter 8: Affective Wearables, see sections such as “Out of the lab and into the world.”) An affective technology does not have to use a formal AI model of emotion, or use discrete emotion recognition or a pattern classifier to fall under the area of affective computing.

But enough about organizing. I think the splitting and naming of pieces of a pie – whether it is an “affective computing pie” or some other kind of pie, is not as interesting as another question I see lurking behind the drive of some designers to separate themselves from a more objective engineering approach: Are emotions fully describable or are they ineffable?

In our work we have described emotion computationally and semantically, in numerous ways – discrete, dimensioned, numeric, semantic, as well as by quantifying creative behaviors, facial expressions, signal measurements of physiology and more. In no case do I think that we have “fully captured” human emotion with our models, methods, or descriptions. Something remains undescribed.

Affective computing often (but not always) tries to describe, objectively, more about emotion than has ever been described subjectively. Much of my work has pushed to make concrete, precise, in an engineering sense, measures of things that previously had only been addressed with words, self-report, questionnaires, whether applied to internal feelings or to outwardly observed behaviors. I am bothered by the way all subjective measurement methods are themselves influenced by emotion, and I want something more objective. Objective measures, however, do not imply reductionism any more than subjective measures imply reductionism. Both approaches “reduce” emotion to something – words, numbers, pictures, “blobby objects.” Using a representation is not being reductionistic. Reductionism is when people take an additional leap and say our emotions are “nothing but” what the computer is representing. The latter leap is one I have never promoted (except through mistranslated remarks).

When I closed my conversation with the man in Barcelona, we realized we were both deeply interested in better understanding emotion, and we realized that our perceived differences were actually not differences at all. Efforts to model do not imply a view of reductionism. Working to build representations that imitate some functions of emotions based on rules and categories does not mean cognitivism. Implementation of affective measures in bits does not mean emotion is only information. Affective computing creates tools toward greater goals – toward greater understanding of what makes us human. The man and I exchanged a hearty handshake and a smile before he departed.

I still have a lot to learn about communicating what we are trying to do with emotion – it’s a big topic, and it’s not one that just an engineering approach can conquer. I’m thrilled, as an engineer, to be sharing the journey with people from social psychology, design, neuroscience, AI, as well as many other arts and sciences.  Together we’ll figure out much more than if we set up different camps.

The original definition I gave of affective computing is broader than the one Kia paraphrases: Computing (includes machines, robots, phones, sensors, smart clothing, anything that can do computation) that relates to, arises from, or deliberately influences emotion or other affective phenomena. This was never just about AI or HCI, or about making intelligent machines, although those were the largest communities I was trying to convince to work on emotion at the time.

Perhaps I can be permitted to close, using the opening I wrote in 1997, which still rings true today:

In the course of this work I have come to appreciate all the more our own human needs for emotional development. It is my hope that this direction of research will encourage and enable us in this development – by no longer ignoring human emotions in human-computer interaction, by helping us become more aware of how we communicate, by providing testbeds for theories of emotion in learning and other functions, through animation of emotional characters and playful scenarios with which children can interact, by assisting scientists in collecting affective patterns, by helping advance research on understanding the role of emotion in preventive medicine, and more. It is my hope that affective computers, as tools to help us, will not just be more intelligent machines, but will also be companions in our endeavors to better understand how we are made, and so enhance our own humanity. (Preface to Affective Computing, 1997)
 

12.5 Commentary by Paul Hekkert

Affective computing is an exiting discipline and Kristina Höök offers us some nice examples of what the field can bring. Wouldn't it be great if intelligent machines could somehow 'sense' what we feel when interacting with them and then adjust their actions accordingly? This is actually what the pioneers of affective computing saw as their challenge and this is what they have been after:

1. Is it possible to recognize people's emotional responses/states from their behavior, physiological responses or facial expressions, and
2. Can we make the system (e.g. computer, product, mobile device) take this information into account in appropriate responses? Rosalind Picard will correct me if I am wrong that these were and still are the main challenges of the AC discipline.

Is this reductionist? Sure it is, you can only measure a few indicators of people's emotional responses and each and every indicator (e.g. pressure exerted, skin conductance, heart rate variability, facial muscles) only tells a little part of the story. There are many behavioral, physiological and psychological sides to an emotion and we simply cannot tap them all. But what is the alternative? We want our measurements to be as non-invasive as possible. If we end up affecting people's behavior or even change their emotional states because of the way we are measuring, the whole purpose is lost. Preferably, we measure user's emotional responses without them being aware of it. The question is what each and every indicator of an emotion actually tells us (its validity) and how accurately – and unobtrusively – we can measure it. A lot of the work in AC has been put into these questions.

As an alternative, Kristina Höök proposes the "Interactional Approach" where the system allows users to reflect on their emotional experiences. This, however, does not eliminate the measurement problem as we can for example see in the "Affective Diary". This application registers movement and arousal as indicators of people's emotional state and transfers these data into shapes and colors as a form of feedback. If you aim to give people feedback on their feelings this is the 'appropriate response' you decided on and you have moved to the second challenge of AC: how to respond? And of course, the response you aim for very much depends on the type and function of the system. When I am typing a document in Word – as I am doing now – I do not want the system to give me continuous feedback on my emotional states, nor do I want to see these reflected in the words I type. But I may want the system to recognize that I am in a hurry, or impatient, or stressed, and subtly 'make me' slow down, without me being aware of it. Miguel Bruns Alonso recently explored this idea in the design of a pen that senses implicit behaviors related to restlessness and responds by providing inherent feedback to lower the stress level (Bruns Alonso et al., 2011).

So, is there an alternative to the measurement problem? In another example from Kristina Höök's chapter, she describes eMoto, an extended SMS-service that allows people to communicate their feelings in colorful and animated shapes. This type of 'measurement' – making users express their own emotions in words or images – only works when we are dealing with communication devices and is problematic for different reasons. First of all, it is obtrusive and not very recommendable if communicating emotions is not the design goal. But also, there are validity problems in people's verbal or non-verbal reports of their own emotions. All kind of social rules, demand characteristics (of the device?), and response styles may interfere with a valid report of your own feelings (see e.g. Mauss and Robinson, 2009 for a review of emotion measures).

And yes, I fully agree with Kristina that the role of our body is relatively unexplored in the AC field and offers a lot of potential, both to the recognition and response challenge. Given recent advances in cognitive science (see e.g. Johnson, 2007), where bodily experiences are increasingly recognized as being at the roots of our thinking and feeling, we may expect more and more studies – like Bruns Alonso's – in which our body is the main mediator. How else can we "grasp" the affective domain?

References

  • Bruns Alonso, M., Hummels, C.C.M., Keyson, D.V., & Hekkert, P. (Conditionally accepted). Measuring and adapting behavior during product interaction to influence affect. Personal and Ubiquitous Computing.
  • Johnson, M. (2007). The meaning of the body: Aesthetics of human understanding. Chicago: The University of Chicago Press.
  • Mauss, I.B. & Robinson, M.D. (2009). Measures of emotion: A review. Cognition and Emotion, 23, 209-237.

12.6 Commentary by Egon L. van den Broek

On the bodily expressions of emotion, be aware: More than a century of research!

When thinking about this commentary, ideas popped up and emotions emerged. What to comment on? Kia Höök delivered an excellent chapter. She mentions three angles to approach emotion in technology from (cf. Van den Broek, 2011), namely: affective computing, affective interaction, and technology as experience. In this commentary, I will narrow the focus to affective computing solely. Furthermore, I have also chosen to take a step back and be so bold as to take a methodological perspective with a historical flavor. Why? Well, throughout the years I have discovered more and more literature that touches the core of affective computing but appears to be unknown (e.g., Arnold, 1968; Candland, 1962; Dunbar, 1954). This commentary is founded on two books from a time long before the term affective computing was coined, the 50s and 60s of the previous century. Both books are taken from completely distinct branches of science. Knowledge on science’s history can prevent us, both practitioners and scientists, from repeating mistakes. As such, this commentary touches upon the essence of science itself.

Kia Höök provides a concise overview of emotion in technology. She embraced affective interaction instead of affective computing. In contrast, in this commentary, I have taken the affective computing standpoint. Moreover, Kia Höök has taken a design perspective, where this commentary touches upon and questions the fundaments of emotions in technology. Lessons had been learned but have already been forgotten (Arnold, 1968; Candland, 1962; Dunbar, 1954). Consequently, affective computing tends to reinvent the wheel, at least to some extent. Yes, this is a bold claim, a very bold claim but I hope that after reading this commentary, you as a reader may share my concerns.

In 1954, 5 years before her death, Flanders Dunbar delivered the fourth edition of “Emotions and bodily changes: A survey of literature on psychosomatic interrelationships 1910-1953”. With this impressive volume, she provides an exhaustive and structured review of scientific literature of (roughly) the first half of the previous century on emotions and bodily changes. The volume's title is well chosen and reflects its content nicely. This makes this book undoubtedly valuable for the community of affective computing. However, as far as I know, outside my own work (e.g., Van den Broek, 2011), not a single reference is made to this book in any affective computing article, report, or book. I can only hope that I have missed quite a few ...

Flanders Dunbar starts her book (1954) with:

Nearly half a millennium B.C., Socrates came back from army service to report to his Greek countrymen that in one respect the barbarian Thracians were in advance of Greek civilization: They knew that the body could not be cured without the mind. “This,” he continued, “is the reason why the cure of many diseases is unknown to the physicians of Hellas, because they are ignorant of the whole.” It was Hippocrates, the Father of Medicine, who said: “In order to cure the human body it is necessary to have knowledge of the whole of things.” And Paracelsus wrote: “True medicine only arises from the creative knowledge of the last and deepest powers of the whole universe; only he who grasps the innermost nature of man, can cure him in earnest.” To us today this seems rather an impossible demand (p. 3).
 

Where the work of Dunbar illustrates that the origins of affective computing can be traced back to more than a century ago, this quote illustrates that knowledge on the interaction between body and mind was already known more than 25 centuries ago! Let us now identify some core concepts as mentioned in the quote from Dunbar (1954), which are crucial for affective computing.

The old Greek already noted that “the body could not be cured without the mind” (cf. Kia Höök’s chapter). So, both are indisputably related and, hence, in principle, the measurement of emotions should be feasible. This is well illustrated by the remark that “the cure of many diseases is unknown to the physicians of Hellas”, as the Greek culture was devoted to the body and not to the mind. Recent work confirmed this relation. For example, when chronic stress is experienced, similar physiological responses emerge as were present during the stressful events from which the stress originates. If such physiological responses persist, they can cause pervasive and structural chemical imbalances in people’s physiological systems, including their autonomic and central nervous system, their neuroendocrine system, their immune system, and even in their brain (Brosschot, 2010). This brings us to the need for the “knowledge of the whole of things”, a holistic view, perhaps closely related to what Kia Höök denotes as Technology as Experience. Although the previous enumeration of people’s physiological systems can give the impression that we are close to a holistic model, it should be noted that this is in sharp contrast with the current level of science. For example, with (chronic) stress, a thorough understanding is still missing. This can be explained by the complexity of human’s physiological systems, the continuous interaction of all systems, and their integral dynamic nature. However, Brosschot (2010) considers emotions as if these can be isolated and attributed to bodily processes only. I firmly agree with Kia Höök that dynamics beyond the body should also be taken into account. Moreover, as Kia Höök also notes, in relation to computing entities, the interaction consists of much more than emotions; however, the same is true when no computing is involved at all.

25 centuries ago scientists did not apply modern statistics; however, 1 century ago, scientists did already apply statistics; for example, Fisher invented the ANOVA class of statistical models in 1918. This provided the means to test and generalize findings on emotions and bodily changes and boosted the development of behavioral sciences in general (Dunbar, 1954). Moreover, this work fits into Rosalind W. Picard’s definition of affective computing: “… a set of ideas on what I call “affective computing,” computing that relates to, arises from, or influences emotions.” (Picard, 1995, p. 1) At least it fits when taken as the traditional interpretation of computing (i.e., to determine by mathematical means). However, the added value of affective computing would be its engineering component, in particular, signal processing and pattern recognition (Van den Broek, 2011). This would enable machines to sense emotions, reason about them, and perhaps develop them themselves. This would mark a new era of computing.

With the invention of computing machinery, shortly after World War II, a new type of statistics was developed: pattern recognition. In his edited volume “Methodologies of Pattern Recognition” (1969), Satosi Watanabe collected a set of papers that were presented or meant to be presented at the International Conference on Methodologies of Pattern Recognition in 1968. Watanabe started his book with defining pattern recognition:

To the layman’s ear, the term pattern recognition sounds like a very narrow esoteric field of electronic computer applications. But, actually, it is a vast and explicit endeavor at mechanization of the most fundamental human function of perception and concept formation (p. vii).
 

Watanabe denotes pattern recognition by computers as the “mechanization of the most fundamental human function of (i) perception and (ii) concept formation.”  Up to this date human pattern recognition in general is largely unsolved. We do not understand how we, as humans, process affective signals (Van den Broek, 2011). Moreover, the perception of signals and, subsequently, patterns is one thing; their interpretation in terms of emotions is something completely different. This issue refers to content validity; that is, (i) the agreement among experts on the domain of emotions; (ii) the degree to which a (low level) percept adequately represents an emotion; and (iii) the degree to which (a set of) percepts adequately represents all aspects of the emotions under investigation.

The issue of concept formation relates to the process of construct validation, which aims to develop a ground truth (or an ontology or semantic network), constructed around the emotions investigated. Such a framework requires theoretically grounded, observable, operational definitions of all constructs and the relations between them. Such a network aims to provide a verifiable theoretical framework. The lack of such a network is one of the most pregnant problems affective computing is coping with. Kia Höök describes emotions as if we can pinpoint them. Although intuitively this is indeed the case, in practice it proves to be very hard to define emotions (Duffy, 1941; Kleinginna & Kleinginna, 1981).

Par excellence, humans can recognize patterns in noisy environments. Moreover, the ease with which humans adapt to new situations, to new patterns remains striking. Moreover, this is in sharp contrast with the performance of signal processing and pattern recognition algorithms. Often, these perform well in a controlled environment; however, in the “real world” their performance deteriorates (Healey, 2008). This problem refers to the influence of the context on measurements, which is also denoted as ecological validity. Due to a lack of real world research, in general, the ecological validity of research on affective computing is limited and its use often still has to be shown in “real world” practice. However, as Kia Höök illustrates, some nice exceptions to this statement have been presented throughout the last decade.

In 1941, Elizabeth Duffy published her article “An explanation of ‘emotional’ phenomena without the use of the concept ‘emotion’” in which she starts by stating that she considers “… ‘emotion’, as a scientific concept, is worse than useless. … ‘Emotion’ apparently did not represent a separate and distinguishable condition.”  Although this statement is 60 years old it is still (or, again) up to date, perhaps even more than ever (cf. Kleinginna & Kleinginna, 1981). Almost fifty years later, in 1990, John T. Cacioppo and Louis G. Tassinary expressed a similar concern; however, they more generally addressed the complexity of psychophysiological relations. These “are conceptualized in terms of their specificity (e.g., one-to-one versus many-to-one) and their generality (e.g., situation or person specific versus cross-situational and pancultural).” (Cacioppo & Tassinary, 1990). They proposed a model, which yields four classes of psychophysiological relations: (a) outcomes, (b) concomitants, (c) markers, and (d) invariants.” Although Cacioppo and Tassinary (1990) discuss the influence of context, they do not operationalize it; hence, this discussion’s value for affective computing is limited. Nevertheless, articles such as this are food for thought. Regrettably, attempts such as this are rare in the community of affective computing; consequently, the field’s research methods are fragile and a solid theoretical framework is missing (Van den Broek, 2011).

To ensure sufficient advancement, it has been proposed to develop computing entities that respond on their user(s) physiological response(s), without the use of any interpretation of them in terms of emotions or cognitive processes (Tractinsky, 2004). This approach has been shown to be feasible for several areas of application. However, this approach also undermines the position of affective computing itself as a field of research. It suggests that emotion research has to mature further before affective computing can be brought to practice. This would be an honest conclusion but a crude one for the field of affective computing. It implies that affective computing should take a few steps back before making its leap forward. A good starting point for this process would be the hot topics on emotion research that Gross (2010, p. 215) summarized in his article “The future's so bright, I gotta wear shades” (see also Van den Broek, 2011).

Taken together, Kia Höök should be acknowledged for her concise overview of emotion in technology. In her chapter she takes the affective interaction standpoint. In contrast, with this commentary, I have taken an affective computing standpoint. Moreover, Kia Höök has taken a design perspective, where this commentary touches upon the fundaments of emotions in technology. I pose that if anything, affective computing has to learn more about its roots (e.g., Arnold, 1968; Candland, 1962; Dunbar, 1954); then, affective computing can and probably will have a bright future!

References

  • Arnold, M.B. (1968). The nature of emotion: Selected readings. Harmondsworth, Middlesex, England: Penguin Books Ltd.
  • Brosschot, J.F. (2010). Markers of chronic stress: Prolonged physiological activation and (un)conscious perseverative cognition. Neuroscience & Biobehavioral Reviews, 35(1), 46-50.
  • Cacioppo, J.T. and L. Tassinary, L.G. (1990). Inferring psychological significance from physiological signals. American Psychologist, 45(1), 16-28.
  • Candland, D.K. (1962). Emotion: Bodily change - An enduring problem in psychology, Selected readings. Princeton, NJ, USA: D. van Nostrand Company, Inc.
  • Duffy, E. (1941). An explanation of "emotional" phenomena without the use of the concept “emotion”. Journal of General Psychology, 25, 283-293.
  • Dunbar, F. (1954). Emotions and bodily changes: A survey of literature on psychosomatic interrelationships 1910—1953 (4th ed.). New York, NY, USA: Columbia University Press.
  • Fisher, R.A. (1918). The correlation between relatives on the supposition of Mendelian inheritance. Transactions of the Royal Society of Edinburgh, 52(2), 399-433.
  • Gross, J.J. (2010). The future's so bright, I gotta wear shades. Emotion Review, 2(3), 212-216.
  • Healey, J.A. (2008). Sensing affective experience. In J.H.D.M. Westerink, M. Ouwerkerk, T. Overbeek, W.F. Pasveer, and B. de Ruyter (Eds.), Probing Experience: From Academic Research to Commercial Propositions (Part II: Probing in order to feed back), Chapter 8, p. 91-100. Series: Philips Research Book Series, Vol. 8. Dordrecht, The Netherlands: Springer Science + Business Media B.V.
  • Kleinginna, P.R. and Kleinginna, A.M. (1981). A categorized list of emotion definitions, with a suggestion for a consensual definition. Motivation and Emotion, 5(4), 345-379.
  • Picard, R.W. (1995). Affective Computing. Technical Report No. 321. Perceptual Computing Section, M.I.T. Media Laboratory, Cambridge, MA, USA.
  • Tractinsky, N. (2004). Tools over solutions? Comments on Interacting with Computers special issue on affective computing. Interacting with Computers, 16(4), 751-757.
  • Van den Broek, E.L. (2011). Affective Signal Processing (ASP): Unraveling the mystery of emotions. PhD-thesis, Human Media Interaction (HMI), Faculty of Electrical Engineering, Mathematics, and Computer Science, University of Twente, Enschede, The Netherlands.
  • Watanabe, S. (1969). Methodologies of Pattern Recognition. New York, NY, USA: Academic Press, Inc.

12.7 Commentary by Joyce H. D. M. Westerink

Kristina Höök has given us an inspiring view of three directions of research targeted at the crossroads of technology and affect, namely (traditional) Affective Computing, Affective Interaction, and Technology as Experience. She emphasizes that each line of research has contributed to the development of applications for various types of users, since they are complementary in their approach. I can only underline this conclusion from my experiences in industrial research. A few aspects in particular I'd like to single out for further discussion.

Let me start with an assumption that is contained in this and many other texts and views on Affective Computing, but never stated explicitly, namely that for any viable application in this domain, you need a measurement of an emotion-relevant signal. This could be a camera signal, as in Affector, movement signals as in eMoto, or physiological signals as in the Affective Diary. Also much of our own effort has been spent in the pursuit of unobtrusive measurement techniques for emotion-related signals, like our skin conductance wristband (Ouwerkerk, 2011). However, to reach the goal of 'making a machine that deliberately ... influences emotion or other affective phenomena', measurement is not strictly needed. A case in point is any TV-set or MP3 player: we use them all the time to change our mood with music, or have a TV-show experience that propels us through a series of emotions. That it works is because people are similar in their reactions to a certain extent and because TV-show directors and music composers are very skilled in creating emotional experiences for the general audience, or for specific target groups. Nevertheless, in our domain of research, everyone tacitly assumes that measurement of emotion-related signals is necessary, and indeed it allows for a further refinement of the affective influencing. Especially for changes away from the average of the crowd, in the direction of adaptation to individuals. This means that ultimately, individual models are not only necessary in the Affective Interactional approach, as Kia Höök proposes, but also in the Affective Computing paradigm. With the emotion-related measurements aboard, we also immediately enter the domain of closed-loop applications (see Van Gerven et al., 2009, Van den Broek, 2011): the emotion-related measurements are interpreted in terms of affect, then a decision is made what actions are applicable (based on present and previous measurements), and these actions are executed, after which a new measurement is done to check the new situation, etc. etc. (see Figure 1). The closed-loop model basically describes that whenever there are measurement data available, they are used to try and achieve a better situation. In this way, one's (affective) state can be guided in a targeted direction. Our Affective Music Player (Janssen et al., 2011, Van der Zwaag et al., 2009), constructed in the best Affective-Computing tradition, can serve as an example: it measures my personal reactions to music, and uses this information to adapt the playlist to direct me (not others) to a certain chosen target mood. All in all, I conclude that emotion-related measurements, individual models, and closed-loop applications are tightly interlinked in any research line in our domain.

Emotional Closed Loop
Figure 12.1: Emotional Closed Loop

The affective closed loop in Figure 1 reserves a substantial part for interpretation of the emotion-related signal. This interpretation can be done by a human, as Kia Höök advocates along the lines of the Affective Interaction paradigm, and this human can either be the person that is measured (e.g. Affective Diary) or someone else (e.g. Affector). In both cases, the measurement information will be used to reflect on the situation measured, and if needed, to take action to change it (making it a closed loop indeed). If the raw emotion-related signals are presented, we will not stand the chance to lose information that is of value to the user, that is true. But on the other hand, this information might also be overwhelming (at least at first) and a user could benefit from help in the form of an interpretation made by an algorithm (in the Affecting Computing tradition). There is no need to make a choice between the two alternatives, we could think of implementing both. For instance, our electronic wristband does show the raw skin conductance/arousal patterns over the course of a day or week, but we can also give the user a discreet buzz (vibration alarm) whenever an algorithm interprets that tension has risen considerably. Of course, Kia Höök points out that it is difficult to make the correct interpretation as context is varying in many applications, and this is underlined by the fact that much of the research effort in affective computing has gone into algorithms deriving affective states from emotion-related signals. Nevertheless, there are options to try and overcome this: a technological approach is by adding additional sensors to monitor the context, like the accelerometer in our wristband that helps us estimate the activity level of the wearer and with that interpret the skin conductance signal. And another way out is by averaging over multiple measurements in varying circumstances to distil an overall effect. This is for instance done in our Affective Music Player, where the mood impact of a single song is modeled by taking the average affective effect (corrected for the Law of Initial Values) of multiple presentations, and this is proven to be good enough to select songs capable of directing one's mood to a certain state. Moreover, neither the raw emotion-related signal, nor its interpretation is presented to the user of our Affective Music Player: (s)he doesn't want to bother and only experiences that (s)he is brought into a different mood. Concluding, we find that both human and algorithm interpretation of emotion-related signals are important ingredients of future applications, and both are capable to deal with context to some extent.

Kia Höök argues that in normal life, emotions are always part of a larger experience, and that it is this larger experience that we need to support with our affective technology, in line with the Technology-as-Experience direction of research. This will certainly broaden the field of applications to include related fields in which emotions play a role. For example, emotions are important in communication, and building up relationships, and it is foreseeable that affective technologies can help (Janssen et al., 2010). It relates to the 'decide on actions' part of the closed loop in Figure 1: what do we want to do with the information gained? Nevertheless, the broadness of possible goals does not preclude that there are also applications that do have the goal to impact affect itself. A case in point is the Affective Music Player described before, which is exactly intended to direct affect, namely the mood. We have also shown (Van der Zwaag et al., 2011) that the optimal, individually selected, music can indeed help to prevent the emotion (or affective state) of anger in the frustrating traffic situations Kia Höök describes. On the other hand, I am not so sure whether consumers are interested in knowing or influencing their emotions. Despite the abundance of emotion-overloaded reality shows on TV, and despite the fact that emotion as a research topic has become fashionable in recent years, the general public still maintains a 'nice for others, not for me' attitude. In my view, this is related to the emotion/female versus rationality/male distinction Kia Höök mentions: The average male continues to see emotions as a female sign of weakness, of which they do not want to be reminded, not even if our measurement technology gives it a more masculine twist. For the females, it is the other way round: They do feel (more) comfortable with mood and emotions and acknowledge their impact on our everyday life, but they are less inclined to deploy masculine technology to alter them. For this reason also, I agree with Kia Höök , that our affective technologies are most likely to be used in applications that target a broader experience than that of affect alone.

To wrap up, let me highlight what I think is the most important message in Kia Höök 's story: That affective technologies will benefit from individual models (not only for human, but also for algorithm interpretation of the emotion-relate signals measured), and that they can be deployed in a wide range of applications extending far beyond the original domain of measuring and influencing affect. I am looking forward to see them appear incorporated in products and applications in the world around us....

References

  • Janssen, Bailenson, IJsselsteijn, & Westerink (2010), Intimate heartbeats: Opportunities for affective communication technology. IEEE Transactions on Affective Computing, Vol. 1, No 2, 72-80.
  • Janssen, Van den Broek, Westerink, Tune in to your emotions: A robust personalized affective music player, User Modeling and User-Adaptive Interaction, In Press.
  • Ouwerkerk (2011), Unobtrusive Emotions Sensing in Daily Life, in: Sensing Emotions. The Impact of Context on Experience Measurements (2011), Westerink, Krans, Ouwerkerk (eds.). Philips Research Book Series, volume 12, Springer, Dordrecht, The Netherlands, pages 21-39.
  • Van den Broek (2011), Affective Signal Processing, Unraveling the mystery of emotions, Ph.D. Thesis University of Twente.
  • Van der Zwaag, Westerink, Van den Broek (2009), Deploying music characteristics for an affective music player, Proceedings VOLUME I, 2009 International Conference on Affective Computing & Intelligent Interaction, ACII 2009, September 10-12, 2009, Amsterdam, The Netherlands, pages 459-465.
  • Van der Zwaag, Fairclough, Spiridon, Westerink (2011). The impact of music on affect during anger inducing drives. In S. Dmello et al. (Eds.), 4th international conference on affective computing and intelligent interaction. Part I, LNCS 6974 (pp. 407-416). Memphis, Tennessee, USA: Springer, Heidelberg.
  • Van Gerven, Farquhar, Schaefer, Vlek, Geuze, Nijholt, Ramsey, Haselager, Vuurpijl, Gielen and Desain (2009). The brain–computer interface cycle. Journal of Neural Engineering, Vol.6, No.4, 1-10.
502 shares
Download PDF

Open Access—Link to us!

We believe in Open Access and the democratization of knowledge. Unfortunately, world-class educational materials such as this page are normally hidden behind paywalls or in expensive textbooks.

If you want this to change, , link to us, or join us to help us democratize design knowledge!

Share Knowledge, Get Respect!

Share on:

or copy link

Cite according to academic standards

Simply copy and paste the text below into your bibliographic reference list, onto your blog, or anywhere else. You can also just hyperlink to this book chapter.

Hook, K. (2014, January 1). Affective Computing. Interaction Design Foundation - IxDF.

New to UX Design? We're Giving You a Free eBook!

The Basics of User Experience Design

Download our free ebook “The Basics of User Experience Design” to learn about core concepts of UX design.

In 9 chapters, we’ll cover: conducting user interviews, design thinking, interaction design, mobile UX design, usability, UX research, and many more!

A valid email address is required.
316,344 designers enjoy our newsletter—sure you don’t want to receive it?

Download Premium UX Design Literature

Enjoy unlimited downloads of our literature. Our online textbooks are written by 100+ leading designers, bestselling authors and Ivy League professors.

Bringing Numbers to Life
The Encyclopedia of Human-Computer Interaction
Gamification at Work: Designing Engaging Business Software
The Social Design of Technical Systems: Building Technologies for Communities

New to UX Design? We're Giving You a Free eBook!

The Basics of User Experience Design

Download our free ebook “The Basics of User Experience Design” to learn about core concepts of UX design.

In 9 chapters, we’ll cover: conducting user interviews, design thinking, interaction design, mobile UX design, usability, UX research, and many more!

A valid email address is required.
316,344 designers enjoy our newsletter—sure you don’t want to receive it?