Science arose from poetry? when times change the two can meet again on a higher level as friends.
-- Johann Goethe
Read the fascinating history of Wearable Computing, told by its father, Steve Mann
Read Steve's chapter !
Our Latest Books
Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger
Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad
Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam
Publications by Robert Christopherson (bibliography)
Bernays, Ryan, Mone, Jeremy, Yau, Patty, Murcia, Michael, Gonzalez-Sanchez, Javier, Chavez-Echeagaray, Maria Elena, Christopherson, Robert and Atkinson, Robert (2012): Lost in the dark: emotion adaption. In: Adjunct Proceedings of the 2012 ACM Symposium on User Interface Software and Technology 2012. pp. 79-80.
Having environments that are able to adjust accordingly with the user has been sought in the last years particularly in the area of Human Computer Interfaces. Environments able to recognize the user emotions and react in consequence have been of interest on the area of Affective Computing. This work presents a project -- an adaptable 3D video game, Lost in the Dark: Emotion Adaption, which uses user's emotions as input to alter and adjust the gaming environment. To achieve this, an interface that is capable of reading brain waves, facial expressions, and head motion was used, an Emotiv® EPOC headset. For our purposes we read emotions such as meditation, excitement, and engagement into the game, altering the lighting, music, gates, colors, and other elements that would appeal to the user emotional state. With this, we achieve closing the loop of using the emotions as inputs, adjusting a system accordingly as a result, and elicit emotions.
© All rights reserved Bernays et al. and/or ACM Press
Cooper, David G., Arroyo, Ivon, Woolf, Beverly Park, Muldner, Kasia, Burleson, Winslow and Christopherson, Robert (2009): Sensors Model Student Self Concept in the Classroom. In: Proceedings of the 2009 Conference on User Modeling, Adaptation and Personalization 2009. pp. 30-41.
In this paper we explore findings from three experiments that use minimally invasive sensors with a web based geometry tutor to create a user model. Minimally invasive sensor technology is mature enough to equip classrooms of up to 25 students with four sensors at the same time while using a computer based intelligent tutoring system. The sensors, which are on each student's chair, mouse, monitor, and wrist, provide data about posture, movement, grip tension, arousal, and facially expressed mental states. This data may provide adaptive feedback to an intelligent tutoring system based on an individual student's affective states. The experiments show that when sensor data supplements a user model based on tutor logs, the model reflects a larger percentage of the students' self-concept than a user model based on the tutor logs alone. The models are further expanded to classify four ranges of emotional self-concept including frustration, interest, confidence, and excitement with over 78% accuracy. The emotional predictions are a first step for intelligent tutor systems to create sensor based personalized feedback for each student in a classroom environment. Bringing sensors to our children's schools addresses real problems of students' relationship to mathematics as they are learning the subject.
© All rights reserved Cooper et al. and/or their publisher
Muldner, Kasia, Christopherson, Robert, Atkinson, Robert and Burleson, Winslow (2009): Investigating the Utility of Eye-Tracking Information on Affect and Reasoning for User Modeling. In: Proceedings of the 2009 Conference on User Modeling, Adaptation and Personalization 2009. pp. 138-149.
We investigate the utility of an eye tracker for providing information on users' affect and reasoning. To do so, we conducted a user study, results from which show that users' pupillary responses differ significantly between positive and negative affective states. As far as reasoning is concerned, while our analysis shows that larger pupil size is associated with more constructive reasoning events, it also suggests that to disambiguate between different kinds of reasoning, additional information may be needed. Our results show that pupillary response is a promising non-invasive avenue for increasing user model bandwidth.
© All rights reserved Muldner et al. and/or their publisher
Show list on your website
Join the design elite and advance:
Changes to this page (author)23 Nov 2012: Modified05 Apr 2012: Added
05 Apr 2012: Modified
Page maintainer: The Editorial Team