Number of co-authors:32
Number of publications with 3 favourite co-authors:Roope Raisamo:11Toni Vanhala:10Jani Lylykangas:8
Veikko Surakka's 3 most productive colleagues in number of publications:Roope Raisamo:52Kari-Jouko Räihä:20Oleg Spakov:14
Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away.
-- Antoine de Saint Exupéry
Read the fascinating history of Wearable Computing, told by its father, Steve Mann
Read Steve's chapter !
Our Latest Books
Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger
Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad
Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam
Publications by Veikko Surakka (bibliography)
Gizatdinova, Yulia, Spakov, Oleg and Surakka, Veikko (2012): Comparison of video-based pointing and selection techniques for hands-free text entry. In: Proceedings of the 2012 International Conference on Advanced Visual Interfaces 2012. pp. 132-139.
Video-based human-computer interaction has received increasing interest over the years. However, earlier research has been mainly focusing on technical characteristics of different methods rather than on user performance and experiences in using computer vision technology. This study aims to investigate performance characteristics of novice users and their subjective experiences in typing text with several video-based pointing and selection techniques. In Experiment 1, eye tracking and head tracking were applied for the task of pointing at the keys of a virtual keyboard. The results showed that gaze pointing was significantly faster but also more erroneous technique as compared with head pointing. Self-reported subjective ratings revealed that it was generally better, faster, more pleasant and efficient to type using gaze pointing than head pointing. In Experiment 2, mouth open and brows up facial gestures were utilized for confirming the selection of a given character. The results showed that text entry speed was approximately the same for both selection techniques, while mouth interaction caused significantly fewer errors than brow interaction. Subjective ratings did not reveal any significant differences between the techniques. Possibilities for design improvements are discussed.
© All rights reserved Gizatdinova et al. and/or ACM Press
Rantanen, Ville, Verho, Jarmo, Lekkala, Jukka, Tuisku, Outi, Surakka, Veikko and Vanhala, Toni (2012): The effect of clicking by smiling on the accuracy of head-mounted gaze tracking. In: Proceedings of the 2012 Symposium on Eye Tracking Research & Applications 2012. pp. 345-348.
The effect of facial behaviour on gaze tracking accuracy was studied while using a prototype system that integrated head-mounted, video-based gaze tracking and a capacitive facial movement detection for respective pointing and selecting objects in a simple graphical user interface. Experiments were carried out to determine how voluntary smiling movements that were used to indicate clicks affect the accuracy of gaze tracking due to the combination of user eye movement behaviour and the operation of gaze tracking algorithms. The results showed no observable degradation of the gaze tracking accuracy when using voluntary smiling for object selections.
© All rights reserved Rantanen et al. and/or ACM Press
Lylykangas, Jani, Surakka, Veikko, Salminen, Katri, Raisamo, Jukka, Laitinen, Pauli, Rönning, Kasper and Raisamo, Roope (2011): Designing tactile feedback for piezo buttons. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 3281-3284.
The present aim was to study the preference of tactile feedback stimulations given by non-physical (i.e., solid) piezo-actuated buttons. Participants (n=16) ranked 16 different tactile feedback stimuli varied by 4 output delays and 4 vibration durations. The results showed that the mean ranks of the stimuli differed significantly from each other. The timing parameters of delay and duration interacted with each other, for example, so that preference of certain vibration duration fluctuated in response to different output delays. Using a very short time window (i.e., 10-453 ms) combining both delay and duration parameters of the feedback could result either in favorable or significantly less favorable subjective experience. The results suggest that a preferred perception of tactile feedback from non-physical buttons requires careful design and controlling of the timing parameters.
© All rights reserved Lylykangas et al. and/or their publisher
Salminen, Katri, Surakka, Veikko, Raisamo, Jukka, Lylykangas, Jani, Pystynen, Johannes, Raisamo, Roope, Mäkelä, Kalle and Ahmaniemi, Teemu (2011): Emotional responses to thermal stimuli. In: Proceedings of the 2011 International Conference on Multimodal Interfaces 2011. pp. 193-196.
The present aim was to study if thermal stimuli presented to the palm can affect emotional responses when measured with emotion related subjective rating scales and changes in skin conductance response (SCR). Two target temperatures, cold and warm, were created by either decreasing or increasing the temperature of the stimulus 4 °C in respect to the participants current hand temperature. Both cold and warm stimuli were presented by using two presentation methods, i.e., dynamic and pre-adjusted. The results showed that both the dynamic and pre-adjusted warm stimuli elevated the ratings of arousal and dominance. In addition, the pre-adjusted warm and cold stimuli elevated the SCR. The results suggest that especially pre-adjusted warm stimuli can be seen as effective in activating the autonomic nervous system and arousal and dominance dimensions of the affective rating space.
© All rights reserved Salminen et al. and/or ACM Press
Heikkinen, Jani, Rantala, Jussi, Olsson, Thomas, Raisamo, Roope and Surakka, Veikko (2011): Exploring the effects of cumulative contextual cues on interpreting vibrotactile messages. In: Proceedings of 13th Conference on Human-computer interaction with mobile devices and services 2011. pp. 1-10.
The sense of touch has been shown to convey emotive information and nuances in face-to-face interpersonal communication, but its applications in mobile communication technologies are still limited. One of the challenges for such new communication medium is the interpretation of tactile messages. This paper presents a study with an early prototype of a mobile tactile device. Twenty novice participants interpreted four messages consisting of a four-channel vibrotactile stimulus, complemented with three cumulative textual cues regarding 1) the communication setting, 2) sender, and 3) situation. The subjective interpretations were assessed with four semantic differential scales, and the reasoning behind the interpretations was inquired by interviewing. The findings show that the intensity and, to some degree, the friendliness of the message could be identified from the tactile-only message. However, to correctly interpret the degree of formality or emotionality in the message, also contextual cues were needed.
© All rights reserved Heikkinen et al. and/or ACM Press
Pakkanen, Toni, Raisamo, Roope, Salminen, Katri and Surakka, Veikko (2010): Haptic numbers: three haptic representation models for numbers on a touch screen phone. In: Proceedings of the 2010 International Conference on Multimodal Interfaces 2010. p. 35.
Systematic research on haptic stimuli is needed to create viable haptic feeling for user interface elements. There has been a lot of research with haptic user interface prototypes, but much less with haptic stimulus design. In this study we compared three haptic representation models with two representation rates for the numbers used in the phone number keypad layout. Haptic representations for the numbers were derived from Arabic and Roman numbers, and from the Location of the number button in the layout grid. Using a Nokia 5800 Express Music phone participants entered phone numbers blindly in the phone. The speed, error rate, and subjective experiences were recorded. The results showed that the model had no effect to the measured performance, but subjective experiences were affected. The Arabic numbers with slower speed were preferred most. Thus, subjectively the performance was rated as better, even though objective measures showed no differences.
© All rights reserved Pakkanen et al. and/or ACM Press
Vanhala, Toni, Surakka, Veikko, Siirtola, Harri, Räihä, Kari-Jouko, Morel, Benoît and Ach, Laurent (2010): Virtual proximity and facial expressions of computer agents regulate human emotions and attention. In Journal of Visualization and Computer Animation, 21 (3) pp. 215-224.
Lylykangas, Jani, Surakka, Veikko, Rantala, Jussi, Raisamo, Jukka, Raisamo, Roope and Tuulari, Esa (2009): Vibrotactile information for intuitive speed regulation. In: Proceedings of the HCI09 Conference on People and Computers XXIII 2009. pp. 112-119.
The present aim was to investigate if controlled vibrotactile stimulation can be used to inform users on how to regulate their behavior. 36 stimuli were varied by frequency modulation (i.e., ascending, constant, and descending), duration (i.e., 500, 1750, and 3000 ms), waveform (i.e., sine and sawtooth), and body location (i.e., wrist and chest), and presented to 12 participants. The participants were to evaluate without any training the meaning of each presented stimuli using three response options: 'accelerate your speed', 'keep your speed constant', and 'decelerate your speed'. Participants rated also how emotionally pleasant and arousing the different stimulations were. The results showed that the stimuli were predominantly perceived analogously with the vibration frequency modulation. The best stimuli represented 'accelerate your speed', 'keep your speed constant', and 'decelerate your speed' information in
© All rights reserved Lylykangas et al. and/or their publisher
Heikkinen, Jani, Rantala, Jussi, Olsson, Thomas, Raisamo, Roope, Lylykangas, Jani, Raisamo, Jukka, Surakka, Veikko and Ahmaniemi, Teemu Tuomas (2009): Enhancing personal communication with spatial haptics: Two scenario-based experiments on gestural interaction. In J. Vis. Lang. Comput., 20 (5) pp. 287-304.
Raisamo, Jukka, Raisamo, Roope and Surakka, Veikko (2009): Evaluating the effect of temporal parameters for vibrotactile saltatory patterns. In: Proceedings of the 2009 International Conference on Multimodal Interfaces 2009. pp. 319-326.
Cutaneous saltation provides interesting possibilities for applications. An illusion of vibrotactile mediolateral movement was elicited to a left dorsal forearm to investigate emotional (i.e., pleasantness) and cognitive (i.e., continuity) experiences to vibrotactile stimulation. Twelve participants were presented with nine saltatory stimuli delivered to a linearly aligned row of three vibrotactile actuators separated by 70 mm in distance. The stimuli were composed of three temporal parameters of 12, 24 and 48 ms for both burst duration and inter-burst interval to form all nine possible uniform pairs. First, the stimuli were ranked by the participants using a special three-step procedure. Second, the participants rated the stimuli using two nine-point bipolar scales measuring the pleasantness and continuity of each stimulus, separately. The results showed especially the interval between two successive bursts was a significant factor for saltation. Moreover, the temporal parameters seemed to affect more the experienced continuity of the stimuli compared to pleasantness. These findings encourage us to continue to further study the saltation and the effect of different parameters for subjective experience.
© All rights reserved Raisamo et al. and/or their publisher
Salminen, Katri, Surakka, Veikko, Lylykangas, Jani, Raisamo, Jukka, Saarinen, Rami, Raisamo, Roope, Rantala, Jussi and Evreinov, Grigori (2008): Emotional and behavioral responses to haptic stimulation. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 1555-1562.
A prototype of friction-based horizontally rotating fingertip stimulator was used to investigate emotional experiences and behavioral responses to haptic stimulation. The rotation style of 12 different stimuli was varied by burst length (i.e., 20, 50, 100 ms), continuity (i.e., continuous and discontinuous), and direction (e.g., forward and backward). Using these stimuli 528 stimulus pairs were presented to 12 subjects who were to distinguish if stimuli in each pair were the same or different. Then they rated the stimuli using four scales measuring the pleasantness, arousal, approachability, and dominance qualities of the 12 stimuli. The results showed that continuous forward-backward rotating stimuli were rated as significantly more unpleasant, arousing, avoidable, and dominating than other types of stimulations (e.g., discontinuous forward rotation). The reaction times to these stimuli were significantly faster than reaction times to discontinuous forward and backward rotating stimuli. The results clearly suggest that even simple haptic stimulation can carry emotional information. The results can be utilized when making use of haptics in human-technology interaction.
© All rights reserved Salminen et al. and/or ACM Press
Vanhala, Toni and Surakka, Veikko (2008): Computer-assisted regulation of emotional and social processes. In: Or, Jimmy (ed.). "Affective Computing: Focus on Emotion Expression, Synthesis, and Recognition". Vienna, Austria: I-Tech Education and Publishingpp. 405-420
The current aim was to design a model for computer systems that support the regulation of emotion related processes during exposure to provoking stimuli. First, we reviewed previous studies of automated recognition of emotions and virtual stimulation as a basis for perceptual intelligence and proactive reasoning about emotional responses. Then, we designed a model for computer-assisted emotion regulation that supports the treatment of emotional disorders using exposure to provoking computer-generated stimuli. The model was shown to address the challenges of computer-assisted therapy, for example, real-time recognition and corresponding adaptation to emotional responses. The model can be used to support emotion regulation and facilitate the quality of human-technology interaction in general.
© All rights reserved Vanhala and Surakka and/or I-Tech Education and Publishing
Pakkanen, Toni, Lylykangas, Jani, Raisamo, Jukka, Raisamo, Roope, Salminen, Katri, Rantala, Jussi and Surakka, Veikko (2008): Perception of low-amplitude haptic stimuli when biking. In: Digalakis, Vassilios, Potamianos, Alexandros, Turk, Matthew, Pieraccini, Roberto and Ivanov, Yuri (eds.) Proceedings of the 10th International Conference on Multimodal Interfaces - ICMI 2008 October 20-22, 2008, Chania, Crete, Greece. pp. 281-284.
Vanhala, Toni, Surakka, Veikko and Anttonen, Jenni (2008): Measuring bodily responses to virtual faces with a pressure sensitive chair. In: Proceedings of the Fifth Nordic Conference on Human-Computer Interaction 2008. pp. 555-559.
The present aim was to study emotion related body movement responses using an unobtrusive measurement chair that is embedded with electromechanical film (EMFi) sensors. 30 participants viewed images of a male and a female computer agent while the magnitude and direction of body movements were measured. The facial expressions (i.e., frowning, neutral, smiling) and the size of the agents were varied. The results showed that participants leaned statistically significantly longer towards the agent when it displayed a frowning or a smiling expression as compared to a neutral expression. Also, their body movements were reduced while viewing the agents. The results suggest that the EMFi chair is a promising tool for detecting human activity related to social and emotional behaviour. In particular, the EMFi chair may support unobtrusive measurement of bodily responses in less strictly controlled contexts of human-computer interaction.
© All rights reserved Vanhala et al. and/or their publisher
Jokiniemi, Maria, Raisamo, Roope, Lylykangas, Jani and Surakka, Veikko (2008): Crossmodal Rhythm Perception. In: Pirhonen, Antti and Brewster, Stephen A. (eds.) HAID 2008 - Haptic and Audio Interaction Design - Third International Workshop September 15-16, 2008, Jyväskylä, Finland. pp. 111-119.
Siirtola, Harri, Räihä, Kari-Jouko, Surakka, Veikko and Vanhala, Toni (2008): Flexible Method for Producing Static Visualizations of Log Data. In: IV 2008 - 12th International Conference on Information Visualisation 8-11 July, 2008, London, UK. pp. 127-132.
Gizatdinova, Yulia and Surakka, Veikko (2008): Effect of Facial Expressions on Feature-Based Landmark Localization in Static Grey Scale Images. In: Ranchordas, Alpesh and Araújo, Helder (eds.) VISAPP 2008 - Proceedings of the Third International Conference on Computer Vision Theory and Applications - Volume 1 January 22-25, 2008, Funchal, Portugal. pp. 259-266.
Pakkanen, Toni, Lylykangas, Jani, Raisamo, Jukka, Raisamo, Roope, Salminen, Katri, Rantala, Jussi and Surakka, Veikko (2008): Perception of low-amplitude haptic stimuli when biking. In: Proceedings of the 2008 International Conference on Multimodal Interfaces 2008. pp. 281-284.
Haptic stimulation in motion has been studied only little earlier. To provide guidance for designing haptic interfaces for mobile use we carried out an initial experiment using C-2 actuators. 16 participants attended in the experiment to find out whether there is a difference in perceiving low-amplitude vibrotactile stimuli when exposed to minimal and moderate physical exertion. A stationary bike was used to control the exertion. Four body locations (wrist, leg, chest and back), two stimulus durations (1000 ms and 2000 ms) and two motion conditions with the stationary bicycle (still and moderate pedaling) were applied. It was found that cycling had significant effect on both the perception accuracy and the reaction times with selected stimuli. Stimulus amplitudes used in this experiment can be used to help haptic design for mobile users.
© All rights reserved Pakkanen et al. and/or their publisher
Vanhala, Toni and Surakka, Veikko (2007): Recognizing the Effects of Voluntary Facial Activations Using Heart Rate Patterns. In: Mastorakis, Nikos E., Kartalopoulos, Stamatios, Simian, Dana, Varonides, Argyrios, Mladenov, Valeri, Bojkovic, Zoran and Antonidakis, Emmanuel (eds.) Proceedings of the 11th WSEAS International Conference on Computers July 23-28, 2007, Agios Nikolaos, Crete Island, Greece. pp. 628-632.
Continuously measured physiological signals have the potential to act as non-invasive, real time indicators of human psychophysiological phenomena. Recently, several non-intrusive, wireless, and discrete measurement devices have been developed. For these reasons, there has been growing interest for using physiological signals for estimating emotions and other psychological processes during human-computer interaction. In the current work, we present the first steps towards constructing a person-independent online system that automatically identifies heart rate responses and estimates subjective experiences during voluntary facial activations. The preliminary results of our study showed that voluntarily produced facial expressions had an effect on subjective emotional experiences and physiological processes. Further, our results suggest that heart rate responses to facial activations can be detected and classified in order to support more accurate and efficient emotion detection.
© All rights reserved Vanhala and Surakka and/or World Scientific and Engineering Academy and Society (WSEAS)
Vanhala, Toni and Surakka, Veikko (2007): Facial Activation Control Effect (FACE). In: Paiva, Ana, Prada, Rui and Picard, Rosalind W. (eds.) Proceedings of the Second International Conference on Affective Computing and Intelligent Interaction September 12-14, 2007, Lisbon, Portugal. pp. 278-289.
The present study was the first in line of a series of experiments investigating the possibilities of using voluntarily produced physiological signals in computer-assisted therapy. The current aim was to find out whether computer-guided voluntary facial activations have an effect on autonomous nervous system activity. Twenty-seven participants performed a series of voluntary facial muscle activations, while wireless electrocardiography and subjective experiences were recorded. Each task consisted of activating either the corrugator supercilii muscle (activated when frowning) or the zygomaticus major muscle (activated when smiling) at one of three activation intensities (i.e. low, medium, and high). Our results showed a voluntary facial activation control effect (FACE) on psychological (i.e. level of experience) and physiological activity. Different muscle activations produced both task-specific emotional experiences and signifcant changes in heart rate and heart rate variability. Low intensity activations of both muscles were the most effective, easy to perform, and pleasant. We conclude that the FACE can clearly open the route for regulating involuntary physiological processes.
© All rights reserved Vanhala and Surakka and/or Springer
Anttonen, Jenni and Surakka, Veikko (2007): Music, Heart Rate, and Emotions in the Context of Stimulating Technologies. In: Paiva, Ana, Prada, Rui and Picard, Rosalind W. (eds.) ACII 2007 - Affective Computing and Intelligent Interaction, Second International Conference September 12-14, 2007, Lisbon, Portugal. pp. 290-301.
Vanhala, Toni and Surakka, Veikko (2007): Facial Activation Control Effect (FACE). In: Paiva, Ana, Prada, Rui and Picard, Rosalind W. (eds.) ACII 2007 - Affective Computing and Intelligent Interaction, Second International Conference September 12-14, 2007, Lisbon, Portugal. pp. 278-289.
Partala, Timo, Surakka, Veikko and Vanhala, Toni (2006): Real-time estimation of emotional experiences from facial expressions. In Interacting with Computers, 18 (2) pp. 208-226.
The present aim was to develop methods that estimate emotional experiences in real time from the electromyographic activity of two facial muscles: zygomaticus major (activated when smiling) and corrugator supercilii (activated when frowning). Ten subjects were stimulated with a series of emotionally arousing pictures and videos. After each stimulus the subjects rated the valence of their emotional experience on a nine-point bipolar dimensional scale. At the same time the computer estimated the subjects' ratings on the basis of their electrical facial activity during each stimulation with 70 computational models. The models estimated the subjects' ratings either categorically or dimensionally with regression models. The best categorical models were able to estimate negative and positive ratings with an average accuracy of over 70 and 80% for pictures and videos, respectively. The best correlations between the human ratings and machine estimations formed with the regression models were high (r>0.9). These findings indicate that models estimating psycho-emotional experiences on the basis of facial activity can be created successfully in several ways.
© All rights reserved Partala et al. and/or Elsevier Science
Anttonen, Jenni and Surakka, Veikko (2005): Emotions and heart rate while sitting on a chair. In: Proceedings of ACM CHI 2005 Conference on Human Factors in Computing Systems 2005. pp. 491-499.
New methods for unobtrusive monitoring of computer users' emotion psychophysiology are very much needed in human-computer interaction research. The present aim was to study heart rate changes during emotionally provocative stimulation. Six-second long auditory, visual, and audiovisual emotionally negative, neutral, and positive stimuli were presented to 24 participants. Heart rate responses were measured with a regular office chair embedded with electromechanical film (the EMFi chair) and with traditional earlobe photoplethysmography (PPG). Ratings of the stimuli were also collected. The results showed that the two heart rate measurements were significantly correlated, r = 0.99. In line with other studies the results showed that, in general, heart rate decelerated in response to emotional stimulation and it decelerated the most in response to negative stimuli as compared with responses to positive and neutral stimuli. Especially, emotional stimulation caused significant changes in heart rate at the 6th second from the stimulus onset. We suggest that the EMFi chair could be used in human-computer interaction for unobtrusive measurement of the user's emotional reactions.
© All rights reserved Anttonen and Surakka and/or ACM Press
Partala, Timo, Surakka, Veikko and Vanhala, Toni (2005): Person-independent estimation of emotional experiences from facial expressions. In: Riedl, John, Jameson, Anthony, Billsus, Daniel and Lau, Tessa (eds.) Proceedings of the 10th international conference on Intelligent user interfaces January 10-13, 2005, San Diego, California, USA. pp. 246-248.
The aim of this research was to develop methods for the automatic person-independent estimation of experienced emotions from facial expressions. Ten subjects watched series of emotionally arousing pictures and videos, while the electromyographic (EMG) activity of two facial muscles: zygomaticus major (activated in smiling) and corrugator supercilii (activated in frowning) was registered. Based on the changes in the activity of these two facial muscles, it was possible to distinguish between ratings of positive and negative emotional experiences at a rate of almost 70% for pictures and over 80% for videos. Using these methods, the computer could adapt its behavior according to the user's emotions during human-computer interaction.
© All rights reserved Partala et al. and/or ACM Press
Partala, Timo, Surakka, Veikko and Lahti, Jussi (2004): Affective effects of agent proximity in conversational systems. In: Proceedings of the Third Nordic Conference on Human-Computer Interaction October 23-27, 2004, Tampere, Finland. pp. 353-356.
The aim of this study was to investigate, if the simulated proximity level of an anthropomorphic conversational agent and the affective contents in the agent's speech influence the subjects' affective experiences. Eight subjects were exposed to messages given by the agent using synthetic speech. The agent character's simulated proximity level (intimate, personal, social, and public) and the affective contents of the speech message (negative, neutral, and positive) were systematically varied in the experiment. The proximity levels were simulated by displaying the agent on a screen in different sizes. After each speech message, the subjects rated their affective experience on four scales: valence, arousal, dominance, and message intimacy. They also chose a preferred agent proximity level. The results showed that by manipulating the agent's simulated proximity level, experienced dominance could be significantly influenced. Further, by manipulating the affective contents of the speech, experienced valence and intimacy could be significantly influenced. The personal and social proximity levels were preferred by the subjects.
© All rights reserved Partala et al. and/or ACM Press
Partala, Timo and Surakka, Veikko (2003): Pupil size variation as an indication of affective processing. In International Journal of Human-Computer Studies, 59 (1) pp. 185-198.
The present objective was to investigate pupil size variation during and
after auditory emotional stimulation. Thirty subjects' (15 females and 15
males) pupil responses were measured while listening to 10 negative and 10
positive highly arousing sounds (e.g. a baby crying and laughing), and 10
emotionally neutral sounds (e.g. regular office noise). The subjects also rated
their subjective experiences related to the stimuli. The results showed that
the pupil size was significantly larger during both emotionally negative and
positive stimuli than during neutral stimuli. The results for the time period
of 2 s following the stimulus offset showed that pupil size was significantly
larger after both negative and positive than neutral stimulation. These results
suggest that the autonomic nervous system is sensitive to highly arousing
emotional stimulation. The subjective ratings confirmed that the stimuli
influenced the subjects' emotional experiences as expected. Further analyses
showed that female subjects had significantly larger pupil responses than males
only to neutral stimuli and only during the auditory stimulation. In sum, our
results showed that systematically chosen stimuli significantly affected the
subjects' physiological reactions and subjective experiences. It could be
possible to use pupil size variation as a computer input signal, for example,
in affective computing. Auditory emotion-related cues could also be utilized to
modulate the user's emotional reactions.
© All rights reserved Partala and Surakka and/or Academic Press
Partala, Timo, Jokiniemi, Maria and Surakka, Veikko (2000): Pupillary responses to emotionally provocative stimuli. In: Duchowski, Andrew T. (ed.) ETRA 2000 - Proceedings of the Eye Tracking Research and Application Symposium November 6-8, 2000, Palm Beach Gardens, Florida, USA. pp. 123-129.
Show list on your website
Join the design elite and advance:
Changes to this page (author)09 Nov 2012: Modified09 Nov 2012: Modified
04 Apr 2012: Modified
04 Apr 2012: Modified
05 Jul 2011: Modified
03 May 2011: Modified
20 Apr 2011: Modified
20 Apr 2011: Modified
14 Apr 2011: Modified
16 Jan 2011: Modified
03 Nov 2010: Modified
24 Aug 2009: Modified
12 Jul 2009: Modified
12 Jul 2009: Modified
15 Jun 2009: Modified
12 Jun 2009: Modified
02 Jun 2009: Modified
01 Jun 2009: Modified
30 May 2009: Modified
26 Aug 2008: Added
06 Jun 2008: Added
06 Jun 2008: Added
06 Jun 2008: Added
12 May 2008: Modified
29 Jun 2007: Modified
27 Jun 2007: Modified
22 Jun 2007: Modified
28 Apr 2003: Added
Page maintainer: The Editorial Team