Number of co-authors:15
Number of publications with 3 favourite co-authors:Veikko Surakka:5Mika Luimula:3Ossi Saukko:2
Timo Partala's 3 most productive colleagues in number of publications:Veikko Surakka:28Teija Vainio:13Toni Vanhala:10
The theory gives the answers, not the theorist.
-- Allen Newell
Read the fascinating history of Wearable Computing, told by its father, Steve Mann
Read Steve's chapter !
Publications by Timo Partala (bibliography)
Partala, Timo and Salminen, Miikka (2012): User experience of photorealistic urban pedestrian navigation. In: Proceedings of the 2012 International Conference on Advanced Visual Interfaces 2012. pp. 204-207.
With advances in satellite and street-level imaging, photorealistic mobile maps have gained widespread popularity. The aim of this research was to study the user experience of mobile navigation with three different mobile maps: a traditional graphical map representation was compared to a photorealistic satellite map and a photorealistic street-level view. Nine subjects used all three visualizations in urban pedestrian navigation and gave evaluations of navigation support, user experience (AttrakDiff), task load (NASA TLX), and overall preference using questionnaire methods. The results indicated that the photorealistic maps were more stimulating to the user than the graphical map and the photorealistic street-level view also enabled more effective identification of nearby landmarks than the other map versions. However, the photorealistic maps were perceived as less pragmatic than the graphical map and the street-level view also demanded a higher task load. The graphical map was the most often preferred visualization.
© All rights reserved Partala and Salminen and/or ACM Press
Partala, Timo, Nurminen, Antti, Vainio, Teija, Laaksonen, Jari, Laine, Miika and Väänänen, Jukka (2010): Salience of visual cues in 3D city maps. In: Proceedings of the HCI10 Conference on People and Computers XXIV 2010. pp. 428-432.
An important activity in urban three-dimensional (3D) mobile navigation is browsing the buildings in the environment and matching them to those in the 3D city map. There are different factors affecting the recognition process such as changes in the appearances of buildings, weather, and illumination conditions. The current aim was to study the salience of different types of visual cues in the recognition of buildings in 3D maps in suboptimal conditions. A pilot laboratory experiment was conducted, in which test participants recognized buildings in a 3D city map using systematically prepared photographs as stimuli, and their cognitive processes were studied using the think aloud protocol. The results suggested that buildings in a 3D city map can be recognized based on a variety of different visual cues ranging from small details such as textual signs to the shape of the building and landmark features such as towers. The results also suggested that buildings are recognized relatively much based on their location and other buildings and objects in their surroundings.
© All rights reserved Partala et al. and/or BCS
Partala, Timo and Kangaskorte, Riitta (2009): The Combined Walkthrough: Measuring Behavioral, Affective, and Cognitive Information in Usability Testing. In Journal of Usability Studies, 5 (1) pp. 21-33.
This paper describes an experiment in studying users' behavior, emotions, and cognitive processes in single usability testing sessions using an experimental method called the combined walkthrough. The users' behavior was studied using task times and completion rates, and emotions were studied using bipolar scales for experienced valence and arousal. Cognition was studied after each task by revisiting detected usability problems together with the users and applying an interactive method based on cognitive walkthrough to each usability problem. An interactive media application was tested with 16 participants using these methods. The results of the experiment showed that the developed methods were efficient in identifying usability problems and measuring the different aspects of interaction, which enabled the researchers to obtain a more multifaceted view of the users' interaction with the system and the nature of the problems encountered. The following were the main findings of this experiment: * Behavioral, affective, and cognitive aspects of computer system usage can be cost-effectively studied together in usability testing. * The information obtained by the behavioral, affective, and cognitive measurements can contribute to a more multifaceted understanding of user interaction with the system. * Variations in the users' emotional experiences (valence and arousal) related to completing a task using an interactive system can be efficiently measured using bipolar scales. Systematic measurement of emotional experiences broadens the scope of subjective measures beyond traditional satisfaction measures. * The use of highly positive or negative media elements influences overall ratings of task-related affective experiences in interactive media applications. * Ideas underlying the cognitive walkthrough can be useful in retrospective analysis of usability problems together with the user.
© All rights reserved Partala and Kangaskorte and/or Usability Professionals Association
Lehtimäki, Taina M., Partala, Timo, Luimula, Mika and Verronen, Pertti (2008): LocaweRoute: an advanced route history visualization for mobile devices. In: Levialdi, Stefano (ed.) AVI 2008 - Proceedings of the working conference on Advanced Visual Interfaces May 28-30, 2008, Napoli, Italy. pp. 392-395.
Partala, Timo, Surakka, Veikko and Vanhala, Toni (2006): Real-time estimation of emotional experiences from facial expressions. In Interacting with Computers, 18 (2) pp. 208-226.
The present aim was to develop methods that estimate emotional experiences in real time from the electromyographic activity of two facial muscles: zygomaticus major (activated when smiling) and corrugator supercilii (activated when frowning). Ten subjects were stimulated with a series of emotionally arousing pictures and videos. After each stimulus the subjects rated the valence of their emotional experience on a nine-point bipolar dimensional scale. At the same time the computer estimated the subjects' ratings on the basis of their electrical facial activity during each stimulation with 70 computational models. The models estimated the subjects' ratings either categorically or dimensionally with regression models. The best categorical models were able to estimate negative and positive ratings with an average accuracy of over 70 and 80% for pictures and videos, respectively. The best correlations between the human ratings and machine estimations formed with the regression models were high (r>0.9). These findings indicate that models estimating psycho-emotional experiences on the basis of facial activity can be created successfully in several ways.
© All rights reserved Partala et al. and/or Elsevier Science
Partala, Timo, Luimula, Mika and Saukko, Ossi (2006): Automatic rotation and zooming in mobile roadmaps. In: Proceedings of 8th conference on Human-computer interaction with mobile devices and services 2006. pp. 255-258.
The aim of this research was to explore the navigational effects of two common features in current mobile roadmap systems: automatic rotation based on the vehicle's direction of movement and speed-dependent automatic zooming. 12 subjects tried four different visualization techniques for a mobile map in real traffic: no rotation/constant zooming, no rotation/automatic zooming, automatic rotation/constant zooming, and automatic rotation/automatic zooming. The subjects rated the techniques on four scales: position knowledge support, direction knowledge support, identification of real-world objects based on map objects and an overall score. The results showed that conditions involving automatic rotation and/or zooming got systematically more positive ratings on all scales than the conditions without those features. The implementation of automatic zooming created for this experiment was rated as very close to optimal. These results suggest that both automatic rotation and automatic zooming can enhance navigation when implemented to a mobile roadmap.
© All rights reserved Partala et al. and/or ACM Press
Partala, Timo, Luimula, Mika and Saukko, Ossi (2006): Automatic rotation and zooming in mobile roadmaps. In: Nieminen, Marko and Röykkee, Mika (eds.) Proceedings of the 8th Conference on Human-Computer Interaction with Mobile Devices and Services - Mobile HCI 2006 September 12-15, 2006, Helsinki, Finland. pp. 255-258.
Partala, Timo, Surakka, Veikko and Vanhala, Toni (2005): Person-independent estimation of emotional experiences from facial expressions. In: Riedl, John, Jameson, Anthony, Billsus, Daniel and Lau, Tessa (eds.) Proceedings of the 10th international conference on Intelligent user interfaces January 10-13, 2005, San Diego, California, USA. pp. 246-248.
The aim of this research was to develop methods for the automatic person-independent estimation of experienced emotions from facial expressions. Ten subjects watched series of emotionally arousing pictures and videos, while the electromyographic (EMG) activity of two facial muscles: zygomaticus major (activated in smiling) and corrugator supercilii (activated in frowning) was registered. Based on the changes in the activity of these two facial muscles, it was possible to distinguish between ratings of positive and negative emotional experiences at a rate of almost 70% for pictures and over 80% for videos. Using these methods, the computer could adapt its behavior according to the user's emotions during human-computer interaction.
© All rights reserved Partala et al. and/or ACM Press
Partala, Timo, Surakka, Veikko and Lahti, Jussi (2004): Affective effects of agent proximity in conversational systems. In: Proceedings of the Third Nordic Conference on Human-Computer Interaction October 23-27, 2004, Tampere, Finland. pp. 353-356.
The aim of this study was to investigate, if the simulated proximity level of an anthropomorphic conversational agent and the affective contents in the agent's speech influence the subjects' affective experiences. Eight subjects were exposed to messages given by the agent using synthetic speech. The agent character's simulated proximity level (intimate, personal, social, and public) and the affective contents of the speech message (negative, neutral, and positive) were systematically varied in the experiment. The proximity levels were simulated by displaying the agent on a screen in different sizes. After each speech message, the subjects rated their affective experience on four scales: valence, arousal, dominance, and message intimacy. They also chose a preferred agent proximity level. The results showed that by manipulating the agent's simulated proximity level, experienced dominance could be significantly influenced. Further, by manipulating the affective contents of the speech, experienced valence and intimacy could be significantly influenced. The personal and social proximity levels were preferred by the subjects.
© All rights reserved Partala et al. and/or ACM Press
Partala, Timo and Surakka, Veikko (2003): Pupil size variation as an indication of affective processing. In International Journal of Human-Computer Studies, 59 (1) pp. 185-198.
The present objective was to investigate pupil size variation during and
after auditory emotional stimulation. Thirty subjects' (15 females and 15
males) pupil responses were measured while listening to 10 negative and 10
positive highly arousing sounds (e.g. a baby crying and laughing), and 10
emotionally neutral sounds (e.g. regular office noise). The subjects also rated
their subjective experiences related to the stimuli. The results showed that
the pupil size was significantly larger during both emotionally negative and
positive stimuli than during neutral stimuli. The results for the time period
of 2 s following the stimulus offset showed that pupil size was significantly
larger after both negative and positive than neutral stimulation. These results
suggest that the autonomic nervous system is sensitive to highly arousing
emotional stimulation. The subjective ratings confirmed that the stimuli
influenced the subjects' emotional experiences as expected. Further analyses
showed that female subjects had significantly larger pupil responses than males
only to neutral stimuli and only during the auditory stimulation. In sum, our
results showed that systematically chosen stimuli significantly affected the
subjects' physiological reactions and subjective experiences. It could be
possible to use pupil size variation as a computer input signal, for example,
in affective computing. Auditory emotion-related cues could also be utilized to
modulate the user's emotional reactions.
© All rights reserved Partala and Surakka and/or Academic Press
Partala, Timo, Jokiniemi, Maria and Surakka, Veikko (2000): Pupillary responses to emotionally provocative stimuli. In: Duchowski, Andrew T. (ed.) ETRA 2000 - Proceedings of the Eye Tracking Research and Application Symposium November 6-8, 2000, Palm Beach Gardens, Florida, USA. pp. 123-129.
Show this list on your homepage
Join the technology elite and advance:
Changes to this page (author)09 Nov 2012: Added03 Apr 2012: Added
16 Jan 2011: Added
11 Feb 2010: Modified
17 Jun 2009: Added
01 Jun 2009: Added
29 May 2009: Added
06 Jun 2008: Added
24 Jul 2007: Added
27 Jun 2007: Added
22 Jun 2007: Added
28 Apr 2003: Added
Page maintainer: The Editorial Team