Publication statistics

Pub. period:2006-2012
Pub. count:13
Number of co-authors:16



Co-authors

Number of publications with 3 favourite co-authors:

Stephen A. Brewster:6
Roope Raisamo:2
Laura Haverinen:2

 

 

Productive colleagues

Eve Hoggan's 3 most productive colleagues in number of publications:

Stephen A. Brewste..:108
Roope Raisamo:53
Nuria Oliver:31
 
 
 
Jul 23

Men have become the tools of their tools.

-- Henry David Thoreau

 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger

 
Start reading

Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad

 
Start reading

Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam

 
Start reading
 
 

Help us help you!

 
 

Eve Hoggan

Add description
Add publication

Publications by Eve Hoggan (bibliography)

 what's this?
2012
 
Edit | Del

Stewart, Craig, Hoggan, Eve, Haverinen, Laura, Salamin, Hugues and Jacucci, Giulio (2012): An exploration of inadvertent variations in mobile pressure input. In: Proceedings of the 14th Conference on Human-computer interaction with mobile devices and services 2012. pp. 35-38.

This paper reports the results of an exploratory study into inadvertent grip pressure changes on mobile devices with a focus on the differences between static lab-based and mobile walking environments. The aim of this research is to inform the design of more robust pressure input techniques that can accommodate dynamic mobile usage. The results of the experiment show that there are significant differences in grip pressure in static and walking conditions with high levels of pressure variation in both. By combining the pressure data with accelerometer data, we show that grip pressure is closely related to user movement.

© All rights reserved Stewart et al. and/or ACM Press

 
Edit | Del

Hoggan, Eve, Stewart, Craig, Haverinen, Laura, Jacucci, Giulio and Lantz, Vuokko (2012): Pressages: augmenting phone calls with non-verbal messages. In: Proceedings of the 2012 ACM Symposium on User Interface Software and Technology 2012. pp. 555-562.

ForcePhone is a mobile synchronous haptic communication system. During phone calls, users can squeeze the side of the device and the pressure level is mapped to vibrations on the recipient's device. The pressure/vibrotactile messages supported by ForcePhone are called pressages. Using a lab-based study and a small field study, this paper addresses the following questions: how can haptic interpersonal communication be integrated into a standard mobile device? What is the most appropriate feedback design for pressages? What types of non-verbal cues can be represented by pressages? Do users make use of pressages during their conversations? The results of this research indicate that such a system has value as a communication channel in real-world settings with users expressing greetings, presence and emotions through pressages.

© All rights reserved Hoggan et al. and/or ACM Press

2011
 
Edit | Del

Hoggan, Eve, Trendafilov, Dari, Ahmaniemi, Teemu and Raisamo, Roope (2011): Squeeze vs. tilt: a comparative study using continuous tactile feedback. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 1309-1314.

This paper presents an investigation into the performance of squeezing as a manipulative interaction technique in comparison to tilting with an aim to answer two questions: is squeezing an effective input technique for mobile devices and can tactile feedback improve performance? The experiment results show that both input methods are viable but squeezing is significantly faster and more sustainable than tilting (with and without tactile feedback).

© All rights reserved Hoggan et al. and/or their publisher

2010
 
Edit | Del

Church, Karen, Hoggan, Eve and Oliver, Nuria (2010): A study of mobile mood awareness and communication through MobiMood. In: Proceedings of the Sixth Nordic Conference on Human-Computer Interaction 2010. pp. 128-137.

Recent research shows that there has been increased interest in investigating the role of mood and emotions in the HCI domain. Our moods, however, are complex. They are affected by many dynamic factors and can change multiple times throughout each day. Furthermore, our mood can have significant implications in terms of our experiences, our actions and most importantly on our interactions with other people. We have developed MobiMood, a proof-of-concept social mobile application that enables groups of friends to share their moods with each other. In this paper, we present the results of an exploratory field study of MobiMood, focusing on explicit mood sharing in-situ. Our results highlight that certain contextual factors had an effect on mood and the interpretation of moods. Furthermore, mood sharing and mood awareness appear to be good springboards for conversations and increased communication among users. These and other findings lead to a number of key implications in the design of mobile social awareness applications.

© All rights reserved Church et al. and/or their publisher

 
Edit | Del

Hoggan, Eve and Brewster, Stephen A. (2010): Crosstrainer: testing the use of multimodal interfaces in situ. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 333-342.

We report the results of an exploratory 8-day field study of CrossTrainer: a mobile game with crossmodal audio and tactile feedback. Our research focuses on the longitudinal effects on performance with audio and tactile feedback, the impact of context such as location and situation on performance and personal modality preference. The results of this study indicate that crossmodal feedback can aid users in entering answers quickly and accurately using a variety of different widgets. Our study shows that there are times when audio is more appropriate than tactile and vice versa and for this reason devices should support both tactile and audio feedback to cover the widest range of environments, user preference, locations and tasks.

© All rights reserved Hoggan and Brewster and/or their publisher

 
Edit | Del

Hoggan, Eve (2010): Crossmodal Audio and Tactile Interaction with Mobile Touchscreens. In International Journal of Mobile Human Computer Interaction, 2 (4) pp. 29-44.

This article asserts that using crossmodal auditory and tactile interaction can aid mobile touchscreen users in accessing data non-visually and, by providing a choice of modalities, can help to overcome problems that occur in different mobile situations where one modality may be less suitable than another (Hoggan, 2010). By encoding data using the crossmodal parameters of audio and vibration, users can learn mappings and translate information between both modalities. In this regard, data may be presented to the most appropriate modality given the situation and surrounding environment.

© All rights reserved Hoggan and/or his/her publisher

2009
 
Edit | Del

Hoggan, Eve, Crossan, Andrew, Brewster, Stephen A. and Kaaresoja, Topi (2009): Audio or tactile feedback: which modality when?. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 2253-2256.

When designing interfaces for mobile devices it is important to take into account the variety of contexts of use. We present a study that examines how changing noise and disturbance in the environment affects user performance in a touchscreen typing task with the interface being presented through visual only, visual and tactile, or visual and audio feedback. The aim of the study is to show at what exact environmental levels audio or tactile feedback become ineffective. The results show significant decreases in performance for audio feedback at levels of 94dB and above as well as decreases in performance for tactile feedback at vibration levels of 9.18g/s. These results suggest that at these levels, feedback should be presented by a different modality. These findings will allow designers to take advantage of sensor enabled mobile devices to adapt the provided feedback to the user's current context.

© All rights reserved Hoggan et al. and/or ACM Press

 
Edit | Del

Hoggan, Eve, Raisamo, Roope and Brewster, Stephen A. (2009): Mapping information to audio and tactile icons. In: Proceedings of the 2009 International Conference on Multimodal Interfaces 2009. pp. 327-334.

We report the results of a study focusing on the meanings that can be conveyed by audio and tactile icons. Our research considers the following question: how can audio and tactile icons be designed to optimise congruence between crossmodal feedback and the type of information this feedback is intended to convey? For example, if we have a set of system warnings, confirmations, progress up-dates and errors: what audio and tactile representations best match the information or type of message? Is one modality more appropriate at presenting certain types of information than the other modality? The results of this study indicate that certain parameters of the audio and tactile modalities such as rhythm, texture and tempo play an important role in the creation of congruent sets of feedback when given a specific type of information to transmit. We argue that a combination of audio or tactile parameters derived from our results allows the same type of information to be derived through touch and sound with an intuitive match to the content of the message.

© All rights reserved Hoggan et al. and/or their publisher

2008
 
Edit | Del

Hoggan, Eve, Brewster, Stephen A. and Johnston, Jody (2008): Investigating the effectiveness of tactile feedback for mobile touchscreens. In: Proceeding of the Twenty-Sixth Annual SIGCHI Conference on Human Factors in Computing Systems April 5-10, 2008, Florence, Italy. pp. 1573-1582.

This paper presents a study of finger-based text entry for mobile devices with touchscreens. Many devices are now coming to market that have no physical keyboards (the Apple iPhone being a very popular example). Touchscreen keyboards lack any tactile feedback and this may cause problems for entering text and phone numbers. We ran an experiment to compare devices with a physical keyboard, a standard touchscreen and a touchscreen with tactile feedback added. We tested this in both static and mobile environments. The results showed that the addition of tactile feedback to the touchscreen significantly improved finger-based text entry, bringing it close to the performance of a real physical keyboard. A second experiment showed that higher specification tactile actuators could improve performance even further. The results suggest that manufacturers should use tactile feedback in their touchscreen devices to regain some of the feeling lost when interacting on a touchscreen with a finger.

© All rights reserved Hoggan et al. and/or ACM Press

 
Edit | Del

Hoggan, Eve, Brewster, Stephen A. and Johnston, Jody (2008): Investigating the effectiveness of tactile feedback for mobile touchscreens. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 1573-1582.

This paper presents a study of finger-based text entry for mobile devices with touchscreens. Many devices are now coming to market that have no physical keyboards (the Apple iPhone being a very popular example). Touchscreen keyboards lack any tactile feedback and this may cause problems for entering text and phone numbers. We ran an experiment to compare devices with a physical keyboard, a standard touchscreen and a touchscreen with tactile feedback added. We tested this in both static and mobile environments. The results showed that the addition of tactile feedback to the touchscreen significantly improved finger-based text entry, bringing it close to the performance of a real physical keyboard. A second experiment showed that higher specification tactile actuators could improve performance even further. The results suggest that manufacturers should use tactile feedback in their touchscreen devices to regain some of the feeling lost when interacting on a touchscreen with a finger.

© All rights reserved Hoggan et al. and/or ACM Press

 
Edit | Del

Hoggan, Eve, Kaaresoja, Topi, Laitinen, Pauli and Brewster, Stephen (2008): Crossmodal congruence: the look, feel and sound of touchscreen widgets. In: Proceedings of the 2008 International Conference on Multimodal Interfaces 2008. pp. 157-164.

Our research considers the following question: how can visual, audio and tactile feedback be combined in a congruent manner for use with touchscreen graphical widgets? For example, if a touchscreen display presents different styles of visual buttons, what should each of those buttons feel and sound like? This paper presents the results of an experiment conducted to investigate methods of congruently combining visual and combined audio/tactile feedback by manipulating the different parameters of each modality. The results indicate trends with individual visual parameters such as shape, size and height being combined congruently with audio/tactile parameters such as texture, duration and different actuator technologies. We draw further on the experiment results using individual quality ratings to evaluate the perceived quality of our touchscreen buttons then reveal a correlation between perceived quality and crossmodal congruence. The results of this research will enable mobile touchscreen UI designers to create realistic, congruent buttons by selecting the most appropriate audio and tactile counterparts of visual button styles.

© All rights reserved Hoggan et al. and/or their publisher

2007
 
Edit | Del

Hoggan, Eve and Brewster, Stephen (2007): Designing audio and tactile crossmodal icons for mobile devices. In: Proceedings of the 2007 International Conference on Multimodal Interfaces 2007. pp. 162-169.

This paper reports an experiment into the design of crossmodal icons which can provide an alternative form of output for mobile devices using audio and tactile modalities to communicate information. A complete set of crossmodal icons was created by encoding three dimensions of information in three crossmodal auditory/tactile parameters. Earcons were used for the audio and Tactons for the tactile crossmodal icons. The experiment investigated absolute identification of audio and tactile crossmodal icons when a user is trained in one modality and tested in the other (and given no training in the other modality) to see if knowledge could be transferred between modalities. We also compared performance when users were static and mobile to see any effects that mobility might have on recognition of the cues. The results showed that if participants were trained in sound with Earcons and then tested with the same messages presented via Tactons they could recognize 85% of messages when stationary and 76% when mobile. When trained with Tactons and tested with Earcons participants could accurately recognize 76.5% of messages when stationary and 71% of messages when mobile. These results suggest that participants can recognize and understand a message in a different modality very effectively. These results will aid designers of mobile displays in creating effective crossmodal cues which require minimal training for users and can provide alternative presentation modalities through which information may be presented if the context requires.

© All rights reserved Hoggan and Brewster and/or their publisher

2006
 
Edit | Del

Hoggan, Eve and Brewster, Stephen A. (2006): Crossmodal spatial location: initial experiments. In: Proceedings of the Fourth Nordic Conference on Human-Computer Interaction 2006. pp. 469-472.

This paper describes an alternative form of interaction for mobile devices using crossmodal output. The aim of our work is to investigate the equivalence of audio and tactile displays so that the same messages can be presented in one form or another. Initial experiments show that spatial location can be perceived as equivalent in both the auditory and tactile modalities Results show that participants are able to map presented 3D audio positions to tactile body positions on the waist most effectively when mobile and that there are significantly more errors made when using the ankle or wrist. This paper compares the results from both a static and mobile experiment on crossmodal spatial location and outlines the most effective ways to use this crossmodal output in a mobile context.

© All rights reserved Hoggan and Brewster and/or ACM Press

 
Add publication
Show list on your website
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
 
 
 

Changes to this page (author)

23 Nov 2012: Modified
23 Nov 2012: Modified
05 Jul 2011: Modified
20 Apr 2011: Modified
20 Apr 2011: Modified
20 Apr 2011: Modified
16 Jan 2011: Modified
02 Nov 2010: Modified
02 Nov 2010: Modified
09 May 2009: Modified
12 May 2008: Added
12 May 2008: Added
12 May 2008: Modified
22 Jun 2007: Added

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/eve_hoggan.html

Publication statistics

Pub. period:2006-2012
Pub. count:13
Number of co-authors:16



Co-authors

Number of publications with 3 favourite co-authors:

Stephen A. Brewster:6
Roope Raisamo:2
Laura Haverinen:2

 

 

Productive colleagues

Eve Hoggan's 3 most productive colleagues in number of publications:

Stephen A. Brewste..:108
Roope Raisamo:53
Nuria Oliver:31
 
 
 
Jul 23

Men have become the tools of their tools.

-- Henry David Thoreau

 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger

 
Start reading

Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad

 
Start reading

Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam

 
Start reading
 
 

Help us help you!