Publication statistics

Pub. period:2008-2011
Pub. count:7
Number of co-authors:13



Co-authors

Number of publications with 3 favourite co-authors:

Patrick Baudisch:4
Pourang Irani:4
Sriram Subramanian:2

 

 

Productive colleagues

Sean Gustafson's 3 most productive colleagues in number of publications:

Carl Gutwin:116
Patrick Baudisch:57
Andreas Butz:48
 
 
 

Upcoming Courses

go to course
Emotional Design: How to make products people will love
Starts tomorrow LAST CALL!
go to course
UI Design Patterns for Successful Software
86% booked. Starts in 9 days
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

 
 
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
 
 
 
 
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading
 
 

Sean Gustafson

 

Publications by Sean Gustafson (bibliography)

 what's this?
2011
 
Edit | Del

Gustafson, Sean, Holz, Christian and Baudisch, Patrick (2011): Imaginary phone: learning imaginary interfaces by transferring spatial memory from a familiar device. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 283-292. Available online

We propose a method for learning how to use an imaginary interface (i.e., a spatial non-visual interface) that we call "transfer learning". By using a physical device (e.g. an iPhone) a user inadvertently learns the interface and can then transfer that knowledge to an imaginary interface. We illustrate this concept with our Imaginary Phone prototype. With it users interact by mimicking the use of a physical iPhone by tapping and sliding on their empty non-dominant hand without visual feedback. Pointing on the hand is tracked using a depth camera and touch events are sent wirelessly to an actual iPhone, where they invoke the corresponding actions. Our prototype allows the user to perform everyday task such as picking up a phone call or launching the timer app and setting an alarm. Imaginary Phone thereby serves as a shortcut that frees users from the necessity of retrieving the actual physical device. We present two user studies that validate the three assumptions underlying the transfer learning method. (1) Users build up spatial memory automatically while using a physical device: participants knew the correct location of 68% of their own iPhone home screen apps by heart. (2) Spatial memory transfers from a physical to an imaginary inter-face: participants recalled 61% of their home screen apps when recalling app location on the palm of their hand. (3) Palm interaction is precise enough to operate a typical mobile phone: Participants could reliably acquire 0.95cm wide iPhone targets on their palm-sufficiently large to operate any iPhone standard widget.

© All rights reserved Gustafson et al. and/or ACM Press

2010
 
Edit | Del

Boring, Sebastian, Baur, Dominikus, Butz, Andreas, Gustafson, Sean and Baudisch, Patrick (2010): Touch projector: mobile interaction through video. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 2287-2296. Available online

In 1992, Tani et al. proposed remotely operating machines in a factory by manipulating a live video image on a computer screen. In this paper we revisit this metaphor and investigate its suitability for mobile use. We present Touch Projector, a system that enables users to interact with remote screens through a live video image on their mobile device. The handheld device tracks itself with respect to the surrounding displays. Touch on the video image is "projected" onto the target display in view, as if it had occurred there. This literal adaptation of Tani's idea, however, fails because handheld video does not offer enough stability and control to enable precise manipulation. We address this with a series of improvements, including zooming and freezing the video image. In a user study, participants selected targets and dragged targets between displays using the literal and three improved versions. We found that participants achieved highest performance with automatic zooming and temporary image freezing.

© All rights reserved Boring et al. and/or their publisher

 
Edit | Del

Gustafson, Sean, Bierwirth, Daniel and Baudisch, Patrick (2010): Imaginary interfaces: spatial interaction with empty hands and without visual feedback. In: Proceedings of the 2010 ACM Symposium on User Interface Software and Technology 2010. pp. 3-12. Available online

Screen-less wearable devices allow for the smallest form factor and thus the maximum mobility. However, current screen-less devices only support buttons and gestures. Pointing is not supported because users have nothing to point at. However, we challenge the notion that spatial interaction requires a screen and propose a method for bringing spatial interaction to screen-less devices. We present Imaginary Interfaces, screen-less devices that allow users to perform spatial interaction with empty hands and without visual feedback. Unlike projection-based solutions, such as Sixth Sense, all visual "feedback" takes place in the user's imagination. Users define the origin of an imaginary space by forming an L-shaped coordinate cross with their non-dominant hand. Users then point and draw with their dominant hand in the resulting space. With three user studies we investigate the question: To what extent can users interact spatially with a user interface that exists only in their imagination? Participants created simple drawings, annotated existing drawings, and pointed at locations described in imaginary space. Our findings suggest that users' visual short-term memory can, in part, replace the feedback conventionally displayed on a screen.

© All rights reserved Gustafson et al. and/or their publisher

2009
 
Edit | Del

Rahman, Mahfuz, Gustafson, Sean, Irani, Pourang and Subramanian, Sriram (2009): Tilt techniques: investigating the dexterity of wrist-based input. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 1943-1952. Available online

Most studies on tilt based interaction can be classified as point-designs that demonstrate the utility of wrist-tilt as an input medium; tilt parameters are tailored to suit the specific interaction at hand. In this paper, we systematically analyze the design space of wrist-based interactions and focus on the level of control possible with the wrist. In a first study, we investigate the various factors that can influence tilt control, separately along the three axes of wrist movement: flexion/extension, pronation/supination, and ulnar/radial deviation. Results show that users can control comfortably at least 16 levels on the pronation/supination axis and that using a quadratic mapping function for discretization of tilt space significantly improves user performance across all tilt axes. We discuss the findings of our results in the context of several interaction techniques and identify several general design recommendations.

© All rights reserved Rahman et al. and/or ACM Press

2008
 
Edit | Del

Gustafson, Sean, Baudisch, Patrick, Gutwin, Carl and Irani, Pourang (2008): Wedge: clutter-free visualization of off-screen locations. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 787-796. Available online

To overcome display limitations of small-screen devices, researchers have proposed techniques that point users to objects located off-screen. Arrow-based techniques such as City Lights convey only direction. Halo conveys direction and distance, but is susceptible to clutter resulting from overlapping halos. We present Wedge, a visualization technique that conveys direction and distance, yet avoids overlap and clutter. Wedge represents each off-screen location using an acute isosceles triangle: the tip coincides with the off-screen locations, and the two corners are located on-screen. A wedge conveys location awareness primarily by means of its two legs pointing towards the target. Wedges avoid overlap programmatically by repelling each other, causing them to rotate until overlap is resolved. As a result, wedges can be applied to numbers and configurations of targets that would lead to clutter if visualized using halos. We report on a user study comparing Wedge and Halo for three off-screen tasks. Participants were significantly more accurate when using Wedge than when using Halo.

© All rights reserved Gustafson et al. and/or ACM Press

 
Edit | Del

Shi, Kang, Irani, Pourang, Gustafson, Sean and Subramanian, Sriram (2008): PressureFish: a method to improve control of discrete pressure-based input. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 1295-1298. Available online

Studies investigating user control of pressure input have reported

© All rights reserved Shi et al. and/or ACM Press

 
Edit | Del

Hui, Bowen, Gustafson, Sean, Irani, Pourang and Boutilier, Craig (2008): The need for an interaction cost model in adaptive interfaces. In: Levialdi, Stefano (ed.) AVI 2008 - Proceedings of the working conference on Advanced Visual Interfaces May 28-30, 2008, Napoli, Italy. pp. 458-461. Available online

 
Add publication
Show list on your website
 
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
Join our community!
 
 
 

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/sean_gustafson.html