Number of co-authors:16
Number of publications with 3 favourite co-authors:Thad Starner:5Kent Lyons:4Sean White:3
Daniel Ashbrook's 3 most productive colleagues in number of publications:Patrick Baudisch:57Thad Starner:49Giulio Jacucci:30
A user will find any interface design intuitive...with enough practice.
-- Popular computer one-liner
Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess
User Experience and Experience Design !
Our Latest Books
Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger
Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad
Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam
Publications by Daniel Ashbrook (bibliography)
Lyons, Kent, Nguyen, David, Ashbrook, Daniel and White, Sean (2012): Facet: a multi-segment wrist worn system. In: Proceedings of the 2012 ACM Symposium on User Interface Software and Technology 2012. pp. 123-130.
We present Facet, a multi-display wrist worn system consisting of multiple independent touch-sensitive segments joined into a bracelet. Facet automatically determines the pose of the system as a whole and of each segment individually. It further supports multi-segment touch, yielding a rich set of touch input techniques. Our work builds on these two primitives to allow the user to control how applications use segments alone and in coordination. Applications can expand to use more segments, collapses to encompass fewer, and be swapped with other segments. We also explore how the concepts from Facet could apply to other devices in this design space.
© All rights reserved Lyons et al. and/or ACM Press
Ashbrook, Daniel, Baudisch, Patrick and White, Sean (2011): Nenya: subtle and eyes-free mobile input with a magnetically-tracked finger ring. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 2043-2046.
We present Nenya, a new input device in the shape of a finger ring. Nenya provides an input mechanism that is always available, fast to access, and allows analog input, while remaining socially acceptable by being embodied in commonly worn items. Users make selections by twisting the ring and "click" by sliding it along the finger. The ring -- the size of a regular wedding band -- is magnetic, and is tracked by a wrist-worn sensor. Nenya's tiny size, eyes-free usability, and physical form indistinguishable from a regular ring make its use subtle and socially acceptable. We present two user studies (one- and two-handed) in which we studied sighted and eyes-free use, finding that even with no visual feedback users were able to select from eight targets.
© All rights reserved Ashbrook et al. and/or their publisher
Hansen, Lone Koefoed, Rico, Julie, Jacucci, Giulio, Brewster, Stephen and Ashbrook, Daniel (2011): Performative interaction in public space. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 49-52.
Building on the assumption that every human action in public space has a performative aspect, this workshop seeks to explore issues of interactions with technology in public settings. More and more interfaces are used in public on an everyday basis. Simultaneously, metaphors from performance and theatre studies find their way into research on these interfaces, addressing how interaction with technology can be understood in a performative sense. However, the term 'performativity' is rarely addressed in ways that accentuate its nuances and its analytic power, and this is the focus of the workshop. We will examine the design of performative technologies, the evaluation of user experience, the importance of spectator and performer roles, and the social acceptability of performative actions in public spaces.
© All rights reserved Hansen et al. and/or their publisher
Lin, Felix Xiaozhu, Ashbrook, Daniel and White, Sean (2011): RhythmLink: securely pairing I/O-constrained devices by tapping. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 263-272.
We present RhythmLink, a system that improves the wireless pairing user experience. Users can link devices such as phones and headsets together by tapping a known rhythm on each device. In contrast to current solutions, RhythmLink does not require user interaction with the host device during the pairing process; and it only requires binary input on the peripheral, making it appropriate for small devices with minimal physical affordances. We describe the challenges in enabling this user experience and our solution, an algorithm that allows two devices to compare imprecisely-entered tap sequences while maintaining the secrecy of those sequences. We also discuss our prototype implementation of RhythmLink and review the results of initial user tests.
© All rights reserved Lin et al. and/or ACM Press
Ashbrook, Daniel and Starner, Thad (2010): MAGIC: a motion gesture design tool. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 2159-2168.
Devices capable of gestural interaction through motion sensing are increasingly becoming available to consumers; however, motion gesture control has yet to appear outside of game consoles. Interaction designers are frequently not expert in pattern recognition, which may be one reason for this lack of availability. Another issue is how to effectively test gestures to ensure that they are not unintentionally activated by a user's normal movements during everyday usage. We present MAGIC, a gesture design tool that addresses both of these issues, and detail the results of an evaluation.
© All rights reserved Ashbrook and Starner and/or their publisher
Ashbrook, Daniel, Lyons, Kent and Starner, Thad (2008): An investigation into round touchscreen wristwatch interaction. In: Hofte, G. Henri ter, Mulder, Ingrid and Ruyter, Boris E. R. de (eds.) Proceedings of the 10th Conference on Human-Computer Interaction with Mobile Devices and Services - Mobile HCI 2008 September 2-5, 2008, Amsterdam, the Netherlands. pp. 311-314.
Ashbrook, Daniel, Lyons, Kent and Clawson, James (2006): Capturing Experiences Anytime, Anywhere. In IEEE Pervasive Computing, 5 (2) pp. 8-11.
Lyons, Kent, Skeels, Christopher, Starner, Thad, Snoeck, Cornelis M., Wong, Benjamin A. and Ashbrook, Daniel (2004): Augmenting conversations using dual-purpose speech. In: Proceedings of the 2004 ACM Symposium on User Interface Software and Technology 2004. pp. 237-246.
In this paper, we explore the concept of dual-purpose speech: speech that is socially appropriate in the context of a human-to-human conversation which also provides meaningful input to a computer. We motivate the use of dual-purpose speech and explore issues of privacy and technological challenges related to mobile speech recognition. We present three applications that utilize dual-purpose speech to assist a user in conversational tasks: the Calendar Navigator Agent, DialogTabs, and Speech Courier. The Calendar Navigator Agent navigates a user\'s calendar based on socially appropriate speech used while scheduling appointments. DialogTabs allows a user to postpone cognitive processing of conversational material by proving short-term capture of transient information. Finally, Speech Courier allows asynchronous delivery of relevant conversational information to a third party.
© All rights reserved Lyons et al. and/or ACM Press
Ashbrook, Daniel and Starner, Thad (2003): Using GPS to learn significant locations and predict movement across multiple users. In Personal and Ubiquitous Computing, 7 (5) pp. 275-286.
Starner, Thad, Auxier, Jake, Ashbrook, Daniel and Gandy, Maribeth (2000): The gesture pendant: a self-illuminating, wearable, infrared computer vision system for home automation control and medical monitoring. In: Proceedings of the 4th IEEE International Symposium on Wearable Computers October 16-17, 2000, Atlanta, USA. pp. 87-94.
In this paper we present a wearable device for control of home automation systems via hand gestures. This solution has many advantages over traditional home automation interfaces in that it can be used by those with loss of vision, motor skills, and mobility. By combining other sources of context with the pendant we can reduce the number and complexity of gestures while maintaining functionality. As users input gestures, the system can also analyze their movements for pathological tremors. This information can then be used for medical diagnosis, therapy, and emergency services.Currently, the Gesture Pendant can recognize control gestures with an accuracy of 95% and user- defined gestures with an accuracy of 97% It can detect tremors above 2HZ within +/- .1 Hz.
© All rights reserved Starner et al. and/or their publisher
Show list on your website
Join our community and advance:
Changes to this page (author)25 Jan 2014: Added23 Nov 2012: Modified
05 Apr 2012: Modified
05 Jul 2011: Modified
05 Jul 2011: Modified
02 Nov 2010: Modified
02 Jun 2009: Modified
31 May 2009: Modified
29 May 2009: Modified
11 Jun 2007: Added
Page maintainer: The Editorial Team