Publication statistics

Pub. period:2006-2011
Pub. count:9
Number of co-authors:27


Number of publications with 3 favourite co-authors:

Rob DeLine:
Meredith Ringel Morris:
John Huffman:



Productive colleagues

Andrew Bragdon's 3 most productive colleagues in number of publications:

Ken Hinckley:54
Meredith Ringel Mo..:38
Yang Li:30

Upcoming Courses

go to course
User Research - Methods and Best Practices
Starts tomorrow LAST CALL!
go to course
Get Your First Job as a UX or Interaction Designer
Starts the day after tomorrow !

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !


Our Latest Books

The Glossary of Human Computer Interaction
by Mads Soegaard and Rikke Friis Dam
start reading
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading

Andrew Bragdon


Publications by Andrew Bragdon (bibliography)

 what's this?
Edit | Del

Bragdon, Andrew and Ko, Hsu-Sheng (2011): Gesture select:: acquiring remote targets on large displays without pointing. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 187-196.

When working at a large wall display, even if partially utilized, many targets are likely to be distant from the user, requiring walking, which is slow, and interrupts workflow. We propose a novel technique for selecting remote targets called Gesture Select, in which users draw an initial mark, in a target's direction; rectilinear gestures represented as icons are dynamically overlaid on targets within a region of interest; the user then continues by drawing the continuation mark corresponding to the target, to select it. Extensions to this technique to support working with remote content for an extended period, and learning gesture shortcuts are presented. A formal experiment indicates Gesture Select significantly outperformed direct selection for mid/far targets. Further analysis suggests Gesture Select performance is principally affected by the extent to which users can read the gestures, influenced by distance and perspective warping, and the gesture complexity in the ROI. The results of a second 2-D experiment with labeled targets indicate Gesture Select significantly outperformed direct selection and an existing technique.

© All rights reserved Bragdon and Ko and/or their publisher

Edit | Del

Bragdon, Andrew, Nelson, Eugene, Li, Yang and Hinckley, Ken (2011): Experimental analysis of touch-screen gesture designs in mobile environments. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 403-412.

Direct-touch interaction on mobile phones revolves around screens that compete for visual attention with users' real-world tasks and activities. This paper investigates the impact of these situational impairments on touch-screen interaction. We probe several design factors for touch-screen gestures, under various levels of environmental demands on attention, in comparison to the status-quo approach of soft buttons. We find that in the presence of environmental distractions, gestures can offer significant performance gains and reduced attentional load, while performing as well as soft buttons when the user's attention is focused on the phone. In fact, the speed and accuracy of bezel gestures did not appear to be significantly affected by environment, and some gestures could be articulated eyes-free, with one hand. Bezel-initiated gestures offered the fastest performance, and mark-based gestures were the most accurate. Bezel-initiated marks therefore may offer a promising approach for mobile touch-screen interaction that is less demanding of the user's attention.

© All rights reserved Bragdon et al. and/or their publisher

Edit | Del

Bragdon, Andrew, DeLine, Rob, Hinckley, Ken and Morris, Meredith Ringel (2011): Code space: touch + air gesture hybrid interactions for supporting developer meetings. In: Proceedings of the 2011 ACM International Conference on Interactive Tabletops and Surfaces 2011. pp. 212-221.

We present Code Space, a system that contributes touch + air gesture hybrid interactions to support co-located, small group developer meetings by democratizing access, control, and sharing of information across multiple personal devices and public displays. Our system uses a combination of a shared multi-touch screen, mobile touch devices, and Microsoft Kinect sensors. We describe cross-device interactions, which use a combination of in-air pointing for social disclosure of commands, targeting and mode setting, combined with touch for command execution and precise gestures. In a formative study, professional developers were positive about the interaction design, and most felt that pointing with hands or devices and forming hand postures are socially acceptable. Users also felt that the techniques adequately disclosed who was interacting and that existing social protocols would help to dictate most permissions, but also felt that our lightweight permission feature helped presenters manage incoming content.

© All rights reserved Bragdon et al. and/or ACM Press

Edit | Del

Bragdon, Andrew, Zeleznik, Robert, Reiss, Steven P., Karumuri, Suman, Cheung, William, Kaplan, Joshua, Coleman, Christopher, Adeputra, Ferdi and LaViola, Joseph J. (2010): Code bubbles: a working set-based interface for code understanding and maintenance. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 2503-2512.

Developers spend significant time reading and navigating code fragments spread across multiple locations. The file-based nature of contemporary IDEs makes it prohibitively difficult to create and maintain a simultaneous view of such fragments. We propose a novel user interface metaphor for code understanding based on collections of lightweight, editable fragments called bubbles, which form concurrently visible working sets. We present the results of a qualitative usability evaluation, and the results of a quantitative study which indicates Code Bubbles significantly improved code understanding time, while reducing navigation interactions over a widely-used IDE, for two controlled tasks.

© All rights reserved Bragdon et al. and/or their publisher

Edit | Del

Zeleznik, Robert, Bragdon, Andrew, Adeputra, Ferdi and Ko, Hsu-Sheng (2010): Hands-on math: a page-based multi-touch and pen desktop for technical work and problem solving. In: Proceedings of the 2010 ACM Symposium on User Interface Software and Technology 2010. pp. 17-26.

Students, scientists and engineers have to choose between the flexible, free-form input of pencil and paper and the computational power of Computer Algebra Systems (CAS) when solving mathematical problems. Hands-On Math is a multi-touch and pen-based system which attempts to unify these approaches by providing virtual paper that is enhanced to recognize mathematical notations as a means of providing in situ access to CAS functionality. Pages can be created and organized on a large pannable desktop, and mathematical expressions can be computed, graphed and manipulated using a set of uni- and bi-manual interactions which facilitate rapid exploration by eliminating tedious and error prone transcription tasks. Analysis of a qualitative pilot evaluation indicates the potential of our approach and highlights usability issues with the novel techniques used.

© All rights reserved Zeleznik et al. and/or their publisher

Edit | Del

Bragdon, Andrew, Zeleznik, Robert, Williamson, Brian, Miller, Timothy and LaViola, Joseph J. (2009): GestureBar: improving the approachability of gesture-based interfaces. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 2269-2278.

GestureBar is a novel, approachable UI for learning gestural interactions that enables a walk-up-and-use experience which is in the same class as standard menu and toolbar interfaces. GestureBar leverages the familiar, clean look of a common toolbar, but in place of executing commands, richly discloses how to execute commands with gestures, through animated images, detail tips and an out-of-document practice area. GestureBar's simple design is also general enough for use with any recognition technique and for integration with standard, non-gestural UI components. We evaluate GestureBar in a formal experiment showing that users can perform complex, ecologically valid tasks in a purely gestural system without training, introduction, or prior gesture experience when using GestureBar, discovering and learning a high percentage of the gestures needed to perform the tasks optimally, and significantly outperforming a state of the art crib sheet. The relative contribution of the major design elements of GestureBar is also explored. A second experiment shows that GestureBar is preferred to a basic crib sheet and two enhanced crib sheet variations.

© All rights reserved Bragdon et al. and/or ACM Press

Edit | Del

Zeleznik, Robert C., Bragdon, Andrew, Liu, Chu-Chi and Forsberg, Andrew S. (2008): Lineogrammer: creating diagrams by drawing. In: Cousins, Steve B. and Beaudouin-Lafon, Michel (eds.) Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology October 19-22, 2008, Monterey, CA, USA. pp. 161-170.

Edit | Del

LaViola, Joseph J., Forsberg, Andrew S., Huffman, John and Bragdon, Andrew (2008): The Influence of Head Tracking and Stereo on User Performance with Non-Isomorphic 3D Rotation. In: Liere, Robert van and Mohler, Betty J. (eds.) Proceedings of the 16th Eurographics Symposium on Virtual Environments - EGVE 2008 2008, Eindhoven, The Netherlands. pp. 111-118.

Edit | Del

Forsberg, Andrew S., Haley, Graff, Bragdon, Andrew, Levy, Joseph, Fassett, Caleb I., Shean, David, III, James W. Head, Milkovich, Sarah and Duchaineau, Mark A. (2006): Adviser: Immersive Field Work for Planetary Geoscientists. In IEEE Computer Graphics and Applications, 26 (4) pp. 46-54.

Add publication
Show list on your website

Join our community and advance:




Join our community!

Page Information

Page maintainer: The Editorial Team