Number of co-authors:41
Number of publications with 3 favourite co-authors:Dinesh Manocha:3Nuria Oliver:2Kate Lund:2
Andrew Wilson's 3 most productive colleagues in number of publications:Ming C. Lin:58Patrick Baudisch:57Dinesh Manocha:57
Science arose from poetry? when times change the two can meet again on a higher level as friends.
-- Johann Goethe
Read the fascinating history of Wearable Computing, told by its father, Steve Mann
Read Steve's chapter !
Our Latest Books
Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger
Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad
Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam
Publications by Andrew Wilson (bibliography)
Wilson, Andrew, Benko, Hrvoje, Izadi, Shahram and Hilliges, Otmar (2012): Steerable augmented reality with the beamatron. In: Proceedings of the 2012 ACM Symposium on User Interface Software and Technology 2012. pp. 413-422.
Steerable displays use a motorized platform to orient a projector to display graphics at any point in the room. Often a camera is included to recognize markers and other objects, as well as user gestures in the display volume. Such systems can be used to superimpose graphics onto the real world, and so are useful in a number of augmented reality and ubiquitous computing scenarios. We contribute the Beamatron, which advances steerable displays by drawing on recent progress in depth camera-based interactions. The Beamatron consists of a computer-controlled pan and tilt platform on which is mounted a projector and Microsoft Kinect sensor. While much previous work with steerable displays deals primarily with projecting corrected graphics onto a discrete set of static planes, we describe computational techniques that enable reasoning in 3D using live depth data. We show two example applications that are enabled by the unique capabilities of the Beamatron: an augmented reality game in which a player can drive a virtual toy car around a room, and a ubiquitous computing demo that uses speech and gesture to move projected graphics throughout the room.
© All rights reserved Wilson et al. and/or ACM Press
Holz, Christian and Wilson, Andrew (2011): Data miming: inferring spatial object descriptions from human gesture. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 811-820.
Speakers often use hand gestures when talking about or describing physical objects. Such gesture is particularly useful when the speaker is conveying distinctions of shape that are difficult to describe verbally. We present data miming -- an approach to making sense of gestures as they are used to describe concrete physical objects. We first observe participants as they use gestures to describe real-world objects to another person. From these observations, we derive the data miming approach, which is based on a voxel representation of the space traced by the speaker's hands over the duration of the gesture. In a final proof-of-concept study, we demonstrate a prototype implementation of matching the input voxel representation to select among a database of known physical objects.
© All rights reserved Holz and Wilson and/or their publisher
Lund, Kate, Coulton, Paul and Wilson, Andrew (2011): Free All Monsters!: a context-aware location based game. In: Proceedings of 13th Conference on Human-computer interaction with mobile devices and services 2011. pp. 675-678.
Free All Monsters! is a novel location based mobile game which incorporates user generated content in an attempt to broaden its appeal by encouraging creativity. An online portal allows participants to create content which is then used to populate the game. The game was recently launched on the iPhone App Store and is aimed and designed to be a family orientated activity.
© All rights reserved Lund et al. and/or ACM Press
Coulton, Paul, Lund, Kate and Wilson, Andrew (2010): Harnessing player creativity to broaden the appeal of location based games. In: Proceedings of the HCI10 Conference on People and Computers XXIV 2010. pp. 143-150.
Despite being the subject of considerable research effort location based games in general have failed to attain the popularity and longevity of similar activities such as geo-caching or orienteering. This leads us to the question are the games designed thus far taking too much inspiration in their design from console and pc games leading to games that are too inflexible and failing to support the types of player behaviour that have emerged in geocaching? Using a design inspired by the player engagement evident in geocaching we present the empirical study of the design and user experience from creating the Mobile Augmented Reality (MAR) game Free All Monsters. The results highlight that enabling user creativity and accommodating the varied motivations for playing such games can successfully be incorporated into the design and operation of location based game design and in particular provide a fun outdoor family activity.
© All rights reserved Coulton et al. and/or BCS
Baudisch, Patrick, Sinclair, Mike and Wilson, Andrew (2006): Soap: a pointing device that works in mid-air. In: Proceedings of the ACM Symposium on User Interface Software and Technology 2006. pp. 43-46.
Soap is a pointing device based on hardware found in a mouse, yet works in mid-air. Soap consists of an optical sensor device moving freely inside a hull made of fabric. As the user applies pressure from the outside, the optical sensor moves independent from the hull. The optical sensor perceives this relative motion and reports it as position input. Soap offers many of the benefits of optical mice, such as high-accuracy sensing. We describe the design of a soap prototype and report our experiences with four application scenarios, including a wall display, Windows Media Center, slide presentation, and interactive video games.
© All rights reserved Baudisch et al. and/or ACM Press
Wilson, Andrew and Brannon, Rebecca M. (2005): Exploring 2D Tensor Fields Using Stress Nets. In: 16th IEEE Visualization Conference VIS 2005 23-28 October, 2005, Minneapolis, MN, USA. p. 2.
Wilson, Andrew and Shafer, Steven (2003): XWand: UI for intelligent spaces. In: Cockton, Gilbert and Korhonen, Panu (eds.) Proceedings of the ACM CHI 2003 Human Factors in Computing Systems Conference April 5-10, 2003, Ft. Lauderdale, Florida, USA. pp. 545-552.
Wilson, Andrew and Pham, Hubert (2003): Pointing in Intelligent Environments with the WorldCursor. In: Proceedings of IFIP INTERACT03: Human-Computer Interaction 2003, Zurich, Switzerland. p. 495.
Wilson, Andrew and Oliver, Nuria (2003): GWindows: robust stereo vision for gesture-based control of windows. In: Oviatt, Sharon L., Darrell, Trevor, Maybury, Mark T. and Wahlster, Wolfgang (eds.) Proceedings of the 5th International Conference on Multimodal Interfaces - ICMI 2003 November 5-7, 2003, Vancouver, British Columbia, Canada. pp. 211-218.
Wilson, Andrew and Oliver, Nuria (2003): GWindows: robust stereo vision for gesture-based control of windows. In: Proceedings of the 2003 International Conference on Multimodal Interfaces 2003. pp. 211-218.
Perceptual user interfaces promise modes of fluid computer-human interaction that complement the mouse and keyboard, and have been especially motivated in non-desktop scenarios, such as kiosks or smart rooms. Such interfaces, however, have been slow to see use for a variety of reasons, including the computational burden they impose, a lack of robustness outside the laboratory, unreasonable calibration demands, and a shortage of sufficiently compelling applications. We address these difficulties by using a fast stereo vision algorithm for recognizing hand positions and gestures. Our system uses two inexpensive video cameras to extract depth information. This depth information enhances automatic object detection and tracking robustness, and may also be used in applications. We demonstrate the algorithm in combination with speech recognition to perform several basic window management tasks, report on a user study probing the ease of using the system, and discuss the implications of such a system for future user interfaces.
© All rights reserved Wilson and Oliver and/or their publisher
Wilson, Andrew, Mayer-Patel, Ketan and Manocha, Dinesh (2001): Spatially-encoded far-field representations for interactive walkthroughs. In: ACM Multimedia 2001 2001. pp. 348-357.
Maddocks, Alan P., Sher, Willy D. and Wilson, Andrew (2000): A Web-based personal and professional development tool to promote life-long learning within the construction industry. In Educational Technology & Society, 3 (1) .
Wilson, Andrew, Lin, Ming C., Yeo, Boon-Lock, Yeung, Minerva M. and Manocha, Dinesh (2000): A video-based rendering acceleration algorithm for interactive walkthroughs. In: ACM Multimedia 2000 2000. pp. 75-83.
Johnson, Michael Patrick, Wilson, Andrew, Blumberg, Bruce, Kline, Christopher and Bobick, Aaron (1999): Sympathetic Interfaces: Using a Plush Toy to Direct Synthetic Characters. In: Altom, Mark W. and Williams, Marian G. (eds.) Proceedings of the ACM CHI 99 Human Factors in Computing Systems Conference May 15-20, 1999, Pittsburgh, Pennsylvania. pp. 152-158.
We introduce the concept of a sympathetic interface for controlling an animated synthetic character in a 3D virtual environment. A plush doll embedded with wireless sensors is used to manipulate the virtual character in an iconic and intentional manner. The interface extends from the novel physical input device through interpretation of sensor data to the behavioral "brain" of the virtual character. We discuss the design of the interface and focus on its latest instantiation in the Swamped! exhibit at SIGGRAPH '98. We also present what we learned from hundreds of casual users, who ranged from young children to adults.
© All rights reserved Johnson et al. and/or ACM Press
Aliaga, Daniel G., Cohen, Jonathan D., Wilson, Andrew, Baker, Eric, Zhang, Hansong, Erikson, Carl, III, Kenneth E. Hoff, Hudson, Thomas C., Stürzlinger, Wolfgang, Bastos, Rui, Whitton, Mary C., Jr., Frederick P. Brooks and Manocha, Dinesh (1999): MMR: an interactive massive model rendering system using geometric and image-based acceleration. In: SI3D 1999 1999. pp. 199-206.
Horan, B., Rector, A. L., Sneath, E. L., Goble, C. A., Howkins, T. J., Kay, S., Nowlan, W. A. and Wilson, Andrew (1990): Supporting a Humanly Impossible Task: The Clinical Human Computer Environment. In: Diaper, Dan, Gilmore, David J., Cockton, Gilbert and Shackel, Brian (eds.) INTERACT 90 - 3rd IFIP International Conference on Human-Computer Interaction August 27-31, 1990, Cambridge, UK. pp. 247-252.
Medicine has proved a fruitful field for developing knowledge based systems. Paradoxically, the General Practice medical environment has a number of characteristics which make the introduction of such systems difficult. Attempts to produce systems for other professional users -- e.g. architects, lawyers, and executives -- have had somewhat similar experiences. However, doctors work under severe time pressure in a complex social environment. The neatly confined problems most tractable to expert systems have limited relevance to doctors' decision making in practical situations. Furthermore, doctors already have a well developed system of sharing expertise. Extensive user centred design studies have led us to propose an alternative model for augmenting doctors' performance. Rather than an expert system, we propose an intelligent human-computer environment for maintaining medical records and `throwing light' on the complex data of patient histories.
© All rights reserved Horan et al. and/or North-Holland
Show list on your website
Join the design elite and advance:
Changes to this page (author)23 Nov 2012: Modified07 Nov 2012: Modified
04 Apr 2012: Modified
03 Apr 2012: Modified
05 Jul 2011: Modified
20 Apr 2011: Modified
17 Jun 2009: Modified
17 Jun 2009: Modified
14 Jun 2009: Modified
31 May 2009: Modified
30 May 2009: Modified
12 May 2008: Modified
24 Jul 2007: Modified
24 Jul 2007: Modified
28 Apr 2003: Added
Page maintainer: The Editorial Team