Number of co-authors:42
Number of publications with 3 favourite co-authors:Shahram Izadi:9Andreas Butz:7David Kim:5
Otmar Hilliges's 3 most productive colleagues in number of publications:Abigail Sellen:81Shahram Izadi:50Andreas Butz:48
User error: replace user and press any key to continue.
-- Popular computer one-liner
Read the fascinating history of Wearable Computing, told by its father, Steve Mann
Read Steve's chapter !
Publications by Otmar Hilliges (bibliography)
Freeman, Dustin, Hilliges, Otmar, Sellen, Abigail, O'Hara, Kenton, Izadi, Shahram and Wood, Kenneth (2012): The role of physical controllers in motion video gaming. In: Proceedings of DIS12 Designing Interactive Systems 2012. pp. 701-710.
Systems that detect the unaugmented human body allow players to interact without using a physical controller. But how is interaction altered by the absence of a physical input device? What is the impact on game performance, on a player's expectation of their ability to control the game, and on their game experience? In this study, we investigate these issues in the context of a table tennis video game. The results show that the impact of holding a physical controller, or indeed of the fidelity of that controller, does not appear in simple measures of performance. Rather, the difference between controllers is a function of the responsiveness of the game being controlled, as well as other factors to do with expectations, real world game experience and social context.
© All rights reserved Freeman et al. and/or ACM Press
Kim, David, Hilliges, Otmar, Izadi, Shahram, Butler, Alex D., Chen, Jiawen, Oikonomidis, Iason and Olivier, Patrick (2012): Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor. In: Proceedings of the 2012 ACM Symposium on User Interface Software and Technology 2012. pp. 167-176.
Digits is a wrist-worn sensor that recovers the full 3D pose of the user's hand. This enables a variety of freehand interactions on the move. The system targets mobile settings, and is specifically designed to be low-power and easily reproducible using only off-the-shelf hardware. The electronics are self-contained on the user's wrist, but optically image the entirety of the user's hand. This data is processed using a new pipeline that robustly samples key parts of the hand, such as the tips and lower regions of each finger. These sparse samples are fed into new kinematic models that leverage the biomechanical constraints of the hand to recover the 3D pose of the user's hand. The proposed system works without the need for full instrumentation of the hand (for example using data gloves), additional sensors in the environment, or depth cameras which are currently prohibitive for mobile scenarios due to power and form-factor considerations. We demonstrate the utility of Digits for a variety of application scenarios, including 3D spatial interaction with mobile devices, eyes-free interaction on-the-move, and gaming. We conclude with a quantitative and qualitative evaluation of our system, and discussion of strengths, limitations and future work.
© All rights reserved Kim et al. and/or ACM Press
Wilson, Andrew, Benko, Hrvoje, Izadi, Shahram and Hilliges, Otmar (2012): Steerable augmented reality with the beamatron. In: Proceedings of the 2012 ACM Symposium on User Interface Software and Technology 2012. pp. 413-422.
Steerable displays use a motorized platform to orient a projector to display graphics at any point in the room. Often a camera is included to recognize markers and other objects, as well as user gestures in the display volume. Such systems can be used to superimpose graphics onto the real world, and so are useful in a number of augmented reality and ubiquitous computing scenarios. We contribute the Beamatron, which advances steerable displays by drawing on recent progress in depth camera-based interactions. The Beamatron consists of a computer-controlled pan and tilt platform on which is mounted a projector and Microsoft Kinect sensor. While much previous work with steerable displays deals primarily with projecting corrected graphics onto a discrete set of static planes, we describe computational techniques that enable reasoning in 3D using live depth data. We show two example applications that are enabled by the unique capabilities of the Beamatron: an augmented reality game in which a player can drive a virtual toy car around a room, and a ubiquitous computing demo that uses speech and gesture to move projected graphics throughout the room.
© All rights reserved Wilson et al. and/or ACM Press
Izadi, Shahram, Kim, David, Hilliges, Otmar, Molyneaux, David, Newcombe, Richard, Kohli, Pushmeet, Shotton, Jamie, Hodges, Steve, Freeman, Dustin, Davison, Andrew and Fitzgibbon, Andrew (2011): KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 559-568.
KinectFusion enables a user holding and moving a standard Kinect camera to rapidly create detailed 3D reconstructions of an indoor scene. Only the depth data from Kinect is used to track the 3D pose of the sensor and reconstruct, geometrically precise, 3D models of the physical scene in real-time. The capabilities of KinectFusion, as well as the novel GPU-based pipeline are described in full. Uses of the core system for low-cost handheld scanning, and geometry-aware augmented reality and physics-based interactions are shown. Novel extensions to the core GPU pipeline demonstrate object segmentation and user interaction directly in front of the sensor, without degrading camera tracking or reconstruction. These extensions are used to enable real-time multi-touch interactions anywhere, allowing any planar or non-planar reconstructed physical surface to be appropriated for touch.
© All rights reserved Izadi et al. and/or ACM Press
Butler, Alex, Hilliges, Otmar, Izadi, Shahram, Hodges, Steve, Molyneaux, David, Kim, David and Kong, Danny (2011): Vermeer: direct interaction with a 360° viewable 3D display. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 569-576.
We present Vermeer, a novel interactive 360° viewable 3D display. Like prior systems in this area, Vermeer provides viewpoint-corrected, stereoscopic 3D graphics to simultaneous users, 360° around the display, without the need for eyewear or other user instrumentation. Our goal is to over-come an issue inherent in these prior systems which -- typically due to moving parts -- restrict interactions to outside the display volume. Our system leverages a known optical illusion to demonstrate, for the first time, how users can reach into and directly touch 3D objects inside the display volume. Vermeer is intended to be a new enabling technology for interaction, and we therefore describe our hardware implementation in full, focusing on the challenges of combining this optical configuration with an existing approach for creating a 360° viewable 3D display. Initially we demonstrate direct involume interaction by sensing user input with a Kinect camera placed above the display. However, by exploiting the properties of the optical configuration, we also demonstrate novel prototypes for fully integrated input sensing alongside simultaneous display. We conclude by discussing limitations, implications for interaction, and ideas for future work.
© All rights reserved Butler et al. and/or ACM Press
Kirk, David S., Izadi, Shahram, Sellen, Abigail, Taylor, Stuart, Banks, Richard and Hilliges, Otmar (2010): Opening up the family archive. In: Proceedings of ACM CSCW10 Conference on Computer-Supported Cooperative Work 2010. pp. 261-270.
The Family Archive device is an interactive multi-touch tabletop technology with integrated capture facility for the archiving of sentimental artefacts and memorabilia. It was developed as a technology probe to help us open up current family archiving practices and to explore family archiving in situ. We detail the deployment and study of three of these devices in family homes and discuss how deploying a new, potentially disruptive, technology can foreground the social relations and organizing systems in domestic life. This in turn facilitates critical reflection on technology design.
© All rights reserved Kirk et al. and/or their publisher
Hilliges, Otmar and Kirk, David Shelby (2009): Getting sidetracked: display design and occasioning photo-talk with the photohelix. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 1733-1736.
In this paper we discuss some of our recent research work designing tabletop interfaces for co-located photo sharing. We draw particular attention to a specific feature of an interface design, which we have observed over an extensive number of uses, as facilitating an under-reported but none-the-less intriguing aspect of the photo-sharing experience -- namely the process of 'getting sidetracked'. Through a series of vignettes of interaction during photo-sharing sessions we demonstrate how users of our tabletop photoware system used peripheral presentation of topically incoherent photos to artfully initiate new photo-talk sequences in on-going discourse. From this we draw implications for the design of tabletop photo applications, and for the experiential analysis of such devices.
© All rights reserved Hilliges and Kirk and/or ACM Press
Hancock, Mark, Hilliges, Otmar, Collins, Christopher, Baur, Dominikus and Carpendale, Sheelagh (2009): Exploring tangible and direct touch interfaces for manipulating 2D and 3D information on a digital table. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 77-84.
On traditional tables, people often manipulate a variety of physical objects, both 2D in nature (e.g., paper) and 3D in nature (e.g., books, pens, models, etc.). Current advances in hardware technology for tabletop displays introduce the possibility of mimicking these physical interactions through direct-touch or tangible user interfaces. While both promise intuitive physical interaction, they are rarely discussed in combination in the literature. In this paper, we present a study that explores the advantages and disadvantages of tangible and touch interfaces, specifically in relation to one another. We discuss our results in terms of how effective each technique was for accomplishing both a 3D object manipulation task and a 2D information visualization exploration task. Results suggest that people can more quickly move and rotate objects in 2D with our touch interaction, but more effectively navigate the visualization using tangible interaction. We discuss how our results can be used to inform future designs of tangible and touch interaction.
© All rights reserved Hancock et al. and/or their publisher
Hilliges, Otmar, Izadi, Shahram, Wilson, Andrew D., Hodges, Steve, Garcia-Mendoza, Armando and Butz, Andreas (2009): Interactions in the air: adding further depth to interactive tabletops. In: Proceedings of the ACM Symposium on User Interface Software and Technology 2009. pp. 139-148.
Although interactive surfaces have many unique and compelling qualities, the interactions they support are by their very nature bound to the display surface. In this paper we present a technique for users to seamlessly switch between interacting on the tabletop surface to above it. Our aim is to leverage the space above the surface in combination with the regular tabletop display to allow more intuitive manipulation of digital content in three-dimensions. Our goal is to design a technique that closely resembles the ways we manipulate physical objects in the real-world; conceptually, allowing virtual objects to be 'picked up' off the tabletop surface in order to manipulate their three dimensional position or orientation. We chart the evolution of this technique, implemented on two rear projection-vision tabletops. Both use special projection screen materials to allow sensing at significant depths beyond the display. Existing and new computer vision techniques are used to sense hand gestures and postures above the tabletop, which can be used alongside more familiar multi-touch interactions. Interacting above the surface in this way opens up many interesting challenges. In particular it breaks the direct interaction metaphor that most tabletops afford. We present a novel shadow-based technique to help alleviate this issue. We discuss the strengths and limitations of our technique based on our own observations and initial user feedback, and provide various insights from comparing, and contrasting, our tabletop implementations.
© All rights reserved Hilliges et al. and/or their publisher
Hilliges, Otmar, Kim, David and Izadi, Shahram (2008): Creating malleable interactive surfaces using liquid displacement sensing. In: Third IEEE International Workshop on Tabletops and Interactive Surfaces Tabletop 2008 October 1-3, 2008, Amsterdam, The Netherlands. pp. 157-160.
Terrenghi, Lucia, Kirk, David, Richter, Hendrik, Krämer, Sebastian, Hilliges, Otmar and Butz, Andreas (2008): Physical handles at the interactive surface: exploring tangibility and its benefits. In: Levialdi, Stefano (ed.) AVI 2008 - Proceedings of the working conference on Advanced Visual Interfaces May 28-30, 2008, Napoli, Italy. pp. 138-145.
Wilson, Andrew D., Izadi, Shahram, Hilliges, Otmar, Garcia-Mendoza, Armando and Kirk, David (2008): Bringing physics to the surface. In: Cousins, Steve B. and Beaudouin-Lafon, Michel (eds.) Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology October 19-22, 2008, Monterey, CA, USA. pp. 67-76.
Hilliges, Otmar, Terrenghi, Lucia, Boring, Sebastian, Kim, David, Richter, Hendrik and Butz, Andreas (2007): Designing for collaborative creative problem solving. In: Proceedings of the 2007 Conference on Creativity and Cognition 2007, Washington DC, USA. pp. 137-146.
Collaborative creativity is traditionally supported by formal techniques, such as brainstorming. These techniques improve the idea-generation process by creating group synergies, but also suffer from a number of negative effects. Current electronic tools to support collaborative creativity overcome some of these problems, but introduce new ones, by either losing the benefits of face-to-face communication or the immediacy of simultaneous contribution. Using an interactive environment as a test bed, we are investigating how collaborative creativity can be supported electronically while maintaining face-to-face communication. What are the design-factors influencing such a system? We have designed a brainstorming application that uses an interactive table and a large wall display, and compared the results of using it to traditional paper-based brainstorming in a user study with 30 participants. From the considerations that went into the design and the observations during the study we derive a number of design guidelines for collaborative systems in interactive environments.
© All rights reserved Hilliges et al. and/or ACM Press
Hilliges, Otmar, Baur, Dominikus and Butz, Andreas (2007): Photohelix: Browsing, Sorting and Sharing Digital Photo Collections. In: Second IEEE International Workshop on Horizontal Interactive Human-Computer Systems Tabletop 2007 October 10-12, 2007, Newport, Rhode Island, USA. pp. 87-94.
Terrenghi, Lucia, Hilliges, Otmar and Butz, Andreas (2007): Kitchen stories: sharing recipes with the Living Cookbook. In Personal and Ubiquitous Computing, 11 (5) pp. 409-414.
Hilliges, Otmar, Kunath, Peter, Pryakhin, Alexey, Butz, Andreas and Kriegel, Hans-Peter (2007): Browsing and Sorting Digital Pictures Using Automatic Image Classification and Quality Analysis. In: Jacko, Julie A. (ed.) HCI International 2007 - 12th International Conference - Part III 2007. pp. 882-891.
Boring, Sebastian, Hilliges, Otmar and Butz, Andreas (2007): A Wall-Sized Focus Plus Context Display. In: PerCom 2007 - Fifth Annual IEEE International Conference on Pervasive Computing and Communications 19-23 March, 2007, White Plains, New York, USA. pp. 161-170.
Hilliges, Otmar, Sandor, Christian and Klinker, Gudrun (2006): Interactive prototyping for ubiquitous augmented reality user interfaces. In: Proceedings of the 2006 International Conference on Intelligent User Interfaces 2006. pp. 285-287.
User interfaces for ubiquitous augmented reality incorporate a wide variety of concepts such as multi-modal, multi-user, multi-device aspects and new input/output devices. In this paper we present a twofold approach that consists of an execution engine for ubiquitous augmented reality user interfaces and a runtime development environment that enables rapid prototyping and live system adaption for such advanced user interfaces.
© All rights reserved Hilliges et al. and/or ACM Press
Show this list on your homepage
Join the technology elite and advance:
Changes to this page (author)23 Nov 2012: Added23 Nov 2012: Added
09 Nov 2012: Added
05 Apr 2012: Added
05 Apr 2012: Added
03 Nov 2010: Added
03 Nov 2010: Added
03 Nov 2010: Added
18 Feb 2010: Modified
24 Aug 2009: Added
12 Jul 2009: Added
17 Jun 2009: Added
05 Jun 2009: Added
31 May 2009: Added
29 May 2009: Added
29 May 2009: Added
09 May 2009: Added
24 Jul 2007: Added
24 Jul 2007: Added
Page maintainer: The Editorial Team