Publication statistics

Pub. period:1998-2010
Pub. count:9
Number of co-authors:22


Number of publications with 3 favourite co-authors:

Andy Jacobs:
Raman Sarin:
James Scott:



Productive colleagues

Mike Sinclair's 3 most productive colleagues in number of publications:

Mary Czerwinski:80
Eric Horvitz:70
Patrick Baudisch:57

Upcoming Courses

go to course
UI Design Patterns for Successful Software
92% booked. Starts in 3 days
go to course
Psychology of Interaction Design: The Ultimate Guide
90% booked. Starts in 5 days

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !


Our Latest Books

The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading

Mike Sinclair


Picture of Mike Sinclair.
Update pic
Personal Homepage:

Current place of employment:
Microsoft Research

Mr. Michael Sinclair's main research interests include user interface devices for the PC, PDAs, and wearables; microelectromechanical systems for interface applications; digital photography using linear sensors; and haptic interfaces. Before joining Microsoft Research, Mr. Sinclair was an Institute Fellow at Georgia Tech and the director of the Interactive Media Technology Center, where he directed multimedia projects for Atlanta's bid for the 1996 Olympics, four dance technology projects with the Atlanta Ballet, Georgia Power's Economic Development Center, and a number of interactive museum systems. Mike also helped develop a number of surgical simulators, telemedicine systems, digital cameras, haptic displays, 3-D scanners, and 3-D motion-capture systems. He was a founding engineer and manager of future products at IVEX Corporation, where he helped develop visual systems for flight simulation and ASICs for real-time image processing. Mike was also a research engineer at Georgia Tech's Research Institute and a toll terminal engineer at Western Electric Corporation.


Publications by Mike Sinclair (bibliography)

 what's this?
Edit | Del

Brush, A. J. Bernheim, Karlson, Amy K., Scott, James, Sarin, Raman, Jacobs, Andy, Bond, Barry, Murillo, Oscar, Hunt, Galen, Sinclair, Mike, Hammil, Kerry and Levi, Steven (2010): User experiences with activity-based navigation on mobile devices. In: Proceedings of 12th Conference on Human-computer interaction with mobile devices and services 2010. pp. 73-82.

We introduce activity-based navigation, which uses human activities derived from sensor data to help people navigate, in particular to retrace a "trail" previously taken by that person or another person. Such trails may include step counts, walking up/down stairs or taking elevators, compass directions, and photos taken along a user's path, in addition to absolute positioning (GPS and maps) when available. To explore the user experience of activity-based navigation, we built Greenfield, a mobile device interface for finding a car. We conducted a ten participant user study comparing users' ability to find cars across three different presentations of activity-based information as well as verbal instructions. Our results show that activity-based navigation can be used for car finding and suggest its promise more generally for supporting navigation tasks. We present lessons for future activity-based navigation interfaces, and motivate further work in this space, particularly in the area of robust activity inference.

© All rights reserved Brush et al. and/or their publisher

Edit | Del

Baudisch, Patrick, Sinclair, Mike and Wilson, Andrew (2006): Soap: a pointing device that works in mid-air. In: Proceedings of the ACM Symposium on User Interface Software and Technology 2006. pp. 43-46.

Soap is a pointing device based on hardware found in a mouse, yet works in mid-air. Soap consists of an optical sensor device moving freely inside a hull made of fabric. As the user applies pressure from the outside, the optical sensor moves independent from the hull. The optical sensor perceives this relative motion and reports it as position input. Soap offers many of the benefits of optical mice, such as high-accuracy sensing. We describe the design of a soap prototype and report our experiences with four application scenarios, including a wall display, Windows Media Center, slide presentation, and interactive video games.

© All rights reserved Baudisch et al. and/or ACM Press

Edit | Del

Zhong, Lin, Sinclair, Mike and Jha, Niraj K. (2005): A personal-area network of low-power wireless interfacing devices for handhelds: system and hardware design. In: Proceedings of 7th conference on Human-computer interaction with mobile devices and services 2005. pp. 251-254.

Handhelds, such as smart-phones and Pocket PCs, have the potential to become the computing, storage, and connectivity hub, or Digital Hub, for pervasive computing. However, their current interfacing paradigms fall short of achieving this goal. To meet this challenge, we present the system and hardware design for a Bluetooth-based personal-area network (PAN) of low-power wireless interfacing devices. The network consists of a wrist-watch, single-hand single-tap multi-finger keypad, smart speech portal, and GPS receiver. These devices serve a handheld in a synergistic fashion, collectively providing the user with immediate and more natural access to computing power and enabling more and better services.

© All rights reserved Zhong et al. and/or ACM Press

Edit | Del

Zhong, Lin, Sinclair, Mike and Jha, Niraj K. (2005): A personal-area network of low-power wireless interfacing devices for handhelds: system and hardware design. In: Tscheligi, Manfred, Bernhaupt, Regina and Mihalic, Kristijan (eds.) Proceedings of the 7th Conference on Human-Computer Interaction with Mobile Devices and Services - Mobile HCI 2005 September 19-22, 2005, Salzburg, Austria. pp. 251-254.

Edit | Del

Hinckley, Ken, Pierce, Jeffrey S., Horvitz, Eric and Sinclair, Mike (2005): Foreground and background interaction with sensor-enhanced mobile devices. In ACM Trans. Comput.-Hum. Interact., 12 (1) pp. 31-52.

 Cited in the following chapter:

: [Not yet published]

 Cited in the following chapter:

: [Not yet published]

Edit | Del

Hinckley, Ken, Pierce, Jeff, Sinclair, Mike and Horvitz, Eric (2000): Sensing Techniques for Mobile Interaction. In: Ackerman, Mark S. and Edwards, Keith (eds.) Proceedings of the 13th annual ACM symposium on User interface software and technology November 06 - 08, 2000, San Diego, California, United States. pp. 91-100.

 Cited in the following chapter:

: [Not yet published]

 Cited in the following chapter:

: [Not yet published]

Edit | Del

Hinckley, Ken and Sinclair, Mike (1999): Touch-Sensing Input Devices. In: Altom, Mark W. and Williams, Marian G. (eds.) Proceedings of the ACM CHI 99 Human Factors in Computing Systems Conference May 15-20, 1999, Pittsburgh, Pennsylvania. pp. 223-230.

We can touch things, and our senses tell us when our hands are touching something. But most computer input devices cannot detect when the user touches or releases the device or some portion of the device. Thus, adding touch sensors to input devices offers many possibilities for novel interaction techniques. We demonstrate the TouchTrackball and the Scrolling TouchMouse, which use unobtrusive capacitance sensors to detect contact from the user's hand without requiring pressure or mechanical actuation of a switch. We further demonstrate how the capabilities of these devices can be matched to an implicit interaction technique, the On-Demand Interface, which uses the passive information captured by touch sensors to fade in or fade out portions of a display depending on what the user is doing; a second technique uses explicit, intentional interaction with touch sensors for enhanced scrolling. We present our new devices in the context of a simple taxonomy of tactile input technologies. Finally, we discuss the properties of touch-sensing as an input channel in general.

© All rights reserved Hinckley and Sinclair and/or ACM Press

Edit | Del

Hinckley, Ken, Sinclair, Mike, Hanson, Erik, Szeliski, Richard and Conway, Matthew (1999): The VideoMouse: A Camera-Based Multi-Degree-of-Freedom Input Device. In: Zanden, Brad Vander and Marks, Joe (eds.) Proceedings of the 12th annual ACM symposium on User interface software and technology November 07 - 10, 1999, Asheville, North Carolina, United States. pp. 103-112.

The VideoMouse is a mouse that uses a camera as its input sensor. A real-time vision algorithm determines the six degree-of-freedom mouse posture, consisting of 2D motion, tilt in the forward/back and left/right axes, rotation of the mouse about its vertical axis, and some limited height sensing. Thus, a familiar 2D device can be extended for three-dimensional manipulation, while remaining suitable for standard 2D GUI tasks. We describe techniques for mouse functionality, 3D manipulation, navigating large 2D spaces, and using the camera for lightweight scanning tasks.

© All rights reserved Hinckley et al. and/or ACM Press

Edit | Del

Hinckley, Ken, Czerwinski, Mary and Sinclair, Mike (1998): Interaction and Modeling Techniques for Desktop Two-Handed Input. In: Mynatt, Elizabeth D. and Jacob, Robert J. K. (eds.) Proceedings of the 11th annual ACM symposium on User interface software and technology November 01 - 04, 1998, San Francisco, California, United States. pp. 49-58.

We describe input devices and two-handed interaction techniques to support map navigation tasks. We discuss several design variations and user testing of two-handed navigation techniques, including puck and stylus input on a Wacom tablet, as well as a novel design incorporating a touchpad (for the nonpreferred hand) and a mouse (for the preferred hand). To support the latter technique, we introduce a new input device, the TouchMouse, which is a standard mouse augmented with a pair of one-bit touch sensors, one for the palm and one for the index finger. Finally, we propose several enhancements to Buxton's three-state model of graphical input and extend this model to encompass two-handed input transactions as well.

© All rights reserved Hinckley et al. and/or ACM Press

Add publication
Show list on your website

Join our community and advance:




Join our community!

Page Information

Page maintainer: The Editorial Team