Number of co-authors:11
Number of publications with 3 favourite co-authors:William G. Griswold:3Timothy Sohn:2Patrick Baudisch:2
Kevin A. Li's 3 most productive colleagues in number of publications:Patrick Baudisch:57Ken Hinckley:54James D. Hollan:49
Computer programs emerge as the outcome of complex human processes of cognition, communication and negotiation, which serve to establish the meaningful embedding of the computer system in its intended use context.
-- Floyd, 1992, p. 24
Read the fascinating history of Wearable Computing, told by its father, Steve Mann
Read Steve's chapter !
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
Kevin A. Li
Publications by Kevin A. Li (bibliography)
Bales, Elizabeth, Li, Kevin A. and Griwsold, William (2011): CoupleVIBE: mobile implicit communication to improve awareness for (long-distance) couples. In: Proceedings of ACM CSCW11 Conference on Computer-Supported Cooperative Work 2011. pp. 65-74.
Long-distance couples face considerable communication challenges in their relationships. Unlike collocated couples, long-distance couples lack awareness cues associated with physical proximity and must use technologies such as SMS or telephony to stay in sync. We posit that long-distance couples have needs that are not met by prevailing communication technologies, which require explicit action from the sender as well as the receiver. We built CoupleVIBE to explore the properties of an implicit messaging channel and observe how couples would use such a technology. CoupleVIBE is a mobile application that automatically pushes a user's location-information to her partner's mobile phone via vibrotactile cues. We present qualitative results of a four-week user study, studying how seven couples used CoupleVIBE. A key result is that CoupleVIBE's implicit communication modality operated as a foundation that helps keep couples in sync, with other modalities being brought into play when further interaction was needed.
© All rights reserved Bales et al. and/or their publisher
Cowan, Lisa G. and Li, Kevin A. (2011): ShadowPuppets: supporting collocated interaction with mobile projector phones using hand shadows. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 2707-2716.
Pico projectors attached to mobile phones allow users to view phone content using a large display. However, to provide input to projector phones, users have to look at the device, diverting their attention from the projected image. Additionally, other collocated users have no way of interacting with the device. We present ShadowPuppets, a system that supports collocated interaction with mobile projector phones. ShadowPuppets allows users to cast hand shadows as input to mobile projector phones. Most people understand how to cast hand shadows, which provide an easy input modality. Additionally, they implicitly support collocated usage, as nearby users can cast shadows as input and one user can see and understand another user's hand shadows. We describe the results of three user studies. The first study examines what hand shadows users expect will cause various effects. The second study looks at how users perceive hand shadows, examining what effects they think various hand shadows will cause. Finally, we present qualitative results from a study with our functional prototype and discuss design implications for systems using shadows as input. Our findings suggest that shadow input can provide a natural and intuitive way of interacting with projected interfaces and can support collocated collaboration.
© All rights reserved Cowan and Li and/or their publisher
Sohn, Timothy, Li, Kevin A., Griswold, William G. and Hollan, James D. (2008): A diary study of mobile information needs. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 433-442.
Being mobile influences not only the types of information people seek but also the ways they attempt to access it. Mobile contexts present challenges of changing location and social context, restricted time for information access, and the need to share attentional resources among concurrent activities. Understanding mobile information needs and associated interaction challenges is fundamental to improving designs for mobile phones and related devices. We conducted a two-week diary study to better understand mobile information needs and how they are addressed. Our study revealed that depending on the time and resources available, as well as the situational context, people use diverse and, at times, ingenious ways to obtain needed information. We summarize key findings and discuss design implications for mobile technology.
© All rights reserved Sohn et al. and/or ACM Press
Li, Kevin A., Baudisch, Patrick and Hinckley, Ken (2008): Blindsight: eyes-free access to mobile phones. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 1389-1398.
Many mobile phones integrate services such as personal calendars. Given the social nature of the stored data, however, users often need to access such information as part of a phone conversation. In typical non-headset use, this re-quires users to interrupt their conversations to look at the screen. We investigate a counter-intuitive solution: to avoid the need for interruption we replace the visual interface with one based on auditory feedback. Surprisingly, this can be done without interfering with the phone conversation. We present blindSight, a prototype application that replaces the traditionally visual in-call menu of a mobile phone. Users interact using the phone keypad, without looking at the screen. BlindSight responds with auditory feedback. This feedback is heard only by the user, not by the person on the other end of the line. We present the results of two user studies of our prototype. The first study verifies that useful keypress accuracy can be obtained for the phone-at-ear position. The second study compares the blindSight system against a visual baseline condition and finds a preference for blindSight.
© All rights reserved Li et al. and/or ACM Press
Li, Kevin A., Baudisch, Patrick, Griswold, William G. and Hollan, James D. (2008): Tapping and rubbing: exploring new dimensions of tactile feedback with voice coil motors. In: Cousins, Steve B. and Beaudouin-Lafon, Michel (eds.) Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology October 19-22, 2008, Monterey, CA, USA. pp. 181-190.
Sohn, Timothy, Li, Kevin A., Lee, Gunny, Smith, Ian E., Scott, James and Griswold, William G. (2005): Place-Its: A Study of Location-Based Reminders on Mobile Phones. In: Beigl, Michael, Intille, Stephen S., Rekimoto, Jun and Tokuda, Hideyuki (eds.) UbiComp 2005 Ubiquitous Computing - 7th International Conference September 11-14, 2005, Tokyo, Japan. pp. 232-250.
Show list on your website
Join the technology elite and advance:
Changes to this page (author)05 Jul 2011: Modified18 Apr 2011: Modified
12 Jul 2009: Modified
30 May 2009: Modified
12 May 2008: Added
12 May 2008: Modified
Page maintainer: The Editorial Team