Publication statistics

Pub. period:2002-2008
Pub. count:5
Number of co-authors:9



Co-authors

Number of publications with 3 favourite co-authors:

Scott E. Hudson:5
Paul H. Dietz:3
Jodi Forlizzi:2

 

 

Productive colleagues

Johnny C. Lee's 3 most productive colleagues in number of publications:

Scott E. Hudson:113
Jodi Forlizzi:90
Ramesh Raskar:26
 
 
 

Upcoming Courses

go to course
Emotional Design: How to make products people will love
90% booked. Starts in 5 days
go to course
UI Design Patterns for Successful Software
82% booked. Starts in 13 days
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

 
 
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
 
 
 
 
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading
 
 

Johnny C. Lee

 

Publications by Johnny C. Lee (bibliography)

 what's this?
2008
 
Edit | Del

Lee, Johnny C., Hudson, Scott E. and Tse, Edward (2008): Foldable interactive displays. In: Cousins, Steve B. and Beaudouin-Lafon, Michel (eds.) Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology October 19-22, 2008, Monterey, CA, USA. pp. 287-290. Available online

2005
 
Edit | Del

Lee, Johnny C., Hudson, Scott E., Summet, Jay W. and Dietz, Paul H. (2005): Moveable interactive projected displays using projector based tracking. In: Proceedings of the 2005 ACM Symposium on User Interface Software and Technology 2005. pp. 63-72. Available online

Video projectors have typically been used to display images on surfaces whose geometric relationship to the projector remains constant, such as walls or pre-calibrated surfaces. In this paper, we present a technique for projecting content onto moveable surfaces that adapts to the motion and location of the surface to simulate an active display. This is accomplished using a projector based location tracking technique. We use light sensors embedded into the moveable surface and project low-perceptibility Gray-coded patterns to first discover the sensor locations, and then incrementally track them at interactive rates. We describe how to reduce the perceptibility of tracking patterns, achieve interactive tracking rates, use motion modeling to improve tracking performance, and respond to sensor occlusions. A group of tracked sensors can define quadrangles for simulating moveable displays while single sensors can be used as control inputs. By unifying the tracking and display technology into a single mechanism, we can substantially reduce the cost and complexity of implementing applications that combine motion tracking and projected imagery.

© All rights reserved Lee et al. and/or ACM Press

2004
 
Edit | Del

Lee, Johnny C., Dietz, Paul H., Maynes-Aminzade, Dan, Raskar, Ramesh and Hudson, Scott E. (2004): Automatic projector calibration with embedded light sensors. In: Proceedings of the 2004 ACM Symposium on User Interface Software and Technology 2004. pp. 123-126. Available online

Projection technology typically places several constraints on the geometric relationship between the projector and the projection surface to obtain an undistorted, properly sized image. In this paper we describe a simple, robust, fast, and low-cost method for automatic projector calibration that eliminates many of these constraints. We embed light sensors in the target surface, project Gray-coded binary patterns to discover the sensor locations, and then prewarp the image to accurately fit the physical features of the projection surface. This technique can be expanded to automatically stitch multiple projectors, calibrate onto non-planar surfaces for object decoration, and provide a method for simple geometry acquisition.

© All rights reserved Lee et al. and/or ACM Press

 
Edit | Del

Lee, Johnny C., Avrahami, Daniel, Hudson, Scott E., Forlizzi, Jodi, Dietz, Paul H. and Leigh, Darren (2004): The calder toolkit: wired and wireless components for rapidly prototyping interactive devices. In: Proceedings of DIS04: Designing Interactive Systems: Processes, Practices, Methods, & Techniques 2004. pp. 167-175. Available online

Toolkits and other tools have dramatically reduced the time and technical expertise needed to design and implement graphical user interfaces (GUIs) allowing high-quality, iterative, user-centered design to become a common practice. Unfortunately the generation of functioning prototypes for physical interactive devices as not had similar support -- it still requires substantial time and effort by individuals with highly specialized skills and tools. This creates a divide between a designers' ability to explore form and interactivity of product designs and the ability to iterate on the basis of high fidelity interactive experiences with a functioning prototype. To help overcome this difficulty we have developed the Calder hardware toolkit. Calder is a development environment for rapidly exploring and prototyping functional physical interactive devices. Calder provides a set of reusable small input and output components, and integration into existing interface prototyping environments. These components communicate with a computer using wired and wireless connections. Calder is a tool targeted toward product and interaction designers to aid them in their early design process. In this paper we describe the process of gaining an understanding of the needs and workflow habits of our target users to generate a collection of requirements for such a toolkit. We describe technical challenges imposed by these needs, and the specifics of design and implementation of the toolkit to meet these challenges.

© All rights reserved Lee et al. and/or ACM Press

2002
 
Edit | Del

Lee, Johnny C., Forlizzi, Jodi and Hudson, Scott E. (2002): The kinetic typography engine: an extensible system for animating expressive text. In: Beaudouin-Lafon, Michel (ed.) Proceedings of the 15th annual ACM symposium on User interface software and technology October 27-30, 2002, Paris, France. pp. 81-90. Available online

Kinetic typography -- text that uses movement or other temporal change -- has recently emerged as a new form of communication. As we hope to illustrate in this paper, kinetic typography can be seen as bringing some of the expressive power of film -- such as its ability to convey emotion, portray compelling characters, and visually direct attention -- to the strong communicative properties of text. Although kinetic typography offers substantial promise for expressive communications, it has not been widely exploited outside a few limited application areas (most notably in TV advertising). One of the reasons for this has been the lack of tools directly supporting it, and the accompanying difficulty in creating dynamic text. This paper presents a first step in remedying this situation -- an extensible and robust system for animating text in a wide variety of forms. By supporting an appropriate set of carefully factored abstractions, this engine provides a relatively small set of components that can be plugged together to create a wide range of different expressions. It provides new techniques for automating effects used in traditional cartoon animation, and provides specific support for typographic manipulations.

© All rights reserved Lee et al. and/or ACM Press

 
Add publication
Show list on your website
 
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
Join our community!
 
 
 

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/johnny_c__lee.html