Publication statistics

Pub. period:2006-2012
Pub. count:14
Number of co-authors:43



Co-authors

Number of publications with 3 favourite co-authors:

Ming-Sui Lee:
Jane Hsu:
Yi-Ping Hung:

 

 

Productive colleagues

Mike Y. Chen's 3 most productive colleagues in number of publications:

James A. Landay:91
Anthony LaMarca:37
David W. McDonald:36
 
 
 

Upcoming Courses

go to course
Human-Computer Interaction
91% booked. Starts in 4 days
go to course
Psychology of Interaction Design: The Ultimate Guide
84% booked. Starts in 11 days
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

 
 
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
 
 
 
 
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading
 
 

Mike Y. Chen

 

Publications by Mike Y. Chen (bibliography)

 what's this?
2012
 
Edit | Del

Cheng, Lung-Pan, Hsiao, Fang-I, Liu, Yen-Ting and Chen, Mike Y. (2012): iRotate grasp: automatic screen rotation based on grasp of mobile devices. In: Adjunct Proceedings of the 2012 ACM Symposium on User Interface Software and Technology 2012. pp. 15-16. Available online

Automatic screen rotation improves viewing experience and usability of mobile devices, but current gravity-based approaches do not support postures such as lying on one side, and manual rotation switches require explicit user input. iRotate Grasp automatically rotates screens of mobile devices to match users' viewing orientations based on how users are grasping the devices. Our insight is that users' grasps are consistent for each orientation, but significantly differ between different orientations. Our prototype embeds a total of 32 light sensors along the four sides and the back of an iPod Touch, and uses support vector machine (SVM) to recognize grasps at 25Hz. We collected 6-users' usage under 54 different conditions: 1) grasping the device using left, right, and both hands, 2) scrolling, zooming and typing, 3) in portrait, landscape-left, and landscape-right orientations, and while 4) sitting and lying down on one side. Results show that our grasp-based approach is promising, and our iRotate Grasp prototype could correctly rotate the screen 90.5% of the time when training and testing on different users.

© All rights reserved Cheng et al. and/or ACM Press

2011
 
Edit | Del

Yu, Neng-Hao, Tsai, Sung-Sheng, Chen, Mike Y. and Hung, Yi-Ping (2011): TUIC open source SDK: enabling tangible interaction on unmodified capacitive multi-touch displays. In: Proceedings of the 2011 ACM International Conference on Interactive Tabletops and Surfaces 2011. p. D2. Available online

 
Edit | Del

Yu, Neng-Hao, Tsai, Sung-Sheng, Hsiao, I-Chun, Tsai, Dian-Je, Lee, Meng-Han, Chen, Mike Y. and Hung, Yi-Ping (2011): Clip-on gadgets: expanding multi-touch interaction area with unpowered tactile controls. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 367-372. Available online

Virtual keyboards and controls, commonly used on mobile multi-touch devices, occlude content of interest and do not provide tactile feedback. Clip-on Gadgets solve these issues by extending the interaction area of multi-touch devices with physical controllers. Clip-on Gadgets use only conductive materials to map user input on the controllers to touch points on the edges of screens; therefore, they are battery-free, lightweight, and low-cost. In addition, they can be used in combination with multi-touch gestures. We present several hardware designs and a software toolkit, which enable users to simply attach Clip-on Gadgets to an edge of a device and start interacting with it.

© All rights reserved Yu et al. and/or ACM Press

2010
 
Edit | Del

Chan, Li-Wei, Kao, Hui-Shan, Chen, Mike Y., Lee, Ming-Sui, Hsu, Jane and Hung, Yi-Ping (2010): Touching the void: direct-touch interaction for intangible displays. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 2625-2634. Available online

In this paper, we explore the challenges in applying and investigate methodologies to improve direct-touch interaction on intangible displays. Direct-touch interaction simplifies object manipulation, because it combines the input and display into a single integrated interface. While traditional tangible display-based direct-touch technology is commonplace, similar direct-touch interaction within an intangible display paradigm presents many challenges. Given the lack of tactile feedback, direct-touch interaction on an intangible display may show poor performance even on the simplest of target acquisition tasks. In order to study this problem, we have created a prototype of an intangible display. In the initial study, we collected user discrepancy data corresponding to the interpretation of 3D location of targets shown on our intangible display. The result showed that participants performed poorly in determining the z-coordinate of the targets and were imprecise in their execution of screen touches within the system. Thirty percent of positioning operations showed errors larger than 30mm from the actual surface. This finding triggered our interest to design a second study, in which we quantified task time in the presence of visual and audio feedback. The pseudo-shadow visual feedback was shown to be helpful both in improving user performance and satisfaction.

© All rights reserved Chan et al. and/or their publisher

 
Edit | Del

Chan, Li-Wei, Wu, Hsiang-Tao, Kao, Hui-Shan, Ko, Ju-Chun, Lin, Home-Ru, Chen, Mike Y., Hsu, Jane and Hung, Yi-Ping (2010): Enabling beyond-surface interactions for interactive surface with an invisible projection. In: Proceedings of the 2010 ACM Symposium on User Interface Software and Technology 2010. pp. 263-272. Available online

This paper presents a programmable infrared (IR) technique that utilizes invisible, programmable markers to support interaction beyond the surface of a diffused-illumination (DI) multi-touch system. We combine an IR projector and a standard color projector to simultaneously project visible content and invisible markers. Mobile devices outfitted with IR cameras can compute their 3D positions based on the markers perceived. Markers are selectively turned off to support multi-touch and direct on-surface tangible input. The proposed techniques enable a collaborative multi-display multi-touch tabletop system. We also present three interactive tools: i-m-View, i-m-Lamp, and i-m-Flashlight, which consist of a mobile tablet and projectors that users can freely interact with beyond the main display surface. Early user feedback shows that these interactive devices, combined with a large interactive display, allow more intuitive navigation and are reportedly enjoyable to use.

© All rights reserved Chan et al. and/or their publisher

 
Edit | Del

Yu, Neng-Hao, Chan, Li-Wei, Cheng, Lung-Pan, Chen, Mike Y. and Hung, Yi-Ping (2010): Enabling tangible interaction on capacitive touch panels. In: Proceedings of the 2010 ACM Symposium on User Interface Software and Technology 2010. pp. 457-458. Available online

We propose two approaches to sense tangible objects on capacitive touch screens, which are used in off-the-shelf multi-touch devices such as Apple iPad, iPhone, and 3M's multi-touch displays. We seek for the approaches that do not require modifications to the panels: spatial tag and frequency tag. Spatial tag is similar to fiducial tag used by tangible tabletop surface interaction, and uses multi-point, geometric patterns to encode object IDs. Frequency tag simulates high-frequency touches in the time domain to encode object IDs, using modulation circuits embedded inside tangible objects to simulate high-speed touches in varying frequency. We will show several demo applications. The first combines simultaneous tangible + touch input system. This explores how tangible inputs (e.g., pen, easer, etc.) and some simple gestures work together on capacitive touch panels.

© All rights reserved Yu et al. and/or their publisher

2008
 
Edit | Del

Consolvo, Sunny, McDonald, David W., Toscos, Tammy, Chen, Mike Y., Froehlich, Jon, Harrison, Beverly L., Klasnja, Predrag, LaMarca, Anthony, LeGrand, Louis, Libby, Ryan, Smith, Ian and Landay, James A. (2008): Activity sensing in the wild: a field trial of ubifit garden. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 1797-1806. Available online

Recent advances in small inexpensive sensors, low-power processing, and activity modeling have enabled applications that use on-body sensing and machine learning to infer people's activities throughout everyday life. To address the growing rate of sedentary lifestyles, we have developed a system, UbiFit Garden, which uses these technologies and a personal, mobile display to encourage physical activity. We conducted a 3-week field trial in which 12 participants used the system and report findings focusing on their experiences with the sensing and activity inference. We discuss key implications for systems that use on-body sensing and activity inference to encourage physical activity.

© All rights reserved Consolvo et al. and/or ACM Press

 
Edit | Del

Consolvo, Sunny, McDonald, David W., Toscos, Tammy, Chen, Mike Y., Froehlich, Jon, Harrison, Beverly, Klasnja, Predrag, LaMarca, Anthony, LeGrand, Louis, Libby, Ryan, Smith, Ian and Landay, James A. (2008): Activity Sensing in the Wild: A Field Trial of UbiFit Garden. In , . Available online

Recent advances in small inexpensive sensors, low-power processing, and activity modeling have enabled applications that use on-body sensing and machine learning to infer people’s activities throughout everyday life. To address the growing rate of sedentary lifestyles, we have developed a system, UbiFit Garden, which uses these technologies and a personal, mobile display to encourage physical activity. We conducted a 3-week field trial in which 12 participants used the system and report findings focusing on their experiences with the sensing and activity inference. We discuss key implications for systems that use on-body sensing and activity inference to encourage physical activity.

© All rights reserved Consolvo et al. and/or their publisher

2007
 
Edit | Del

Consolvo, Sunny, Harrison, Beverly L., Smith, Ian, Chen, Mike Y., Everitt, Katherine, Froehlich, Jon and Landay, James A. (2007): Conducting In Situ Evaluations for and With Ubiquitous Computing Technologies. In International Journal of Human-Computer Interaction, 22 (1) pp. 103-118. Available online

To evaluate ubiquitous computing technologies, which may be embedded in the environment, embedded in objects, worn, or carried by the user throughout everyday life, it is essential to use methods that accommodate the often unpredictable, real-world environments in which the technologies are used. This article discusses how we have adapted and applied traditional methods from psychology and human-computer interaction, such as Wizard of Oz and Experience Sampling, to be more amenable to the in situ evaluations of ubiquitous computing applications, particularly in the early stages of design. The way that ubiquitous computing technologies can facilitate the in situ collection of self-report data is also discussed. Although the focus is on ubiquitous computing applications and tools for their assessment, it is believed that the in situ evaluation tools that are proposed will be generally useful for field trials of other technology, applications, or formative studies that are concerned with collecting data in situ.

© All rights reserved Consolvo et al. and/or Lawrence Erlbaum Associates

 
Edit | Del

Chang, Keng-hao, Chen, Mike Y. and Canny, John (2007): Tracking Free-Weight Exercises. In: Krumm, John, Abowd, Gregory D., Seneviratne, Aruna and Strang, Thomas (eds.) UbiComp 2007 Ubiquitous Computing - 9th International Conference September 16-19, 2007, Innsbruck, Austria. pp. 19-37. Available online

2006
 
Edit | Del

Patel, Kayur, Chen, Mike Y., Smith, Ian and Landay, James A. (2006): Personalizing routes. In: Proceedings of the ACM Symposium on User Interface Software and Technology 2006. pp. 187-190. Available online

Navigation services (e.g., in-car navigation systems and online mapping sites) compute routes between two locations to help users navigate. However, these routes may direct users along an unfamiliar path when a familiar path exists, or, conversely, may include redundant information that the user already knows. These overly complicated directions increase the cognitive load of the user, which may lead to a dangerous driving environment. Since the level of detail is user specific and depends on their familiarity with a region, routes need to be personalized. We have developed a system, called MyRoute, that reduces route complexity by creating user specific routes based on a priori knowledge of familiar routes and landmarks. MyRoute works by compressing well known steps into a single contextualized step and rerouting users along familiar routes.

© All rights reserved Patel et al. and/or ACM Press

 
Edit | Del

Froehlich, Jon, Chen, Mike Y., Smith, Ian E. and Potter, Fred (2006): Voting with Your Feet: An Investigative Study of the Relationship Between Place Visit Behavior and Preference. In: Dourish, Paul and Friday, Adrian (eds.) UbiComp 2006 Ubiquitous Computing - 8th International Conference September 17-21, 2006, Orange County, CA, USA. pp. 333-350. Available online

 
Edit | Del

Chen, Mike Y., Sohn, Timothy, Chmelev, Dmitri, Hhnel, Dirk, Hightower, Jeffrey, Hughes, Jeff, LaMarca, Anthony, Potter, Fred, Smith, Ian E. and Varshavsky, Alex (2006): Practical Metropolitan-Scale Positioning for GSM Phones. In: Dourish, Paul and Friday, Adrian (eds.) UbiComp 2006 Ubiquitous Computing - 8th International Conference September 17-21, 2006, Orange County, CA, USA. pp. 225-242. Available online

 
Edit | Del

Sohn, Timothy, Varshavsky, Alex, LaMarca, Anthony, Chen, Mike Y., Choudhury, Tanzeem, Smith, Ian E., Consolvo, Sunny, Hightower, Jeffrey, Griswold, William G. and Lara, Eyal de (2006): Mobility Detection Using Everyday GSM Traces. In: Dourish, Paul and Friday, Adrian (eds.) UbiComp 2006 Ubiquitous Computing - 8th International Conference September 17-21, 2006, Orange County, CA, USA. pp. 212-224. Available online

 
Add publication
Show list on your website
 
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
Join our community!
 
 
 

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/mike_y__chen.html