Upcoming Courses

go to course
Emotional Design: How to make products people will love
Starts TODAY LAST CALL!
go to course
UI Design Patterns for Successful Software
87% booked. Starts in 8 days
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

 
 
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
 
 
 
 
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading
 
 

Jane Hsu

 

Publications by Jane Hsu (bibliography)

 what's this?
2010
 
Edit | Del

Chan, Li-Wei, Kao, Hui-Shan, Chen, Mike Y., Lee, Ming-Sui, Hsu, Jane and Hung, Yi-Ping (2010): Touching the void: direct-touch interaction for intangible displays. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 2625-2634. Available online

In this paper, we explore the challenges in applying and investigate methodologies to improve direct-touch interaction on intangible displays. Direct-touch interaction simplifies object manipulation, because it combines the input and display into a single integrated interface. While traditional tangible display-based direct-touch technology is commonplace, similar direct-touch interaction within an intangible display paradigm presents many challenges. Given the lack of tactile feedback, direct-touch interaction on an intangible display may show poor performance even on the simplest of target acquisition tasks. In order to study this problem, we have created a prototype of an intangible display. In the initial study, we collected user discrepancy data corresponding to the interpretation of 3D location of targets shown on our intangible display. The result showed that participants performed poorly in determining the z-coordinate of the targets and were imprecise in their execution of screen touches within the system. Thirty percent of positioning operations showed errors larger than 30mm from the actual surface. This finding triggered our interest to design a second study, in which we quantified task time in the presence of visual and audio feedback. The pseudo-shadow visual feedback was shown to be helpful both in improving user performance and satisfaction.

© All rights reserved Chan et al. and/or their publisher

 
Edit | Del

Chan, Li-Wei, Wu, Hsiang-Tao, Kao, Hui-Shan, Ko, Ju-Chun, Lin, Home-Ru, Chen, Mike Y., Hsu, Jane and Hung, Yi-Ping (2010): Enabling beyond-surface interactions for interactive surface with an invisible projection. In: Proceedings of the 2010 ACM Symposium on User Interface Software and Technology 2010. pp. 263-272. Available online

This paper presents a programmable infrared (IR) technique that utilizes invisible, programmable markers to support interaction beyond the surface of a diffused-illumination (DI) multi-touch system. We combine an IR projector and a standard color projector to simultaneously project visible content and invisible markers. Mobile devices outfitted with IR cameras can compute their 3D positions based on the markers perceived. Markers are selectively turned off to support multi-touch and direct on-surface tangible input. The proposed techniques enable a collaborative multi-display multi-touch tabletop system. We also present three interactive tools: i-m-View, i-m-Lamp, and i-m-Flashlight, which consist of a mobile tablet and projectors that users can freely interact with beyond the main display surface. Early user feedback shows that these interactive devices, combined with a large interactive display, allow more intuitive navigation and are reportedly enjoyable to use.

© All rights reserved Chan et al. and/or their publisher

2009
 
Edit | Del

Hsiao, Chuan-Heng, Chan, Li-Wei, Hu, Ting-Ting, Chen, Mon-Chu, Hsu, Jane and Hung, Yi-Ping (2009): To move or not to move: a comparison between steerable versus fixed focus region paradigms in multi-resolution tabletop display systems. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 153-162. Available online

Previous studies have outlined the advantages of multi-resolution large-area displays over their fixed-resolution counterparts, however the mobility of the focus region has up until the present time received little attention. To study this phenomenon further, we have developed a multi-resolution tabletop display system with a steerable high resolution focus region to compare the performance between steerable and fixed focus region systems under different working scenarios. We have classified these scenarios according to region of interest (ROI) with analogies to different eye movement types (fixed, saccadic, and pursuit ROI). Empirical data gathered during the course of a multi-faceted user study demonstrates that the steerable focus region system significantly outperforms the fixed focus region system. The former is shown to provide enhanced display manipulation and proves especially advantageous in cases where the user must maintain spatial awareness of the display content as is the case in which, within a single session, several regions of the display are to be visited.

© All rights reserved Hsiao et al. and/or ACM Press

 
Add publication
Show list on your website
 
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
Join our community!
 
 
 

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/jane_hsu.html