Number of co-authors:16
Number of publications with 3 favourite co-authors:Bing-Yu Chen:4Rong-Hao Liang:3Chao-Huai Su:2
Kai-Yin Cheng's 3 most productive colleagues in number of publications:Hao-Hua Chu:24Bing-Yu Chen:15Darren Edge:13
go to course
User Experience: The Beginner's Guide
Starts the day after tomorrow !
go to course
User-Centred Design - Module 2
92% booked. Starts in 3 days
Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess
User Experience and Experience Design !
Our Latest Books
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
Publications by Kai-Yin Cheng (bibliography)
Edge, Darren, Cheng, Kai-Yin, Whitney, Michael, Qian, Yao, Yan, Zhijie and Soong, Frank (2012): Tip tap tones: mobile microtraining of mandarin sounds. In: Proceedings of the 14th Conference on Human-computer interaction with mobile devices and services 2012. pp. 427-430. Available online
Learning a second language is hard, especially when the learner's brain must be retrained to identify sounds not present in his or her native language. It also requires regular practice, but many learners struggle to find the time and motivation. Our solution is to break down the challenge of mastering a foreign sound system into minute-long episodes of "microtraining" delivered through mobile gaming. We present the example of Tip Tap Tones -- a mobile game with the purpose of helping learners acquire the tonal sound system of Mandarin Chinese. In a 3-week, 12-user study of this system, we found that an average of 71 minutes' gameplay significantly improved tone identification by around 25%, regardless of whether the underlying sounds had been used to train tone perception. Overall, results suggest that mobile microtraining is an efficient, effective, and enjoyable way to master the sounds of Mandarin Chinese, with applications to other languages and domains.
© All rights reserved Edge et al. and/or ACM Press
Liang, Rong-Hao, Cheng, Kai-Yin, Su, Chao-Huai, Weng, Chien-Ting, Chen, Bing-Yu and Yang, De-Nian (2012): GaussSense: attachable stylus sensing using magnetic sensor grid. In: Proceedings of the 2012 ACM Symposium on User Interface Software and Technology 2012. pp. 319-326. Available online
This work presents GaussSense, which is a back-of-device sensing technique for enabling input on an arbitrary surface using stylus by exploiting magnetism. A 2mm-thick Hall sensor grid is developed to sense magnets that are embedded in the stylus. Our system can sense the magnetic field that is emitted from the stylus when it is within 2cm of any non-ferromagnetic surface. Attaching the sensor behind an arbitrary thin surface enables the stylus input to be recognized by analyzing the distribution of the applied magnetic field. Attaching the sensor grid to the back of a touchscreen device and incorporating magnets into the corresponding stylus enable the system 1) to distinguish touch events that are caused by a finger from those caused by the stylus, 2) to sense the tilt angle of the stylus and the pressure with which it is applied, and 3) to detect where the stylus hovers over the screen. A pilot study reveals that people were satisfied with the novel sketching experiences based on this system.
© All rights reserved Liang et al. and/or ACM Press
Lin, Shu-Yang, Su, Chao-Huai, Cheng, Kai-Yin, Liang, Rong-Hao, Kuo, Tzu-Hao and Chen, Bing-Yu (2011): Pub -- point upon body: exploring eyes-free interaction and methods on an arm. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 481-488. Available online
This paper presents a novel interaction system, PUB (Point Upon Body), to explore eyes-free interaction in a personal space by allowing users tapping on their own arms to be provided with haptic feedback from their skin. Two user studies determine how users can interact precisely with their forearms and how users behave when operating in their arm space. According to those results, normal users can divide their arm space at most into 6 points between their wrists and elbows with iterative practice. Experimental results also indicate that the divided pattern of each user is unique from that of other ones. Based on the design principles from the observations, an interaction system, PUB, is designed to demonstrate how interaction design benefits from those findings. Two scenarios, remote display control and mobile device control, are demonstrated through the UltraSonic device attached on the users' wrists to detect their tapped positions.
© All rights reserved Lin et al. and/or ACM Press
Cheng, Kai-Yin, Liang, Rong-Hao, Chen, Bing-Yu, Laing, Rung-Huei and Kuo, Sy-Yen (2010): iCon: utilizing everyday objects as additional, auxiliary and instant tabletop controllers. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 1155-1164. Available online
This work describes a novel approach to utilizing everyday objects of users as additional, auxiliary, and instant tabletop controllers. Based on this approach, a prototype platform, called iCon, is developed to explore the possible design. Field studies and user studies reveal that utilizing everyday objects such as auxiliary input devices might be appropriate under a multi-task scenario. User studies further demonstrate that daily objects can generally be applied in low precision circumstances, low engagement with selected objects, and medium-to-high frequency of use. The proposed approach allows users to interact with computers while not altering their original work environments.
© All rights reserved Cheng et al. and/or their publisher
Cheng, Kai-Yin, Luo, Sheng-Jie, Chen, Bing-Yu and Chu, Hao-Hua (2009): SmartPlayer: user-centric video fast-forwarding. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 789-798. Available online
In this paper we propose a new video interaction model called adaptive fast-forwarding to help people quickly browse videos with predefined semantic rules. This model is designed around the metaphor of scenic car driving, in which the driver slows down near areas of interest and speeds through unexciting areas. Results from a preliminary user study of our video player suggest the following: (1) the player should adaptively adjust the current playback speed based on the complexity of the present scene and predefined semantic events; (2) the player should learn user preferences about predefined event types as well as a suitable playback speed; (3) the player should fast-forward the video continuously with a playback rate acceptable to the user to avoid missing any undefined events or areas of interest. Furthermore, our user study results suggest that for certain types of video, our SmartPlayer yields better user experiences in browsing and fast-forwarding videos than existing video players' interaction models.
© All rights reserved Cheng et al. and/or ACM Press
Join our community and advance:
Page maintainer: The Editorial Team