Publication statistics

Pub. period:2008-2012
Pub. count:8
Number of co-authors:15


Number of publications with 3 favourite co-authors:

Tovi Grossman:
Shahram Izadi:
George Fitzmaurice:



Productive colleagues

Xing-Dong Yang's 3 most productive colleagues in number of publications:

Shahram Izadi:50
Pourang Irani:44
Tovi Grossman:44

Upcoming Courses

go to course
Gestalt Psychology and Web Design: The Ultimate Guide
Starts the day after tomorrow !
go to course
Become a UX Designer from scratch
92% booked. Starts in 3 days

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !


Our Latest Books

The Glossary of Human Computer Interaction
by Mads Soegaard and Rikke Friis Dam
start reading
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading

Xing-Dong Yang


Publications by Xing-Dong Yang (bibliography)

 what's this?
Edit | Del

Hasan, Khalad, Yang, Xing-Dong, Liang, Hai-Ning and Irani, Pourang (2012): How to position the cursor?: an exploration of absolute and relative cursor positioning for back-of-device input. In: Proceedings of the 14th Conference on Human-computer interaction with mobile devices and services 2012. pp. 103-112.

Observational studies indicate that most people use one hand to interact with their mobile devices. Interaction on the back-of-devices (BoD) has been proposed to enhance one-handed input for various tasks, including selection and gesturing. However, we do not possess a good understanding of some fundamental issues related to one-handed BoD input. In this paper, we attempt to fill this gap by conducting three studies. The first study explores suitable selection techniques; the second study investigates the performance and suitability of the two main modes of cursor movement: Relative and Absolute; and the last study examines solutions to the problem of reaching the lower part of the device. Our results indicate that for BoD interaction, relative input is more efficient and accurate for cursor positioning and target selection than absolute input. Based on these findings provide guidelines for designing BoD interactions for mobile devices.

© All rights reserved Hasan et al. and/or ACM Press

Edit | Del

Yang, Xing-Dong, Grossman, Tovi, Wigdor, Daniel and Fitzmaurice, George (2012): Magic finger: always-available input through finger instrumentation. In: Proceedings of the 2012 ACM Symposium on User Interface Software and Technology 2012. pp. 147-156.

We present Magic Finger, a small device worn on the fingertip, which supports always-available input. Magic Finger inverts the typical relationship between the finger and an interactive surface: with Magic Finger, we instrument the user's finger itself, rather than the surface it is touching. Magic Finger senses touch through an optical mouse sensor, enabling any surface to act as a touch screen. Magic Finger also senses texture through a micro RGB camera, allowing contextual actions to be carried out based on the particular surface being touched. A technical evaluation shows that Magic Finger can accurately

© All rights reserved Yang et al. and/or ACM Press

Edit | Del

Yang, Xing-Dong, Grossman, Tovi, Irani, Pourang and Fitzmaurice, George (2011): TouchCuts and TouchZoom: enhanced target selection for touch displays using finger proximity sensing. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 2585-2594.

Although touch-screen laptops are increasing in popularity, users still do not comfortably rely on touch in these environments, as current software interfaces were not designed for being used by the finger. In this paper, we first demonstrate the benefits of using touch as a complementary input modality along with the keyboard and mouse or touchpad in a laptop setting. To alleviate the frustration users experience with touch, we then design two techniques, TouchCuts, a single target expansion technique, and TouchZoom, a multiple target expansion technique. Both techniques facilitate the selection of small icons, by detecting the finger proximity above the display surface, and expanding the target as the finger approaches. In a controlled evaluation, we show that our techniques improve performance in comparison to both the computer mouse and a baseline touch-based target acquisition technique. We conclude by discussing other application scenarios that our techniques support.

© All rights reserved Yang et al. and/or their publisher

Edit | Del

Yang, Xing-Dong, Mak, Edward, McCallum, David, Irani, Pourang, Cao, Xiang and Izadi, Shahram (2010): LensMouse: augmenting the mouse with an interactive touch display. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 2431-2440.

We introduce LensMouse, a novel device that embeds a touch-screen display -- or tangible 'lens' -- onto a mouse. Users interact with the display of the mouse using direct touch, whilst also performing regular cursor-based mouse interactions. We demonstrate some of the unique capabilities of such a device, in particular for interacting with auxiliary windows, such as toolbars, palettes, pop-ups and dialog-boxes. By migrating these windows onto LensMouse, challenges such as screen real-estate use and window management can be alleviated. In a controlled experiment, we evaluate the effectiveness of LensMouse in reducing cursor movements for interacting with auxiliary windows. We also consider the concerns involving the view separation that results from introducing such a display-based device. Our results reveal that overall users are more effective with LenseMouse than with auxiliary application windows that are managed either in single or dual-monitor setups. We conclude by presenting other application scenarios that LensMouse could support.

© All rights reserved Yang et al. and/or their publisher

Edit | Del

Kadaba, Nivedita R., Yang, Xing-Dong and Irani, Pourang P. (2009): Facilitating multiple target tracking using semantic depth of field (SDOF). In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 4375-4380.

Users of radar control systems and monitoring applications have to constantly extract essential information from dynamic scenes. In these environments a critical and elemental task consists of tracking multiple targets that are moving simultaneously. However, focusing on multiple moving targets is not trivial as it is very easy to lose continuity, particularly when the objects are situated within a very dense or cluttered background. While focus+context displays have been developed to improve users' ability to attend to important visual information, such techniques have not been applied to the visualization of moving objects. In this paper we evaluate the effectiveness of a focus+context technique, referred to as Semantic Depth of Field (SDOF), to the task of facilitating multiple target tracking. Results of our studies show an inclination for better performance with SDOF techniques, especially in low contrast scenarios.

© All rights reserved Kadaba et al. and/or ACM Press

Edit | Del

Yang, Xing-Dong, Irani, Pourang, Boulanger, Pierre and Bischof, Walter (2009): One-handed behind-the-display cursor input on mobile devices. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 4501-4506.

Behind-the-display interaction has gained popularity for interactions on handheld devices as researchers have demonstrated the viability of such interactions on small devices. However, most designs have investigated the use of direct input behind the screen. We demonstrate that behind-the-display interaction with cursor input is promising and can be a useful augmentation to handheld devices. We developed a prototypical system on a PDA to which we affixed a wireless mouse. The mouse is mounted on the rear of the PDA with the optical sensor facing outwards. The system is designed to be used with one hand, and prevents occlusion and finger-reach. Through several applications we propose the benefits associated with behind-the-display cursor interaction. A preliminary user evaluation indicates that users can benefit from such an interaction when operating a handheld using one hand.

© All rights reserved Yang et al. and/or ACM Press

Edit | Del

Yang, Xing-Dong, Mak, Edward, Irani, Pourang and Bischof, Walter F. (2009): Dual-Surface input: augmenting one-handed interaction with coordinated front and behind-the-screen input. In: Proceedings of 11th Conference on Human-computer interaction with mobile devices and services 2009. p. 5.

Interaction patterns with handheld mobile devices are constantly evolving. Researchers observed that users prefer to interact with mobile device using one hand. However, only few interaction techniques support this mode of operation. We show that one-handed operations can be enhanced with coordinated interaction using for input the front and back of a mobile device, which we term as Dual-Surface interaction. We present some of the design rationale for introducing coordinated Dual-Surface interactions. We demonstrate that several tasks, including target selection, benefit from Dual-Surface input which allows users to rapidly select small targets in locations that are less accessible when interacting using the thumb with one-handed input. Furthermore, we demonstrate the benefits of virtual enhancements that are possible with behind-the-display relative input to perform complex tasks, such as steering. Our results show that Dual-Surface interactions offer numerous benefits that are not available with input on the front or the back alone.

© All rights reserved Yang et al. and/or their publisher

Edit | Del

Yang, Xing-Dong, Bischof, Walter F. and Boulanger, Pierre (2008): The Effects of Hand Motion on Haptic Perception of Force Direction. In: Ferre, Manuel (ed.) EuroHaptics 2008 - Haptics Perception, Devices and Scenarios - 6th International Conference June 10-13, 2008, Madrid, Spain. pp. 355-360.

Add publication
Show list on your website

Join our community and advance:




Join our community!

Page Information

Page maintainer: The Editorial Team