Number of co-authors:16
Number of publications with 3 favourite co-authors:Roel Vertegaal:4Daniel Cheng:2Changuk Sohn:2
Connor Dickie's 3 most productive colleagues in number of publications:Roel Vertegaal:59Ted Selker:37A. James Stewart:8
go to course
User-Centred Design - Module 2
89% booked. Starts in 6 days
go to course
Design Thinking: The Beginner's Guide
88% booked. Starts in 7 days
Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess
User Experience and Experience Design !
Our Latest Books
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
Publications by Connor Dickie (bibliography)
Cheng, Sylvia, Dickie, Connor, Hanewich-Hollatz, Andreas, Vertegaal, Roel and Lee, Justin (2011): Don't touch: social appropriateness of touch sensor placement on interactive lumalive e-textime shirts. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. p. 511.
In this video, we discuss the design of an e-textile shirt with an interactive Lumalive display featuring a touch-controlled image browser. To determine where to place touch sensors, we investigated which areas of the Lumalive shirt users would be comfortable touching or being touched based on how often participants would opt out of touches. For both touchers and touchees, opt-outs occurred mostly in the upper chest. On the front, the upper chest and lower abdominal zones were the least comfortable. Findings suggest participants were less comfortable with touches on the upper chest, the lower abdomen, and the lower back. We conclude that the most appropriate areas for touch sensors on a shirt are on the arms, shoulders, and upper back.
© All rights reserved Cheng et al. and/or their publisher
Lee, Chia-Hsun Jackie, Chang, Chaochi, Chung, Hyemin, Dickie, Connor and Selker, Ted (2007): Emotionally reactive television. In: Proceedings of the 2007 International Conference on Intelligent User Interfaces 2007. pp. 329-332.
When is an interface simple? Is it when it is invisible or very obvious, even intrusive? From the time TV was created, watching TV is considered as a static activity. TV audiences have very limited choices to interact with TV, such as turning on/off, increasing/decreasing volume, and traversing among different channels. This paper suggests that TV program should have social responses to people, such as affording and accepting audience's emotional feeling with the growth of technologies. This paper presents HiTV, an Emotionally-Reactive TV system using a digitally augmented soft ball as affect-input interfaces that can amplify TV program's video/audio signals. HiTV transforms the original video and audio into effects that intrigue and fulfill people's emotional expectation.
© All rights reserved Lee et al. and/or ACM Press
Dickie, Connor, Hart, Jamie, Vertegaal, Roel and Eiser, Alex (2006): LookPoint: an evaluation of eye input for hands-free switching of input devices between multiple computers. In: Kjeldskov, Jesper and Paay, Jane (eds.) Proceedings of OZCHI06, the CHISIG Annual Conference on Human-Computer Interaction 2006. pp. 119-126.
We present LookPoint, a system that uses eye input for switching input between multiple computing devices. LookPoint uses an eye tracker to detect which screen the user is looking at, and then automatically routes mouse and keyboard input to the computer associated with that screen. We evaluated the use of eye input for switching between three computer monitors during a typing task, comparing its performance with that of three other selection techniques: multiple keyboards, function key selection, and mouse selection. Results show that the use of eye input is 111% faster than the mouse, 75% faster than function keys, and 37% faster than the use of multiple keyboards. A user satisfaction questionnaire showed that participants also preferred the use of eye input over other three techniques. The implications of this work are discussed, as well as future calibration-free implementations.
© All rights reserved Dickie et al. and/or their publisher
Dickie, Connor, Vertegaal, Roel, Sohn, Changuk and Cheng, Daniel (2005): eyeLook: using attention to facilitate mobile media consumption. In: Proceedings of the 2005 ACM Symposium on User Interface Software and Technology 2005. pp. 103-106.
One of the problems with mobile media devices is that they may distract users during critical everyday tasks, such as navigating the streets of a busy city. We addressed this issue in the design of eyeLook: a platform for attention sensitive mobile computing. eyeLook appliances use embedded low cost eyeCONTACT sensors (ECS) to detect when the user looks at the display. We discuss two eyeLook applications, seeTV and seeTXT, that facilitate courteous media consumption in mobile contexts by using the ECS to respond to user attention. seeTV is an attentive mobile video player that automatically pauses content when the user is not looking. seeTXT is an attentive speed reading application that flashes words on the display, advancing text only when the user is looking. By making mobile media devices sensitive to actual user attention, eyeLook allows applications to gracefully transition users between consuming media, and managing life.
© All rights reserved Dickie et al. and/or ACM Press
Shell, Jeffrey S., Vertegaal, Roel, Cheng, Daniel, Skaburskis, Alexander W., Sohn, Changuk, Stewart, A. James, Aoudeh, Omar and Dickie, Connor (2004): ECSGlasses and EyePliances: using attention to open sociable windows of interaction. In: Duchowski, Andrew T. and Vertegaal, Roel (eds.) ETRA 2004 - Proceedings of the Eye Tracking Research and Application Symposium March 22-24, 2004, San Antonio, Texas, USA. pp. 93-100.
Join our community and advance:
Changes to this page (author)05 Jul 2011: Modified01 Jun 2009: Modified
24 Jul 2007: Modified
24 Jul 2007: Modified
11 Jun 2007: Added
Page maintainer: The Editorial Team