Publication statistics

Pub. period:2003-2006
Pub. count:7
Number of co-authors:16


Number of publications with 3 favourite co-authors:

Mark Altosaar:
David Holman:
Aadil Mamuji:



Productive colleagues

Changuk Sohn's 3 most productive colleagues in number of publications:

Roel Vertegaal:59
A. James Stewart:8
David Holman:7

Upcoming Courses

go to course
Mobile User Experience Design
go to course
User-Centred Design - Module 1
Starts tomorrow LAST CALL!

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !


Our Latest Books

The Glossary of Human Computer Interaction
by Mads Soegaard and Rikke Friis Dam
start reading
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading

Changuk Sohn

Picture of Changuk Sohn.
Update pic

Publications by Changuk Sohn (bibliography)

 what's this?
Edit | Del

Altosaar, Mark, Vertegaal, Roel, Sohn, Changuk and Cheng, Daniel (2006): AuraOrb: using social awareness cues in the design of progressive notification appliances. In: Kjeldskov, Jesper and Paay, Jane (eds.) Proceedings of OZCHI06, the CHISIG Annual Conference on Human-Computer Interaction 2006. pp. 159-166.

One of the problems with notification appliances is that they can be distracting when providing information not of immediate interest to the user. In this paper, we present AuraOrb, an ambient notification appliance that deploys progressive turn taking techniques to minimize notification disruptions. AuraOrb uses social awareness cues, such as eye contact to detect user interest in an initially ambient light notification. Once detected, it displays a text message with a notification heading visible from 360 degrees. Touching the orb causes the associated message to be displayed on the user's computer screen. When user interest is lost, AuraOrb automatically reverts back to its idle state. We performed an initial evaluation of AuraOrb's functionality using a set of heuristics tailored to ambient displays. We compared progressive notification with the use of persistent ticker tape notifications and Outlook Express system tray messages for notifying the user of incoming emails. Results of our evaluation suggest that progressive turn taking techniques allowed AuraOrb users to access notification headings with minimal impact on their focus task.

© All rights reserved Altosaar et al. and/or their publisher

Edit | Del

Smith, John D., Vertegaal, Roel and Sohn, Changuk (2005): ViewPointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis. In: Proceedings of the 2005 ACM Symposium on User Interface Software and Technology 2005. pp. 53-61.

We introduce ViewPointer, a wearable eye contact sensor that detects deixis towards ubiquitous computers embedded in real world objects. ViewPointer consists of a small wearable camera no more obtrusive than a common Bluetooth headset. ViewPointer allows any real-world object to be augmented with eye contact sensing capabilities, simply by embedding a small infrared (IR) tag. The headset camera detects when a user is looking at an infrared tag by determining whether the reflection of the tag on the cornea of the user\'s eye appears sufficiently central to the pupil. ViewPointer not only allows any object to become an eye contact sensing appliance, it also allows identification of users and transmission of data to the user through the object. We present a novel encoding scheme used to uniquely identify ViewPointer tags, as well as a method for transmitting URLs over tags. We present a number of scenarios of application as well as an analysis of design principles. We conclude eye contact sensing input is best utilized to provide context to action.

© All rights reserved Smith et al. and/or ACM Press

Edit | Del

Dickie, Connor, Vertegaal, Roel, Sohn, Changuk and Cheng, Daniel (2005): eyeLook: using attention to facilitate mobile media consumption. In: Proceedings of the 2005 ACM Symposium on User Interface Software and Technology 2005. pp. 103-106.

One of the problems with mobile media devices is that they may distract users during critical everyday tasks, such as navigating the streets of a busy city. We addressed this issue in the design of eyeLook: a platform for attention sensitive mobile computing. eyeLook appliances use embedded low cost eyeCONTACT sensors (ECS) to detect when the user looks at the display. We discuss two eyeLook applications, seeTV and seeTXT, that facilitate courteous media consumption in mobile contexts by using the ECS to respond to user attention. seeTV is an attentive mobile video player that automatically pauses content when the user is not looking. seeTXT is an attentive speed reading application that flashes words on the display, advancing text only when the user is looking. By making mobile media devices sensitive to actual user attention, eyeLook allows applications to gracefully transition users between consuming media, and managing life.

© All rights reserved Dickie et al. and/or ACM Press

Edit | Del

Smith, David, Donald, Matthew, Chen, Daniel, Cheng, Daniel, Sohn, Changuk, Mamuji, Aadil, Holman, David and Vertegaal, Roel (2005): OverHear: augmenting attention in remote social gatherings through computer-mediated hearing. In: Proceedings of ACM CHI 2005 Conference on Human Factors in Computing Systems 2005. pp. 1801-1804.

One of the problems with mediated communication systems is that they limit the user's ability to listen to informal conversations of others within a remote space. In what is known as the Cocktail Party phenomenon, participants in noisy face-to-face conversations are able to focus their attention on a single individual, typically the person they look at. Media spaces do not support the cues necessary to establish this attentive mechanism. We addressed this issue in our design of OverHear, a media space that augments the user's attention in remote social gatherings through computer mediated hearing. OverHear uses an eye tracker embedded in the webcam display to direct the focal point of a robotic shotgun microphone mounted in the remote space. This directional microphone is automatically pointed towards the currently observed individual, allowing the user to OverHear this person's conversations.

© All rights reserved Smith et al. and/or ACM Press

Edit | Del

Vertegaal, Roel, Mamuji, Aadil, Sohn, Changuk and Cheng, Daniel (2005): Media eyepliances: using eye tracking for remote control focus selection of appliances. In: Proceedings of ACM CHI 2005 Conference on Human Factors in Computing Systems 2005. pp. 1861-1864.

This paper discusses the use of eye contact sensing for focus selection operations in remote controlled media appliances. Focus selection with remote controls tends to be cumbersome as selection buttons place the remote in a device-specific modality. We addressed this issue with the design of Media EyePliances, home theatre appliances augmented with a digital eye contact sensor. An appliance is selected as the focus of remote commands by looking at its sensor. A central server subsequently routes all commands provided by remote, keyboard or voice input to the focus EyePliance. We discuss a calibration-free digital eye contact sensing technique that allows Media EyePliances to determine the user's point of gaze.

© All rights reserved Vertegaal et al. and/or ACM Press

Edit | Del

Shell, Jeffrey S., Vertegaal, Roel, Cheng, Daniel, Skaburskis, Alexander W., Sohn, Changuk, Stewart, A. James, Aoudeh, Omar and Dickie, Connor (2004): ECSGlasses and EyePliances: using attention to open sociable windows of interaction. In: Duchowski, Andrew T. and Vertegaal, Roel (eds.) ETRA 2004 - Proceedings of the Eye Tracking Research and Application Symposium March 22-24, 2004, San Antonio, Texas, USA. pp. 93-100.

Edit | Del

Vertegaal, Roel, Weevers, Ivo, Sohn, Changuk and Cheung, Chris (2003): GAZE-2: conveying eye contact in group video conferencing using eye-controlled camera direction. In: Cockton, Gilbert and Korhonen, Panu (eds.) Proceedings of the ACM CHI 2003 Human Factors in Computing Systems Conference April 5-10, 2003, Ft. Lauderdale, Florida, USA. pp. 521-528.

Add publication
Show list on your website

Join our community and advance:




Join our community!

Page Information

Page maintainer: The Editorial Team