Number of co-authors:35
Number of publications with 3 favourite co-authors:Jinah Park:Sangwon Choi:Jaehyun Han:
Geehyuk Lee's 3 most productive colleagues in number of publications:Woohun Lee:20Hyunjung Kim:7Jinah Park:6
go to course
85% booked. Starts in 10 days
go to course
User Experience: The Beginner's Guide
84% booked. Starts in 11 days
Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess
User Experience and Experience Design !
Our Latest Books
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
Publications by Geehyuk Lee (bibliography)
Kim, Sunjun and Lee, Geehyuk (2012): Restorable backspace. In: Adjunct Proceedings of the 2012 ACM Symposium on User Interface Software and Technology 2012. pp. 73-74. Available online
This paper presents Restorable Backspace, an input helper for mistyping correction. It stores characters deleted by backspace keystrokes, and restores them in the retyping phase. We developed Restoration algorithm that compares deleted characters and retyped characters, and makes a suggestion while retyping. In a pilot study we could observe the algorithm work as expected for most of the cases. All participants in the pilot study showed satisfaction about the concept of Restorable Backspace.
© All rights reserved Kim and Lee and/or ACM Press
Choi, Sangwon, Han, Jaehyun, Lee, Geehyuk, Lee, Narae and Lee, Woohun (2011): RemoteTouch: touch-screen-like interaction in the tv viewing environment. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 393-402. Available online
We explored the possibility of touch-screen-like interaction with a remote control in the TV-viewing environment. A shadow representing the user's thumb touches the screen, presses a button, flicks a cover-flow list, and draws a simple stroke, while the thumb stays and moves on and above the touchpad. In order to implement the concept we developed an optical touchpad for tracking the thumb hovering over its surface, and designed a TV application to demonstrate possible new interaction styles. Throughout two iterations of prototyping, we corrected some of our false expectations, and also verified its potential as a viable option for a TV remote control. This paper presents technical issues and requirements for the hover-tracking touchpad and a complete report of our user studies to explore touch-screen-like interaction for the TV.
© All rights reserved Choi et al. and/or their publisher
Lee, Seunghwan, Han, Jaehyun and Lee, Geehyuk (2011): MultiPress: releasing keys for multitap segmentation. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 1489-1494. Available online
While MultiTap is one of the most popular text entry methods for mobile phones, it has a fundamental weakness known as MultiTap segmentation problem. Based on the observation that the thumb does not leave the keys between tapping actions, we designed a MultiTap segmentation method where the release action of the thumb is used to indicate input completion. A user study using a touch-sensing keypad prototype to explore the feasibility of the idea and a comparison test to access its benefit revealed promising results supporting the effectiveness of the proposed segmentation method.
© All rights reserved Lee et al. and/or their publisher
Heo, Seongkook and Lee, Geehyuk (2011): Force gestures: augmented touch screen gestures using normal and tangential force. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 1909-1914. Available online
Similar sliding gestures may have different meanings when they are performed with changing intensity. Touch screens, however, fail to properly distinguish those intensities due to their inability to sense variable pressures. Enabled by distinguishing normal and tangential forces, we explore new possibilities for gestures on a touch screen. We have implemented a pressure-sensitive prototype and have designed a set of gestures that utilize alterable forces. The gestures' feasibility has been tested through a simple experiment. Finally, we discuss the new possibility of touch interactions that are sensitive to pressure.
© All rights reserved Heo and Lee and/or their publisher
Heo, Seongkook and Lee, Geehyuk (2011): ForceTap: extending the input vocabulary of mobile touch screens by adding tap gestures. In: Proceedings of 13th Conference on Human-computer interaction with mobile devices and services 2011. pp. 113-122. Available online
We introduce an interaction technique that increases the touch screen input vocabulary by distinguishing a strong tap from a gentle tap without the use of additional hardware. We have designed and validated an algorithm that detects different types of screen touches by combining data from the built-in accelerometer with position data from the touch screen. The proposed technique allows a touch screen input to contain not only the position of a finger contact, but also its type, i.e., whether the contact is a 'Tap' or a 'ForceTap.' To verify the feasibility of the proposed technique we have implemented our detection algorithm in experiments that test cases of single-handed, two-handed, immersive, and on the move usage. Based on the experimental results, we investigate the advantages of using two types of touch inputs and discuss emerging issues. Finally, we suggest a design guideline for applying the proposed technique to touch screen applications, and present possible application scenarios.
© All rights reserved Heo and Lee and/or ACM Press
Heo, Seongkook, Han, Jaehyun, Choi, Sangwon, Lee, Seunghwan, Lee, Geehyuk, Lee, Hyong-Euk, Kim, SangHyun, Bang, Won-Chul, Kim, DoKyoon and Kim, Changyeong (2011): IrCube tracker: an optical 6-DOF tracker based on LED directivity. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 577-586. Available online
Six-degrees-of-freedom (6-DOF) trackers, which were mainly for professional computer applications, are now in demand by everyday consumer applications. With the requirements of consumer electronics in mind, we designed an optical 6-DOF tracker where a few photo-sensors can track the position and orientation of an LED cluster. The operating principle of the tracker is basically source localization by solving an inverse problem. We implemented a prototype system for a TV viewing environment, verified the feasibility of the operating principle, and evaluated the basic performance of the prototype system in terms of accuracy and speed. We also examined its application possibility to different environments, such as a tabletop computer, a tablet computer, and a mobile spatial interaction environment.
© All rights reserved Heo et al. and/or ACM Press
Heo, Seongkook and Lee, Geehyuk (2011): Force gestures: augmenting touch screen gestures with normal and tangential forces. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 621-626. Available online
Force gestures are touch screen gestures augmented by the normal and tangential forces on the screen. In order to study the feasibility of the force gestures on a mobile touch screen, we implemented a prototype touch screen device that can sense the normal and tangential forces of a touch gesture on the screen. We also designed two example applications, a web browser and an e-book reader, that utilize the force gestures for their primary actions. We conducted a user study with the prototype and the applications to study the characteristics of the force gestures and the effectiveness of their mapping to the primary actions. In the user study we could also discover interesting usability issues and collect useful user feedback about the force gestures and their mapping to GUI actions.
© All rights reserved Heo and Lee and/or ACM Press
Choi, Sangwon, Han, Jaehyun, Kim, Sunjun, Heo, Seongkook and Lee, Geehyuk (2011): ThickPad: a hover-tracking touchpad for a laptop. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 15-16. Available online
We explored the use of a hover tracking touchpad in a laptop environment. In order to study the new experience, we implemented a prototype touchpad consisting of infrared LEDs and photo-transistors, which can track fingers as far as 10mm over the surface. We demonstrate here three major interaction techniques that would become possible when a hover-tracking touchpad meets a laptop.
© All rights reserved Choi et al. and/or ACM Press
Gu, Jiseong and Lee, Geehyuk (2011): TouchString: a flexible linear multi-touch sensor for prototyping a freeform multi-touch surface. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 75-76. Available online
We propose the concept of prototyping a multi-touch surface of an arbitrary form using a flexible linear multi-touch sensor that we call TouchString. We defined the conceptual structure of a TouchString, and implemented an example prototype of a TouchString. We verified the feasibility of the concept by demonstrating a few basic application scenarios using the prototype.
© All rights reserved Gu and Lee and/or ACM Press
Lee, Seunghwan, Seo, Jungsuk and Lee, Geehyuk (2010): An adaptive speed-call list algorithm and its evaluation with ESM. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 2019-2022. Available online
We designed an algorithm to build a speed-call list adaptively based on mobile phone call logs. Call logs provide the time-dependent calling patterns of mobile phone users, and therefore a speed-call list based on them will be more successful in recommending a desired number than a speed-call list based on recent calls only. This paper presents the design process of our algorithm for an adaptive speed-call list, its verification result with recorded call logs, and in-situ evaluation results of the algorithm using an Experience Sampling Method (ESM) system.
© All rights reserved Lee et al. and/or their publisher
Shin, Heesook, Lee, Woohun, Lee, Geehyuk and Cho, Ilyeon (2009): Multi-point touch input method for Korean text entry. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 3871-3876. Available online
Multi-touch interfaces are becoming popular as a new input means for the various applications. In this paper, we suggest a new Korean text entry method using a multi-touch interface called MPT (Multi-Point Touch) input method. We conducted a text entry performance test comprising 4 sessions for 10 participants, and compared the result with an existing commercial SPT (Single-Point Touch) input method. The experimental results show that the entry speed of MPT was slower than that of SPT method in the initial session. However, the entry speed of MPT input method was improved more rapidly than the speed of SPT method as sessions were proceeded. We observed a statistically significant learning effect from the result of MPT method. Moreover, we found no significant difference between the task loads of SPT and MPT input methods.
© All rights reserved Shin et al. and/or ACM Press
Kim, Hyunjung, Kim, Seoktae, Lee, Boram, Pak, Jinhee, Sohn, Minjung, Lee, Geehyuk and Lee, Woohun (2008): Digital rubbing: playful and intuitive interaction technique for transferring a graphic image onto paper with pen-based computing. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 2337-2342. Available online
In this paper, we introduce digital rubbing, which is a playful and intuitive interaction technique for transferring a graphic image directly onto paper. We designed TransPen and MimeoPad to realize digital rubbing. With these drawing tools, children and adults can use rubbing motions to transfer a digital image directly to paper and produce a drawing with a personal touch and natural texture, just as in traditional rubbing. We expect that digital rubbing technique would be useful in arts and design as a new way of expression in the process of drawing and editing ideas. In addition, the suggested interaction devices have the full potential to become new drawing toys for children.
© All rights reserved Kim et al. and/or ACM Press
Lee, Seunghwan, Lee, Geehyuk and Kim, Hojin (2008): Ubigraphy: a third-person viewpoint life log. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 3531-3536. Available online
A traditional life-log is written in the first-person viewpoint since a user collects data using sensors worn on the body. A UbiGraphy that we introduce here is a third-person viewpoint life-log that is made possible by the spontaneous interaction between a wearable computer and smart objects in a ubiquitous computing environment. A wearable computer uses smart objects in the proximity to capture a user's smiles, poses, and even songs from the third-person viewpoint, and then write a life-log where a user appears. This paper presents the design of a protocol that enables UbiGraphy and our first prototyping effort for experiencing UbiGraphy.
© All rights reserved Lee et al. and/or ACM Press
Yoon, Youngwoo, Ahn, Yuri, Lee, Geehyuk, Hong, Sungmoo and Kim, Minjeong (2008): Context-aware photo selection for promoting photo consumption on a mobile phone. In: Hofte, G. Henri ter, Mulder, Ingrid and Ruyter, Boris E. R. de (eds.) Proceedings of the 10th Conference on Human-Computer Interaction with Mobile Devices and Services - Mobile HCI 2008 September 2-5, 2008, Amsterdam, the Netherlands. pp. 33-42. Available online
Lee, Seunghwan, Kim, Hojin, Yun, Sumi and Lee, Geehyuk (2007): A Feasibility Study of Sixth Sense Computing Scenarios in a Wearable Community. In: Jacko, Julie A. (ed.) HCI International 2007 - 12th International Conference - Part II July 22-27, 2007, Beijing, China. pp. 1155-1164. Available online
Choi, Jinhyuk, Lee, Geehyuk and Um, Yonghoon (2007): Analysis of Internet Users' Interests Based on Windows GUI Messages. In: Jacko, Julie A. (ed.) HCI International 2007 - 12th International Conference - Part IV 2007. pp. 881-888. Available online
Yoon, Youngwoo and Lee, Geehyuk (2007): BetweenKeys: Looking for Room Between Keys. In: Jacko, Julie A. (ed.) HCI International 2007 - 12th International Conference - Part II July 22-27, 2007, Beijing, China. pp. 504-512. Available online
Kim, Daeseok, Yoon, Youngwoo, Hwang, Sunyu, Lee, Geehyuk and Park, Jinah (2007): Visualizing Spray Paint Deposition in VR Training. In: Sherman, William R., Lin, Ming C. and Steed, Anthony (eds.) IEEE Virtual Reality Conference, VR 2007 10-14 March, 2007, Charlotte, NC, USA. pp. 307-308. Available online
Sohn, Byungkon and Lee, Geehyuk (2005): Circle & identify: interactivity-augmented object recognition for handheld devices. In: Proceedings of the 2005 ACM Symposium on User Interface Software and Technology 2005. pp. 107-110. Available online
The first requirement of a \"spatial mouse\" is the ability to identify the object that it is aiming at. Among many possible technologies that can be employed for this purpose, possibly the best solution would be object recognition by machine vision. The problem, however, is that object recognition algorithms are not yet reliable enough or light enough for hand-held devices. This paper demonstrates that a simple object recognition algorithm can become a practical solution when augmented by interactivity. The user draw a circle around a target using a spatial mouse, and the mouse captures a series of camera frames. The frames can be easily stitched together to give a target image separated from the background, with which we need only additional steps of feature extraction and object classification. We present here results from two experiments with a few household objects.
© All rights reserved Sohn and Lee and/or ACM Press
Hwang, Sunyu, Lee, Geehyuk, Jeong, Buyong, Lee, Woohun and Cho, Ilyeon (2005): FeelTip: tactile input device for small wearable information appliances. In: Proceedings of ACM CHI 2005 Conference on Human Factors in Computing Systems 2005. pp. 1475-1478. Available online
The ever decreasing size of information devices these days does not allow even the space for small input devices such as a touchpad or a 3x4 keypad. We introduce here an input device, FeelTip, as a solution for very small information devices. The main idea is to exchange the usual roles of a finger and a surface in a touchpad; a device has a tip and a finger now provides a surface. The result is an input device requiring minimal space but is potentially more efficient than a touchpad due to the tactile feedback of a tip on a finger. Our first prototype consists of a transparent tip and a small CMOS image sensor that tracks the movement of a finger on a tip. In a series of experiments, it outperformed a small analog joystick in free pointing tasks, and was comparable with a 3x4 keypad in text entry tasks.
© All rights reserved Hwang et al. and/or ACM Press
Hwang, Sunyu and Lee, Geehyuk (2005): Qwerty-like 3x4 keypad layouts for mobile phone. In: Proceedings of ACM CHI 2005 Conference on Human Factors in Computing Systems 2005. pp. 1479-1482. Available online
Most computer users are accustomed to the QWERTY keyboard layout. This study was started from the hypothesis that a user's skill in a QWERTY keyboard may be transferred to a 3x4 keypad environment. In order to test the hypothesis, we designed an experiment where users are instructed to type a series of sentences on a "blank" keypad after they were informed that the underlying layout is either QWERTY-like or ABC-type (alphabetical). We observed a more localized layout of typed characters over keys in the QWERTY-like case than in the ABC case. Encouraged by the results, we carried out a series of experiments in order to compare a QWERTY-like layout and an ABC-type layout, and obtained consistently better learning curves and better final typing speeds with a QWERTY-like keypad. As an effort to explain the results, we carried out an eye-gaze analysis for the two cases, and the results are presented.
© All rights reserved Hwang and Lee and/or ACM Press
Sohn, Misook and Lee, Geehyuk (2005): ISeeU: camera-based user interface for a handheld computer. In: Proceedings of 7th conference on Human-computer interaction with mobile devices and services 2005. pp. 299-302. Available online
"Tilt scrolling" and "peephole display" are popular user interface ideas for small computers, and inertial sensors were often the choice for the realization of such ideas. Inertial sensors, however, have two fundamental limitations; the frame of reference is not the user but the earth, and drifting errors are difficult to overcome. A possibly better solution, free from such limitations, is machine vision. Machine vision was a luxury for small computers but is becoming a practical solution because a camera is now a common component of small computers and the required vision algorithm is already running in optical mice. The vision algorithm of our prototype, which we called ISeeU, tracks simple features in the user's face and calculates lateral displacements and changes in distance, which in turn are used to control scrolling and zooming. An informal test with scrolling tasks indicates that its performance is comparable with a user interface using arrow keys.
© All rights reserved Sohn and Lee and/or ACM Press
Sohn, Misook and Lee, Geehyuk (2005): ISeeU: camera-based user interface for a handheld computer. In: Tscheligi, Manfred, Bernhaupt, Regina and Mihalic, Kristijan (eds.) Proceedings of the 7th Conference on Human-Computer Interaction with Mobile Devices and Services - Mobile HCI 2005 September 19-22, 2005, Salzburg, Austria. pp. 299-302. Available online
Join our community and advance:
Page maintainer: The Editorial Team