Number of co-authors:9
Number of publications with 3 favourite co-authors:Ji Soo Yi:4Julie A. Jacko:4Chris M. Law:2
Young Sang Choi's 3 most productive colleagues in number of publications:Andrew Sears:86Julie A. Jacko:78Ji Soo Yi:13
Design can be art. Design can be aesthetics. Design is so simple, that's why it is so complicated.
-- Paul Rand, 1997
Read the fascinating history of Wearable Computing, told by its father, Steve Mann
Read Steve's chapter !
Young Sang Choi
Publications by Young Sang Choi (bibliography)
Lee, Hosub, Choi, Young Sang and Lee, Sunjae (2012): Mobile posture monitoring system to prevent physical health risk of smartphone users. In: Proceedings of the 2012 International Conference on Uniquitous Computing 2012. pp. 592-593.
With the widespread use of a smartphone, users tend to use their smartphone for a long period of time in unhealthy postures; bending forward the neck and watching the relatively small screen closely with concentration. If users keep such unhealthy postures for a long time, they are susceptible to musculoskeletal disorders and eye problems such as cervical disc and myopia, respectively. To prevent users from having these diseases, we propose a new methodology to monitor the posture of smartphone users with built-in sensors. The proposed mechanism estimates various values representing user postures like the tilt angle of the neck, viewing distance, and gaze condition of the user, by analyzing sensor data from a front-faced camera, 3-axis accelerometer, orientation sensor, or any combination thereof, and warns the user if estimated values are maintained within the abnormal range over the allowed time. Via the proposed mechanism, users are able to be aware of their unhealthy postures, and then try to correct their postures.
© All rights reserved Lee et al. and/or ACM Press
Lee, Hosub and Choi, Young Sang (2011): Fit your hand: personalized user interface considering physical attributes of mobile device users. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 59-60.
We present a mobile user interface which dynamically reformulates the layout based on the touch input pattern of users. By analyzing the touch input, it infers users' physical characteristics such as handedness, finger length, or usage habits, thereby calculates the optimal touch area for the user. The user interface is gradually adapted to each user by automatically rearranging graphic objects such as application icons to the most easy-to-touch positions. To compute the optimal touch area, we designed software architecture and implemented an Android application which analyzes touch input and determines the touch frequency in specific screen areas, the handedness and hand size of users. As proof of concept, this research prototype shows acceptable performance and accuracy. To decide which items should be placed in the optimal touch area, we plan to integrate our machine learning algorithm which prioritizes applications according to the context of users into the proposed system.
© All rights reserved Lee and Choi and/or ACM Press
Choi, Young Sang, Anderson, Cressel D., Glass, Jonathan D. and Kemp, Charles C. (2008): Laser pointers and a touch screen: intuitive interfaces for autonomous mobile manipulation for the motor impaired. In: Tenth Annual ACM SIGACCESS Conference on Assistive Technologies 2008. pp. 225-232.
El-E ("Ellie") is a prototype assistive robot designed to help people with severe motor impairments manipulate everyday objects. When given a 3D location, El-E can autonomously approach the location and pick up a nearby object. Based on interviews of patients with amyotrophic lateral sclerosis (ALS), we have developed and tested three distinct interfaces that enable a user to provide a 3D location to El-E and thereby select an object to be manipulated: an ear-mounted laser pointer, a hand-held laser pointer, and a touch screen interface. Within this paper, we present the results from a user study comparing these three user interfaces with a total of 134 trials involving eight patients with varying levels of impairment recruited from the Emory ALS Clinic. During this study, participants used the three interfaces to select everyday objects to be approached, grasped, and lifted off of the ground. The three interfaces enabled motor impaired users to command a robot to pick up an object with a 94.8% success rate overall after less than 10 minutes of learning to use each interface. On average, users selected objects 69% more quickly with the laser pointer interfaces than with the touch screen interface. We also found substantial variation in user preference. With respect to the Revised ALS Functional Rating Scale (ALSFRS-R), users with greater upper-limb mobility tended to prefer the hand-held laser pointer, while those with less upper-limb mobility tended to prefer the ear-mounted laser pointer. Despite the extra efficiency of the laser pointer interfaces, three patients preferred the touch screen interface, which has unique potential for manipulating remote objects out of the user's line of sight. In summary, these results indicate that robots can enhance accessibility by supporting multiple interfaces. Furthermore, this work demonstrates that the communication of 3D locations during human-robot interaction can serve as a powerful abstraction barrier that supports distinct interfaces to assistive robots while using identical, underlying robotic functionality.
© All rights reserved Choi et al. and/or ACM Press
Choi, Young Sang, Yi, Ji Soo, Law, Chris M. and Jacko, Julie A. (2006): Are "universal design resources" designed for designers?. In: Eighth Annual ACM Conference on Assistive Technologies 2006. pp. 87-94.
Universal design (UD) is an approach to design that incorporates things which can be used by all people to the greatest extent possible. UD in information and communication technologies (ICTs) is of growing importance because standard ICTs have great potential to be usable by all people, including people with disabilities (PWDs). Currently, PWDs who need ICTs often have less access because the products have not been universally designed. We hypothesize that one of the reasons for the slow adoption of UD is that universal design resources (UDRs) are not adequate for facilitating designers' tasks. We investigated the usability of UDRs from designers' perspectives. A heuristic evaluation on eight selected UDRs was conducted, and the opinions of contributors to the content of these resources were collected through a web-based survey study. The results of the heuristic evaluation show that most of the investigated UDRs do not provide a clear central idea and fail to support the cognitive processes of designers. The results of the survey also confirmed that the content of these resources do not systematically address the needs of designers as end-users during the development process.
© All rights reserved Choi et al. and/or ACM Press
Law, Chris M., Yi, Ji Soo, Choi, Young Sang and Jacko, Julie A. (2006): Are disability-access guidelines designed for designers?: do they need to be?. In: Kjeldskov, Jesper and Paay, Jane (eds.) Proceedings of OZCHI06, the CHISIG Annual Conference on Human-Computer Interaction 2006. pp. 357-360.
In this paper we discuss the implications of recent research studies on disability-related design guidelines. We have investigated the quality of guidelines with respect to designers as their end-users, and we have conducted field studies of the use design resources in practice. We now look at gaps in the current knowledge regarding the conceptualized system that comprises: the designer of technology, end-users of technologies, and guideline-setting committees. We look at the practice of setting up accessibility program offices in large companies as a means to tackle accessibility issues, and examine the implications of this practice for product designers, and people creating disability-based guidelines for technology.
© All rights reserved Law et al. and/or their publisher
Yi, Ji Soo, Choi, Young Sang, Jacko, Julie A. and Sears, Andrew (2005): Context awareness via a single device-attached accelerometer during mobile computing. In: Proceedings of 7th conference on Human-computer interaction with mobile devices and services 2005. pp. 303-306.
Interest in context-aware computing has expanded the use of sensing technologies. The accelerometer is one of the most widely used sensors for capturing context because it is small, inexpensive, lightweight, and self-operable. In efforts to obtain behavioral patterns, many studies have reported the use of multiple accelerometers attached to the human body. However, this is difficult to implement in real-life situations and may not fully address the context of user interaction. In contrast, the present study employed a single tri-axial accelerometer attached to a handheld computing device instead of to a user. The objective was to determine what contextual information could be obtained from this more feasible, albeit limited, source of acceleration data. Data analyses confirmed that changes in both mobility and lighting conditions induced statistically significant differences in the output of the accelerometer.
© All rights reserved Yi et al. and/or ACM Press
Yi, Ji Soo, Choi, Young Sang, Jacko, Julie A. and Sears, Andrew (2005): Context awareness via a single device-attached accelerometer during mobile computing. In: Tscheligi, Manfred, Bernhaupt, Regina and Mihalic, Kristijan (eds.) Proceedings of the 7th Conference on Human-Computer Interaction with Mobile Devices and Services - Mobile HCI 2005 September 19-22, 2005, Salzburg, Austria. pp. 303-306.
Show this list on your homepage
Join the technology elite and advance:
Changes to this page (author)23 Nov 2012: Added05 Apr 2012: Added
25 Feb 2010: Modified
29 May 2009: Added
07 Apr 2009: Added
24 Jul 2007: Added
24 Jul 2007: Added
22 Jun 2007: Added
Page maintainer: The Editorial Team