Number of co-authors:13
Number of publications with 3 favourite co-authors:Bruce N. Walker:6Benjamin K. Davison:2Andreas Riener:2
Myounghoon Jeon's 3 most productive colleagues in number of publications:Bruce N. Walker:19Andreas Riener:8Miriam Reiner:7
The theory gives the answers, not the theorist.
-- Allen Newell
Read the fascinating history of Wearable Computing, told by its father, Steve Mann
Read Steve's chapter !
Publications by Myounghoon Jeon (bibliography)
Suh, Hyewon, Jeon, Myounghoon and Walker, Bruce N. (2012): Spearcons Improve Navigation Performance and Perceived Speediness in Korean Auditory Menus. In: Proceedings of the Human Factors and Ergonomics Society 2012 Annual Meeting 2012. pp. 1361-1365.
For decades, auditory menus using both speech (usually text-to-speech, TTS) and non-speech sounds have been extensively studied. Researchers have developed situation-optimized auditory menus involving such cues as auditory icons, earcons, spearcons, and spindex. Spearcons have generally outperformed other cues in terms of providing both contextual information and item-specific information. However, little research has been devoted to exploration of spearcons in languages other than English, or the use of spearcon-only auditory menus. In this study, we evaluated the use of spearcons in Korean menus, as well as the use of spearcons alone. Twenty-five native Korean speakers navigated through a two-dimensional auditory menu presented via TTS, with or without spearcon enhancements. Korean spearcons were successful. Participants also rated the spearcon-enhanced menu as seeming speedier and more fun than the TTS-only menu. After a short learning period, mean time-to-target in the auditory menu was even faster with spearcons alone, compared to traditional TTS-only menus.
© All rights reserved Suh et al. and/or Human Factors and Ergonomics Society
Riener, Andreas, Reiner, Miriam, Jeon, Myounghoon and Chalfoun, Pierre (2012): Methodical approaches to prove the effects of subliminal perception in ubiquitous computing environments. In: Proceedings of the 2012 International Conference on Uniquitous Computing 2012. pp. 1120-1121.
To cope with the rising volume of information in human-computer interfaces, explicit and attentive interaction is more and more frequently replaced by implicit means of information exchange, supported by context-and activity-aware systems and applications. The trend of excessive information is, however, still ongoing, calling for further solutions to reduce a persons cognitive load or level of attention. Subliminal interaction techniques are considered a promising approach to deliver information to a person without causing much supplementary workload. This workshop aims at discussing the potential of subliminal perception to improve the information flow for human-computer interaction in the light of the fact that, up to now, the results have been mixed. One group of researchers has provided evidence that subliminal stimulation works, but the other has found that it does not, or even cannot, work. To clarify this issue, experts from various domains attending the workshop will discuss how subliminal effects can be scientifically supported or how a certain claim could be empirically refuted.
© All rights reserved Riener et al. and/or ACM Press
Riener, Andreas and Jeon, Myounghoon (2012): The role of subliminal perception in vehicular interfaces. In: Proceedings of the 2012 International Conference on Uniquitous Computing 2012. pp. 1122-1126.
Following laws and provisions passed on the national and international level, the most relevant goal of future traffic and vehicular interfaces is to increase road safety. To alleviate the cognitive load associated with the interaction with the variety of emerging information and assistance systems in the car, subliminal stimulation is assumed to be a promising technique. To assess the potential of subliminal cues that could be used as their interaction means in future vehicles, we have organized a workshop within the frame of the automotive user interfaces conference (AutoUI 2011) to discuss this topic in a group of experts. This paper summarizes the findings from that workshop and should give researchers a starting point for their own activities in the field by indicating sort of grand research challenges and most critical issues. In particular, the goal of this summary article is to make this challenging research field more 'tangible' for researchers working in a range of disciplines, such as engineering, neuroscience, computer science, and psychophysiology. While currently discussed in the automotive domain only, the principles, research questions, and findings could immediately (and easily) be transferred to and adopted in other research fields. Interaction based on subliminal techniques can have an impact on society at large, making significant contributions toward a more natural, convenient, and even relaxing future style of interaction with any complex systems.
© All rights reserved Riener and Jeon and/or ACM Press
Jeon, Myounghoon (2012): A systematic approach to using music for mitigating affective effects on driving performance and safety. In: Proceedings of the 2012 International Conference on Uniquitous Computing 2012. pp. 1127-1132.
Research has shown that affective effects on driving performance and safety are as dangerous as (or even more dangerous than) effects of the secondary tasks . There has been some research on the use of speech-based systems for the intervention, but little research on the use of music has attempted to mitigate a driver's affective states while driving. The current paper identifies various taxonomies of the effects of music and explores plausible research variables, considerations, and practical application directions.
© All rights reserved Jeon and/or ACM Press
Jeon, Myounghoon and Walker, Bruce N. (2011): What to detect?: Analyzing Factor Structures of Affect in Driving Contexts for an Emotion Detection and Regulation System. In: Proceedings of the Human Factors and Ergonomics Society 55th Annual Meeting 2011. pp. 1889-1893.
This research is a part of the IVAT (In-Vehicle Assistive Technology) project, an in-dash interface design project to help drivers who have various disabilities, including deficits in emotion regulation. While there have been several studies on emotion detection for drivers, few studies have seriously addressed what to detect and why. Those are crucial issues to consider when implementing an effective affect management system. Phase 1 of our study gathered a total of 33 different driving situations that can induce emotions and 56 plausible affective keywords to describe such emotions. Phase 2 analyzed factor structures of affect for driving contexts through user ratings and Factor Analysis, and obtained nine factors: fearful, happy, angry, depressed, curious, embarrassed, urgent, bored, and relieved. These factors accounted for 65.1% of the total variance. Results are discussed in terms of designing the IVAT emotion detection and regulation system for driving contexts.
© All rights reserved Jeon and Walker and/or HFES
Jeon, Myounghoon, Gupta, Siddharth, Davison, Benjamin K. and Walker, Bruce N. (2010): Auditory menus are not just spoken visual menus: a case study of "unavailable" menu items. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 3319-3324.
Auditory menus can supplement or replace visual menus to enhance usability and accessibility. Despite the rapid increase of research on auditory displays, more is still needed to optimize the auditory-specific aspects of these implementations. In particular, there are several menu attributes and features that are often displayed visually, but that are not or poorly conveyed in the auditory version of the menu. Here, we report on two studies aimed at determining how best to render the important concept of an unavailable menu item. In Study 1, 23 undergraduates navigated a Microsoft Word-like auditory menu with a mix of available and unavailable items. For unavailable items, using whisper was favored over attenuated voice or saying "unavailable". In Study 2, 26 undergraduates navigated a novel auditory menu. With practice, whispering unavailable items was more effective than skipping unavailable items. Results are discussed in terms of acoustic theory and cognitive menu selection theory.
© All rights reserved Jeon et al. and/or their publisher
Moskovitch, Yarden, Jeon, Myounghoon and Walker, Bruce N. (2010): Enhanced Auditory Menu Cues on a Mobile Phone Improve Time-Shared Performance of a Driving-Like Dual Task. In: Proceedings of the Human Factors and Ergonomics Society 54th Annual Meeting 2010. pp. 1321-1325.
The growing trend of using mobile phones and other in-vehicle technologies (IVT) while driving has spurred research on driver distraction, its effects and alleviation (Ashley, 2001; Young&Regan, 2007). The present study used a dual task in which 21 undergraduates navigated a mobile phone contact list for a target name (secondary task) while playing a computer game representative of driving (primary task). The phone menu was enhanced with two audio navigation cues: traditional text-to-speech (TTS) and spearcons (i.e., compressed speech). These cues were tested with and without visual display of the contact list. Spearcons in conjunction with TTS enhanced performance on the primary task while having no negative effect on the secondary task. Auditory menus reduced perceived workload and increased subjective ratings. Results are discussed in terms of multiple resources theory and practical mobile phone menu design.
© All rights reserved Moskovitch et al. and/or HFES
Jeon, Myounghoon and Walker, Bruce N. (2009): "Spindex": Accelerated Initial Speech Sounds Improve Navigation Performance in Auditory Menus. In: Proceedings of the Human Factors and Ergonomics Society 53rd Annual Meeting 2009. pp. 1081-1085.
Users interact with mobile devices through menus, which can include many items. Auditory menus can supplement or even replace visual menus. Unfortunately, little research has been devoted to enhancing the usability of large auditory menus. We evaluated a novel auditory menu enhancement called a "spindex" (i.e., speech index), in which brief audio cues inform the user where she is in a long menu. In the current implementation, each item in a menu is preceded by a sound based on the item's initial letter. 25 undergraduates navigated through an alphabetized contact list of 50 or 150 names. The menu was presented with text-to-speech (TTS) alone, or TTS plus spindex, and with the visual menu displayed or not. Search time was faster with the spindex-enhanced menu, especially for long lists. Subjective ratings also favored the spindex. Results are discussed in terms of theory and practical applications.
© All rights reserved Jeon and Walker and/or their publisher
Jeon, Myounghoon, Davison, Benjamin K., Nees, Michael A., Wilson, Jeff and Walker, Bruce N. (2009): Enhanced auditory menu cues improve dual task performance and are preferred with in-vehicle technologies. In: Schmidt, Albrecht, Dey, Anind K., Seder, Thomas and Juhlin, Oskar (eds.) Proceedings of 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications - AutomotiveUI 2009 21-22 September , 2009, Essen, Germany. pp. 91-98.
Jeon, Myounghoon, Park, Junho, Heo, Ubeom and Yun, Jongmin (2009): Enhanced turning point displays facilitate drivers' interaction with navigation devices. In: Schmidt, Albrecht, Dey, Anind K., Seder, Thomas and Juhlin, Oskar (eds.) Proceedings of 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications - AutomotiveUI 2009 21-22 September , 2009, Essen, Germany. pp. 145-148.
Show this list on your homepage
Join the technology elite and advance:
Changes to this page (author)23 Nov 2012: Added23 Nov 2012: Added
23 Nov 2012: Added
10 Nov 2012: Added
04 Apr 2012: Added
06 Jul 2011: Added
06 Jul 2011: Added
16 Jan 2011: Added
03 Nov 2010: Added
02 Nov 2010: Added
Page maintainer: The Editorial Team