Number of co-authors:12
Number of publications with 3 favourite co-authors:Jaka Sodnik:4Saso Tomazic:4Mark Billinghurst:4
Christina Dicke's 3 most productive colleagues in number of publications:Mark Billinghurst:92Juha Lehikoinen:13Jaka Sodnik:7
User error: replace user and press any key to continue.
-- Popular computer one-liner
Read the fascinating history of Wearable Computing, told by its father, Steve Mann
Read Steve's chapter !
Publications by Christina Dicke (bibliography)
Dicke, Christina, Jakus, Grega, Tomazic, Saso and Sodnik, Jaka (2012): On the Evaluation of Auditory and Head-up Displays While Driving. In: Proceedings of the 2012 International Conference on Advances in Computer-Human Interactions 2012. pp. 200-203.
In this paper, we propose a low cost, laboratory based testing framework for in-vehicle interfaces. Exemplified by a comparison between an auditory interface, a Head-up display, and a combination of both we show how task completion times, driving penalty points, mental workload, and subjective user evaluations of the interfaces can be collected through different logging systems and user questionnaires. The driving simulator used in the experiment enables the simulation of varying traffic conditions as well as different driving scenarios including a highway and a busy city center. Only some preliminary results are reported in this paper.
© All rights reserved Dicke et al. and/or IEEE
Dicke, Christina, Wolf, Katrin and Tal, Yaroslav (2010): Foogue: eyes-free interaction for smartphones. In: Proceedings of 12th Conference on Human-computer interaction with mobile devices and services 2010. pp. 455-458.
Graphical user interfaces for mobile devices have several drawbacks in mobile situations. In this paper, we present Foogue, an eyes-free interface that utilizes spatial audio and gesture input. Foogue does not require visual attention and hence does not divert visual attention from the task at hand. Foogue has two modes, which are designed to fit the usage patterns of mobile users. For user input we designed a gesture language build of a limited number of simple but also easy to differentiate gesture elements.
© All rights reserved Dicke et al. and/or their publisher
Dicke, Christina, Aaltonen, Viljakaisa, Rämö, Anssi and Vilermo, Miikka (2010): Talk to me: the influence of audio quality on the perception of social presence. In: Proceedings of the HCI10 Conference on People and Computers XXIV 2010. pp. 309-318.
In this paper, we compare the impact of monophonic, stereophonic, and binaural human speech recordings in terms of their ability to induce the feeling of presence and influence the understanding of the emotional state the speakers were in. These factors are generally important in entertainment applications, for example when conversing with a non-player character or in mediated synchronous human-to-human communication. Our results show a significant advantage of binaural over mono and stereo sound for inducing the sense of being present in an (virtual) environment. Furthermore, we found that listening to a stereophonic recording of a conversation leads to a significantly stronger understanding of the emotional state of speakers than listening to a mono or binaural recording.
© All rights reserved Dicke et al. and/or BCS
Dicke, Christina, Deo, Shaleen, Billinghurst, Mark, Adams, Nathan and Lehikoinen, Juha (2008): Experiments in mobile spatial audio-conferencing: key-based and gesture-based interaction. In: Hofte, G. Henri ter, Mulder, Ingrid and Ruyter, Boris E. R. de (eds.) Proceedings of the 10th Conference on Human-Computer Interaction with Mobile Devices and Services - Mobile HCI 2008 September 2-5, 2008, Amsterdam, the Netherlands. pp. 91-100.
Sodnik, Jaka, Dicke, Christina, Tomazic, Saso and Billinghurst, Mark (2008): A user study of auditory versus visual interfaces for use while driving. In International Journal of Human-Computer Studies, 20 (5) pp. 318-332.
This paper describes a user study on interaction with a mobile device installed in a driving simulator. Two new auditory interfaces were proposed and their effectiveness and efficiency were compared to a standard visual interface. Both auditory interfaces consisted of spatialized auditory cues representing individual items in the hierarchical structure of the menu. In the first auditory interface all items of the current level of the menu were played simultaneously. In the second auditory interface only one item was played at a time. The visual interface was shown on a small in-vehicle LCD screen on the dashboard. In all three cases, a custom-made interaction device (a scrolling wheel and two buttons) attached to the steering wheel was used for controlling the interface. The driving performance, task completion times, perceived workload and overall user satisfaction were evaluated. The experiment proved that both auditory interfaces were effective to use in a mobile environment, but were not faster than the visual interface. In the case of shorter tasks, e.g. changing the active profile or deleting an image, the task completion times were comparable for all interfaces; however, both the driving performance was significantly better and the perceived workload was lower when using the auditory interfaces. The test subjects also reported a high overall satisfaction with the auditory interfaces. The latter were labelled as easier to use, more satisfying and more adequate for performing the required tasks than the visual interface. The results of the survey are not surprising as there is a stronger competition for the visual attention between the visual interface and the primary task (driving the car) than in the case of using the auditory interface. So although both types of interfaces were proven to be effective, the visual interface was less efficient as it strongly distracted the user from performing the primary task.
© All rights reserved Sodnik et al. and/or Academic Press
Sodnik, Jaka, Tomazic, Saso, Dicke, Christina and Billinghurst, Mark (2008): Spatial Auditory Interface for an Embedded Communication Device in a Car. In: Proceedings of the 2008 International Conference on Advances in Computer-Human Interactions 2008. pp. 69-76.
In this paper we evaluate the safety of the driver when using an embedded communication device while driving. As a part of our research, four different tasks were preformed with the device in order to evaluate the efficiency and safety of the drivers under three different conditions: one visual and two different auditory conditions. In the visual condition, various menu items were shown on a small LCD screen attached to the dashboard. In the auditory conditions, the same menu items were presented with spatial sounds distributed on a virtual ring around the user's head. The same custom-made interaction device attached to the steering wheel was used in all three conditions, enabling simple and safe interaction with the device while driving. The auditory interface proved to be as fast as the visual one, while at the same time enabling a significantly safer driving and higher satisfaction of the users. The measured workload also appeared to be lower when using the auditory interfaces.
© All rights reserved Sodnik et al. and/or IEEE
Dicke, Christina, Sodnik, Jaka, Billinghurst, Mark and Tomazic, Saso (2007): Spatial Auditory Interfaces Compared to Visual Interfaces for Mobile Use in a Driving Task. In: Cardoso, Jorge, Cordeiro, José and Filipe, Joaquim (eds.) ICEIS 2007 - Proceedings of the Ninth International Conference on Enterprise Information Systems Volume HCI June 12-16, 2007, Funchal, Portugal. pp. 282-285.
Show this list on your homepage
Join the technology elite and advance:
Changes to this page (author)09 Nov 2012: Added09 Nov 2012: Added
03 Apr 2012: Added
02 Nov 2010: Added
23 Feb 2010: Modified
25 Aug 2009: Added
05 Jun 2009: Added
29 May 2009: Added
Page maintainer: The Editorial Team