Number of co-authors:12
Number of publications with 3 favourite co-authors:Ing-Marie Jonsson:3Mei Yii Lim:2Ruth Aylett:2
Christian Martyn Jones's 3 most productive colleagues in number of publications:Ruth Aylett:35Ing-Marie Jonsson:13Mei Yii Lim:6
Design can be art. Design can be aesthetics. Design is so simple, that's why it is so complicated.
-- Paul Rand, 1997
Read the fascinating history of Wearable Computing, told by its father, Steve Mann
Read Steve's chapter !
Christian Martyn Jones
Publications by Christian Martyn Jones (bibliography)
Willis, Matthew John and Jones, Christian Martyn (2012): Emotishare: emotion sharing on mobile devices. In: Proceedings of the HCI12 Conference on People and Computers XXVI 2012. pp. 292-297.
Emotishare is a web and mobile platform for users to continuously track, share and respond to the emotional states of their friends. The system was trialled with both large and small groups to explore emotional communication. The groups were provided with two alternate interfaces to the system (web and mobile), and usage was compared in order to determine the effectiveness of each interface in supporting emotional communication. While overall usage behaviour was unaffected across both systems, the results highlighted that the mobile system was better suited to encouraging ad-hoc emotional tracking, sharing and response behaviour.
© All rights reserved Willis and Jones and/or their publisher
Jones, Christian Martyn and Pozzebon, Kay (2010): Being safety smart: social issue game for child protective behaviour training. In: Proceedings of the HCI10 Conference on People and Computers XXIV 2010. pp. 151-159.
Being Safety Smart is an online, social issue game designed to mitigate increasing child abduction rates in Australia. By teaching young children skills and strategies to help protect themselves, the game empowers children with the ability and confidence to act appropriately and decisively. This paper reports on the collaborative research and development of Being Safety Smart, bringing together global best practice in child protection and computer game design to create an educational resource targeted to children aged 6 to 8. The anti-abduction messages and strategies were developed in partnerships with Australian government departments of the Queensland Police Service, the Crime and Misconduct Commission, the Department of Communities (Child Safety Services) and Education Queensland. The gaming environment is aligned to age and gender specific learning capabilities of children and is based on eight key features associated with children's acquisition and retention of protective behaviour concepts and skills. Results of a successful evaluation of the program with schools are presented. Being Safety Smart received the 2009 Queensland Police Service gold award for excellence in crime prevention. and is being used in over 200 schools across Australia.
© All rights reserved Jones and Pozzebon and/or BCS
Rolfe, Ben, Jones, Christian Martyn and Wallace, Helen (2010): Designing dramatic play: story and game structure. In: Proceedings of the HCI10 Conference on People and Computers XXIV 2010. pp. 448-452.
Drama in games is created by the interplay of the narrative structure of story and the ludic structure of challenges. In this paper, we combine Csikszentmihalyi's model of engagement and flow with Freytag's pyramid, a model of narrative structure. Using this combination, we explore the dramatic structure of Halo: Combat Evolved, comparing ludic and narrative structures at each stage of the game. Based on our analysis, we recommend that game designers recognise the importance of psychological states beyond flow, and structure gameplay to lead the player on a journey through different states. In particular, we defend the idea of pushing the player out of their comfort zone early in the game to provide motivation and positive stress, and ending the game with challenges below the player's level of expertise, to allow them to relax, reflect, and experience a sense of closure.
© All rights reserved Rolfe et al. and/or BCS
Jones, Christian Martyn and Baldwin, Claudia (2009): Using emotion eliciting photographs to inspire awareness and attitudinal change: a user-centered case study. In: Proceedings of OZCHI09, the CHISIG Annual Conference on Human-Computer Interaction 2009. pp. 201-207.
Photographs can be used to elicit an emotional response in the viewer to promote attitudinal change. The paper considers the types of photographs which can elicit the strongest impact on viewers and uses a case study of the Mary River Dam. The Queensland government is proposing to dam the Mary River, whilst the Save the Mary River group has been running a campaign against the proposed dam using images of the community and landscape in its protest materials and website. This paper reports on a project to understand which types of images provided by the Save the Mary River group elicit the strongest impact on viewers to inspire support for their protest, and how and why these images can increase awareness around the issues of the proposed dam as a solution to water needs.
© All rights reserved Jones and Baldwin and/or their publisher
Jones, Christian Martyn and Willis, Matthew (2009): Edutainment in the field using mobile location based services. In: Proceedings of OZCHI09, the CHISIG Annual Conference on Human-Computer Interaction 2009. pp. 385-388.
The explorer project provides educational tours and activities to schoolchildren using existing low cost technologies. The activities take place in environmentally sensitive and remote locations and are based around a proven curricula developed in collaboration with Queensland schools. To undertake the activities, smart phones are provided to students that are pre-loaded with GPS driven software that guides them through each task. Tasks are triggered by the student's proximity to field locations (using GPS coordinates). Students are directed to observe, collect, analyse and report data by utilising the features of the device, such as the in built camera, location services, text, handwriting and sketch entry, and the audio and video capabilities of the device. Data collated by students is uploaded to a secure server on completion of the tasks. All data is made available to students via the server for inclusion in reports, assessment items and for sharing and blogging on social networking sites. The project will assess changes to learning outcomes, and student attitudes and values towards the environment, comparing the experience of students using the explorer device with traditional paper-based descriptions and reporting. Results of the explorer project will help inform the development of future location-based technologies for field-based education.
© All rights reserved Jones and Willis and/or their publisher
Jones, Christian Martyn and Deeming, Andrew (2007): Investigating emotional interaction with a robotic dog. In: Proceedings of OZCHI07, the CHISIG Annual Conference on Human-Computer Interaction November 28-30, 2007, Adelaide, Australia. pp. 183-186.
Next generation of consumer-level entertainment robots should offer more natural engaging interaction. This paper reports on the development and evaluation of a consumer-level robotic dog with acoustic emotion recognition capabilities. The dog can recognise the emotional state of its owner from affective cues in the owner's speech and respond with appropriate actions. The evaluation study shows that users can recognise the new robotic dog to be emotionally intelligent and report that this makes the dog appear more 'alive'.
© All rights reserved Jones and Deeming and/or ACM Press
Jones, Christian Martyn and Troen, Tommy (2007): Biometric valence and arousal recognition. In: Proceedings of OZCHI07, the CHISIG Annual Conference on Human-Computer Interaction November 28-30, 2007, Adelaide, Australia. pp. 191-194.
A real-time user-independent emotion detection system using physiological signals has been developed. The system has the ability to classify affective states into 2-dimensions using valence and arousal. Each dimension ranges from 1 to 5 giving a total of 25 possible affective regions. Physiological signals were measured using 3 biometric sensors for Blood Volume Pulse (BVP), Skin Conductance (SC) and Respiration (RESP). Two emotion inducing experiments were conducted to acquire physiological data from 13 subjects. The data from 10 of these subjects were used to train the system, while the remaining 3 datasets were used to test the performance of the system. A recognition rate of 62% for valence and 67% for arousal was achieved within +/- 1 units of the valence and arousal rating.
© All rights reserved Jones and Troen and/or ACM Press
Jones, Christian Martyn and Jonsson, Ing-Marie (2007): Performance Analysis of Acoustic Emotion Recognition for In-Car Conversational Interfaces. In: Stephanidis, Constantine (ed.) Universal Access in Human-Computer Interaction. Ambient Interaction, 4th International Conference on Universal Access in Human-Computer Interaction, UAHCI 2007 Held as Part of HCI International 2007 Beijing, China, July 22-27, 2007 Proceedings, Part II July 22-27, 2007, Beijing, China. pp. 411-420.
Jones, Christian Martyn and Jonsson, Ing-Marie (2005): Automatic recognition of affective cues in the speech of car drivers to allow appropriate responses. In: Proceedings of OZCHI05, the CHISIG Annual Conference on Human-Computer Interaction 2005. pp. 1-10.
Speech interaction with in-car systems is becoming more commonplace as systems improve. New cars are often equipped with speech recognition systems to dial phone numbers and or control the in-car environment, and with speech output to provide verbal directions from navigation systems. The paper explores the possibilities of richer speech interaction between driver and car with automatic recognition of the emotional state of the driver with appropriate responses from the car. Driver's emotions often influence driving performance that could be improved if the car actively responds to the emotional state of the driver. This paper focuses on an in-car emotion recognition system to recognise driver emotional state.
© All rights reserved Jones and Jonsson and/or their publisher
Lim, Mei Yii, Aylett, Ruth and Jones, Christian Martyn (2005): Affective Guide with Attitude. In: Tao, Jianhua, Tan, Tieniu and Picard, Rosalind W. (eds.) ACII 2005 - Affective Computing and Intelligent Interaction, First International Conference October 22-24, 2005, Beijing, China. pp. 772-779.
Jones, Christian Martyn and Jonsson, Ing-Marie (2005): Detecting Emotions in Conversations Between Driver and In-Car Information Systems. In: Tao, Jianhua, Tan, Tieniu and Picard, Rosalind W. (eds.) ACII 2005 - Affective Computing and Intelligent Interaction, First International Conference October 22-24, 2005, Beijing, China. pp. 780-787.
Lim, Mei Yii, Aylett, Ruth and Jones, Christian Martyn (2005): Emergent Affective and Personality Model. In: Panayiotopoulos, Themis, Gratch, Jonathan, Aylett, Ruth, Ballin, Daniel, Olivier, Patrick and Rist, Thomas (eds.) IVA 2005 - Intelligent Virtual Agents - 5th International Working Conference September 12-14, 2005, Kos, Greece. pp. 371-380.
Jones, Christian Martyn and Dlay, Satnam Singh (1997): MARTI: Man-Machine Animation Real-Time Interface: The Illusion of Life. In: Smith, Michael J., Salvendy, Gavriel and Koubek, Richard J. (eds.) HCI International 1997 - Proceedings of the Seventh International Conference on Human-Computer Interaction - Volume 2 August 24-29, 1997, San Francisco, California, USA. pp. 841-844.
Show this list on your homepage
Join the technology elite and advance:
Changes to this page (author)09 Nov 2012: Added03 Apr 2012: Added
03 Apr 2012: Added
03 Nov 2010: Added
03 Nov 2010: Added
18 Feb 2010: Modified
12 Jul 2009: Added
12 Jul 2009: Added
12 Jul 2009: Added
05 Jun 2009: Added
05 Jun 2009: Added
12 May 2008: Added
12 May 2008: Added
24 Jul 2007: Added
Page maintainer: The Editorial Team