Publication statistics

Pub. period:2001-2012
Pub. count:41
Number of co-authors:57



Co-authors

Number of publications with 3 favourite co-authors:

John Williamson:14
Steven Strachan:9
Parisa Eslambolchilar:7

 

 

Productive colleagues

Roderick Murray-Smith's 3 most productive colleagues in number of publications:

Albrecht Schmidt:110
Stephen A. Brewste..:108
Matt Jones:63
 
 
 
Jul 25

For us, our most important stakeholder is not our stockholders, it is our customers. We’re in business to serve the needs and desires of our core customer base

-- John Mackey

 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger

 
Start reading

Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad

 
Start reading

Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam

 
Start reading
 
 

Help us help you!

 
 

Roderick Murray-Smith

Picture of Roderick Murray-Smith.
Personal Homepage:
http://www.dcs.gla.ac.uk/~rod/

Add description
Add publication

Publications by Roderick Murray-Smith (bibliography)

 what's this?
2012
 
Edit | Del

Lamont, Stuart, Bowman, Richard, Rath, Matthias, Williamson, John, Murray-Smith, Roderick and Padgett, Miles (2012): Touching the micron: tactile interactions with an optical tweezer. In: Proceedings of the 14th Conference on Human-computer interaction with mobile devices and services 2012. pp. 313-316.

A tablet interface for manipulating microscopic particles is augmented with vibrotactile and audio feedback. The feedback is generated using a novel real-time synthesis library based on approximations to physical processes, and is efficient enough to run on mobile devices, despite their limited computational power. The feedback design and usability testing was done with a realistic simulator on appropriate tasks, allowing users to control objects more rapidly, with fewer errors and applying more consistent forces. The feedback makes the interaction more tangible, giving the user more awareness of changes in the characteristics of the optical tweezers as the number of optical traps changes.

© All rights reserved Lamont et al. and/or ACM Press

 
Edit | Del

Weir, Daryl, Rogers, Simon, Murray-Smith, Roderick and Löchtefeld, Markus (2012): A user-specific machine learning approach for improving touch accuracy on mobile devices. In: Proceedings of the 2012 ACM Symposium on User Interface Software and Technology 2012. pp. 465-476.

We present a flexible Machine Learning approach for learning user-specific touch input models to increase touch accuracy on mobile devices. The model is based on flexible, non-parametric Gaussian Process regression and is learned using recorded touch inputs. We demonstrate that significant touch accuracy improvements can be obtained when either raw sensor data is used as an input or when the device's reported touch location is used as an input, with the latter marginally outperforming the former. We show that learned offset functions are highly nonlinear and user-specific and that user-specific models outperform models trained on data pooled from several users. Crucially, significant performance improvements can be obtained with a small (≈200) number of training examples, easily obtained for a particular user through a calibration game or from keyboard entry data.

© All rights reserved Weir et al. and/or ACM Press

2011
 
Edit | Del

Quek, Melissa, Boland, Daniel, Williamson, John, Murray-Smith, Roderick, Tavella, Michele, Perdikis, Serafeim, Schreuder, Martijn and Tangermann, Michael (2011): Simulating the feel of brain-computer interfaces for design, development and social interaction. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 25-28.

We describe an approach to improving the design and development of Brain-Computer Interface (BCI) applications by simulating the error-prone characteristics and subjective feel of electroencephalogram (EEG), motor-imagery based BCIs. BCIs have the potential to enhance the quality of life of people who are severely disabled, but it is often time-consuming to test and develop the systems. Simulation of BCI characteristics allows developers to rapidly test design options, and gain both subjective and quantitative insight into expected behaviour without using an EEG cap. A further motivation for the use of simulation is that 'impairing' a person without motor disabilities in a game with a disabled BCI user can create a level playing field and help carers empathise with BCI users. We demonstrate a use of the simulator in controlling a game of Brain Pong.

© All rights reserved Quek et al. and/or their publisher

 
Edit | Del

Rogers, Simon, Williamson, John, Stewart, Craig and Murray-Smith, Roderick (2011): AnglePose: robust, precise capacitive touch tracking via 3d orientation estimation. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 2575-2584.

We present a finger-tracking system for touch-based interaction which can track 3D finger angle in addition to position, using low-resolution conventional capacitive sensors, therefore compensating for the inaccuracy due to pose variation in conventional touch systems. Probabilistic inference about the pose of the finger is carried out in real-time using a particle filter; this results in an efficient and robust pose estimator which also gives appropriate uncertainty estimates. We show empirically that tracking the full pose of the finger results in greater accuracy in pointing tasks with small targets than competitive techniques. Our model can detect and cope with different finger sizes and the use of either fingers or thumbs, bringing a significant potential for improvement in one-handed interaction with touch devices. In addition to the gain in accuracy we also give examples of how this technique could open up the space of novel interactions.

© All rights reserved Rogers et al. and/or their publisher

 
Edit | Del

Norrie, Lauren and Murray-Smith, Roderick (2011): Virtual sensors: rapid prototyping of ubiquitous interaction with a mobile phone and a Kinect. In: Proceedings of 13th Conference on Human-computer interaction with mobile devices and services 2011. pp. 25-28.

The Microsoft Kinect sensor can be combined with a modern mobile phone to rapidly create digitally augmented environments. This can be used either directly as a form of ubiquitous computing environment or indirectly as framework for rapidly prototyping ubicomp environments that are otherwise implemented using conventional sensors. We describe an Android mobile application that supports rapid prototyping of spacial interaction by using 3D position data from the Kinect to simulate a proximity sensor. This allows a developer, or end user, to easily associate content or services on the device with surfaces or regions of a room. The accuracy of the hotspot marking was tested in an experiment where users selected points marked on a whiteboard using a mobile phone. The distribution of the sample points were analysed and showed that the bulk of the selections were within about 13cm of the target and the distributions were characteristically skewed depending on whether the user came to the target from the left or right. This range is sufficient for prototyping many common ubicomp scenarios based on proximity in a room. To illustrate this approach, we describe the design of a novel mobile application that associates a virtual book library with a region of a room, integrating the additional sensors and actuators of a smartphone with the position sensing of the Kinect. We highlight limitations of this approach and suggest areas for future work.

© All rights reserved Norrie and Murray-Smith and/or ACM Press

 
Edit | Del

Trendafilov, Dari, Vazquez-Alvarez, Yolanda, Lemmelä, Saija and Murray-Smith, Roderick (2011): "Can we work this out?": an evaluation of remote collaborative interaction in a mobile shared environment. In: Proceedings of 13th Conference on Human-computer interaction with mobile devices and services 2011. pp. 499-502.

We describe a novel dynamic method for collaborative virtual environments designed for mobile devices and evaluated in a mobile context. Participants interacted in pairs remotely and through touch while walking in three different feedback conditions: 1) visual, 2) audio-tactile, 3) spatial audio-tactile. Results showed the visual baseline system provided higher shared awareness, efficiency and a strong learning effect. However, and although very challenging, the eyes-free systems still offered the ability to build joint awareness in remote collaborative environments, particularly the spatial audio one. These results help us better understand the potential of different feedback mechanisms in the design of future mobile collaborative environments.

© All rights reserved Trendafilov et al. and/or ACM Press

 
Edit | Del

Brewster, Stephen, Jones, Matt, Murray-Smith, Roderick, Nanavati, A. A., Rajput, N., Schmidt, Albrecht and Turunen, M. (2011): We need to talk: rediscovering audio for universal access. In: Proceedings of 13th Conference on Human-computer interaction with mobile devices and services 2011. pp. 715-716.

"In all the wonderful worlds that writing opens, the spoken word still resides and lives. Written texts all have to be related somehow, directly or indirectly, to the world of sound, the natural habitat of language, to yield their meanings." Only 22% of the human population accesses the Internet. The larger fraction of the world cannot read or write. Worldwide, 284 million people are visually impaired. And yet, there are 5.3 billion mobile subscribers, and their numbers are increasing. Much of the mobile work by HCI researchers explores a future world populated by high-end devices and relatively affluent users. This panel turns to consider the hundreds of millions of people for whom such sophistication will not be realised for many years to come. How should we design interfaces and services that are relevant and beneficial for them?

© All rights reserved Brewster et al. and/or ACM Press

2010
 
Edit | Del

Music, Josip and Murray-Smith, Roderick (2010): Virtual hooping: teaching a phone about hula-hooping for fitness, fun and rehabilitation. In: Proceedings of 12th Conference on Human-computer interaction with mobile devices and services 2010. pp. 309-312.

The paper demonstrates the feasibility of using mobile phones for fitness and rehabilitation purposes by training them to recognise a user's hula-hooping movements. It also proposes several parameters which can be used as a measure of rhythmic movement quality. Experimental measurements were achieved with two test subjects performing two sets of steady hula-hooping. The paper compares algorithm performance with accelerometer, gyroscope and magnetometer sensor readings. Analysis of the recorded data indicated that magnetometers had some advantages over accelerometers for reliable phase extraction. Hilbert transforms were used to extract the phase information, and a Dynamic Rhythmic Primitive Model was identified for the hula-hooping movement. Together these tools allow the creation of hula-hooping performance metrics which can be used in wellness, rehabilitation or entertainment applications for mobile devices. We outline open technical challenges and possible future research directions.

© All rights reserved Music and Murray-Smith and/or their publisher

 
Edit | Del

Robinson, Simon, Jones, Matt, Eslambolchilar, Parisa, Murray-Smith, Roderick and Lindborg, Mads (2010): "I did it my way": moving away from the tyranny of turn-by-turn pedestrian navigation. In: Proceedings of 12th Conference on Human-computer interaction with mobile devices and services 2010. pp. 341-344.

In this article we describe a novel approach to pedestrian navigation using bearing-based haptic feedback. People are guided in the general direction of their destination via vibration, but additional exploratory navigation is stimulated by varying feedback based on the potential for taking alternative routes. We describe two mobile prototypes that were created to examine the possible benefits of the approach. The successful use of this exploratory navigation method is demonstrated in a realistic field trial, and we discuss the results and interesting participant behaviours that were recorded.

© All rights reserved Robinson et al. and/or their publisher

 
Edit | Del

Vinciarelli, Alessandro, Murray-Smith, Roderick and Bourlard, Hervé (2010): Mobile social signal processing: vision and research issues. In: Proceedings of 12th Conference on Human-computer interaction with mobile devices and services 2010. pp. 513-516.

This paper introduces the First International Workshop on Mobile Social Signal Processing (SSP). The Workshop aims at bringing together the Mobile HCI and Social Signal Processing research communities. The former investigates approaches for effective interaction with mobile and wearable devices, while the latter focuses on modeling, analysis and synthesis of nonverbal behavior in human{human and human-machine interactions. While dealing with similar problems, the two domains have different goals and methodologies. However, mutual exchange of expertise is likely to raise new research questions as well as to improve approaches in both domains. After providing a brief survey of Mobile HCI and SSP, the paper introduces general aspects of the workshop (including topics, keynote speakers and dissemination means).

© All rights reserved Vinciarelli et al. and/or their publisher

 
Edit | Del

Rogers, Simon, Williamson, John, Stewart, Craig and Murray-Smith, Roderick (2010): FingerCloud: uncertainty and autonomy handover incapacitive sensing. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 577-580.

We describe a particle filtering approach to inferring finger movements on capacitive sensing arrays. This technique allows the efficient combination of human movement models with accurate sensing models, and gives high-fidelity results with low-resolution sensor grids and tracks finger height. Our model provides uncertainty estimates, which can be linked to the interaction to provide appropriately smoothed responses as sensing performance degrades; system autonomy is increased as estimates of user behaviour become less certain. We demonstrate the particle filter approach with a map browser running with a very small sensor board, where finger position uncertainty is linked to autonomy handover.

© All rights reserved Rogers et al. and/or their publisher

 
Edit | Del

Williamson, John, Robinson, Simon, Stewart, Craig, Murray-Smith, Roderick, Jones, Matt and Brewster, Stephen A. (2010): Social gravity: a virtual elastic tether for casual, privacy-preserving pedestrian rendezvous. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 1485-1494.

We describe a virtual "tether" for mobile devices that allows groups to have quick, simple and privacy-preserving meetups. Our design provides cues which allow dynamic coordination of rendezvous without revealing users' positions. Using accelerometers and magnetometers, combined with GPS positioning and non-visual feedback, users can probe and sense a dynamic virtual object representing the nearest meeting point. The Social Gravity system makes social bonds tangible in a virtual world which is geographically grounded, using haptic feedback to help users rendezvous. We show dynamic navigation using this physical model-based system to be efficient and robust in significant field trials, even in the presence of low-quality positioning. The use of simulators to build models of mobile geolocated systems for pre-validation purposes is discussed, and results compared with those from our trials. Our results show interesting behaviours in the social coordination task, which lead to guidelines for geosocial interaction design. The Social Gravity system proved to be very successful in allowing groups to rendezvous efficiently and simply and can be implemented using only commercially available hardware.

© All rights reserved Williamson et al. and/or their publisher

 
Edit | Del

Eslambolchilar, Parisa and Murray-Smith, Roderick (2010): A Model-Based Approach to Analysis and Calibration of Sensor-Based Human Interaction Loops. In International Journal of Mobile Human Computer Interaction, 2 (1) pp. 48-72.

The dynamic systems approach to the design of continuous interaction allows designers to use analytical tools such as state-space modeling and Bode diagrams to simulate and analyse the behaviour and stability of sensor-based applications alone and when it is coupled with a manual control model of user behaviour. This approach also helps designers to calibrate and tune the parameters of the sensor-based application before the actual implementation, and in response to user action. In this article the authors introduce some term definitions from manual control theory for the analysis of the continuous aspects of the interaction design and human behaviour. Then we provide a theoretical framework for specification, analysis and calibration of a sensor-based zooming and scrolling application on mobile devices including the user in the interaction loop. It is especially topical and interesting for guiding design of sensor-based applications on mobile devices. We test our framework with a tilt-controlled speed-dependent automatic zooming application on a PDA.

© All rights reserved Eslambolchilar and Murray-Smith and/or their publisher

 Cited in the following chapter:

Formal Methods: [/encyclopedia/formal_methods.html]


 
2009
 
Edit | Del

Crossan, Andrew, McGill, Mark, Brewster, Stephen A. and Murray-Smith, Roderick (2009): Head tilting for interaction in mobile contexts. In: Proceedings of 11th Conference on Human-computer interaction with mobile devices and services 2009. p. 6.

Developing interfaces for mobile situations requires that devices are useable on the move. Here, we explore head tilting as an input technique to allow a user to interact with a mobile device 'hands free'. A Fitts' Law style evaluation is described where a user acquires targets, moving the cursor by head tilt. We explore d position and velocity control cursor mechanisms in both static and mobile situations to see which provided the best level of performance. Results show that participants could successfully acquire targets using head tilting. Position control was shown to be significantly faster and more accurate in a static context, but exhibited significantly poorer accuracy and longer target acquisition times when the user was on the move. We further demonstrate how analysis of user's gait shows consistent targeting biases at different stages in the gait cycle.

© All rights reserved Crossan et al. and/or their publisher

 
Edit | Del

Crossan, Andrew, Murray-Smith, Roderick, Brewster, Stephen A. and Musizza, Bojan (2009): Instrumented Usability Analysis for Mobile Devices. In International Journal of Mobile Human Computer Interaction, 1 (1) pp. 1-19.

Instrumented usability analysis involves the use of sensors during a usability study which provide observations from which the evaluator can infer details of the context of use, specific activities or disturbances. This is particularly useful for the evaluation of mobile and wearable devices, which are currently difficult to test realistically without constraining users in unnatural ways. To illustrate the benefits of such an approach, we present a study of touch-screen selection of on-screen targets, whilst walking and sitting, using a PocketPC instrumented with an accelerometer. From the accelerometer data the user's gait behaviour is inferred, allowing us to link performance to gait phase angle, showing there were phase regions with significantly lower error and variability. The article provides examples of how information acquired via sensors gives us quantitatively measurable information about the detailed interactions taking place when mobile, allowing designers to test and revise design decisions, based on realistic user activity.

© All rights reserved Crossan et al. and/or their publisher

 
Edit | Del

Murray-Smith, Roderick (2009): Empowering People Rather Than Connecting Them. In International Journal of Mobile Human Computer Interaction, 1 (3) pp. 18-28.

This article discusses the consequences for the fundamentals of interaction design given the introduction of mobile devices with increased sensing capability. Location-aware systems are discussed as one example of the possibilities. The article provides eight challenges to the mobile HCI research community, and makes suggestions for how the International Journal of Mobile HCI could contribute to the field.

© All rights reserved Murray-Smith and/or his/her publisher

 
Edit | Del

Strachan, Steven and Murray-Smith, Roderick (2009): Nonvisual, Distal Tracking of Mobile Remote Agents in Geosocial Interaction. In: Choudhury, Tanzeem, Quigley, Aaron J., Strang, Thomas and Suginuma, Koji (eds.) Location and Context Awareness - Fourth International Symposium - LoCA 2009 May 7-8, 2009, Tokyo, Japan. pp. 88-102.

2008
 
Edit | Del

Murray-Smith, Roderick, Williamson, John, Hughes, Stephen and Quaade, Torben (2008): Stane: synthesized surfaces for tactile input. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 1299-1302.

Stane is a hand-held interaction device controlled by tactile input: scratching or rubbing textured surfaces and tapping. The system has a range of sensors, including contact microphones, capacitive sensing and inertial sensing, and provides audio and vibrotactile feedback. The surface textures vary around the device, providing perceivably different textures to the user. We demonstrate that the vibration signals generated by stroking and scratching these surfaces can be reliably classified, and can be used as a very cheaply manufacturable way to control different aspects of interaction. The system is demonstrated as a control for a music player.

© All rights reserved Murray-Smith et al. and/or ACM Press

 
Edit | Del

Murray-Smith, Roderick, Williamson, John, Hughes, Stephen, Quaade, Torben and Strachan, Steven (2008): Rub the Stane. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 2355-2360.

Stane is a hand-held interaction device controlled by tactile input: scratching or rubbing textured surfaces and tapping. The system has a range of sensors, including contact microphones, capacitive sensing and inertial sensing, and provides audio and vibrotactile feedback. The surface textures vary around the device, providing perceivably different textures to the user. We demonstrate that the vibration signals generated by stroking and scratching these surfaces can be reliably classified, and can be used as a very cheap to manufacture way to control different aspects of interaction. The system is demonstrated as a control for a music player, and in a mobile spatial interaction scenario.

© All rights reserved Murray-Smith et al. and/or ACM Press

 
Edit | Del

Serrano, Marcos, Nigay, Laurence, Lawson, Jean-Yves L., Ramsay, Andrew, Murray-Smith, Roderick and Denef, Sebastian (2008): The openinterface framework: a tool for multimodal interaction. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 3501-3506.

The area of multimodal interaction has expanded rapidly. However, the implementation of multimodal systems still remains a difficult task. Addressing this problem, we describe the OpenInterface (OI) framework, a component-based tool for rapidly developing multimodal input interfaces. The OI underlying conceptual component model includes both generic and tailored components. In addition, to enable the rapid exploration of the multimodal design space for a given system, we need to capitalize on past experiences and include a large set of multimodal interaction techniques, their specifications and documentations. In this work-in-progress report, we present the current state of the OI framework and the two exploratory test-beds developed using the OpenInterface Interaction Development Environment.

© All rights reserved Serrano et al. and/or ACM Press

 
Edit | Del

Crossan, Andrew, Williamson, John, Brewster, Stephen A. and Murray-Smith, Roderick (2008): Wrist rotation for interaction in mobile contexts. In: Hofte, G. Henri ter, Mulder, Ingrid and Ruyter, Boris E. R. de (eds.) Proceedings of the 10th Conference on Human-Computer Interaction with Mobile Devices and Services - Mobile HCI 2008 September 2-5, 2008, Amsterdam, the Netherlands. pp. 435-438.

 
Edit | Del

Eslambolchilar, Parisa and Murray-Smith, Roderick (2008): Interact, excite, and feel. In: Schmidt, Albrecht, Gellersen, Hans-Werner, Hoven, Elise van den, Mazalek, Ali, Holleis, Paul and Villar, Nicolas (eds.) TEI 2008 - Proceedings of the 2nd International Conference on Tangible and Embedded Interaction February 18-20, 2008, Bonn, Germany. pp. 131-138.

 
Edit | Del

Strachan, Steven and Murray-Smith, Roderick (2008): GeoPoke: rotational mechanical systems metaphor for embodied geosocial interaction. In: Proceedings of the Fifth Nordic Conference on Human-Computer Interaction 2008. pp. 543-546.

Rotational dynamic system models can be used to enrich tightly-coupled, bearing-aware embodied control of movement-sensitive mobile devices and support a more bidirectional, negotiated style of interaction. A simulated rotational spring system is used to provide natural eyes-free feedback in both the audio and haptic channels in a geosocial mobile networking context.

© All rights reserved Strachan and Murray-Smith and/or their publisher

 
Edit | Del

Eslambolchilar, Parisa and Murray-Smith, Roderick (2008): Control centric approach in designing scrolling and zooming user interfaces. In International Journal of Human-Computer Studies, 20 (12) pp. 838-856.

The dynamic systems approach to the design of continuous interaction interfaces allows the designer to use simulations, and analytical tools to analyse the behaviour and stability of the controlled system alone and when it is coupled with a manual control model of user behaviour. This approach also helps designers to calibrate and tune the parameters of the system before the actual implementation, and in response to user feedback. In this work we provide a dynamic systems interpretation of the coupling of internal states involved in speed-dependent automatic zooming, and test our implementation on a text browser on a Pocket PC instrumented with a tilt sensor. We illustrate simulated and experimental results of the use of the proposed coupled navigation and zooming interface using tilt and touch screen input.

© All rights reserved Eslambolchilar and Murray-Smith and/or Academic Press

2007
 
Edit | Del

Williamson, John, Murray-Smith, Roderick and Hughes, Stephen (2007): Shoogle: excitatory multimodal interaction on mobile devices. In: Proceedings of ACM CHI 2007 Conference on Human Factors in Computing Systems 2007. pp. 121-124.

Shoogle is a novel, intuitive interface for sensing data within a mobile device, such as presence and properties of text messages or remaining resources. It is based around active exploration: devices are shaken, revealing the contents rattling around "inside". Vibrotactile display and realistic impact sonification create a compelling system. Inertial sensing is used for completely eyes-free, single-handed interaction that is entirely natural. Prototypes are described running both on a PDA and on a mobile phone with a wireless sensorpack. Scenarios of use are explored where active sensing is more appropriate than the dominant alert paradigm.

© All rights reserved Williamson et al. and/or ACM Press

 
Edit | Del

Strachan, Steven, Williamson, John and Murray-Smith, Roderick (2007): Show me the way to Monte Carlo: density-based trajectory navigation. In: Proceedings of ACM CHI 2007 Conference on Human Factors in Computing Systems 2007. pp. 1245-1248.

We demonstrate the use of uncertain prediction in a system for pedestrian navigation via audio with a combination of Global Positioning System data, a music player, inertial sensing, magnetic bearing data and Monte Carlo sampling for a density following task, where a listener's music is modulated according to the changing predictions of user position with respect to a target density, in this case a trajectory or path. We show that this system enables eyes-free navigation around set trajectories or paths unfamiliar to the user and demonstrate that the system may be used effectively for varying trajectory width and context.

© All rights reserved Strachan et al. and/or ACM Press

 
Edit | Del

Murray-Smith, Roderick, Ramsay, Andrew, Garrod, Simon, Jackson, Melissa and Musizza, Bojan (2007): Gait alignment in mobile phone conversations. In: Cheok, Adrian David and Chittaro, Luca (eds.) Proceedings of the 9th Conference on Human-Computer Interaction with Mobile Devices and Services - Mobile HCI 2007 September 9-12, 2007, Singapore. pp. 214-221.

 
Edit | Del

Cho, Sung-Jung, Murray-Smith, Roderick and Kim, Yeun-Bae (2007): Multi-context photo browsing on mobile devices based on tilt dynamics. In: Cheok, Adrian David and Chittaro, Luca (eds.) Proceedings of the 9th Conference on Human-Computer Interaction with Mobile Devices and Services - Mobile HCI 2007 September 9-12, 2007, Singapore. pp. 190-197.

 
Edit | Del

Blankertz, Benjamin, Krauledat, Matthias, Dornhege, Guido, Williamson, John, Murray-Smith, Roderick and Müller, Klaus-Robert (2007): A Note on Brain Actuated Spelling with the Berlin Brain-Computer Interface. In: Stephanidis, Constantine (ed.) Universal Access in Human-Computer Interaction. Ambient Interaction, 4th International Conference on Universal Access in Human-Computer Interaction, UAHCI 2007 Held as Part of HCI International 2007 Beijing, China, July 22-27, 2007 Proceedings, Part II July 22-27, 2007, Beijing, China. pp. 759-768.

2006
 
Edit | Del

Williamson, John, Strachan, Steven and Murray-Smith, Roderick (2006): It's a long way to Monte Carlo: probabilistic display in GPS navigation. In: Proceedings of 8th conference on Human-computer interaction with mobile devices and services 2006. pp. 89-96.

We present a mobile, GPS-based multimodal navigation system, equipped with inertial control that allows users to explore and navigate through an augmented physical space, incorporating and displaying the uncertainty resulting from inaccurate sensing and unknown user intentions. The system propagates uncertainty appropriately via Monte Carlo sampling and predicts at a user-controllable time horizon. Control of the Monte Carlo exploration is entirely tilt-based. The system output is displayed both visually and in audio. Audio is rendered via granular synthesis to accurately display the probability of the user reaching targets in the space. We also demonstrate the use of uncertain prediction in a trajectory following task, where a section of music is modulated according to the changing predictions of user position with respect to the target trajectory. We show that appropriate display of the full distribution of potential future users positions with respect to sites-of-interest can improve the quality of interaction over a simplistic interpretation of the sensed data.

© All rights reserved Williamson et al. and/or ACM Press

 
Edit | Del

Williamson, John, Strachan, Steven and Murray-Smith, Roderick (2006): It's a long way to Monte Carlo: probabilistic display in GPS navigation. In: Nieminen, Marko and Röykkee, Mika (eds.) Proceedings of the 8th Conference on Human-Computer Interaction with Mobile Devices and Services - Mobile HCI 2006 September 12-15, 2006, Helsinki, Finland. pp. 89-96.

 
Edit | Del

Crossan, Andrew and Murray-Smith, Roderick (2006): Rhythmic Interaction for Song Filtering on a Mobile Device. In: McGookin, David K. and Brewster, Stephen A. (eds.) HAID 2006 - Haptic and Audio Interaction Design - First International Workshop August 31 - September 1, 2006, Glasgow, UK. pp. 45-55.

2005
 
Edit | Del

Crossan, Andrew, Murray-Smith, Roderick, Brewster, Stephen A., Kelly, James and Musizza, Bojan (2005): Gait phase effects in mobile interaction. In: Proceedings of ACM CHI 2005 Conference on Human Factors in Computing Systems 2005. pp. 1312-1315.

One problem evaluating mobile and wearable devices is that they are used in mobile settings, making it hard to collect usability data. We present a study of tap-based selection of on-screen targets whilst walking and sitting, using a PocketPC instrumented with an accelerometer to collect information about user activity at the time of each tap. From these data the user's gait can be derived, and this is then used to investigate preferred tapping behaviour relative to gait phase, and associated tap accuracy. Results showed that users were more accurate sitting than walking. When walking there were phase regions with significantly increased tap likelihood, and these regions had significantly lower error rates, and lower error variability. This work represents an example of accelerometer-instrumented mobile usability analysis, and the results give a quantitative understanding of the detailed interactions taking place when on the move, allowing us to develop better mobile interfaces.

© All rights reserved Crossan et al. and/or ACM Press

 
Edit | Del

Strachan, Steven, Eslambolchilar, Parisa, Murray-Smith, Roderick, Hughes, Stephen and O'Modhrain, Sile (2005): GpsTunes: controlling navigation via audio feedback. In: Proceedings of 7th conference on Human-computer interaction with mobile devices and services 2005. pp. 275-278.

We combine the functionality of a mobile Global Positioning System (GPS) with that of an MP3 player, implemented on a PocketPC, to produce a handheld system capable of guiding a user to their desired target location via continuously adapted music feedback. We illustrate how the approach to presentation of the audio display can benefit from insights from control theory, such as predictive 'browsing' elements to the display, and the appropriate representation of uncertainty or ambiguity in the display. The probabilistic interpretation of the navigation task can be generalised to other context-dependent mobile applications. This is the first example of a completely handheld location-aware music player. We discuss scenarios for use of such systems.

© All rights reserved Strachan et al. and/or ACM Press

 
Edit | Del

Strachan, Steven, Eslambolchilar, Parisa, Murray-Smith, Roderick, Hughes, Stephen and O'Modhrain, M. Sile (2005): GpsTunes: controlling navigation via audio feedback. In: Tscheligi, Manfred, Bernhaupt, Regina and Mihalic, Kristijan (eds.) Proceedings of the 7th Conference on Human-Computer Interaction with Mobile Devices and Services - Mobile HCI 2005 September 19-22, 2005, Salzburg, Austria. pp. 275-278.

2004
 
Edit | Del

Lantz, Vuokko and Murray-Smith, Roderick (2004): Rhythmic interaction with a mobile device. In: Proceedings of the Third Nordic Conference on Human-Computer Interaction October 23-27, 2004, Tampere, Finland. pp. 97-100.

We describe a rhythmic interaction mechanism for mobile devices. A PocketPC with a three degree of freedom linear acceleration meter is used as the experimental platform for data acquisition. Dynamic Movement Primitives are used to learn the limit cycle behavior associated with the rhythmic gestures. We outline the open technical and user experience challenges in the development of usable rhythmic interfaces.

© All rights reserved Lantz and Murray-Smith and/or ACM Press

 
Edit | Del

Eslambolchilar, Parisa and Murray-Smith, Roderick (2004): Tilt-Based Automatic Zooming and Scaling in Mobile Devices - A State-Space Implementation. In: Brewster, Stephen A. and Dunlop, Mark D. (eds.) Mobile Human-Computer Interaction - Mobile HCI 2004 - 6th International Symposium September 13-16, 2004, Glasgow, UK. pp. 120-131.

 
Edit | Del

Crossan, Andrew and Murray-Smith, Roderick (2004): Variability in Wrist-Tilt Accelerometer Based Gesture Interfaces. In: Brewster, Stephen A. and Dunlop, Mark D. (eds.) Mobile Human-Computer Interaction - Mobile HCI 2004 - 6th International Symposium September 13-16, 2004, Glasgow, UK. pp. 144-155.

 
Edit | Del

Strachan, Steven, Murray-Smith, Roderick, Oakley, Ian and Ängeslevä, Jussi (2004): Dynamic Primitives for Gestural Interaction. In: Brewster, Stephen A. and Dunlop, Mark D. (eds.) Mobile Human-Computer Interaction - Mobile HCI 2004 - 6th International Symposium September 13-16, 2004, Glasgow, UK. pp. 325-330.

 
Edit | Del

Crossan, Andrew, Williamson, John and Murray-Smith, Roderick (2004): Haptic Granular Synthesis: Targeting, Visualisation and Texturing. In: IV 2004 - 8th International Conference on Information Visualisation 14-16 July, 2004, London, UK. pp. 527-532.

2001
 
Edit | Del

Brewster, Stephen A. and Murray-Smith, Roderick (eds.) (2001): Haptic human-computer interaction : first international workshop, Glasgow, UK, August 31-September 1, 2000. Springer-Verlag

 
Add publication
Show list on your website
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
 
 
 

Changes to this page (author)

23 Nov 2012: Modified
23 Nov 2012: Modified
27 Jul 2012: Added
04 Apr 2012: Modified
04 Apr 2012: Modified
04 Apr 2012: Modified
25 Jul 2011: Modified
05 Jul 2011: Modified
05 Jul 2011: Modified
16 Jan 2011: Modified
16 Jan 2011: Modified
16 Jan 2011: Modified
02 Nov 2010: Modified
02 Nov 2010: Modified
02 Nov 2010: Modified
02 Nov 2010: Modified
02 Nov 2010: Modified
02 Nov 2010: Modified
14 Jun 2009: Modified
12 Jun 2009: Modified
05 Jun 2009: Modified
04 Jun 2009: Modified
02 Jun 2009: Modified
01 Jun 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
29 May 2009: Modified
12 May 2008: Modified
12 May 2008: Modified
12 May 2008: Modified
24 Jul 2007: Modified
24 Jul 2007: Modified
29 Jun 2007: Modified
27 Jun 2007: Modified
22 Jun 2007: Modified
19 Jun 2007: Modified
19 Jun 2007: Added

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/roderick_murray-smith.html

Publication statistics

Pub. period:2001-2012
Pub. count:41
Number of co-authors:57



Co-authors

Number of publications with 3 favourite co-authors:

John Williamson:14
Steven Strachan:9
Parisa Eslambolchilar:7

 

 

Productive colleagues

Roderick Murray-Smith's 3 most productive colleagues in number of publications:

Albrecht Schmidt:110
Stephen A. Brewste..:108
Matt Jones:63
 
 
 
Jul 25

For us, our most important stakeholder is not our stockholders, it is our customers. We’re in business to serve the needs and desires of our core customer base

-- John Mackey

 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger

 
Start reading

Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad

 
Start reading

Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam

 
Start reading
 
 

Help us help you!