Number of co-authors:47
Number of publications with 3 favourite co-authors:Yvonne Rogers:15Eva Hornecker:6Jochen Rick:4
Paul Marshall's 3 most productive colleagues in number of publications:Albrecht Schmidt:111Yvonne Rogers:99Panos Markopoulos:81
go to course
Emotional Design: How to make products people will love
Starts the day after tomorrow !
go to course
UI Design Patterns for Successful Software
85% booked. Starts in 10 days
Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess
User Experience and Experience Design !
Our Latest Books
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
Publications by Paul Marshall (bibliography)
Marshall, Paul, Rogers, Yvonne and Pantidi, Nadia (2011): Using F-formations to analyse spatial patterns of interaction in physical environments. In: Proceedings of ACM CSCW11 Conference on Computer-Supported Cooperative Work 2011. pp. 445-454. Available online
There are few conceptual tools available to analyse physical spaces in terms of their support for social interactions and their potential for technological augmentation. In this paper, we describe how we used Adam Kendon's characterisation of the F-formation system of spatial organisation as a conceptual lens to analyse the social interactions between visitors and staff in a tourist information centre. We describe how the physical structures in the space encouraged and discouraged particular kinds of interactions and discuss how F-formations might be used to think about augmenting physical spaces.
© All rights reserved Marshall et al. and/or their publisher
Doring, Tanja, Kern, Dagmar, Marshall, Paul, Pfeiffer, Max, Schoning, Johannes, Gruhn, Volker and Schmidt, Albrecht (2011): Gestural interaction on the steering wheel: reducing the visual demand. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 483-492. Available online
Cars offer an increasing number of infotainment systems as well as comfort functions that can be controlled by the driver. In our research, we investigate new interaction techniques that aim to make it easier to interact with these systems while driving. We suggest utilizing the steering wheel as an additional interaction surface. In this paper, we present two user studies conducted with a working prototype of a multi-touch steering wheel. In the first, we developed a user-defined steering wheel gesture set, and in the second, we applied the identified gestures and compared their application to conventional user interaction with infotainment systems in terms of driver distraction. The main outcome was that driver's visual demand is reduced significantly by using gestural interaction on the multi-touch steering wheel.
© All rights reserved Doring et al. and/or their publisher
Marshall, Paul, Morris, Richard, Rogers, Yvonne, Kreitmayer, Stefan and Davies, Matt (2011): Rethinking 'multi-user': an in-the-wild study of how groups approach a walk-up-and-use tabletop interface. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 3033-3042. Available online
Multi-touch tabletops have been much heralded as an innovative technology that can facilitate new ways of group working. However, there is little evidence of these materialising outside of research lab settings. We present the findings of a 5-week in-the-wild study examining how a shared planning application -- designed to run on a walk-up-and-use tabletop -- was used when placed in a tourist information centre. We describe how groups approached, congregated and interacted with it and the social interactions that took place -- noting how they were quite different from research findings describing the ways groups work around a tabletop in lab settings. We discuss the implications of such situated group work for designing collaborative tabletop applications for use in public settings.
© All rights reserved Marshall et al. and/or their publisher
Antle, Alissa N., Marshall, Paul and Hoven, Elise van den (2011): Workshop on embodied interaction: theory and practice in HCI. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 5-8. Available online
For over ten years researchers in human-computer interaction (HCI) have explored an embodied perspective that seeks to describe and explain the fundamental role played by the physical body in how we experience, interact with and understand computation in the world we live in. Recently, such a perspective has been used to discuss human actions and interactions with a range of computational applications including tangibles, mobiles, wearables, tabletops and interactive environments. This workshop aims to enable participants to critically explore the different approaches to incorporating an embodied perspective in HCI research, and to develop a shared set of understandings and identification of differences, similarities and synergies between our research approaches.
© All rights reserved Antle et al. and/or their publisher
Olsen, Anneli, Schmidt, Albrecht, Marshall, Paul and Sundstedt, Veronica (2011): Using eye tracking for interaction. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 741-744. Available online
The development of cheaper eye trackers and open source software for eye tracking and gaze interaction brings the possibility to integrate eye tracking into everyday use devices as well as highly specialized equipment. Apart from providing means for analyzing eye movements, eye tracking also offers the possibility of a natural user interaction modality. Gaze control interfaces are already used within assistive applications for disabled users. However, this novel user interaction possibility comes with its own set of limitations and challenges. The aim of this SIG is to provide a forum for Designers, Researchers and Usability Professionals to discuss the role of eye tracking as a user interaction method in the future as well as the technical and user interaction challenges that using eye tracking as an interaction method brings.
© All rights reserved Olsen et al. and/or their publisher
Rick, Jochen, Marshall, Paul and Yuill, Nicola (2011): Beyond one-size-fits-all: how interactive tabletops support collaborative learning. In: Proceedings of ACM IDC11 Interaction Design and Children 2011. pp. 109-117. Available online
Previous research has demonstrated the capacity of interactive table-tops to support co-located collaborative learning; however, these analyses have been at a coarse scale -- focusing on general trends across conditions. In this paper, we offer a complimentary perspective by focusing on specific group dynamics. We detail three cases of dyads using the DigiTile application to work on fraction challenges. While all pairs perform well, their group dynamics are distinctive; as a consequence, the benefits of working together and the benefits of using an interactive tabletop are different for each pair. Thus, we demonstrate that one size does not fit all when characterizing how interactive tabletops support collaborative learning.
© All rights reserved Rick et al. and/or ACM Press
Hazlewood, William R., Dalton, Nick, Marshall, Paul, Rogers, Yvonne and Hertrich, Susanna (2010): Bricolage and consultation: addressing new design challenges when building large-scale installations. In: Proceedings of DIS10 Designing Interactive Systems 2010. pp. 380-389. Available online
We describe the many challenges faced when designing, implementing and embedding large-scale installations in a physical space, such as a building. A case study is presented of a distributed ambient display system intended to inform, lure and influence people when moving through the building. We outline the wide range of technical, user, aesthetic and practical aspects that need to be addressed; pointing out how many unpredictable problems can surface when going 'big', 'physical' and 'out of the PC', We argue that a different set of 'non-user-centered' processes are required. Furthermore, we propose a new design implementation approach that includes aspects of iterative design, but with the new processes of bricolage and consultation added for progressing the design.
© All rights reserved Hazlewood et al. and/or their publisher
Rogers, Yvonne, Hazlewood, William R., Marshall, Paul, Dalton, Nick and Hertrich, Susanna (2010): Ambient influence: can twinkly lights lure and abstract representations trigger behavioral change?. In: Proceedings of the 2010 International Conference on Uniquitous Computing 2010. pp. 261-270. Available online
Can ubiquitous technologies be designed to nudge people to change their behavior? If so, how? We describe an ambient installation that was intended to help people decide -- and to encourage them to reflect -- when confronted with a choice. In this particular case, it was whether to take the stairs or the elevator in their place of work. The rationale was to push people towards a desired behavior at the point of decision-making and to reflect upon theirs and others' aggregate behavior. We describe the ambient displays that were developed and the prototyping studies in which they were evaluated. The findings from an in-the-wild study are then presented. They reveal that even though people said they were not aware of changing their behavior, logged data of their actual behavior showed a significant change. We discuss these mixed findings in relation to whether ambient displays can influence at an unconscious or conscious level.
© All rights reserved Rogers et al. and/or their publisher
Kern, Dagmar, Marshall, Paul and Schmidt, Albrecht (2010): Gazemarks: gaze-based visual placeholders to ease attention switching. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 2093-2102. Available online
Many tasks require attention switching. For example, searching for information on one sheet of paper and then entering this information onto another one. With paper we see that people use fingers or objects as placeholders. Using these simple aids, the process of switching attention between displays can be simplified and speeded up. With large or multiple visual displays we have many tasks where both attention areas are on the screen and where using a finger as a placeholder is not suitable. One way users deal with this is to use the mouse and highlight their current focus. However, this also has its limitations -- in particular in environments where there is no pointing device. Our approach is to utilize the user's gaze position to provide a visual placeholder. The last area where a user fixated on the screen (before moving their attention away) is highlighted; we call this visual reminder a Gazemark. Gazemarks ease orientation and the resumption of the interrupted task when coming back to this display. In this paper we report on a study where the effectiveness of using Gazemarks was investigated, in particular we show how they can ease attention switching. Our results show faster completion times for a resumed simple visual search task when using this technique. The paper analyzes relevant parameters for the implementation of Gazemarks and discusses some further application areas for this approach.
© All rights reserved Kern et al. and/or their publisher
Dalton, Sheep N., Marshall, Paul and Dalton, Ruth Conroy (2010): Measuring environments for public displays: a space syntax approach. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 3841-3846. Available online
This paper reports on an on-going project, which is investigating the role that location plays in the visibility of information presented on a public display. Spatial measures are presented, derived from the architectural theory of Space Syntax. These are shown to relate to the memorability of words and images presented on different displays. Results show a complex pattern of interactions between the size and shape of spaces in which displays are situated and the memorability of different types of representations depicted. This approach offers a new way to consider the role of space in guiding and constraining interaction in real settings: a growing concern within HCI and Ubicomp.
© All rights reserved Dalton et al. and/or their publisher
Marshall, Paul, Fleck, Rowanne, Harris, Amanda, Rick, Jochen, Hornecker, Eva, Rogers, Yvonne, Yuill, Nicola and Dalton, Nick Sheep (2009): Fighting for control: children's embodied interactions when using physical and digital representations. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 2149-2152. Available online
Tabletop and tangible interfaces are often described in terms of their support for shared access to digital resources. However, it is not always the case that collaborators want to share and help one another. In this paper we detail a video-analysis of a series of prototyping sessions with children who used both cardboard objects and an interactive tabletop surface. We show how the material qualities of the digital interface and physical objects affect the kinds of bodily strategies adopted by children to stop others from accessing them. We discuss how children fight for and maintain control of physical versus digital objects in terms of embodied interaction and what this means when designing collaborative applications for shareable interfaces.
© All rights reserved Marshall et al. and/or ACM Press
Zaman, Bieke, Abeele, Vero Vanden, Markopoulos, Panos and Marshall, Paul (2009): Tangibles for children, the challenges. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 4729-4732. Available online
A significant proportion of research in the field of tangible interaction involves children. A common aspiration is to offer benefits through tangibility, related to ease of use and overall user experience while also support learning and developmental processes. However, evaluation results are often equivocal, and expectations of researchers not always verified. This workshop aims to attract researchers who approach this topic of tangibility and children from an empirical or design perspective. The purpose is to obtain a good picture of what benefits we expect tangibility to provide (including novel and future applications), establish what is the current empirical evidence to support such claims (or what is missing), and motivate appropriate evaluation methodologies for children.
© All rights reserved Zaman et al. and/or ACM Press
England, David, Hornecker, Eva, Roast, Chris, Romero, Pablo, Fergus, Paul and Marshall, Paul (2009): Whole body interaction. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 4815-4818. Available online
Holland, Simon, Marshall, Paul, Bird, Jon, Dalton, Nick Sheep, Morris, Richard, Pantidi, Nadia, Rogers, Yvonne and Clark, Andy (2009): Running up Blueberry Hill: prototyping whole body interaction in harmony space. In: Villar, Nicolas, Izadi, Shahram, Fraser, Mike and Benford, Steve (eds.) TEI 2009 - Proceedings of the 3rd International Conference on Tangible and Embedded Interaction February 16-18, 2009, Cambridge, UK. pp. 93-98. Available online
Rick, Jochen, Harris, Amanda, Marshall, Paul, Fleck, Rowanne, Yuill, Nicola and Rogers, Yvonne (2009): Children designing together on a multi-touch tabletop: an analysis of spatial orientation and user interactions. In: Proceedings of ACM IDC09 Interaction Design and Children 2009. pp. 106-114. Available online
Applications running on multi-touch tabletops are beginning to be developed to enable children to collaborate on a variety of activities, from photo sharing to playing games. However, little is know as to how children work together on such interactive surfaces. We present a study that investigated groups of children's use of a multitouch tabletop for a shared-space design task, requiring reasoning and compromise. The OurSpace application was designed to allow children to arrange the desks in their classroom and allocate students to seats around those desks. A number of findings are reported, including a comparison of single versus multiple touch, equity of participation, and an analysis of how a child's tabletop position affects where he or she touches. A main finding was that children used all of the tabletop surface, but took more responsibility for the parts of the design closer to their relative position.
© All rights reserved Rick et al. and/or ACM Press
Antle, Alissa N., Fernaeus, Ylva and Marshall, Paul (2009): Children and embodied interaction: seeking common ground. In: Proceedings of ACM IDC09 Interaction Design and Children 2009. pp. 306-308. Available online
As computation plays an ever larger role as an embedded part of the environment, research that seeks to understand the embodied nature of children's interactions with computation becomes increasingly important. Embodied interaction is an approach to understanding human-computer interaction that seeks to investigate and support the complex interplay of mind, body and environment in interaction. Recently, such a perspective has been used to discuss human actions and interactions with a range of computational applications including tangibles, mobiles, robotics and gesture-based interfaces. Physically-based forms of child computer interaction including body movements, the ability to touch, feel, manipulate and build sensory awareness of the relationships in the world are crucial to children's cognitive and social development. This workshop aims to critically explore the different approaches to incorporating an embodied perspective in children's interaction design and HCI research, and to develop a shared set of understandings and identification of differences, similarities and synergies between our research approaches.
© All rights reserved Antle et al. and/or ACM Press
Kern, Dagmar, Marshall, Paul, Hornecker, Eva, Schmidt, Albrecht and Rogers, Yvonne (2009): Enhancing Navigation Information with Tactile Output Embedded into the Steering Wheel. In: Proceedings of Pervasive 2009. pp. 42-58. Available online
Fleck, Rowanne, Rogers, Yvonne, Yuill, Nicola, Marshall, Paul, Carr, Amanda, Rick, Jochen and Bonnett, Victoria (2009): Actions speak loudly with words: unpacking collaboration around the table. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 189-196. Available online
The potential of tabletops to enable groups of people to simultaneously touch and manipulate a shared tabletop interface provides new possibilities for supporting collaborative learning. However, findings from the few studies carried out to date have tended to show small or insignificant effects compared with other technologies. We present the Collaborative Learning Mechanisms framework used to examine the coupling of verbal interactions and physical actions in collaboration around the tabletop and reveal subtle mechanisms at play. Analysis in this way revealed that what might be considered undesirable or harmful interactions and intrusions in general collaborative settings, might be beneficial for collaborative learning. We discuss the implications of these findings for how tabletops may be used to support children's collaboration, and the value of considering verbal and physical aspects of interaction together in this way.
© All rights reserved Fleck et al. and/or their publisher
Marshall, Paul, Cheng, Peter C.-H. and Luckin, Rosemary (2009): Tangibles in the balance: a discovery learning task with physical or graphical materials. In: Proceedings of the 4th International Conference on Tangible and Embedded Interaction 2009. pp. 153-160. Available online
An assumption behind much work on the use of tangibles for learning is that there are individual cognitive benefits related to the physical manipulation of materials. However, previous work that has shown learning benefits in using physical materials often hasn't adequately controlled for the covariates of physicality. In this paper, we describe a study where we compared the effects on adults' discovery learning on a balance beam task of using either physical or graphical materials and with either control or no control over the design of experiments. No effects were found of either the type of learning material or the level of control over the experimental design.
© All rights reserved Marshall et al. and/or their publisher
Bird, Jon, Marshall, Paul and Rogers, Yvonne (2009): Low-fi skin vision: a case study in rapid prototyping a sensory substitution system. In: Proceedings of the HCI09 Conference on People and Computers XXIII 2009. pp. 55-64. Available online
We describe the design process we have used to develop a minimal, twenty vibration motor Tactile Vision Sensory Substitution (TVSS) system which enables blind-folded subjects to successfully track and bat a rolling ball and thereby experience 'skin vision'. We have employed a low-fi rapid prototyping approach to build this system and argue that this methodology is particularly effective for building embedded interactive systems. We support this argument in two ways. First, by drawing on theoretical insights from robotics, a discipline that also has to deal with the challenge of building complex embedded systems that interact with their environments; second, by using the development of our TVSS as a case study: describing the series of prototypes that led to our successful design and highlighting what we learnt at each stage.
© All rights reserved Bird et al. and/or their publisher
Rogers, Yvonne, Lim, Youn-kyung, Hazlewood, William R. and Marshall, Paul (2009): Equal Opportunities: Do Shareable Interfaces Promote More Group Participation Than Single User Displays?. In Human-Computer Interaction, 24 (1) pp. 79-116. Available online
Computers designed for single use are often appropriated suboptimally when used by small colocated groups working together. Our research investigates whether shareable interfaces -- that are designed for more than one user to interact with-can facilitate more equitable participation in colocated group settings compared with single user displays. We present a conceptual framework that characterizes Shared Information Spaces (SISs) in terms of how they constrain and invite participation using different entry points. An experiment was conducted that compared three different SISs: a physical-digital set-up (least constrained), a multitouch tabletop (medium), and a laptop display (most constrained). Statistical analyses showed there to be little difference in participation levels between the three conditions other than a predictable lack of equity of control over the interface in the laptop condition. However, detailed qualitative analyses revealed more equitable participation took place in the physical-digital condition in terms of verbal utterances over time. Those who spoke the least contributed most to the physical design task. The findings are discussed in relation to the conceptual framework and, more generally, in terms of how to select, design, and combine different display technologies to support collaborative activities.
© All rights reserved Rogers et al. and/or Taylor and Francis
Hornecker, Eva, Marshall, Paul, Dalton, Nick Sheep and Rogers, Yvonne (2008): Collaboration and interference: awareness with mice or touch input. In: Proceedings of ACM CSCW08 Conference on Computer-Supported Cooperative Work 2008. pp. 167-176. Available online
Multi-touch surfaces are becoming increasingly popular. An assumed benefit is that they can facilitate collaborative interactions in co-located groups. In particular, being able to see another's physical actions can enhance awareness, which in turn can support fluid interaction and coordination. However, there is a paucity of empirical evidence or measures to support these claims. We present an analysis of different aspects of awareness in an empirical study that compared two kinds of input: multi-touch and multiple mice. For our analysis, a set of awareness indices was derived from the CSCW and HCI literatures, which measures both the presence and absence of awareness in co-located settings. Our findings indicate higher levels of awareness for the multi-touch condition accompanied by significantly more actions that interfere with each other. A subsequent qualitative analysis shows that the interactions in this condition were more fluid and that interference was quickly resolved. We suggest that it is more important that resources are available to negotiate interference rather than necessarily to attempt to prevent it.
© All rights reserved Hornecker et al. and/or ACM Press
Marshall, Paul, Hornecker, Eva, Morris, Richard, Dalton, Nick Sheep and Rogers, Yvonne (2008): When the fingers do the talking: A study of group participation with varying constraints to a tabletop interface. In: Third IEEE International Workshop on Tabletops and Interactive Surfaces Tabletop 2008 October 1-3, 2008, Amsterdam, The Netherlands. pp. 33-40. Available online
Marshall, Paul (2007): Do tangible interfaces enhance learning?. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 163-170. Available online
Conceptual work on tangible interfaces has focused primarily on the production of descriptive frameworks. While this work has been successful in mapping out a space of technical possibilities and providing a terminology to ground discussion, it provides little guidance on the cognitive or social effects of using one type of interface or another. In this paper we look at the area of learning with tangible interfaces, suggesting that more empirically grounded research is needed to guide development. We provide an analytic framework of six perspectives, which describes latent trends and assumptions that might be used to motivate and guide this work, and makes links with existing research in cognitive science and education.
© All rights reserved Marshall and/or ACM Press
Hornecker, Eva, Marshall, Paul and Rogers, Yvonne (2007): From entry to access: how shareability comes about. In: Koskinen, Ilpo and Keinonen, Turkka (eds.) DPPI 2007 - Proceedings of the 2007 International Conference on Designing Pleasurable Products and Interfaces August 22-25, 2007, Helsinki, Finland. pp. 328-342. Available online
Fitzpatrick, Geraldine, Marshall, Paul and Phillips, Anthony (2006): CVS integration with notification and chat: lightweight software team collaboration. In: Proceedings of ACM CSCW06 Conference on Computer-Supported Cooperative Work 2006. pp. 49-58. Available online
Code management systems like Concurrent Version System (CVS) can play an important role in supporting coordination in software development, but often at some time removed from original CVS log entries or removed from the informal conversations around the code. The focus of this paper is one team's long term use of a solution where CVS is augmented with a lightweight event notification system, Elvin, and a tickertape tool where CVS messages are displayed and where developers can also chat with one another. Through a statistical analysis of CVS logs, and a qualitative analysis of tickertape logs and interview data, there is evidence of the tool transforming archival log entries into communicative acts and supporting timely interactions. Developers used the close integration of CVS with chat for growing team culture, stimulating focused discussion, supplementing log information, marking phases of work, coordinating and negotiating work, and managing availability and interruptibility. This has implications for consideration of more lightweight solutions for supporting collaborative software development, as well as managing awareness and interruptions more generally.
© All rights reserved Fitzpatrick et al. and/or ACM Press
Marshall, Paul, Payandeh, Shahram and Dill, John (2006): A Study on Haptic Rendering in a Simulated Surgical Training Environment. In: HAPTICS 2006 - 14th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems 25-26 March, 2006, Arlington, VA, USA. p. 35. Available online
Marshall, Paul, Price, Sara and Rogers, Yvonne (2003): Conceptualising tangibles to support learning. In: Proceedings of ACM IDC03: Interaction Design and Children 2003. pp. 101-109. Available online
We present a new way of conceptualising tangibles for learning. This scheme adopts Heidegger's analysis of the ways a user can treat a tool: either as 'ready-to-hand' or 'present-at-hand'. It also proposes two types of activity a learner can engage in when using a tangible: either exploratory or expressive activity. Finally, two types of models that a user can explore are proposed: theoretical and practical models. Examples from the literature are described in terms of this framework and an example is given from our own work of an attempt to use this conceptualisation in design.
© All rights reserved Marshall et al. and/or ACM Press
Join our community and advance:
Page maintainer: The Editorial Team