Number of co-authors:16
Number of publications with 3 favourite co-authors:Karon E. MacLean:4Kellogg S. Booth:3Alex Arthur:2
Colin Swindells's 3 most productive colleagues in number of publications:Kori Inkpen:70Kellogg S. Booth:56Karon E. MacLean:26
To design an easy-to-use interface, pay attention to what users do, not what they say. Self-reported claims are unreliable, as are user speculations about future behavior.
-- Jakob Nielsen
Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess
User Experience and Experience Design !
Our Latest Books
Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger
Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad
Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam
Publications by Colin Swindells (bibliography)
Oviatt, Sharon, Swindells, Colin and Arthur, Alex (2008): Implicit user-adaptive system engagement in speech and pen interfaces. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 969-978.
As emphasis is placed on developing mobile, educational, and other applications that minimize cognitive load on users, it is becoming more essential to explore interfaces based on implicit engagement techniques so users can remain focused on their tasks. In this research, data were collected with 12 pairs of students who solved complex math problems using a tutorial system that they engaged over 100 times per session entirely implicitly via speech amplitude or pen pressure cues. Results revealed that users spontaneously, reliably, and substantially adapted these forms of communicative energy to designate and repair an intended interlocutor in a computer-mediated group setting. Furthermore, this behavior was harnessed to achieve system
© All rights reserved Oviatt et al. and/or ACM Press
Cohen, Philip R., Swindells, Colin, Oviatt, Sharon L. and Arthur, Alexander M. (2008): A high-performance dual-wizard infrastructure for designing speech, pen, and multimodal interfaces. In: Digalakis, Vassilios, Potamianos, Alexandros, Turk, Matthew, Pieraccini, Roberto and Ivanov, Yuri (eds.) Proceedings of the 10th International Conference on Multimodal Interfaces - ICMI 2008 October 20-22, 2008, Chania, Crete, Greece. pp. 137-140.
Cohen, Phil, Swindells, Colin, Oviatt, Sharon and Arthur, Alex (2008): A high-performance dual-wizard infrastructure for designing speech, pen, and multimodal interfaces. In: Proceedings of the 2008 International Conference on Multimodal Interfaces 2008. pp. 137-140.
The present paper reports on the design and performance of a novel dual-Wizard simulation infrastructure that has been used effectively to prototype next-generation adaptive and implicit multimodal interfaces for collaborative groupwork. This high-fidelity simulation infrastructure builds on past development of single-wizard simulation tools for multiparty multimodal interactions involving speech, pen, and visual input . In the new infrastructure, a dual-wizard simulation environment was developed that supports (1) real-time tracking, analysis, and system adaptivity to a user's speech and pen paralinguistic signal features (e.g., speech amplitude, pen pressure), as well as the semantic content of their input. This simulation also supports (2) transparent user training to adapt their speech and pen signal features in a manner that enhances the reliability of system functioning, i.e., the design of mutually-adaptive interfaces. To accomplish these objectives, this new environment also is capable of handling (3) dynamic streaming digital pen input. We illustrate the performance of the simulation infrastructure during longitudinal empirical research in which a user-adaptive interface was designed for implicit system engagement based exclusively on users' speech amplitude and pen pressure . While using this dual-wizard simulation method, the wizards responded successfully to over 3,000 user inputs with 95-98% accuracy and a joint wizard response time of less than 1.0 second during speech interactions and 1.65 seconds during pen interactions. Furthermore, the interactions they handled involved naturalistic multiparty meeting data in which high school students were engaged in peer tutoring, and all participants believed they were interacting with a fully functional system. This type of simulation capability enables a new level of flexibility and sophistication in multimodal interface design, including the development of implicit multimodal interfaces that place minimal cognitive load on users during mobile, educational, and other applications.
© All rights reserved Cohen et al. and/or their publisher
Swindells, Colin, MacLean, Karon E., Booth, Kellogg S. and Meitner, Michael J. (2007): Exploring affective design for physical controls. In: Proceedings of ACM CHI 2007 Conference on Human Factors in Computing Systems 2007. pp. 933-942.
Physical controls such as knobs, sliders, and buttons are experiencing a revival as many computing systems progress from personal computing architectures towards ubiquitous computing architectures. We demonstrate a process for measuring and comparing visceral emotional responses of a physical control to performance results of a target acquisition task. In our user study, participants experienced mechanical and rendered friction, inertia, and detent dynamics as they turned a haptic knob towards graphical targets of two different widths and amplitudes. Together, this process and user study provide novel affect- and performance-based design guidance to developers of physical controls for emerging ubiquitous computing environments. Our work bridges extensive human factors work in mechanical systems that peaked in the 1960's, to contemporary trends, with a goal of integrating mechatronic controls into emerging ubiquitous computing systems.
© All rights reserved Swindells et al. and/or ACM Press
Swindells, Colin and MacLean, Karon E. (2007): Capturing the Dynamics of Mechanical Knobs. In: WHC 2007 - Second Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems 22-24 March, 2007, Tsukuba, Japan. pp. 194-199.
Swindells, Colin, MacLean, Karon E., Booth, Kellogg S. and Meitner, Michael (2006): A case-study of affect measurement tools for physical user interface design. In: Proceedings of the 2006 Conference on Graphics Interface 2006. pp. 243-250.
Designers of human-computer interfaces often overlook issues of affect. An example illustrating the importance of affective design is the frustration many of us feel when working with a poorly designed computing device. Redesigning such computing interfaces to induce more pleasant user emotional responses would improve the user's health and productivity. Almost no research has been conducted to explore affective responses in rendered haptic interfaces. In this paper, we describe results and analysis from two user studies as a starting point for future systematic evaluation and design of rendered physical controls. Specifically, we compare and contrast self-report and biometric measurement techniques for two common types of haptic interactions. First, we explore the tactility of real textures such as silk, putty, and acrylic. Second, we explore the kinesthetics of physical control renderings such as friction and inertia. We focus on evaluation methodology, on the premise that good affect evaluation and analysis cycles can be a useful element of the interface designer's tool palette.
© All rights reserved Swindells et al. and/or Canadian Information Processing Society
Swindells, Colin, Maksakov, Evgeny and MacLean, Karon E. (2006): The Role of Prototyping Tools for Haptic Behavior Design. In: HAPTICS 2006 - 14th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems 25-26 March, 2006, Arlington, VA, USA. p. 25.
Tory, Melanie and Swindells, Colin (2003): Comparing ExoVis Orientation Icon and In-Place 3D Visualization Techniques. In: Graphics Interface 2003 June 11-13, 2003, Halifax, Nova Scotia, Canada. pp. 57-64.
Swindells, Colin, Unden, Alex and Sang, Tao (2003): TorqueBAR: an ungrounded haptic feedback device. In: Oviatt, Sharon L., Darrell, Trevor, Maybury, Mark T. and Wahlster, Wolfgang (eds.) Proceedings of the 5th International Conference on Multimodal Interfaces - ICMI 2003 November 5-7, 2003, Vancouver, British Columbia, Canada. pp. 52-59.
Swindells, Colin, Unden, Alex and Sang, Tao (2003): TorqueBAR: an ungrounded haptic feedback device. In: Proceedings of the 2003 International Conference on Multimodal Interfaces 2003. pp. 52-59.
Kinesthetic feedback is a key mechanism by which people perceive object properties during their daily tasks -- particularly inertial properties. For example, transporting a glass of water without spilling, or dynamically positioning a handheld tool such as a hammer, both require inertial kinesthetic feedback. We describe a prototype for a novel ungrounded haptic feedback device, the TorqueBAR, that exploits a kinesthetic awareness of dynamic inertia to simulate complex coupled motion as both a display and input device. As a user tilts the TorqueBAR to sense and control computer programmed stimuli, the TorqueBAR's centre-of-mass changes in real-time according to the user's actions. We evaluate the TorqueBAR using both quantitative and qualitative techniques, and we describe possible applications for the device such as video games and real-time robot navigation.
© All rights reserved Swindells et al. and/or their publisher
Swindells, Colin, Inkpen, Kori, Dill, John C. and Tory, Melanie (2002): That one there! Pointing to establish device identity. In: Beaudouin-Lafon, Michel (ed.) Proceedings of the 15th annual ACM symposium on User interface software and technology October 27-30, 2002, Paris, France. pp. 151-160.
Computing devices within current work and play environments are relatively
static. As the number of 'networked' devices grows, and as people and their
devices become more dynamic, situations will commonly arise where users will
wish to use 'that device there' instead of navigating through traditional user
interface widgets such as lists. This paper describes a process for identifying
devices through a pointing gesture using custom tags and a custom stylus called
the gesturePen. Implementation details for this system are provided along with
qualitative and quantitative results from a formal user study. As ubiquitous
computing environments become more pervasive, people will rapidly switch their
focus between many computing devices. The results of our work demonstrate that
our gesturePen method can improve the user experience in ubiquitous
environments by facilitating significantly faster interactions between
© All rights reserved Swindells et al. and/or ACM Press
Swindells, Colin, Dill, John C. and Booth, Kellogg S. (2000): System Lag Tests for Augmented and Virtual Environments. In: Ackerman, Mark S. and Edwards, Keith (eds.) Proceedings of the 13th annual ACM symposium on User interface software and technology November 06 - 08, 2000, San Diego, California, United States. pp. 161-170.
Show list on your website
Join our community and advance:
Changes to this page (author)20 Apr 2011: Modified20 Apr 2011: Modified
12 Jun 2009: Modified
12 Jun 2009: Modified
30 May 2009: Modified
30 May 2009: Modified
12 May 2008: Modified
23 Jun 2007: Modified
19 Jun 2007: Modified
28 Apr 2003: Added
Page maintainer: The Editorial Team