Number of co-authors:9
Number of publications with 3 favourite co-authors:Barry Arons:3Debby Hindus:2Chris Schmandt:2
Lisa Stifelman's 3 most productive colleagues in number of publications:Elizabeth D. Mynat..:71Chris Schmandt:40Maribeth Back:18
89% booked. Starts in 6 days
Affordances: Designing Intuitive User Interfaces
84% booked. Starts in 11 days
User Experience: The Beginner's Guide
Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess
User Experience and Experience Design !
Our Latest Books
Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger
Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad
Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam
Publications by Lisa Stifelman (bibliography)
Stifelman, Lisa, Arons, Barry and Schmandt, Chris (2001): The Audio Notebook: Paper and Pen Interaction with Structured Speech. In: Beaudouin-Lafon, Michel and Jacob, Robert J. K. (eds.) Proceedings of the ACM CHI 2001 Human Factors in Computing Systems Conference March 31 - April 5, 2001, Seattle, Washington, USA. pp. 182-189.
This paper addresses the problem that a listener experiences when attempting to capture information presented during a lecture, meeting, or interview. Listeners must divide their attention between the talker and their notetaking activity. We propose a new device-the Audio Notebook-for taking notes and interacting with a speech recording. The Audio Notebook is a combination of a digital audio recorder and paper notebook, all in one device. Audio recordings are structured using two techniques: user structuring based on notetaking activity, and acoustic structuring based on a talker's changes in pitch, pausing, and energy. A field study showed that the interaction techniques enabled a range of usage styles, from detailed review to high speed skimming. The study motivated the addition of phrase detection and topic suggestions to improve access to the audio recordings. Through these audio interaction techniques, the Audio Notebook defines a new approach for navigation in the audio domain.
© All rights reserved Stifelman et al. and/or ACM Press
Singer, Andrew, Hindus, Debby, Stifelman, Lisa and White, Sean (1999): Tangible Progress: Less is More in Somewire Audio Spaces. In: Altom, Mark W. and Williams, Marian G. (eds.) Proceedings of the ACM CHI 99 Human Factors in Computing Systems Conference May 15-20, 1999, Pittsburgh, Pennsylvania. pp. 104-111.
We developed four widely different interfaces for users of Somewire, a prototype audio-only media space. We informally studied users' experiences with the two screen-based interfaces. We prototyped a non-screen-based interface as an example of a novel tangible interface for a communication system. We explored the conflict between privacy and simplicity of representation, and identified two unresolved topics: the role of audio quality and the prospects for scaling audio spaces beyond a single workgroup. Finally, we formulated a set of design guidelines for control and representation in audio spaces, as follows: GUIs are not well-suited to audio spaces, users do not require control over localization or other audio attributes, and awareness of other users' presence is desirable.
© All rights reserved Singer et al. and/or ACM Press
Hindus, Debby, Arons, Barry, Stifelman, Lisa, Gaver, William, Mynatt, Elizabeth D. and Back, Maribeth (1995): Designing Auditory Interactions for PDAs. In: Robertson, George G. (ed.) Proceedings of the 8th annual ACM symposium on User interface and software technology November 15 - 17, 1995, Pittsburgh, Pennsylvania, United States. pp. 143-146.
This panel addresses issues in designing audio-based user interactions for small, personal computing devices, or PDAs. One issue is the nature of interacting with an auditory PDA and the interplay of affordances and form factors. Another issue is how both new and traditional metaphors and interaction concepts might be applied to auditory PDAs. The utility and design of nonspeech cues are discussed, as are the aesthetic issues of persona and narrative in designing sounds. Also discussed are commercially available sound and speech components and related hardware tradeoffs. Finally, the social implications of auditory interactions are explored, including privacy, fashion and novel social interactions.
© All rights reserved Hindus et al. and/or ACM Press
Stifelman, Lisa (1995): A Tool to Support Speech and Non-Speech Audio Feedback Generation in Audio Interfaces. In: Robertson, George G. (ed.) Proceedings of the 8th annual ACM symposium on User interface and software technology November 15 - 17, 1995, Pittsburgh, Pennsylvania, United States. pp. 171-179.
Development of new auditory interfaces requires the integration of text-to-speech synthesis, digitized audio, and non-speech audio output. This paper describes a tool for specifying speech and non-speech audio feedback and its use in the development of a speech interface, Conversational VoiceNotes. Auditory feedback is specified as a context-free grammar, where the basic elements in the grammar can be either words or non-speech sounds. The feedback specification method described here provides the ability to vary the feedback based on the current state of the system, and is flexible enough to allow different feedback for different input modalities (e.g., speech, mouse, buttons). The declarative specification is easily modifiable, supporting an iterative design process.
© All rights reserved Stifelman and/or ACM Press
Stifelman, Lisa, Arons, Barry, Schmandt, Chris and Hulteen, Eric A. (1993): VoiceNotes: A Speech Interface for a Hand-Held Voice Notetaker. In: Ashlund, Stacey, Mullet, Kevin, Henderson, Austin, Hollnagel, Erik and White, Ted (eds.) Proceedings of the ACM CHI 93 Human Factors in Computing Systems Conference April 24-29, 1993, Amsterdam, The Netherlands. pp. 179-186.
VoiceNotes is an application for a voice-controlled hand-held computer that allows the creation, management, and retrieval of user-authored voice notes -- small segments of digitized speech containing thoughts, ideas, reminders, or things to do. Iterative design and user testing helped to refine the initial user interface design. VoiceNotes explores the problem of capturing and retrieving spontaneous ideas, the use of speech as data, and the use of speech input and output in the user interface for a hand-held computer without a visual display. In addition, VoiceNotes serves as a step toward new uses of voice technology and interfaces for future portable devices.
© All rights reserved Stifelman et al. and/or ACM Press
Show list on your website
Join our community and advance:
Changes to this page (author)28 Apr 2003: Added
Page maintainer: The Editorial Team