Number of co-authors:8
Number of publications with 3 favourite co-authors:Leysia Palen:4R. Bowen Loftin:3Wayne D. Gray:2
Marilyn C. Salzman's 3 most productive colleagues in number of publications:Wayne D. Gray:44Leysia Palen:32R. Bowen Loftin:17
Civilization advances by extending the number of important operations which we can perform without thinking of them.
-- Alfred North Whitehead
Read the fascinating history of Wearable Computing, told by its father, Steve Mann
Read Steve's chapter !
Marilyn C. Salzman
Personal Homepage: salzmanconsulting.com/about.html
Current place of employment: Salzman Consulting
Marilyn C. Salzman is President of Salzman Consulting, LLC. Marilyn looks to deliver customer-centered and business-savvy designs for her clients. Marilyn as over 15 years practice in user experience research and design. She is highly skilled in user-centered research, design, and data analysis techniques used throughout the product and customer lifecycles. Throughout her career, Marilyn has designed interfaces for hardware and software products, including: business applications, ecommerce, portals, the web, consumer products, mobile devices, interactive voice systems, medical products, educational technologies, and documentation.
Prior to forming Salzman Consulting, LLC, Marilyn managed the customer experience research and design team (20-30 employees and contractors) and a multi-million dollar web design program for Sun Microsystem’s corporate web properties (including sun.com, commerce, downloads, communities, partners and portals). Marilyn also has worked for companies such as Genomica, US WEST Advanced Technologies, and American Institutes for Research, where she designed and evaluated a range of online, consumer, and business products.
Marilyn has a Ph.D. in Applied Cognitive Psychology and Human Factors Engineering from George Mason University and a B.S. in Human Factors Engineering from Tufts University.
Publications by Marilyn C. Salzman (bibliography)
Palen, Leysia and Salzman, Marilyn C. (2002): Voice-mail diary studies for naturalistic data capture under mobile conditions. In: Churchill, Elizabeth F., McCarthy, Joe, Neuwirth, Christine and Rodden, Tom (eds.) Proceedings of the 2002 ACM conference on Computer supported cooperative work November 16 - 20, 2002, New Orleans, Louisiana, USA. pp. 87-95.
Mobile technology requires new methods for studying its use under realistic
conditions "in the field." Reflexively, mobile technology also creates new
opportunities for data collection while participants are remotely located. We
report on our experiences with a variation on the paper-based diary study
technique, which we extend by using voice-mail paired with mobile and landline
telephony to more easily collect data in natural situations. We discuss lessons
learned from experiences with voice-mail diary studies in two investigations of
different scope. We also present suggestions for tailoring the technique to
different research objectives, garnering high subject participation, and
configuring the voice-mail system for data collection.
© All rights reserved Palen and Salzman and/or ACM Press
Palen, Leysia and Salzman, Marilyn C. (2002): Beyond the handset: designing for wireless communications usability. In ACM Transactions on Computer-Human Interaction, 9 (2) pp. 125-151.
Service-based wireless devices like wireless telephones require users to interact with aspects of the technology beyond the hardware and software of the handset. By entering into contractual relationships with service-providers, and by using network-based services, users interact with a larger system -- one that has social and technological components. The operation of the wireless telephone requires the assimilation of heterogeneous sources of information from the device manufacturer, sales people, customer service representatives, marketing people, and members of the popular media, among others, which can easily confound users' understanding of this new class of technology. Opportunities for usability problems therefore scale beyond the handset, as do opportunities for better design. We report the results of a study of 19 novice wireless phone users who were closely tracked for the first 6 weeks after service acquisition. Taking a technology-as-system analytical approach, we describe the wireless telephony system as four socio-technical components: hardware, software, "netware," and "bizware." This particular organization of the system is intended for the practical application of designing for usability.
© All rights reserved Palen and Salzman and/or ACM Press
Palen, Leysia, Salzman, Marilyn C. and Youngs, Ed (2001): Discovery and Integration of Mobile Communications in Everyday Life. In Personal and Ubiquitous Computing, 5 (2) pp. 109-122.
Palen, Leysia, Salzman, Marilyn C. and Youngs, Ed (2000): Going Wireless: Behavior & Practice of New Mobile Phone Users. In: Kellogg, Wendy A. and Whittaker, Steve (eds.) Proceedings of the 2000 ACM conference on Computer supported cooperative work 2000, Philadelphia, Pennsylvania, United States. pp. 201-210.
We report on the results of a study in which 19 new mobile phone users were closely tracked for the first six weeks after service acquisition. Results show that new users tend to rapidly modify their perceptions of social appropriateness around mobile phone use, that actual nature of use frequently differs from what users initially predict, and that comprehension of service-oriented technologies can be problematic. We describe instances and features of mobile telephony practice. When in use, mobile phones occupy multiple social spaces simultaneously, spaces with norms that sometimes conflict: the physical space of the mobile phone user and the virtual space of the conversation.
© All rights reserved Palen et al. and/or ACM Press
Salzman, Marilyn C., Dede, Chris and Loftin, R. Bowen (1999): VR's Frames of Reference: A Visualization Technique for Mastering Abstract Multidimensional Information. In: Altom, Mark W. and Williams, Marian G. (eds.) Proceedings of the ACM CHI 99 Human Factors in Computing Systems Conference May 15-20, 1999, Pittsburgh, Pennsylvania. pp. 489-495.
This paper describes a research study that investigated how designers can use frames of reference (egocentric, exocentric, and a combination of the two) to support the mastery of abstract multidimensional information. The primary focus of this study was the relationship between FORs and mastery; the secondary focus was on other factors (individual characteristics and interaction experience) that were likely to influence the relationship between FORs and mastery. This study's outcomes (1) clarify how FORs work in conjunction with other factors in shaping mastery, (2) highlight strengths and weaknesses of different FORs, (3) demonstrate the benefits of providing multiple FORs, and (4) provide the basis for our recommendations to HCI researchers and designers.
© All rights reserved Salzman et al. and/or ACM Press
Salzman, Marilyn C., Dede, Christopher J., Loftin, R. Bowen and Chen, Jim X. (1999): A Model for Understanding How Virtual Reality Aids Complex Conceptual Learning. In Presence: Teleoperators and Virtual Environments, 8 (3) pp. 293-316.
Gray, Wayne D. and Salzman, Marilyn C. (1998): Damaged Merchandise? A Review of Experiments That Compare Usability Evaluation Methods. In Human-Computer Interaction, 13 (3) pp. 203-261.
An interest in the design of interfaces has been a core topic for researchers and practitioners in the field of human-computer interaction (HCI); an interest in the design of experiments has not. To the extent that reliable and valid guidance for the former depends on the results of the latter, it is necessary that researchers and practitioners understand how small features of an experimental design can cast large shadows over the results and conclusions that can be drawn. In this review we examine the design of 5 experiments that compared usability evaluation methods (UEMs). Each has had an important influence on HCI thought and practice. Unfortunately, our examination shows that small problems in the way these experiments were designed and conducted call into serious question what we thought we knew regarding the efficacy of various UEMs. If the influence of these experiments were trivial, then such small problems could be safely ignored. Unfortunately, the outcomes of these experiments have been used to justify advice to practitioners regarding their choice of UEMs. Making such choices based on misleading or erroneous claims can be detrimental -- compromising the quality and integrity of the evaluation, incurring unnecessary costs, or undermining the practitioner's credibility within the design team. The experimental method is a potent vehicle that can help inform the choice of a UEM as well as help to address other HCI issues. However, to obtain the desired outcomes, close attention must be paid to experimental design.
© All rights reserved Gray and Salzman and/or Taylor and Francis
Gray, Wayne D. and Salzman, Marilyn C. (1998): Repairing Damaged Merchandise: A Rejoinder. In Human-Computer Interaction, 13 (3) pp. 325-335.
Our goal in writing "Damaged Merchandise?" (DM) was not to have the last word on the subject but to raise an awareness within the human-computer interaction (HCI) community of issues that we felt had been too long ignored or neglected. On reading the 10 commentaries from distinguished members of the HCI community, we were pleased to see that they had joined the debate and broadened the discussion. Subsequently, we were some what torn by how to proceed. Our first thought was to respond point by point, commentary by commentary. However, we refrain from addressing many specific issues here, as a full discussion would involve an article at least as long as DM. Instead we focus on a few important themes that emerged throughout our article and the ensuing discussion: * What is usability, how do we measure it, and what do we need to know about our usability evaluation methods (UEMs)? * Why do we find ourselves where we are? * What is the role of experiments versus other empirical studies in HCI? Are there common issues in the design of empirical studies? * How do we judge the value of a study? * Where do we go from here?
© All rights reserved Gray and Salzman and/or Taylor and Francis
Salzman, Marilyn C., Dede, Chris and Loftin, R. Bowen (1995): Usability and Learning in Educational Virtual Realities. In: Proceedings of the Human Factors and Ergonomics Society 39th Annual Meeting 1995. pp. 486-490.
Designing ScienceSpace, a series of virtual realities for teaching difficult science concepts and skills, has implications for designing sensorily immersive educational virtual realities. Through the design and evaluation of the worlds in ScienceSpace we are gaining insights into the general utility of sensorial immersion, as well as virtual reality's potential and limitations for enhancing learning. This paper focuses on the learner-centered design and evaluation of NewtonWorld, one of the virtual worlds in ScienceSpace. NewtonWorld is a sensorily immersive virtual learning environment in which students can challenge their intuitions about Newton's laws and the conservation of energy and momentum through game-like inquiry activities. We discuss how usability and learning issues have shaped the design and refinement of NewtonWorld. Additionally, we discuss implications of our work for designing sensorily immersive virtual reality interfaces that are usable and facilitate learning.
© All rights reserved Salzman et al. and/or Human Factors Society
Salzman, Marilyn C. and Rivers, S. David (1994): Smoke and Mirrors: Setting the Stage for a Successful Usability Test. In Behaviour and Information Technology, 13 (1) pp. 9-16.
Setting the stage, or testing atmosphere, is an important step in preparing for a usability test. This article addresses how to create a good testing atmosphere. We liken this process to that of preparing the stage for a theatre or movie production. In a usability test production, usability professionals serve as directors and set designers, camouflaging the stage (lab equipment), creating a set (an appropriate workspace), recruiting performers (participants representative of end-users), and executing the script (running the test). Usability professionals must attend to each of these issues because they can impact participants' performance, the flow of events, and, ultimately, data quality.
© All rights reserved Salzman and and/or Taylor and Francis
Show this list on your homepage
Join the technology elite and advance:
Changes to this page (author)06 Dec 2012: Modified04 Dec 2012: Modified
30 Nov 2012: Modified
11 Feb 2010: Modified
01 Jun 2009: Added
31 May 2009: Added
27 Jun 2007: Added
28 Apr 2003: Added
Page maintainer: The Editorial Team