Number of co-authors:11
Number of publications with 3 favourite co-authors:Jay Elkerton:7Gene Lynch:3Scott Lewis:2
Susan Palmiter's 3 most productive colleagues in number of publications:Michael Mateas:17Jay Elkerton:17Gene Lynch:7
For a list of all the ways technology has failed to improve the quality of life, please press three.
-- Alice Kahn
Read the fascinating history of Wearable Computing, told by its father, Steve Mann
Read Steve's chapter !
Has also published under the name of:
"Susan L. Palmiter"
Publications by Susan Palmiter (bibliography)
Palmiter, Susan, Lynch, Gene, Day, Jennifer, Geist, Melinda and Rhoads, Bryan (2005): Focus on the individual: the future of web-based product support. In: Proceedings of ACM CHI 2005 Conference on Human Factors in Computing Systems 2005. pp. 2136-2137.
Lewis, Scott, Mateas, Michael, Palmiter, Susan and Lynch, Gene (1996): Ethnographic Data for Product Development: A Collaborative Process. In Interactions, 3 (6) pp. 52-69.
Palmiter, Susan, Lynch, Gene, Lewis, Scott and Stempski, Mark (1994): Breaking Away from the Conventional 'Usability Lab': The Customer-Centered Design Group at Tektronix, Inc.. In Behaviour and Information Technology, 13 (1) pp. 128-131.
The conventional usability lab is primarily responsible for testing prototypes and products to determine if customers will accept a new design. Often this testing comes too late in the development cycle to allow major design or product changes to occur. In the Customer-Centered Design Group at Tektronix Labs, the usability lab is a small part of our group's involvement in the entire design life cycle of a Tektronix product. We work with design groups to bring the benefits of a usability lab to all phases of design, beginning with understanding our customer's current system and work processes to assessing the competitor's strengths and weaknesses to simulating and evaluating design alternatives. Our 'lab' is often on the road; meeting with customers where they work, working with design teams to simulate and prototype designs, and evaluating designs with our customers. To keep in touch with customers and to keep product development focused, we feel a usability group must break down the barriers inherent in a conventional testing suite. By breaking these barriers we can better determine what customers need and how these needs are addressed throughout the entire product life cycle.
© All rights reserved Palmiter et al. and/or Taylor and Francis
Palmiter, Susan and Elkerton, Jay (1993): Animated Demonstrations for Learning Procedural Computer-Based Tasks. In Human-Computer Interaction, 8 (3) pp. 193-216.
Animated demonstrations display the execution of interface procedures. They appear to be a natural and fast way for users to learn direct-manipulation interfaces by watching. To assess their effectiveness for users learning HyperCard, we compared carefully matched animated demonstrations, procedural textual instructions, and demonstrations combined with spoken procedural text. During training, demonstration users were faster and more accurate than text-only users. Without the instructions, 7 days later, text-only users were faster and as accurate as demonstration users in recalling and performing identical and similar tasks without the instructions. Surprisingly, users of the combined demonstrations with spoken text closely mirrored the results of the demonstration-only users. The poor retention and transfer for the demonstration users appeared to be due to mimicry of the demonstrated procedures. Even with accompanying spoken text, the simplicity of using animated demonstrations may encourage superficial processing and disregard for the procedural text.
© All rights reserved Palmiter and Elkerton and/or Taylor and Francis
Palmiter, Susan (1993): The Effectiveness of Animated Demonstrations for Computer-based Tasks: a Summary, Model and Future Research. In J. Vis. Lang. Comput., 4 (1) pp. 71-89.
Palmiter, Susan and Elkerton, Jay (1991): An Evaluation of Animated Demonstrations for Learning Computer-Based Tasks. In: Robertson, Scott P., Olson, Gary M. and Olson, Judith S. (eds.) Proceedings of the ACM CHI 91 Human Factors in Computing Systems Conference April 28 - June 5, 1991, New Orleans, Louisiana. pp. 257-263.
Animated demonstrations are real-time instantiations of computer-based procedures. They appear to be a natural way of helping people learn direct manipulation interfaces, yet we know little about their efficacy. Carefully matched animated demonstrations, procedural textual instructions, and a combination of demonstrations and spoken text were compared. The demonstration groups were faster and more accurate when learning procedural tasks, but seven days later, the text group was faster and as accurate when performing identical and similar tasks. Apparently, the processing of animated demonstrations may not be sufficient for retention and transfer of interface procedures. Even with accompanying text provided, the simplicity of using demonstrations may encourage mimicry and disregard of text.
© All rights reserved Palmiter and Elkerton and/or ACM Press
Palmiter, Susan, Elkerton, Jay and Baggett, Patricia (1991): Animated Demonstrations vs Written Instructions for Learning Procedural Tasks: A Preliminary Investigation. In International Journal of Man-Machine Studies, 34 (5) pp. 687-701.
Animated demonstrations have been created due to the development of direct manipulation interfaces and the need for faster learning, so that users can learn interface procedures by watching. To compare animated demonstrations with written instructions we observed users learning and performing HyperCard authoring tasks on the Macintosh during three performance sessions. In the training session, users were asked either to watch a demonstration or read the procedures needed for the task and then to perform the task. In the later two sessions users were asked to perform tasks identical or similar to the tasks used in the training session. Results showed that demonstrations provided faster and more accurate learning during the training session. However, during the later sessions those who saw demonstrated procedures took longer to perform the tasks than did users of written instructions. Users appeared to be mimicking the training demonstrations without processing the information which would be needed later. In fact, when users had to infer procedures for tasks which were similar to those seen in the training session, the text group was much better at deducing the necessary procedures than the demonstration group. These findings indicate that animated demonstrations, as they were implemented for this study, were not robust enough to aid in later transfer.
© All rights reserved Palmiter et al. and/or Academic Press
Elkerton, Jay, Goldstein, Steven J. and Palmiter, Susan (1990): Designing a Help System Using a GOMS Model: A Preliminary Method Execution Analysis. In: D., Woods, and E., Roth, (eds.) Proceedings of the Human Factors Society 34th Annual Meeting 1990, Santa Monica, USA. pp. 259-263.
The GOMS model (Card, Moran, and Newell, 1983) was used to develop the content of a help system from the goals, operators, methods, and selection rules needed to perform HyperCard authoring tasks. Three groups of 12 novice HyperCard users performed 28 authoring tasks using either the GOMS help system, an original help system developed by Apple Computer, or no help at all (a control group). In the two help groups, users were provided the most complete help method and did not have to search for the help information. The results indicated that both help systems significantly decreased the time spent performing the authoring tasks when compared to the control group. Although a 23% decrease in execution time for GOMS users compared to original users was not significant, variance ratios confirmed that GOMS users, as a group, were more consistent when compared to original and control users. Also, GOMS users spent significantly less time per help display, translating the help methods into execution performance 78% more efficiently than original users. This result probably was due to the procedurally explicit and consistent help methods specified by the GOMS model.
© All rights reserved Elkerton et al. and/or Human Factors Society
Elkerton, Jay and Palmiter, Susan (1989): Designing Help Systems Using the GOMS Model: An Information Retrieval Evaluation. In: Proceedings of the Human Factors Society 33rd Annual Meeting 1989. pp. 281-285.
Using the GOMS model (Card, Moran, and Newell, 1983), a help system was developed which was complete and well structured. The content of this help system was determined from the goals, operators, methods, and selection rules needed to perform HyperCard authoring tasks. The index to these methods, which was an integrated part of the system, was determined from the hierarchical goal tree provided by the GOMS analysis. To determine the effectiveness of using GOMS as a design aid for help systems, the GOMS help system was compared to a state-of-the art interface developed by Apple Computer which was modified slightly for experimental purposes (Original help system). Two groups of 14 users, using one of the two help systems, retrieved help information about 56 tasks separated into 4 sessions. The results indicated that the GOMS users were significantly faster than the Original users with the largest speed difference occurring in the first session. However, no reliable differences were found for retrieval accuracy between the two groups. This is not surprising since the Original help system was found to have 85.9% of the procedural information contained in the GOMS help system. Interestingly, participants subjectively rated the GOMS help system higher than the Original help system. Overall, the results from this information retrieval study suggest that a GOMS model can aid in the development of help systems which are easy to use, easy to learn, and well liked.
© All rights reserved Elkerton and Palmiter and/or Human Factors Society
Evans, Susan M., Palmiter, Susan and Elkerton, Jay (1988): The EDGE System: Ergonomic Design Using Graphic Evaluation. In: Proceedings of the Human Factors Society 32nd Annual Meeting 1988. pp. 612-616.
EDGE is a computer-based ergonomic workspace design system which integrates several models of operator performance with a common graphic interface. In addition to serving as a practical design system, it also serves as a research tool for understanding the ergonomic design process in industry. System users include trained ergonomists and engineers responsible for manual workspace design. The design system centers around a core vocabulary of task-related terms. A common input format, modeled after the traditional "work methods table" addresses the input requirements of the varied human performance models. Output from the performance models is displayed on multiple screens in varying levels of detail. Among the measures of physical stress currently integrated into the system are models of biomechanical strength, NIOSH lifting limits, metabolic energy expenditure, and elemental time prediction.
© All rights reserved Evans et al. and/or Human Factors Society
Palmiter, Susan and Elkerton, Jay (1987): Evaluation Metrics and a Tool for Control Panel Design. In: Proceedings of the Human Factors Society 31st Annual Meeting 1987. pp. 1123-1127.
In the use of control panels for the automotive industry, consistency and usability are of major importance. General qualitative guidelines exist for the designer, but there is currently a lack of quantitative human factors data for control panel designs. A state of the art design tool which provides the designer with ergonomic usability guidelines and structure is needed. As part of the current research, a computer-based tool which provides a quantitative analysis of the ergonomic quality of a control panel layout has been created. This tool is a tailored Auto-CAD program for the IBM PC which provides features to encourage consistency and structure in the design of control panel layouts. Extending the work by Tullis (1983) on alphanumeric display metrics, four graphical metrics for the overall and functional design levels are included as part of the design tool. These metrics are: 1) Overall Density -- rate of free space to occupied space, 2) Local Density -- how closely placed the design entities are to each other, 3) Layout Complexity -- position irregularity of functional areas, and 4) Display Grouping -- number of functions and number of controls and displays. In this effort, the design metrics and the design tool have been developed.
© All rights reserved Palmiter and Elkerton and/or Human Factors Society
Show this list on your homepage
Join the technology elite and advance:
Changes to this page (author)14 Apr 2011: Added12 Feb 2010: Modified
29 Jun 2007: Added
26 Jun 2007: Added
26 Jun 2007: Added
25 Jun 2007: Added
25 Jun 2007: Added
28 Apr 2003: Added
Page maintainer: The Editorial Team