Number of co-authors:14
Number of publications with 3 favourite co-authors:Emilie M. Roth:7William F. Stubler:2Ray S. Perez:1
Randall J. Mumaw's 3 most productive colleagues in number of publications:David D. Woods:35Kim J. Vicente:24Emilie M. Roth:21
Civilization advances by extending the number of important operations which we can perform without thinking of them.
-- Alfred North Whitehead
Read the fascinating history of Wearable Computing, told by its father, Steve Mann
Read Steve's chapter !
Randall J. Mumaw
Has also published under the name of:
"R. J. Mumaw"
Publications by Randall J. Mumaw (bibliography)
Vicente, Kim J., Roth, Emilie M. and Mumaw, Randall J. (2001): How do Operators Monitor a Complex, Dynamic Work Domain? The Impact of Control Room Technology. In International Journal of Human-Computer Studies, 54 (6) pp. 831-856.
This article describes part of a research programme whose goal is to develop a better understanding of how operators monitor complex, dynamic systems under normal operations. In a previous phase, field observations were made at two older nuclear power plant control rooms (CRs) consisting primarily of analogue, hard-wired instrumentation. In this phase, additional field observations were conducted in a newer computer-based CR to determine the impact of CR technology on operator monitoring. Eleven different operators were observed in situ for a total of approximately 88 h. The findings indicate that there are many similarities in the monitoring strategies adopted by operators in the two types of CRs. However, in most cases, these same strategies are performed using different behaviours, thereby showing the shaping effect of the CR technology. A new way of conceptualizing the difference between traditional analogue CRs and modern computer-based CRs is proposed.
© All rights reserved Vicente et al. and/or Academic Press
Roth, Emilie M. and Mumaw, Randall J. (1995): Using Cognitive Task Analysis to Define Human Interface Requirements for First-of-a-Kind Systems. In: Proceedings of the Human Factors and Ergonomics Society 39th Annual Meeting 1995. pp. 520-524.
Cognitive task analysis (CTA) methods have grown out of the need to explicitly consider cognitive processing requirements of complex tasks. A number of approaches to CTA have been developed that vary in goals, the tools they bring to bear, and their data requirements. We present a particular CTA technique that we are utilizing in the design of new person-machine interfaces for first-of-a-kind advanced process control plants. The methodology has its roots in the formal analytic goal-means decomposition method pioneered by Rasmussen (1986). It contrasts with other approaches in that it is intended: (1) for design of first-of-a-kind systems for which there are no close existing analogues precluding the use of CTA techniques that rely on empirical analysis of expert performance; (2) to define person-machine interface requirements to support operator problem-solving and decision-making in unanticipated situations; and (3) to be a pragmatic, codified, tool that can be used reliably by person-machine interface designers.
© All rights reserved Roth and Mumaw and/or Human Factors Society
Mumaw, Randall J. and Roth, Emilie M. (1995): Training Complex Tasks in a Functional Context. In: Proceedings of the Human Factors and Ergonomics Society 39th Annual Meeting 1995. pp. 1253-1257.
We have reviewed training programs for complex skills that have strong decision-making components, such as nuclear power plant operations and air traffic control. In each case, we found that an ISD approach is routinely applied to training-program design. The ISD framework can aid training designers in designing individual modules of instruction but seems to provide insufficient guidance on designing the larger training-program structure. We found two types of problems. First, because a good understanding of skill acquisition is not used to drive training-program design, training activities can be ineffective or inefficient. Second, because it is difficult to get insights on cognitive skills with traditional task analysis, the core decision-making task is not trained explicitly. Trainees are typically on their own to discover decision-making skills. We developed an alternative framework for training-program design called the Functional Context Approach. This approach attempts to restore efficiency to skill acquisition and improve training of critical decision-making skills.
© All rights reserved Mumaw and Roth and/or Human Factors Society
Moses, Franklin L., Salas, Ed, Cannon-Bowers, Janis A., Perez, Ray S., Roth, Emilie M., Mumaw, Randall J., Mirabella, Angelo, Cohen, Marvin S. and Klein, Gary (1994): Improved Training Methods: Research to Applications. In: Proceedings of the Human Factors and Ergonomics Society 38th Annual Meeting 1994. pp. 1150-1153.
How to train people to make good decisions, solve problems, and so on depends, as does all training, on some form of practice and feedback. The question for behavioral research often is how to improve on these basic requirements. Six panelists describe and discuss their research and experience with the relationship among training and factors such as group dynamics, stress, mental models, and naturalistic requirements. This session includes interaction among the panel and the audience.
© All rights reserved Moses et al. and/or Human Factors Society
Mumaw, Randall J., Roth, Emilie M. and Schoenfeld, Isabelle (1993): Analysis of Complexity in Nuclear Power Severe Accident Management. In: Proceedings of the Human Factors and Ergonomics Society 37th Annual Meeting 1993. pp. 377-381.
A model of decision making has been developed for nuclear power plant operations and has been previously applied to the analysis of performance during emergency operations. The model was extended to identify the cognitive skills required, the types of complexity that can arise, and the potential for human error in severe accident management (SAM). Twelve SAM scenarios were developed to aid in this analysis. Potential sources of complexity and error are described and illustrated, and implications for training cognitive skills are discussed.
© All rights reserved Mumaw et al. and/or Human Factors Society
Stubler, William F., Roth, Emilie M. and Mumaw, Randall J. (1991): Evaluation Issues for Computer-Based Control Rooms. In: Proceedings of the Human Factors Society 35th Annual Meeting 1991. pp. 383-387.
The design of control centers is advancing toward totally computer-based man-machine interfaces. Computer based interfaces offer many potential advantages over traditional hardwired control panel interfaces including greater flexibility regarding the type of data displayed and its presentation. However, achieving this potential will require development of new interface concepts that will change the way operators interact with the plant. Extensive evaluation throughout the design process will be required to verify and validate the interface concepts. This paper describes a process for uncovering evaluation issues related to the computer-based control room concept and its relationship to cognitive activities of plant control. Important evaluation issues are presented.
© All rights reserved Stubler et al. and/or Human Factors Society
Woods, David D., Roth, Emilie M., Stubler, William F. and Mumaw, Randall J. (1990): Navigating Through Large Display Networks in Dynamic Control Applications. In: D., Woods, and E., Roth, (eds.) Proceedings of the Human Factors Society 34th Annual Meeting 1990, Santa Monica, USA. pp. 396-399.
There is an increasing trend to use computer display systems as the primary "window" by which users see and interact with complex dynamic processes (e.g., air traffic control; computerized control rooms for process control). These kinds of applications offer special challenges to the design of computer based display systems. In particular, the large scope of these applications necessitates large display structures involving thousands of displays. Further, the dynamic nature of the tasks mean that users need to be able to move rapidly through the display structure to keep pace with temporally evolving situations and to be able to respond to new events as they occur. As a result, special display navigation challenges arise in computer based display systems for monitoring and controlling dynamic processes.
© All rights reserved Woods et al. and/or Human Factors Society
Singer, Michael J., Mumaw, Randall J. and Gilligan, Elizabeth L. (1988): The Formative Evaluation of a Decision Support System for Designing Training Devices. In: Proceedings of the Human Factors Society 32nd Annual Meeting 1988. pp. 1246-1250.
Formative evaluation in the broadest sense refers to the measurement of some system in order to make direct and immediate differences in the procedures, mechanisms, and goals of that system during development. ?The objective of this formative evaluation is to address three areas: 1) increase our understanding of how the targeted users make decisions, 2) train the user about how the system makes decisions and present information, and 3) develop information about interface and modeling changes needed in the system. What is needed in the design of decision process, and how the decisions actually could be effectively made. Both of these issues must be addressed before the user will accept and use the decision aid. The system also needs to be able to accept and use the information that the user considers necessary, as well as to present both recommendations and supporting information in acceptable formats. We have applied a structured interview within our formative evaluation process as a basis for integrating the user in the develop, revise, and deliver cycle. The structured interview vs conducted on-line, demonstrating what the system does while explanations of how it works are provided. The responses have provided information about whether the user thinks the system addresses the correct issues, the users agreement with the system analyses, and a report of the users decision processes. By including the user in the review of the developing system, the design of the prototype more accurately reflects the user's decision processes, as well as providing more usable output. This study will provide some insight about one method for evaluating decision aids early in the development process.
© All rights reserved Singer et al. and/or Human Factors Society
Show this list on your homepage
Join the technology elite and advance:
Changes to this page (author)15 Feb 2010: Modified27 Jun 2007: Added
27 Jun 2007: Added
26 Jun 2007: Added
26 Jun 2007: Added
26 Jun 2007: Added
26 Jun 2007: Added
25 Jun 2007: Added
28 Apr 2003: Added
Page maintainer: The Editorial Team