Number of co-authors:29
Number of publications with 3 favourite co-authors:Jens Rasmussen:5Beverly L. Harrison:3Emilie M. Roth:2
Kim J. Vicente's 3 most productive colleagues in number of publications:Hiroshi Ishii:111Bill Buxton:78Gordon Kurtenbach:45
go to course
Emotional Design: How to make products people will love
go to course
UI Design Patterns for Successful Software
87% booked. Starts in 8 days
Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess
User Experience and Experience Design !
Our Latest Books
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
Kim J. Vicente
Personal Homepage: cel.mie.utoronto.ca/people/kjv/bio.htm
Kim J. Vicente is founding director of the Cognitive Engineering Laboratory at the University of Toronto where he is a professor in the departments of mechanical engineering, computer science and electrical and computer engineering. At the age of 34, he was promoted to full professor, the youngest in the 175 year history of the University of Toronto. Professor Vicente is also an adjunct professor of psychology at Miami University, Ohio, and a registered Professional Engineer in Ontario.
Vicente has conducted extensive research on cognitive engineering in applications as diverse as animation, aviation, engineering design, infectious diseases, food management, medicine, network management, nuclear power, and petrochemical processes. His research has led to technology transfer to AECL Research, Alias|Wavefront, Honeywell Technology Center, Mitsubishi, and particularly, the Toshiba Nuclear Engineering Laboratory. Professor Vicente also helped change the way a multinational Big Pharma corporation of 70,00 employees in 120 countries and $16 billion of annual sales does its business.
In 1985, Vicente received a B.A.Sc. in industrial engineering from the University of Toronto, in 1987 a M.S. in industrial engineering and operations research from the Virginia Polytechnic Institute and State University, and in 1991 a Ph.D. in mechanical engineering from the University of Illinois at Urbana-Champaign. During 1987-1988, Professor Vicente spent one year as a visiting scientist in the Section for Informatics and Cognitive Science of the Ris� National Laboratory in Roskilde, Denmark, where he worked with his mentor, Professor Jens Rasmussen. During 1991-92, he was also on the faculty of the School of Industrial and Systems Engineering at the Georgia Institute of Technology. During 2002-2003, Vicente was the thirty-first Jerome Clarke Hunsaker Distinguished Visiting Professor of Aerospace Information Engineering and Minta Martin Lecturer at the Massachusetts Institute of Technology.
Professor Vicente has been presented the first round Premier's Research Excellence Award, valued at $100,000, and was the first engineer to receive the McLean Award, the University of Toronto's wealthiest and most prestigious award for basic research. Vicente also received the Natural Sciences and Research Council E.W.R. Steacie Memorial Fellowship, Canada's most prestigious prize for young academics in all areas of science and engineering, the Outstanding Professional Achievement Award from the Federation of Portuguese Canadian Business and Professionals, the COPA Award for Outstanding Vision/Leadership from the Portuguese Canadian National Congress, the Brunswik New Investigator Award from the Brunswik Society, the award for the best paper published in the Human Factors Society Bulletin in 1991, and the outstanding abstract award in the area of Clinical Application of Technology from the Society for Technology in Anesthesia. In 1999, he was chosen by TIME Magazine as one of 25 Canadians under the age of 40, who is a "Leader for the 21st Century who will shape Canada's future".
Professor Vicente was an Associate Editor of the IEEE Transactions on Systems, Man, and Cybernetics, and on the editorial boards of Human Factors and the International Journal of Cognitive Ergonomics. He serves on the editorial board of Theoretical Issues in Ergonomics Science. Vicente worked Capitol Hill by serving as the first Canadian researcher ever to be invited to be on the Standing Committee for Human Factors of the National Academy of Sciences/National Research Council. Vicente is also a Senior Fellow and a member of the Corporation of Massey College at the University of Toronto.
In 1999, Vicente authored the first textbook in the area of cognitive engineering, Cognitive Work Analysis: Toward Safe, Productive, and Healthy Computer-based Work, published by Lawrence Erlbaum Associates. His latest book, The Human Factor: Revolutionizing the Way We Live with Technology, was published by Alfred A. Knopf Canada in Canada (2003), Routledge in the U.S. (2004), Les �ditions Logiques in Qu�bec (2004), and Ediouro in Brazil (2005). This book received the National Business Book Award and the Science in Society General Audience Book Award, and was a finalist for the Canadian Booksellers Association Libris Award for Non-fiction Book of the Year.
Professor Vicente has provided an expert opinion to the media on issues related to people and technology. He has been quoted, or his work featured in, numerous outlets, including: CTV, Globe & Mail, Maclean's, CBC Radio, and the Toronto Star.
Vicente has been invited to lecture in 11 countries on 4 continents, including lecture tours in Europe, Japan, and Australia. He has acted as a consultant to industry and government, including the Canadian Nuclear Agency, the Australian Defence Science and Technology Organisation, Honeywell Technology Center, Microsoft Corporation, NASA Ames Research Center, Nortel Networks, the U.S. Institute of Medicine, the U.S. National Academy of Engineering, and the U.S. Nuclear Regulatory Commission.
Professor Vicente has invented two areas of research - ecological interface design and cognitive work analysis - leading to hundreds of articles authored by many researchers around the world. Vicente's research has led to 1 patent, 2 books, 3 co-edited books, 178 refereed articles, 16 book chapters, 17 invited keynote addresses, and 54 technical reports. He is also listed in Canadian Who's Who. One of his journal articles, co-authored with Jens Rasmussen, has been chosen as one of the most important papers in the 110 year history of human factors.
Publications by Kim J. Vicente (bibliography)
Duez, Pierre and Vicente, Kim J. (2005): Ecological interface design and computer network management: The effects of network size and fault frequency. In International Journal of Human-Computer Studies, 63 (6) pp. 565-586. Available online
This article describes an experiment investigating the impact of ecological interface design (EID) on human performance in computer network management. This work domain is more dynamic than those previously studied under EID because there is a constant potential for the addition and removal of devices, as well as changing configurations, making it important to study the generalizability of the framework. Two interfaces were created for the University of Toronto campus network consisting of 220 nodes: a P interface based on existing design practices which presented primarily physical information and a P+F interface based on EID which presented both physical and functional information identified by an abstraction hierarchy analysis. Participants used one of the two interfaces to detect and diagnose faults or disturbances in the simulated network in real-time. Network size and fault frequency were both manipulated as within-participants variables. The P+F interface led to faster detection times overall, as well as improved fault detection rate and more accurate fault diagnosis under higher fault loads. These results suggest that the EID framework may lead to more robust monitoring performance in computer network management compared to existing interfaces.
© All rights reserved Duez and Vicente and/or Academic Press
Vicente, Kim J., Roth, Emilie M. and Mumaw, Randall J. (2001): How do Operators Monitor a Complex, Dynamic Work Domain? The Impact of Control Room Technology. In International Journal of Human-Computer Studies, 54 (6) pp. 831-856.
This article describes part of a research programme whose goal is to develop a better understanding of how operators monitor complex, dynamic systems under normal operations. In a previous phase, field observations were made at two older nuclear power plant control rooms (CRs) consisting primarily of analogue, hard-wired instrumentation. In this phase, additional field observations were conducted in a newer computer-based CR to determine the impact of CR technology on operator monitoring. Eleven different operators were observed in situ for a total of approximately 88 h. The findings indicate that there are many similarities in the monitoring strategies adopted by operators in the two types of CRs. However, in most cases, these same strategies are performed using different behaviours, thereby showing the shaping effect of the CR technology. A new way of conceptualizing the difference between traditional analogue CRs and modern computer-based CRs is proposed.
© All rights reserved Vicente et al. and/or Academic Press
Vicente, Kim J. (2000): HCI in the Global Knowledge-Based Economy: Designing to Support Worker Adaptation. In ACM Transactions on Computer-Human Interaction, 7 (2) pp. 263-280. Available online
Increasingly, people are being required to perform open-ended intellectual tasks that require discretionary decision making. These demands require a relatively unique approach to the design of computer-based support tools. A review of the characteristics associated with the global knowledge-based economy strongly suggests that there will be an increasing need for workers, managers, and organizations to adapt to change and novelty. This is equivalent to a call for designing computer tools that foster continuous learning. There are reasons to believe that the need to support adaptation and continuous learning will only increase. Thus, in the new millennium HCI should be concerned with explicitly designing for worker adaptation. The cognitive work analysis framework is briefly described as a potential programmatic approach to this practical design challenge.
© All rights reserved Vicente and/or ACM Press
Vicente, Kim J. (1999): Cognitive Work Analysis: Toward Safe, Productive, and Healthy Computer-Based Work. Lawrence Erlbaum Associates
Vicente, Kim J. (1999): Wanted: Psychologically Relevant, Device- and Event-Independent Work Analysis Techniques. In Interacting with Computers, 11 (3) pp. 237-254.
This article offers a commentary on Richardson, Ormerod, and Shepherd (in press) while building on the previous discussion in this journal of the relative merits of task analysis and systems analysis in human-computer interface design [1,2,7]. The SGT scheme described by Richardson et al. represents a valuable contribution to the work analyst's toolkit. However, it is limited in the extent to which it can identify the information requirements associated with unanticipated events. The abstraction hierarchy  is an event-independent work domain analysis technique that can be used to overcome this limitation while still satisfying the criteria of device-independence and psychological relevance. Future research should integrate the complementary advantages of SGT and the abstraction hierarchy into a single, unified framework for work analysis.
© All rights reserved Vicente and/or Elsevier Science
Janzen, Michael E. and Vicente, Kim J. (1998): Attention Allocation within the Abstraction Hierarchy. In International Journal of Human-Computer Studies, 48 (4) pp. 521-545.
Previous research has shown that Rasmussen's abstraction hierarchy, which consists of both physical and functional system models, provides a useful basis for interface design for complex human-machine systems. However, very few studies have quantitatively analysed how people allocate their attention across levels of abstraction. This experiment investigated the relationship between attention allocation strategies and performance on a thermal-hydraulic process simulation. Subjects controlled the process during both normal and fault situations for about an hour per weekday for approximately one month. All subjects used a multi-level interface consisting of four separate windows, each representing a level of the abstraction hierarchy. Subjects who made more frequent use of functional levels of information exhibited more accurate system control under normal conditions, and more accurate diagnosis performance under fault trials. Moreover, subjects who made efficient use of functional information exhibited faster fault compensation times. In contrast, subjects who made infrequent or inefficient use of functional information exhibited poorer performance on both normal and fault trials. These results provide some initial, specific evidence of the advantages of an abstraction hierarchy interface over more traditional interfaces that emphasize physical rather than functional information.
© All rights reserved Janzen and Vicente and/or Academic Press
Christoffersen, Klaus, Hunter, Christopher N. and Vicente, Kim J. (1998): A Longitudinal Study of the Effects of Ecological Interface Design on Deep Knowledge. In International Journal of Human-Computer Studies, 48 (6) pp. 729-762.
Some researchers have argued that providing operators with externalized, graphic representations can lead to a trade-off whereby deep knowledge is sacrificed for cognitive economy and performance. This article provides an initial empirical investigation of this hypothesis by presenting a longitudinal study of the effect of ecological interface design (EID), a framework for designing interfaces for complex industrial systems, on subjects' deep knowledge. The experiment continuously observed the quasi-daily performance of the subjects' over a period of six months. The research was conducted in the context of DURESS II, a real-time, interactive thermal-hydraulic process control simulation that was designed to be representative of industrial systems. The performance of two interfaces was compared, an EID interface based on physical and functional (P+F) system representations and a more traditional interface based solely on a physical (P) representation. Subjects were required to perform several control tasks, including startup, tuning, shutdown and fault management. Occasionally, a set of knowledge elicitation tests was administered to assess the evolution of subjects' deep knowledge of DURESS II. The results suggest that EID can lead to a functionally organized knowledge base as well as superior performance, but only if subjects actively reflect on the feedback they get from the interface. In contrast, if subjects adopt a surface approach to learning, then EID can lead to a shallow knowledge base and poor performance, although no worse than that observed with a traditional interface.
© All rights reserved Christoffersen et al. and/or Academic Press
Howie, Dianne E. and Vicente, Kim J. (1998): Making the Most of Ecological Interface Design: The Role of Self-Explanation. In International Journal of Human-Computer Studies, 49 (5) pp. 651-674.
Ecological interface design (EID) is a candidate framework for designing interfaces for complex sociotechnical systems. Interfaces based on EID have been shown to lead to better performance than traditional interfaces, but not all participants benefit equally. Thus, it is important to identify ways of raising the performance of all participants using an EID interface. The purpose of this article is to determine whether encouraging participants to engage in self-explanations (i.e. reasoning aloud) can help them "make the most" of EID. An experiment was conducted using DURESS II, an interactive, thermal-hydraulic process control microworld with an interface designed according to the principles of EID. During this one-month study, participants controlled DURESS II under normal and fault conditions on a quasi-daily basis. Two experimental groups occasionally watched a replay of their own performance immediately after completing a trial, while the control group did not. In addition, the self-explanation (SE) group was instructed to explain aloud the reasons for their control actions while watching the replay. The replay group simple watched their trials again with no verbal explanation. The SE participants were divided into "good" and "poor" groups according to several performance criteria. An analysis of the protocols produced during self-explanation revealed that "good" SE participants showed more signs of self-explanation in their protocols than did the "poor" SE participants. There were no substantial differences between the SE, replay and control groups for normal trials. However, the SE participants did have the best overall performance on fault trials, suggesting that self-explanation can help operators make the most of EID.
© All rights reserved Howie and Vicente and/or Academic Press
Harrison, Beverly L. and Vicente, Kim J. (1996): An Experimental Evaluation of Transparent Menu Usage. In: Tauber, Michael J., Bellotti, Victoria, Jeffries, Robin, Mackinlay, Jock D. and Nielsen, Jakob (eds.) Proceedings of the ACM CHI 96 Human Factors in Computing Systems Conference April 14-18, 1996, Vancouver, Canada. pp. 391-398. Available online
This paper reports a systematic evaluation of transparent user interfaces. It reflects our progression from theoretically-based experiments in focused attention to more representative application-based experiments on selection response times and error rates. We outline how our previous research relates to both the design and the results reported here. For this study, we used a variably-transparent, text menu superimposed over different backgrounds: text pages, wire-frame images, and solid images. We compared "standard" text (Motif style, Helvetica, 14 point) and a proposed font enhancement technique ("Anti-Interference" outlining). More generally, this experimental evaluation provides information about the interaction between transparency and text legibility.
© All rights reserved Harrison and Vicente and/or ACM Press
Pawlak, William S. and Vicente, Kim J. (1996): Inducing Effective Operator Control through Ecological Interface Design. In International Journal of Human-Computer Studies, 44 (5) pp. 653-688.
Ecological Interface Design (EID) is a theoretical framework for designing interfaces for complex human-machine systems. This article investigates the utility of EID in inducing effective real-time operator control performance during both normal and abnormal conditions. Two interfaces for a thermal-hydraulic process were compared, an EID interface based on physical and functional (P + F) system representations and a more traditional interface based solely on a physical (P) representation. Subjects were given 4 weeks of daily practice with one of the two interfaces before their performance on normal events and unfamiliar faults was evaluated. Under normal conditions, there was no performance difference between the P + F and P interfaces. However, dual task results indicate that the P interface loads more on verbal resources, whereas the P + F interface loads more on spatial resources during normal trials. Furthermore, a process tracing analysis of the fault trials showed that the P + F interface led to faster fault detection and more accurate fault diagnosis. Moreover, the P + F subjects exhibited a more sophisticated and effective set of fault management strategies that are similar to those observed in field studies of experienced operators in complex human-machine systems. In addition, a deficiency of the P +F interface was identified, suggesting a need for integrating historical information with emergent feature displays. Collectively, these findings have significant practical implications for the design of advanced computer interfaces for complex industrial systems.
© All rights reserved Pawlak and Vicente and/or Academic Press
Harrison, Beverly L., Ishii, Hiroshi, Vicente, Kim J. and Buxton, Bill (1995): Transparent Layered User Interfaces: An Evaluation of a Display Design to Enhance Focused and Divided Attention. In: Katz, Irvin R., Mack, Robert L., Marks, Linn, Rosson, Mary Beth and Nielsen, Jakob (eds.) Proceedings of the ACM CHI 95 Human Factors in Computing Systems Conference May 7-11, 1995, Denver, Colorado. pp. 317-324. Available online
This paper describes a new research program investigating graphical user interfaces from an attentional perspective (as opposed to a more traditional visual perception approach). The central research issue is how we can better support both focusing attention on a single interface object (without distraction from other objects) and dividing or time sharing attention between multiple objects (to preserve context or global awareness). This attentional trade-off seems to be a central but as yet comparatively ignored issue in many interface designs. To this end, this paper proposes a framework for classifying and evaluating user interfaces with semi-transparent windows, menus, dialogue boxes, screens, or other objects. Semi-transparency fits into a more general proposed display design space of "layered" interface objects. We outline the design space, task space, and attentional issues which motivated our research. Our investigation is comprised of both empirical evaluation and more realistic application usage. This paper reports on the empirical results and summarizes some of the application findings.
© All rights reserved Harrison et al. and/or ACM Press
Harrison, Beverly L., Kurtenbach, Gordon and Vicente, Kim J. (1995): An Experimental Evaluation of Transparent User Interface Tools and Information Content. In: Robertson, George G. (ed.) Proceedings of the 8th annual ACM symposium on User interface and software technology November 15 - 17, 1995, Pittsburgh, Pennsylvania, United States. pp. 81-90. Available online
The central research issue addressed by this paper is how we can design computer interfaces that better support human attention and better maintain the fluency of work. To accomplish this we propose to use semi-transparent user interface objects. This paper reports on an experimental evaluation which provides both valuable insights into design parameters and suggests a systematic evaluation methodology. For this study, we used a variably-transparent tool palette superimposed over different background content, combining text, wire-frame or line art images, and solid images. The experiment explores the issue of focused attention and interference, by varying both visual distinctiveness and levels of transparency.
© All rights reserved Harrison et al. and/or ACM Press
Burns, Catherine M. and Vicente, Kim J. (1995): A Framework for Describing and Understanding Interdisciplinary Interactions in Design. In: Proceedings of DIS95: Designing Interactive Systems: Processes, Practices, Methods, & Techniques 1995. pp. 97-103.
Today's design environments are highly constrained and projects are often worked on by designers from different domains. This paper describes a framework, based on the work of Rasmussen (1990), for examining these design processes in terms of design movements through levels of constraint and across design domains. The different design domains are defined by different disciplines. This framework was developed to assist in the analysis of a field study of the design of a nuclear power plant control room. The general structure of the framework is explained and then is used in five design scenarios to demonstrate its utility.
© All rights reserved Burns and Vicente and/or ACM Press
Christoffersen, Klaus, Hunter, Christopher N. and Vicente, Kim J. (1995): Ecological Interface Design and Fault Management Performance: Long-Term Effects. In: Proceedings of the Human Factors and Ergonomics Society 39th Annual Meeting 1995. pp. 496-500.
This paper presents a six-month longitudinal study of the effects of ecological interface design (EID) on fault management performance. The research was conducted in the context of DURESS II, a real-time, interactive thermal-hydraulic process control simulation that was designed to be representative of industrial systems. Subjects' performance on two interfaces was compared, one based on the principles of EID and another based on a more traditional piping and instrumentation diagram (P&ID) format. Subjects were required to perform several control tasks, including startup, tuning, shutdown, and fault management on both routine and non-routine faults. At the end of the experiment, subjects used the interface that the other group had been using to control the system. The results indicate that there are substantial individual differences in performance, but that overall, the EID interface led to faster fault detection, more accurate fault diagnosis, and faster fault compensation.
© All rights reserved Christoffersen et al. and/or Human Factors Society
Vicente, Kim J., Roth, Emilie M., Klein, Gary A. and Gordon, Sallie E. (1995): Cognitive Task Analysis: What Is It? Why Do It?. In: Proceedings of the Human Factors and Ergonomics Society 39th Annual Meeting 1995. p. 519.
Cognitive task analysis (CTA) is increasingly being used to effectively address a wide variety of human factors problems. However, different researchers are using significantly different methods. In many cases, a particular method is used solely by its originators. Therefore, there are significant issues that must be worked through before CTA becomes a widely accepted and easily transferable human factors tool. The objectives of this symposium are to: bring CTA to the attention of a wider audience; develop a better understanding of the differences and similarities between different CTA methods; and demonstrate the practical advantages of CTA.
© All rights reserved Vicente et al. and/or Human Factors Society
Vicente, Kim J. (1995): Task Analysis, Cognitive Task Analysis, Cognitive Work Analysis: What's the Difference?. In: Proceedings of the Human Factors and Ergonomics Society 39th Annual Meeting 1995. pp. 534-537.
The term cognitive task analysis (CTA) has been appearing in the human factors literature with increasing frequency. Others have used the term cognitive work analysis (CWA). Is there a difference? Do either of these methods differ from traditional task analysis (TA)? If so, what advantages can CTA/CWA provide human factors engineers? To address these issues, the history of work analysis methods and the evolution of work are reviewed. Work method analyses of the 19th century were suited to manual labor. As job demands progressed beyond the physical, traditional TA was introduced to provide a broader perspective. CTA has since been introduced to increase the emphasis on cognitive task demands. However, CTA, like TA, is incapable of dealing with unanticipated task demands. CWA has been introduced to deal with complex systems whose demands include unanticipated events. The initial evidence available indicates that CWA can be applied to industry-scale problems, leading to innovative designs.
© All rights reserved Vicente and/or Human Factors Society
Lin, Laura, Isla, Racquel, Doniz, Karine, Harkness, Heather, Vicente, Kim J. and Doyle, D. John (1995): Analysis, Redesign, and Evaluation of a Patient-Controlled Analgesia Machine Interface. In: Proceedings of the Human Factors and Ergonomics Society 39th Annual Meeting 1995. pp. 738-741.
The hypothesis explored in this paper is that, by adopting human factors design principles, the use of medical equipment can be made safer and more efficient. We have selected a commercially available patient-controlled analgesia (PCA) machine as a vehicle to test this hypothesis. A cognitive task analysis of PCA usage, combined with a set of human factors design principles, led to a redesigned PCA interface. An experimental evaluation was conducted, comparing this new interface with the existing interface. The results show that the new interface leads to significantly faster, less effortful, and more reliable performance. These findings have implications for improving the design of other medical equipment.
© All rights reserved Lin et al. and/or Human Factors Society
Flach, John M., Hancock, Peter A., Caird, Jeff and Vicente, Kim J. (1995): Global perspectives on the ecology of human-machine systems. Hillsdale, USA, Erlbaum
Bisantz, Ann M. and Vicente, Kim J. (1994): Making the Abstraction Hierarchy Concrete. In International Journal of Human-Computer Studies, 40 (1) pp. 83-117.
The abstraction hierarchy (AH) is a multileveled representation framework, consisting of physical and functional system models, which has been proposed as a useful framework for developing representations of complex work environments. Despite the fact that the AH is well known and widely cited in the cognitive engineering community, there are surprisingly few examples of its application. Accordingly, the intent of this paper is to provide a concrete example of how the AH can be applied as a knowledge representation framework. A formal instantiation of the AH as the basis for a computer program is presented in the context of a thermal-hydraulic process. This model of the system is complemented by a relatively simple reasoning mechanism which is independent of the information contained in the knowledge representation. This reasoning mechanism uses the AH model, along with qualitative user input about system states, to generate reasoning trajectories for different types of events and problems. Simulation outputs showing how the AH model can provide an effective basis for reasoning under different classes of situations, including challenging faults of various types, are presented. These detailed examples illustrate the various benefits of adopting the AH as a knowledge representation framework, namely: providing sufficient representations to allow reasoning about unanticipated fault and control situations, allowing the use of reasoning mechanisms that are independent of domain information, and having psychological relevance.
© All rights reserved Bisantz and Vicente and/or Academic Press
Moray, Neville, Lee, John, Vicente, Kim J., Jones, Barclay G. and Rasmussen, Jens (1994): A Direct Perception Interface for Nuclear Power Plants. In: Proceedings of the Human Factors and Ergonomics Society 38th Annual Meeting 1994. pp. 481-485.
Following the suggestions of Beltracchi (1987) a direct perception interface for the thermal hydraulic systems of a pressurized water nuclear power reactor (PWR) was developed. It presents operators with an animated graphic of the Rankine heat cycle describing the functional relations of steam generation in a PWR. The ability of students of thermal and nuclear systems to recall system states, and detect and diagnose nine transients was compared to that of experienced nuclear power plant operators. The results were compared to a display representing traditional analog meters. The direct perception interface supported better diagnostic performance, but did not improve memory for quantitative information. Problems in evaluating such displays are discussed, in particular concerning choice of scenarios, and investigation of failure modes of advanced displays.
© All rights reserved Moray et al. and/or Human Factors Society
Vicente, Kim J. (1990): Coherence- and Correspondence-Driven Work Domains: Implications for Systems Design. In Behaviour and Information Technology, 9 (6) pp. 493-502.
A distinction is made between coherence- and correspondence-driven work domains. This novel domain taxonomy is used to argue that the widely accepted goal of making the interface representation compatible with the user's mental model is not always appropriate. For correspondence-driven domains, it is more meaningful to constrain design from the side of the work domain rather than from that of the user. The implications of the coherence/correspondence distinction for the modelling of work domains, for interface design in computer supported co-operative work, and for the development of a multidimensional taxonomy of work domains are also briefly pointed out. The discussion suggests that the correspondence/coherence taxonomy provides a powerful conceptual tool for addressing fundamental issues in human-computer interaction.
© All rights reserved Vicente and/or Taylor and Francis
Rasmussen, Jens and Vicente, Kim J. (1990): Ecological Interfaces: A Technological Imperative in High-Tech Systems?. In International Journal of Human-Computer Interaction, 2 (2) pp. 93-110.
The topic of the present article is the design of ecological interfaces for advanced technological systems. Ecological interfaces are characteristic by representing the interior functional structures and states of a system in the human-machine interface in a way that matches the immediate task and the cognitive characteristics of the user. It is argued that the present trend in technological development towards large, complex and rapidly changing socio-technical systems makes these kinds of interfaces important for system reliability and safety.
© All rights reserved Rasmussen and Vicente and/or Lawrence Erlbaum Associates
Vicente, Kim J. and Rasmussen, Jens (1990): The ecology of human-machine systems II: Mediating "direct perception" in complex work domains. In Ecological psychology, 2 (3) pp. 207-249. Available online
Recently, a new class of artifacts has appeared in our environment: complex, high-technology work domains. An important characteristic of such systems is that their goal-relevant properties cannot be directly observed by the unaided eye. As a result, interface design is a ubiquitous problem in the design of these work environments. Nevertheless, the problem is one that has yet to be addressed in an adequate manner. An analogy to human perceptual mechanisms suggests that a smart instrument approach to interface design is needed to supplant the rote instrument (single-sensor-single-indicator) approach that has dominated to this point. Ecological interface design (ED) is a theoretical framework in the smart instrument vein that postulates a set of general, prescriptive principles for design. The goal of E D is twofold: first, to reveal the affordances of the work domain through the interface in such a way as to take advantage of the powerful capabilities of perception and action; and second, to provide the appropriate computer support for the comparatively more laborious process of problem solving. An example of the application of the E D framework is presented in the context of a thermal-hydraulic system. The various steps in the design process are illlustrated, showing how the abstract principles of E D can be applied in a prescriptive manner to develop a concrete design product. An important outcome of this discussion is the novel application of Rasmussen's (1985b) means-end hierarchy to structure the affordances of an ecosystem.
© All rights reserved Vicente and Rasmussen and/or Taylor and Francis
Rasmussen, Jens and Vicente, Kim J. (1989): Coping with Human Errors through System Design: Implications for Ecological Interface Design. In International Journal of Man-Machine Studies, 31 (5) pp. 517-534.
Research during recent years has revealed that human errors are not stochastic events which can be removed through improved training programs or optimal interface design. Rather, errors tend to reflect either systematic interference between various models, rules, and schemata, or the effects of the adaptive mechanisms involved in learning. In terms of design implications, these findings suggest that reliable human-system interaction will be achieved by designing interfaces which tend to minimize the potential for control interference and support recovery from errors. In other words, the focus should be on control of the effects of errors rather than on the elimination of errors per se. In this paper, we propose a theoretical framework for interface design that attempts to satisfy these objectives. The goal of our framework, called ecological interface design, is to develop a meaningful representation of the process which is not just optimised for one particular level of cognitive control, but that supports all three levels simultaneously. The paper discusses the necessary requirements for a mapping between the process and the combined action/observation surface, and analyses of the resulting influence on both the interferences causing error and on the opportunity for error recovery left to the operator.
© All rights reserved Rasmussen and Vicente and/or Academic Press
Vicente, Kim J. and Williges, Robert C. (1988): Accommodating Individual Differences in Searching a Hierarchical File System. In International Journal of Man-Machine Studies, 29 (6) pp. 647-668.
Individual differences among users of a hierarchical file system were investigated. The results of a previous experiment revealed that subjects with low spatial ability were getting lost in the hierarchical file structure. Based on the concept of visual momentum, two changes to the old interface were proposed in an attempt to accommodate the individual differences in task performance. The changes consisted of a partial map of the hierarchy and an analogue indicator of current file position. This experiment compared the performance of users with high and low spatial abilities on the old verbal interface and the new graphical interface. The graphical interface resulted in changes in command usage that were consistent with the predictions of the visual momentum analysis. Although these changes in strategy resulted in a performance advantage for the graphical interface, the relative performance difference between high and low spatial groups remained constant across interfaces. However, the new interface did result in a decrease in the within-group variability in performance.
© All rights reserved Vicente and Williges and/or Academic Press
Vicente, Kim J. and Rasmussen, Jens (1988): On Applying the Skills, Rules, Knowledge Framework to Interface Design. In: Proceedings of the Human Factors Society 32nd Annual Meeting 1988. pp. 254-258.
In this paper, a theoretical framework for interface design for complex systems is proposed. The approach, called Ecological Interface Design (EID), is based on the skills, rules, knowledge framework of levels of cognitive control. The fundamental goal of EID is to develop interfaces that provide the appropriate support for all three levels, but that do not force cognitive control to a higher level than the demands of the task require. The framework, consisting of a set of prescriptive design principles, is discussed, and an example of its application is presented.
© All rights reserved Vicente and Rasmussen and/or Human Factors Society
Join our community and advance:
Page maintainer: The Editorial Team