Number of co-authors:8
Number of publications with 3 favourite co-authors:Joseph Sharit:3Charles D. Bowen:2David M. Hirst:1
Donna L. Cuomo's 3 most productive colleagues in number of publications:Joseph Sharit:18Laurie E. Damianos:5Charles D. Bowen:3
User error: replace user and press any key to continue.
-- Popular computer one-liner
Read the fascinating history of Wearable Computing, told by its father, Steve Mann
Read Steve's chapter !
Donna L. Cuomo
Has also published under the name of:
"D. L. Cuomo"
Publications by Donna L. Cuomo (bibliography)
Damianos, Laurie E., Cuomo, Donna L., Griffith, John, Hirst, David M. and Smallwood, James (2007): Exploring the Adoption, Utility, and Social Influences of Social Bookmarking in a Corporate Environment. In: HICSS 2007 - 40th Hawaii International International Conference on Systems Science 3-6 January, 2007, Waikoloa, Big Island, HI, USA. p. 86.
Cuomo, Donna L. (1994): Understanding the Applicability of Sequential Data Analysis Techniques for Analyzing Usability Data. In Behaviour and Information Technology, 13 (1) pp. 171-182.
The applicability of sequential data analysis (SDA) techniques for analyzing usability test data is examined. SDA techniques include transition matrix analysis, lag sequential analysis, frequency of cycles, graphical summarization techniques, and pattern analysis techniques. A subset of each was used in analyzing the data from three usability studies. The encoding schemes used, the analysis routines run, software tools to support encoding and analysis (SHAPA and the Maximal Repeating Pattern analysis tool), and their interactions are discussed. The different types of usability problems which can be extracted from the data when analysed with SDA techniques are illustrated. It is concluded that the SDA techniques will be useful once the state-of-the-art in software support is able to provide the analyst greater flexibility in applying the analysis routines. Without the ability to apply analysis routines to multiple data levels, too much work is involved in obtaining a complete analysis of usability problems at all levels.
© All rights reserved Cuomo and/or Taylor and Francis
Cuomo, Donna L. (1994): A Method for Assessing the Usability of Graphical, Direct-Manipulation Style Interfaces. In International Journal of Human-Computer Interaction, 6 (3) pp. 275-297.
A model-based method for assessing the usability of graphical, direct-manipulation style interfaces was developed. The method involves collecting and integrating verbal protocol data, history logs, and videotapes of the system display. Then, an analyst familiar with the task, the data, and Norman's (1986) user activity model reviews the data and makes determinations on what they mean in terms of the model. An encoding scheme is next applied to the integrated data, to structure the Human Computer Interaction (HCI) process at a detailed interaction level. The structured data now support the application of quantitative methods and the identification of meaningful patterns and frequencies that highlight potential usability problems or instances of indirectness. Error encodings reflect user-system interface difficulties not only in the execution stage but also in the psychological stages. The method was used to evaluate the usability of a military airspace scheduling system; the types of usability problems identified and the advantages of the method are discussed.
© All rights reserved Cuomo and/or Lawrence Erlbaum Associates
Cuomo, Donna L. and Bowen, Charles D. (1994): Understanding Usability Issues Addressed by Three User-System Interface Evaluation Techniques. In Interacting with Computers, 6 (1) pp. 86-108.
Three structured judgment evaluation techniques were applied to a system with a graphical direct manipulation style interface, to understand the types of usability problems they address. These evaluation techniques were cognitive walkthrough, heuristic evaluation, and the Smith and Mosier (1986) guidelines. The authors wanted to learn whether the techniques identify problems: across all stages of user activity, which noticeably affect users' performance with the system, and which are important to the usability of direct manipulation-style systems. Results showed that the cognitive walkthrough method identifies issues almost exclusively within the action specification stage, while guidelines covered more stages. The walkthrough was best, however, and the guidelines worst at predicting problems that cause users noticeable difficulty (as observed during a usability study). All the techniques could be improved in assessing semantic distance and addressing all stages on the evaluation side of the HCI activity cycle. To evaluate the directness of engagements, improved or new techniques are needed.
© All rights reserved Cuomo and Bowen and/or Elsevier Science
Cuomo, Donna L. (1993): A Methodology and Encoding Scheme for Evaluating the Usability of Graphical, Direct Manipulation Style Interfaces. In: Proceedings of the Human Factors and Ergonomics Society 37th Annual Meeting 1993. pp. 1137-1141.
A model-based method for assessing the usability of graphical, direct manipulation style interfaces was developed and applied to a military airspace scheduling system. The method involves collecting and integrating verbal protocol data and mouse/keystroke files, and having an analyst familiar with the task, the data, and Norman's (1986) user activity model review the data and make determinations on what the data mean in terms of the model. A hierarchical encoding scheme based on the model is then applied to the integrated data to structure the human-computer interaction (HCI) process at a detailed interaction level. Meaningful patterns can be identified, frequency of events per task, and number of actions per intention can be calculated at various levels in the hierarchical breakdown, highlighting potential usability problems or instances of indirectness. Repetitious sequences, for example, could imply a missing high-level task domain object or an inability to group objects for application of a single action. Detailed model-based error encodings reflect user-system interface difficulties not only in the execution stage of HCI but in the psychological stages as well. The types of usability problems identified and the advantages of the method arc discussed. Based on these results, we have begun developing a multi-media tool to support application of the method.
© All rights reserved Cuomo and/or Human Factors Society
Cuomo, Donna L. and Bowen, Charles D. (1992): Stages of User Activity Model as a Basis for User-System Interface Evaluations. In: Proceedings of the Human Factors Society 36th Annual Meeting 1992. pp. 1254-1258.
This paper discusses the results of the first phase of a research project concerned with developing methods and measures of user-system interface effectiveness for command and control systems with graphical, direct manipulation style interfaces. Due to the increased use of prototyping user interfaces during concept definition and demonstration/validation phases, the opportunity exists for human factors engineers to apply evaluation methodologies early enough in the life cycle to make an impact on system design. Understanding and improving user-system interface (USI) evaluation techniques is critical to this process. In 1986, Norman proposed a descriptive "stages of user activity" model of human-computer interaction. Hutchins, Hollan, and Norman (1986) proposed concepts of measures based on the model which would assess the directness of the engagements between the user and the interface at each stage of the model. This first phase of our research program involved applying three USI evaluation techniques to a single interface, and assessing which, if any, provided information on the directness of engagement at each stage of Norman's model. We also classified the problem types identified according to the Smith and Mosier (1986) functional areas. The three techniques used were cognitive walkthrough, heuristic evaluation, and guidelines. It was found that the cognitive walkthrough method applied almost exclusively to the action specification stage. The guidelines were applicable to more of the stages evaluated but all the techniques were weak in measuring semantic distance and all of the stages on the evaluation side of the HCI activity cycle. Improvements to existing or new techniques are required for evaluating the directness of engagement for graphical, direct manipulation style interfaces.
© All rights reserved Cuomo and Bowen and/or Human Factors Society
Blackwell, Janet S. and Cuomo, Donna L. (1991): Evaluation of a Proposed Space and Missile Warning Symbology Standard for Graphical Displays. In: Proceedings of the Human Factors Society 35th Annual Meeting 1991. pp. 102-106.
A discriminability evaluation was performed on a proposed Space and Missile Warning symbol set. Our analysis focused on the discriminability of the symbols and the application of the information coding techniques. Inconsistent or inappropriate use of coding techniques can affect a user's interpretation of the symbol's intended meaning. Potential problems included the similarity individual symbols, use of alphanumeric markers and partially shaded symbols, and the lack of guidance on the minimum size of the symbols. After a lengthy review of previous research, we felt the literature could not provide adequate solutions. A two-part discriminability study was conducted to test the overall effects of the information coding techniques on discriminability, to identify individual symbols with low discriminability, and to determine an appropriate minimum size for these symbols. Search time was used as a measure of symbol discriminability. Size, shape, markers, and shading had significant effects on search time and errors. The experiments confirmed the suspected discriminability problems and modifications were made to the existing symbol set to create three new alternative symbol sets. Testing performed on these new symbol sets revealed that many of the problem areas from the original symbol set had been improved. Design guidelines and a new modified symbol set were proposed for review by the operational community.
© All rights reserved Blackwell and Cuomo and/or Human Factors Society
Cuomo, Donna L. and Rizzuto, Anthony P. (1990): Methodology for Determining the Human Role in the Strategic Defense System Command Center. In: D., Woods, and E., Roth, (eds.) Proceedings of the Human Factors Society 34th Annual Meeting 1990, Santa Monica, USA. pp. 1148-1152.
Cuomo, Donna L. and Sharit, Joseph (1989): A Study of Human Performance in Computer-Aided Architectural Design. In International Journal of Human-Computer Interaction, 1 (1) pp. 69-107.
This paper describes the development and application of a cognitively-based performance methodology for assessing human performance on computer-aided architectural design (CAAD) tasks. Two CAAD tasks were employed that were hypothesized to be different in terms of the underlying cognitive processes required for these tasks to be performed. Methods of manipulating task complexity within each of these tasks were then developed. Six architectural graduate students were trained on a commercially available CAAD system. Each student performed the two experimental design tasks at one of three levels of complexity. The data collected included protocols, video recordings of the computer screen, and an interactive script (time-stamped record of every command input and the computers textual response). Performance measures and methods of analysis were developed which reflected the cognitive processes used by the human during design (including problem-solving techniques, planning times, heuristics employed, etc.) and the role of the computer as a design aid. The analysis techniques used included graphical techniques, Markov process analysis, protocol analysis, and error classification and analysis. The results of the study indicated that some measures more directly reflected human design activity while others more directly reflected the efficiency of interaction between the computer and the the human. The discussion of the results focuses primarily on the usefulness of the tasks employed including methods for manipulating task complexity, and the effectiveness of this system as well as CAAD systems in general for aiding human design processes.
© All rights reserved Cuomo and Sharit and/or Lawrence Erlbaum Associates
Cuomo, Donna L. and Sharit, Joseph (1989): Human Performance in Computer-Aided Architectural Design. In: Proceedings of the Third International Conference on Human-Computer Interaction 1989. pp. 241-249.
The tremendous growth in the area of human-computer interaction has, in some cases, resulted in the implementation of technologies at a pace well ahead of the development for assessing human performance on tasks employing these technologies. An example of such a technology is computer-aided design. The cognitive processes underlying human design behavior require that performance measures be developed that adequately reflect these processes. Ultimately, the development and implementation of such a performance methodology could help us establish the degree to which the computer technology supports or constrains human design activities. In this paper we discuss an approach that was taken toward meeting these objectives. In particular, the application area of architectural design will be examined.
© All rights reserved Cuomo and Sharit and/or Lawrence Erlbaum Associates
Sharit, Joseph and Cuomo, Donna L. (1988): A Cognitively Based Methodology for Evaluating Human Performance in the Computer-Aided Design Task Domain. In Behaviour and Information Technology, 7 (4) pp. 373-397.
This article describes a methodology for evaluating human performance in the computer aided design (CAD) task environment. The methodology is based primarily on cognitive theoretic frameworks that are consistent with processes presumed to underlie human design activities. The motivation for its development stems from rapid software and hardware advances in CAD systems and our relative lack of understanding of how these enhancements affect human design performance for (1) fundamentally different types of tasks and (2) different levels of complexity for a particular task. This methodology is currently being applied to computer aided architectural design, an area where artificial intelligence (AI), enhanced geometric modelling and other system features are being debated in terms of their usefulness in aiding the human's design activities.
© All rights reserved Sharit and Cuomo and/or Taylor and Francis
Show this list on your homepage
Join the technology elite and advance:
Changes to this page (author)20 Feb 2010: Modified13 Jun 2009: Added
28 Jun 2007: Added
26 Jun 2007: Added
26 Jun 2007: Added
26 Jun 2007: Added
26 Jun 2007: Added
28 Apr 2003: Added
Page maintainer: The Editorial Team