Publication statistics

Pub. period:1994-2011
Pub. count:7
Number of co-authors:11



Co-authors

Number of publications with 3 favourite co-authors:

David E. Kieras:4
David Meyer:2
Anthony J. Hornof:1

 

 

Productive colleagues

Scott D. Wood's 3 most productive colleagues in number of publications:

James D. Foley:49
Darren Gergle:34
David E. Kieras:25
 
 
 

Upcoming Courses

go to course
User-Centred Design - Module 2
89% booked. Starts in 6 days
go to course
Design Thinking: The Beginner's Guide
88% booked. Starts in 7 days
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

 
 
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
 
 
 
 
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading
 
 

Scott D. Wood

Add description
Rename / change spelling
Add publication
 

Publications by Scott D. Wood (bibliography)

 what's this?
2011
 
Edit | Del

Wood, Scott D. and Bagian, James P. (2011): A Cognitive Analysis of Color-Coded Wristband Use in Health Care. In: Proceedings of the Human Factors and Ergonomics Society 55th Annual Meeting 2011. pp. 281-285.

In this paper we examine human factors involved in the use of color-coded wristbands by analyzing cases that resulted in adverse events. We consider such cases in terms of which stage of care the events occurred, the health care roles involved in the event, and the cognitive factors seen as most likely to be causative in the event. We discuss a common theme in this analysis, that perhaps we are expecting too much from color-coding, and propose a number of possible solutions and suggestions for improving patient safety. We conclude that relying on color alone for any health care task is both risky and ineffective.

© All rights reserved Wood and Bagian and/or HFES

2001
 
Edit | Del

Brinck, Tom, Gergle, Darren and Wood, Scott D. (2001): Usability for the Web: Designing Web Sites that Work. Morgan Kaufmann Publishers

1997
 
Edit | Del

Kieras, David E., Wood, Scott D. and Meyer, David (1997): Predictive Engineering Models Based on the EPIC Architecture for a Multimodal High-Performance Human-Computer Interaction Task. In ACM Transactions on Computer-Human Interaction, 4 (3) pp. 230-275.

Engineering models of human performance permit some aspects of usability of interface designs to be predicted from an analysis of the task, and thus they can replace to some extent expensive user-testing data. We successfully predicted human performance in telephone operator tasks with engineering models constructed in the EPIC (Executive Process-Interactive Control) architecture for human information processing, which is especially suited for modeling multimodal, complex tasks, and has demonstrated success in other task domains. Several models were constructed on an a priori basis to represent different hypotheses about how operators coordinate their activities to produce rapid task performance. The models predicted the total time with useful accuracy and clarified some important properties of the task. The best model was based directly on the GOMS analysis of the task and made simple assumptions about the operator's task strategy, suggesting that EPIC models are a feasible approach to predicting performance in multimodal high-performance tasks.

© All rights reserved Kieras et al. and/or ACM Press

1995
 
Edit | Del

Kieras, David E., Wood, Scott D. and Meyer, David (1995): Predictive Engineering Models Using the EPIC Architecture for a High-Performance Task. In: Katz, Irvin R., Mack, Robert L., Marks, Linn, Rosson, Mary Beth and Nielsen, Jakob (eds.) Proceedings of the ACM CHI 95 Human Factors in Computing Systems Conference May 7-11, 1995, Denver, Colorado. pp. 11-18.

Engineering models of human performance permit some aspects of usability of interface designs to be predicted from an analysis of the task, and thus can replace to some extent expensive user testing data. Human performance in telephone operator tasks was successfully predicted using engineering models constructed in the EPIC (Executive Process-Interactive Control) architecture for human information-processing, which is especially suited for modeling multimodal, complex tasks. Several models were constructed on an a priori basis to represent different hypotheses about how users coordinate their activities to produce rapid task performance. All of the models predicted the total task time with useful accuracy, and clarified some important properties of the task.

© All rights reserved Kieras et al. and/or ACM Press

 
Edit | Del

Kieras, David E., Wood, Scott D., Abotel, Kasem and Hornof, Anthony J. (1995): GLEAN: A Computer-Based Tool for Rapid GOMS Model Usability Evaluation of User Interface Designs. In: Robertson, George G. (ed.) Proceedings of the 8th annual ACM symposium on User interface and software technology November 15 - 17, 1995, Pittsburgh, Pennsylvania, United States. pp. 91-100.

Engineering models of human performance permit some aspects of usability of interface designs to be predicted from an analysis of the task, and thus can replace to some extent expensive user testing data. The best developed such tools are GOMS models, which have been shown to be accurate and effective in predicting usability of the procedural aspects of interface designs. This paper describes a computer-based tool, GLEAN, that generates quantitative predictions from a supplied GOMS model and a set of benchmark tasks. GLEAN is demonstrated to reproduce the results of a case study of GOMS model application with considerable time savings over both manual modeling as well as empirical testing.

© All rights reserved Kieras et al. and/or ACM Press

 
Edit | Del

Damper, R. I. and Wood, Scott D. (1995): Speech versus Keying in Command and Control Applications. In International Journal of Human-Computer Studies, 42 (3) pp. 289-305.

Experimental comparisons of speech and competitor input media such as keying have, taken overall, produced equivocal results: this has usually been attributed to "task-specific variables". Thus, it seems that there are some good, and some less good, situations for utilization of speech input. One application generally thought to be a success is small-vocabulary, isolated-word recognition for command and control. In a simulated command and control task, Poock purportedly showed a very significant superiority of speech over keying in terms of higher input speeds and lower error rates. This paper argues that the apparent superiority observed results from a methodological error -- specifically that the verbose commands chosen suit the requirements of speech input but make little or no concession to the requirements of keying. We describe experiments modelled on those of Poock, but designed to overcome this putative flaw and to effect a fair comparison of the input media by using terse, abbreviated commands for the keying condition at least. Results of these new experiments reveal that speech input is 10.6% slower (although this difference is not statistically significant) and 360.4% more error-prone than keying, supporting our hypothesis that the methodology of the earlier work was flawed. However, simple extrapolation of our data for terse commands to the situation where keyed commands are entered in full suggests that other differences between our work and Poock's could play a part. Overall, we conclude that a fair comparison of input media requires an experimental design that explicitly attempts to minimize the so-called transaction cycle -- the number of user actions necessary to elicit a system response -- for each medium.

© All rights reserved Damper and Wood and/or Academic Press

1994
 
Edit | Del

Byrne, Michael D., Wood, Scott D., Sukaviriya, Piyawadee, Foley, James D. and Kieras, David E. (1994): Automating Interface Evaluation. In: Adelson, Beth, Dumais, Susan and Olson, Judith S. (eds.) Proceedings of the ACM CHI 94 Human Factors in Computing Systems Conference April 24-28, 1994, Boston, Massachusetts. pp. 232-237.

One method for user interface analysis that has proven successful is formal analysis, such as GOMS-based analysis. Such methods are often criticized for being difficult to learn, or at the very least an additional burden for the system designer. However, if the process of constructing and using formal models could be automated as part of the interface design environment, such models could be of even greater value. This paper describes an early version of such a system, called USAGE (the UIDE System for semi-Automated GOMS Evaluation). Given the application model necessary to drive the UIDE system, USAGE generates an NGOMSL model of the interface which can be "run" on a typical set of user tasks and provide execution and learning time estimates.

© All rights reserved Byrne et al. and/or ACM Press

 
Add publication
Show list on your website
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
Join our community!
 
 
 

Changes to this page (author)

04 Apr 2012: Modified
27 Jun 2007: Modified
28 Apr 2003: Added

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/scott_d__wood.html

Publication statistics

Pub. period:1994-2011
Pub. count:7
Number of co-authors:11



Co-authors

Number of publications with 3 favourite co-authors:

David E. Kieras:4
David Meyer:2
Anthony J. Hornof:1

 

 

Productive colleagues

Scott D. Wood's 3 most productive colleagues in number of publications:

James D. Foley:49
Darren Gergle:34
David E. Kieras:25
 
 
 

Upcoming Courses

go to course
User-Centred Design - Module 2
89% booked. Starts in 6 days
go to course
Design Thinking: The Beginner's Guide
88% booked. Starts in 7 days
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

 
 
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
 
 
 
 
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading