Number of co-authors:14
Number of publications with 3 favourite co-authors:Philip J. Smith:3Jane M. Fraser:2Sally Rudmann:2
Jack W. Smith's 3 most productive colleagues in number of publications:Philip J. Smith:29B. Chandrasekaran:17Thomas E. Miller:3
Computer analyst to programmer: "You start coding. I'll go find out what they want."
-- Popular computer one-liner
Read the fascinating history of Wearable Computing, told by its father, Steve Mann
Read Steve's chapter !
Jack W. Smith
Publications by Jack W. Smith (bibliography)
Fraser, Jane M., Smith, Philip J. and Smith, Jack W. (1992): A Catalog of Errors. In International Journal of Man-Machine Studies, 37 (3) pp. 265-307.
This paper reviews various errors that have been described by comparing human behavior to the norms of probability, causal connection and logical deduction. For each error we review evidence on whether the error has been demonstrated to occur. For many errors, the occurrence of a bias has not been demonstrated; for others, a bias does occur, but arguments can be made that the bias is not always an error. Based on the conclusions of this review, we caution researchers and practitioners in referring to well known biases and errors.
© All rights reserved Fraser et al. and/or Academic Press
Chandrasekaran, B., Johnson, Todd R. and Smith, Jack W. (1992): Task-Structure Analysis for Knowledge Modeling. In Communications of the ACM, 35 (9) pp. 124-137.
Smith, Philip J., Galdes, Deborah, Fraser, Jane M., Miller, Thomas, Smith, Jack W., Svirbely, John R., Blazina, Janice, Kennedy, Melanie, Rudmann, Sally and Thomas, Donna L. (1991): Coping with the Complexities of Multiple-Solution Problems: A Case Study. In International Journal of Man-Machine Studies, 35 (4) pp. 429-453.
A model is proposed to account for the expertise of a skilled immunohematologist in solving multiple-solution problems. These problems, which he must deal with daily, are concerned with ensuring the safe transfusion of blood into patients. This model suggests that he copes with this difficult class of problems by: (1) Using patterns in the data to simplify the problem, hypothesizing the number of primitive solutions necessary to account for the test results and, when possible, decomposing the problem into a set of less complex, single-solution problems. Such decompositions then enable more powerful reasoning processes; (2) making use of a mixture of data-driven and hypothesis-driven processes in order to reduce the chances that heuristic (and therefore error-prone) methods and cognitive biases will lead away from critical data; (3) Relying on a mixture of confirmatory and rule-out processes to provide converging evidence, thus reducing the chances of error; (4) uncovering his own errors through the use of "error models" that identify the conditions where one of his processes is likely to make an error (similar to the use of student models by expert tutors to diagnose mistakes made by students).
© All rights reserved Smith et al. and/or Academic Press
Guerlain, Stephanie A. E., Smith, Philip J., Miller, Thomas E., Gross, Susan M., Smith, Jack W. and Rudmann, Sally (1991): A Testbed for Teaching Problem Solving Skills in an Interactive Learning Environment. In: Proceedings of the Human Factors Society 35th Annual Meeting 1991. p. 1166.
An interactive learning environment was developed with the goal of empirically testing the effectiveness of various teaching strategies in improving problem solving performance. The domain chosen was transfusion medicine since it involves solving complex, multiple solution problems which are typically found to be difficult (Elstein, Shulman, and Sprafka, 1978) and because normal performance of this task calls for marking data sheets with intermediate conclusions, thereby improving the chances of the computer correctly inferring the student's reasoning. The testbed, called TMT (for Transfusion Medicine Tutor), monitors for errors, builds a model of what a student knows and can select teaching strategies based on human tutoring models that were developed from earlier studies. The testbed will be used to collect data of a student's performance in conditions where the degree of teaching and type of feedback are manipulated. A number of broadly applicable issues can be explored in this framework such as the difference between expert and student problem solving strategies, the effectiveness of different teaching strategies, and the importance of modeling student knowledge and providing visual feedback when developing an interactive learning environment. Preliminary results of our experiments, a demonstration of the testbed, and a discussion of how it was implemented will be presented in the demonstration session.
© All rights reserved Guerlain et al. and/or Human Factors Society
Show this list on your homepage
Join the technology elite and advance:
Changes to this page (author)12 Feb 2010: Modified17 Aug 2009: Added
26 Jun 2007: Added
28 Apr 2003: Added
Page maintainer: The Editorial Team