Number of co-authors:22
Number of publications with 3 favourite co-authors:Paul Weiler:2Jeffrey J. Hendrickson:2Billy W. Hensley:2
Monty Hammontree's 3 most productive colleagues in number of publications:Gilbert Cockton:72Elizabeth F. Churc..:58Russell Beale:51
go to course
User-Centred Design - Module 3
71% booked. Starts in 24 days
Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess
User Experience and Experience Design !
Our Latest Books
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
Has also published under the name of:
"Monty L. Hammontree"
Publications by Monty Hammontree (bibliography)
Vaughan, Misha, Courage, Catherine, Rosenbaum, Stephanie, Jain, Jhilmil, Hammontree, Monty, Beale, Russell and Welsh, Dan (2008): Longitudinal usability data collection: art versus science?. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 2261-2264.
In this proposal the authors describe an exciting panel for CHI 2008 on Longitudinal Usability Data Collection. Collecting usability data over time is increasingly becoming best practice in industry, but lacks "thought leadership" in the current literature -- very few articles or books exist addressing the topic. To inspire academic research and share best practices with practitioners, we propose a panel to debate some key questions that arose from the CHI 2007 SIG on the same topic.
© All rights reserved Vaughan et al. and/or ACM Press
Gilmore, David J., Cockton, Gilbert, Churchill, Elizabeth F., Kujala, Sari, Henderson, Austin and Hammontree, Monty (2008): Values, value and worth: their relationship to HCI?. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 3933-3936.
Hammontree, Monty, Weiler, Paul and Nayak, Nandini (1994): Remote Usability Testing. In Interactions, 1 (3) pp. 21-25.
Weiler, Paul, Cordes, Richard, Hammontree, Monty, Hoiem, Derek and Thompson, Michael (1993): Software for the Usability Lab: A Sampling of Current Tools. In: Ashlund, Stacey, Mullet, Kevin, Henderson, Austin, Hollnagel, Erik and White, Ted (eds.) Proceedings of the ACM CHI 93 Human Factors in Computing Systems Conference April 24-29, 1993, Amsterdam, The Netherlands. pp. 57-60.
This panel brings together usability professionals throughout the computer industry to demonstrate and discuss their usability lab software tools. These tools are specifically designed to improve the data collection and analysis process for usability labs. Their capabilities range from simple to complex and the panel will not only discuss the benefits of using the tools but also share the lessons learned during the design and development process.
© All rights reserved Weiler et al. and/or ACM Press
Hammontree, Monty, Hendrickson, Jeffrey J. and Hensley, Billy W. (1992): Integrated Data Capture and Analysis Tools for Research and Testing on Graphical User Interfaces. In: Bauersfeld, Penny, Bennett, John and Lynch, Gene (eds.) Proceedings of the ACM CHI 92 Human Factors in Computing Systems Conference June 3-7, 1992, Monterey, California. pp. 431-432.
Our on-line data capture and analysis tools include an event capture program, event data filtering programs, a multimedia data analyzer, and a retrospective verbal protocol recorder for use with the multimedia data analyzer. Off-line observation logging is also supported. Additional plans for development include the integration of an online time-synchronized observation logger, and time-synchronized eyetracking data recording. The tool set provides an integrated multi-source data collection, processing, and analysis system for: 1) comparing and evaluating software applications and prototypes; 2) evaluating software documentation and instructional materials; and 3) evaluating on-line training. The tools currently run on Macintosh computers and under Microsoft Windows. Plans are to port the tools to run under Presentation Manager and Motif.
© All rights reserved Hammontree et al. and/or ACM Press
Hammontree, Monty, Hensley, Billy W. and Hendrickson, Jeffrey J. (1991): Event Capture and Analysis Tools for Graphic User Interfaces. In: Proceedings of the Human Factors Society 35th Annual Meeting 1991. p. 1165.
This was a demonstration of a set of tools used to: 1) compare and evaluate software applications and prototypes; 2) evaluate documentation and instructional material; and 3) process video tape recordings of human-computer interaction (HCI). These tools include an event capture tool, which records events related to objects in graphical user interfaces, data filtering tools, which translate and aggregate user-generated events into meaningful characterizations of the interaction, and a multimedia data analyzer, which couples event logs and video recordings from HCI testing sessions.
© All rights reserved Hammontree et al. and/or Human Factors Society
Arnegard, Ruth J., Hammontree, Monty, Montgomery, Melinda J., Pearson, Gwen L. and Zwaga, Harm J. G. (1989): An Evaluation of the Signposting System in a Large Subway Station. In: Proceedings of the Human Factors Society 33rd Annual Meeting 1989. pp. 541-545.
The purpose of this study was to objectively evaluate the signposting system within the Metro Center subway station of the WMATA (Washington Metro Area Transit Authority). The approach taken was: first, to estimate the prevalence of passenger behaviors indicative of deficiencies in wayfinding; second, to evaluate the adequacy of the signposting within various decision areas of the station; and finally, to evaluate the individual components of the signage system. The first objective was addressed by observing 507 passengers selected via a pseudo-random sampling technique. From this sample an overview of traffic patterns was developed and it was determined that roughly 7% of these passengers followed a route that did not comply with the directions provided by the stations sign posting system. It was further found that an additional 2% asked for direction. These figures were combined with data provided by the WMATA to project that 6,000 to 7,000 of the passengers disembarking a train within Metro Center will evidence some form of inefficient wayfinding behavior each day. To address the second question, portions of the randomly sampled data were combined with data gathered from a selectively sampled group of passengers operationally defined as needing wayfinding assistance (n = 359). This data was analyzed to determine the relative difficulty that information needy passengers had in finding their way through the various areas of the station. The final objective was addressed by comparing individual components of the signage system to current human factors guidelines.
© All rights reserved Arnegard et al. and/or Human Factors Society
Join our community and advance:
Changes to this page (author)12 May 2008: Modified12 May 2008: Modified
26 Jun 2007: Modified
26 Jun 2007: Added
28 Apr 2003: Added
Page maintainer: The Editorial Team