Number of co-authors:8
Number of publications with 3 favourite co-authors:Nigel Bevan:3M. Corbett:1M. Kelly:1
Miles Macleod's 3 most productive colleagues in number of publications:Nigel Bevan:32Ralph Rengger:2M. Kelly:2
Civilization advances by extending the number of important operations which we can perform without thinking of them.
-- Alfred North Whitehead
Read the fascinating history of Wearable Computing, told by its father, Steve Mann
Read Steve's chapter !
Has also published under the name of:
Publications by Miles Macleod (bibliography)
Macleod, Miles, Bowden, Rosemary, Bevan, Nigel and Curson, Ian (1997): The MUSiC Performance Measurement Method. In Behaviour and Information Technology, 16 (4) pp. 279-293.
This paper reports a method for measuring usability in terms of task performance -- achievement of frequent and critical task goals by particular users in a context simulating the work environment. The terms usability and quality in use are defined in international standards as the effectiveness, efficiency and satisfaction with which goals are achieved in a specific context of use. The performance measurement method gives measures which, in combination with measures of satisfaction, operationalize these definitions. User performance is specified and assessed by measures including task effectiveness (the quantity and quality of task performance) and User efficiency (effectiveness divided by task time). Measures are obtained with users performing tasks in a context of evaluation which matches the intended context of use. This can also reveal usability problems which may not become evident if the evaluator interacts with the user. The method is supported by tools which make it practical in commercial timescales. The method has been widely applied in industry, and can be adapted for use early in design, and to evaluate non-computer products and the performance of small work groups.
© All rights reserved Macleod et al. and/or Taylor and Francis
Bevan, Nigel and Macleod, Miles (1994): Usability Measurement in Context. In Behaviour and Information Technology, 13 (1) pp. 132-145.
Different approaches to the measurement of usability are reviewed and related to definitions of usability in international standards. It is concluded that reliable measures of overall usability can only be obtained by assessing the effectiveness, efficiency and satisfaction with which representative users carry out representative tasks in representative environments. This requires a detailed understanding of the context of use of a product. The ESPRIT MUSiC project has developed tools which can be used to measure usability in the laboratory and the field. An overview is given of the methods and tools for measuring user performance, cognitive workload, and user perceived quality.
© All rights reserved Bevan and Macleod and/or Taylor and Francis
Macleod, Miles and Bevan, Nigel (1993): MUSiC Video Analysis and Context Tools for Usability Measurement. In: Ashlund, Stacey, Mullet, Kevin, Henderson, Austin, Hollnagel, Erik and White, Ted (eds.) Proceedings of the ACM CHI 93 Human Factors in Computing Systems Conference April 24-29, 1993, Amsterdam, The Netherlands. p. 55.
Analysis of interaction between users and a system, based on video-assisted observation, can provide a highly informative and effective means of evaluating usability. To obtain valid and reliable results, the people observed should be representative users performing representative work tasks in appropriate circumstances, and the analysis should be methodical. The MUSiC Performance Measurement Method (PMM) -- developed at NPL as part of the ESPRIT Project MUSiC: Metrics for Usability Standards in Computing -- provides a validated method for making and analysing such video recordings to derive performance-based usability metrics. PMM is supported by the DRUM software tool which greatly speeds up analysis of video, and helps manage evaluations.
© All rights reserved Macleod and Bevan and/or ACM Press
Macleod, Miles and Rengger, Ralph (1993): The Development of DRUM: A Software Tool for Video-Assisted Usability Evaluation. In: Alty, James L., Diaper, Dan and Guest, D. (eds.) Proceedings of the Eighth Conference of the British Computer Society Human Computer Interaction Specialist Group - People and Computers VIII August 7-10, 1993, Loughborough University, UK. pp. 293-309.
The development is reported of a practical software tool which supports video-assisted observational evaluation of usability. The Diagnostic Recorder for Usability Measurement (DRUM) helps evaluators to organise and analyse user-based evaluations, and to deliver measures and diagnostic data. This paper reports DRUM's rationale, theoretical background, requirements capture and collaborative iterative development. It outlines DRUM's functionality and manner of use. DRUM runs on Apple Macintosh, drives a range of video machines, and supports management of evaluation data, task analysis, video mark-up and logging (with find and replay of logged events), analysis of logged data and calculation of metrics.
© All rights reserved Macleod and Rengger and/or Cambridge University Press
Corbett, M., Macleod, Miles and Kelly, M. (1993): Quantitative Usability Evaluation -- The ESPRIT MUSiC Project. In: Proceedings of the Fifth International Conference on Human-Computer Interaction 1993. pp. 313-318.
This paper presents an overview of the ESPRIT Project 5429 MUSiC -- Metrics for Usability Standards in Computing. The driving force for this project was the recognition of the industry need for effective tools and techniques to assess usability. The participants in this project have successfully developed a series of methods and tools in the four areas of: analytic measures, performance measures, cognitive workload measures and user attitude measures. In addition the importance of context in usability assessment, as recognised in ISO 9241 has formed a key component in the development of all MUSiC methods.
© All rights reserved Corbett et al. and/or Elsevier Science
Kornbrot, Diana and Macleod, Miles (1990): Monitoring and Analysis of Hypermedia Navigation. In: Diaper, Dan, Gilmore, David J., Cockton, Gilbert and Shackel, Brian (eds.) INTERACT 90 - 3rd IFIP International Conference on Human-Computer Interaction August 27-31, 1990, Cambridge, UK. pp. 401-406.
The use of an interaction monitoring tool in conjunction with commercial spreadsheet and statistical packages is described. The tool was used to monitor and analyse M.Sc. students' use of a hypermedia system with multiple navigation structures to study course content. The final product of the analysis is a description of the navigation routes and methods used by individual students to acquire information from the courseware. Post hoc, students were clearly separable into those who performed relatively more, and those who performed relatively less, actions per minute. These two groups were also different in terms of their use of the available navigation structures and the content they chose to visit. The role of high level monitoring tools and associated analysis packages in evaluating hypermedia material, and in answering questions about human learning, is discussed.
© All rights reserved Kornbrot and Macleod and/or North-Holland
Macleod, Miles and Tillson, Penelope (1990): Pull-Down, HoldDown, or StayDown? A Theoretical and Empirical Comparison of Three Menu Designs. In: Diaper, Dan, Gilmore, David J., Cockton, Gilbert and Shackel, Brian (eds.) INTERACT 90 - 3rd IFIP International Conference on Human-Computer Interaction August 27-31, 1990, Cambridge, UK. pp. 429-434.
Pull-down menus can be cumbersome to use when making multiple choices, as they become hidden after each choice. They may also be criticized for paucity of feedback about choices made. This paper considers two alternative designs, which help overcome these shortcomings: a menu which can be set to stay visible until closed by the user; and a menu which can be held in view while required, by pressing a 'hold' key. The user actions required by these design alternatives are evaluated theoretically, with the help of user action notations, and predictions generated about some aspects of usability. The implementation in HyperCard of working, self-monitoring prototypes is described. An empirical comparison for usability of the implemented designs is reported, where the StayDown and HoldDown menus were found to be significantly faster than a pull-down menu for making multiple choices, and to be subjectively preferred, especially for their enhanced feedback about currently chosen attributes.
© All rights reserved Macleod and Tillson and/or North-Holland
Macleod, Miles (1989): Direct Manipulation Prototype User Interface Monitoring. In: Sutcliffe, Alistair G. and Macauley, Linda (eds.) Proceedings of the Fifth Conference of the British Computer Society Human Computer Interaction Specialist Group - People and Computers V August 5-8, 1989, University of Nottingham, UK. pp. 395-407.
A simple automated technique is described for monitoring interaction between users and computer programs with direct manipulation user interfaces, implemented using HyperCard. With its set of easily tailorable interface components, HyperCard can be used as a prototyping tool to construct direct manipulation user interfaces, for the purpose of comparing design alternatives. The work described here makes available an additional means for their evaluation. Interaction is recorded as actions on discrete interface objects (e.g., buttons, menu items, fields), rather than at the level of mouse coordinates and pixels. This grain of analysis provides a readily interpretable record, with the potential of being matched against predictions derived from formalisable interaction task models. A log of actions and times is created unobtrusively during interaction, and may be inspected or written to an external text file when desired. Two stages in the development of AutoMonitors (systems which monitor themselves) are outlined. Firstly, the construction of an AutoMonitor. Secondly, the implementation of a software device which can convert HyperCard programs into AutoMonitors, without additional programming effort. Conversion involves the automated modification of the code attached to each interface object, and the grafting on of a user interface for the monitor itself. The design, and automated installation, of a system for recording users' and experimenters' comments is also described.
© All rights reserved Macleod and/or Cambridge University Press
Show this list on your homepage
Join the technology elite and advance:
Changes to this page (author)19 Feb 2010: Modified28 Jun 2007: Added
28 Apr 2003: Added
Page maintainer: The Editorial Team