Ann Blandford

Professor

Personal Homepage
http://www.ucl.ac.uk/uclic/people/a_blandford/
Employer
University College London ()
Email
a.blandford@ucl.ac.uk

Ann Blandford is Professor of Human-Computer Interaction in the Department of Computer Science at University College London, and served as Director of UCL Interaction Centre (UCLIC) (2004-2011). Her teaching includes User-Centred Evaluation Methods on the MSc in HCI with Ergonomics at UCL. She started her career in industry, as a software engineer, but soon moved into academia, where she developed a focus on the use and usability of computer systems. Ann leads research projects on human error and on interacting with information, with a focus on modelling situated interactions. In particular, she leads an EPSRC Platform Grant on Interactive Systems in Healthcare, and an EPSRC Programme Grant, CHI+MED, on Human-Computer Interaction for Medical Devices. She has been technical programme chair for several conferences, the most recent being NordiCHI 2010. See http://www.ucl.ac.uk/uclic/people/a_blandford/ for more detail.

Publication Statistics

Publication period start
1993
Publication period end
2014
Number of co-authors
93

Co-authors
Number of publications with favourite co-authors

Productive Colleagues
Most productive colleagues in number of publications

Publications

Rajkomar, Atish, Blandford, Ann, Mayer, Astrid (2013): Coping with complexity in home hemodialysis: a fresh perspective on time as a medium of Di. In Cognition, Technology & Work, 0 (0) pp. . http://link.springer.com/article/10.1007%2Fs10111-013-0263-x

Blandford, Ann (2013): Eliciting People's Conceptual Models of Activities and Systems. In International Journal of Conceptual Structures and Smart Applications, 1 (1) pp. . http://www.igi-global.com/article/eliciting-peoples-conceptual-models-of-activities-and-systems/80380

Makri, Stephann, Blandford, Ann (2012): Coming across information serendipitously – Part 1: A process model. In Journal of Documentation, 68 (5) pp. .684-705. http://www.emeraldinsight.com/journals.htm?articleid=17051067

Diriye, Abdigani, Tombros, Anastasios, Blandford, Ann (2012): A little interaction can go a long way: enriching the query formulation process. In: Proceedings of the 2012 BCS-IRSG European Conference on Information Retrieval , 2012, . pp. 531-534. http://dx.doi.org/10.1007/978-3-642-28997-2_57

Kamsin, Amirrudin, Blandford, Ann, Cox, Anna L. (2012): Personal task management: my tools fall apart when I'm very busy!. In: CHI12 Extended Abstracts on Human Factors in Computing Systems May 5-10, 2012, Austin, USA. pp. 1369-1374. http://dl.acm.org/citation.cfm?id=2212457

Rajkomar, Atish, Blandford, Ann (2012): Understanding infusion administration in the ICU through Distributed Cognition. In Journal of Biomedical Informatics, 45 (3) pp. 580-590. http://www.j-biomed-inform.com/article/S1532-0464(12)00024-X/abstract

Furniss, Dominic, Blandford, Ann, Curzon, Paul (2011): Confessions from a grounded theory PhD: experiences and lessons learnt. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems , 2011, . pp. 113-122. http://dx.doi.org/10.1145/1978942.1978960

Vincent, Chris, Blandford, Ann (2011): Designing for Safety and Usability: User-Centered Techniques in Medical Device Design Prac. In: Proceedings of the Human Factors and Ergonomics Society 55th Annual Meeting , 2011, . pp. 793-797. http://pro.sagepub.com/content/55/1/793

Attfield, Simon, Blandford, Ann (2011): Making Sense of Digital Footprints in Team-Based Legal Investigations: The Acquisition of . In Human Computer Interaction, 26 (1) pp. 38-71. http://www.tandfonline.com/doi/abs/10.1080/07370024.2011.556548

Furniss, Dominic, Blandford, Ann, Mayer, Astrid (2011): Unremarkable errors: low-level disturbances in infusion pump use. In: Proceedings of the 25th BCS Conference on Human-Computer Interaction BCS-HCI 11 July 4-8, 2011, Newcastle, United Kingdom. pp. 197-204. http://dl.acm.org/citation.cfm?id=2305353

Blandford, Ann, Pietro, Giuseppe De, Gallo, Luigi, Gimblett, Andy, Oladimeji, Patrick, Thimbleby, Harold (2011): Engineering interactive computer systems for medicine and healthcare (EICS4Med). In: ACM SIGCHI 2011 Symposium on Engineering Interactive Computing Systems , 2011, . pp. 341-342. http://dx.doi.org/10.1145/1996461.1996556

Sutcliffe, Alistair G., Blandford, Ann (2010): Guest Editors' Introduction. In Interacting with Computers, 22 (1) pp. 1-2. http://www.sciencedirect.com/science/article/B6V0D-4XV5NX9-1/2/d69d62dafb6b75fe62fe10331a45a83c

Hiltz, Kimberley, Back, Jonathan, Blandford, Ann (2010): The roles of conceptual device models and user goals in avoiding device initialization err. In Interacting with Computers, 22 (5) pp. 363-374. http://www.sciencedirect.com/science/article/B6V0D-4Y6S7J6-1/2/ac929da4800c223e9d815538825596fe

Makri, Stephann, Blandford, Ann, Cox, Anna L. (2010): This is what i'm doing and why: reflections on a think-aloud study of dl users' informatio. In: JCDL10 Proceedings of the 2010 Joint International Conference on Digital Libraries , 2010, . pp. 349-352. http://doi.acm.org/10.1145/1816123.1816177

Sulaiman, Suziah, Blandford, Ann, Cairns, Paul (2010): Haptic experience and the design of drawing interfaces. In Interacting with Computers, 22 (3) pp. 193-205. http://www.sciencedirect.com/science/article/B6V0D-4XVRYK8-1/2/3320470ee245f03da57242fcc601d25f

Hassard, Stephen T., Blandford, Ann, Cox, Anna L. (2009): Analogies in design decision-making. In: Proceedings of the HCI09 Conference on People and Computers XXIII , 2009, . pp. 140-148. http://doi.acm.org/10.1145/1671011.1671027

Diriye, Abdigani, Blandford, Ann, Tombros, Anastasios (2009): A polyrepresentational approach to interactive query expansion. In: JCDL09 Proceedings of the 2009 Joint International Conference on Digital Libraries , 2009, . pp. 217-220. http://doi.acm.org/10.1145/1555400.1555434

Smith, Penn, Blandford, Ann, Back, Jonathan (2009): Questioning, exploring, narrating and playing in the control room to maintain system safet. In Cognition Technology and Work, 11 (4) pp. 279-291. http://link.springer.com/article/10.1007%2Fs10111-008-0116-1

Blandford, Ann, Green, T. R. G., Furniss, Dominic, Makri, Stephann (2008): Evaluating system utility and conceptual fit using CASSM. In International Journal of Human-Computer Studies, 20 (6) pp. 393-409. http://www.sciencedirect.com/science/article/B6WGR-4R6B2TF-1/2/5f43b203d9e253553384df9ec560de6f

Blandford, Ann, Curzon, Paul, Hyde, Joanne, Papatzanis, George (2008): EMU in the Car: Evaluating Multimodal Usability of a Satellite Navigation System. In: Graham, T. C. Nicholas, Palanque, Philippe A. (eds.) DSV-IS 2008 - Interactive Systems. Design, Specification, and Verification, 15th International Workshop July 16-18, 2008, Kingston, Canada. pp. 1-14. http://dx.doi.org/10.1007/978-3-540-70569-7_1

Makri, Stephann, Blandford, Ann, Cox, Anna L. (2008): Using Information Behaviors to Evaluate the Functionality and Usability of Electronic Reso. In JASIST - Journal of the American Society for Information Science and Technology, 59 (14) pp. 2244-2267. http://eprints.ucl.ac.uk/103915/

Attfield, Simon, Blandford, Ann, Dowell, John, Cairns, Paul (2008): Uncertainty-tolerant design: Evaluating task performance and drag-and-link information gat. In International Journal of Human-Computer Studies, 20 (6) pp. 410-424. http://www.sciencedirect.com/science/article/B6WGR-4RC2S4S-1/2/198cc6a41987a6ca961dc63f3869bfba

Attfield, Simon, Fegan, Sarah, Blandford, Ann (2008): Idea Generation and Material Consolidation: Tool Use and Intermediate Artefacts in Journal. In Cognition, Technology & Work, 11 (3) pp. 227-239. http://www.uclic.ucl.ac.uk/people/s.attfield/saabsfIdeaGen.pdf

Stelmaszewska, Hanna, Fields, Bob, Blandford, Ann (2008): The Roles of Time, Place, Value and Relationships in Collocated Photo Sharing with Camera . In: Proceedings of the HCI08 Conference on People and Computers XXII , 2008, . pp. 141-150. http://www.bcs.org/server.php?show=ConWebDoc.21364

Blandford, Ann, Adams, Anne, Attfield, Simon, Buchanan, George, Gow, Jeremy, Makri, Stephann, Rimmer, Jon, Warwick, Claire (2008): The PRET A Rapporter framework: Evaluating digital libraries from the perspective of infor. In Information Processing & Management, 44 (1) pp. 4-21. http://dl.acm.org/citation.cfm?id=1315074

Back, Jonathan, Blandford, Ann, Curzon, Paul (2007): Recognising Erroneous and Exploratory Interactions. In: Baranauskas, Maria Cecília Calani, Palanque, Philippe A., Abascal, Julio, Barbosa, Simone Diniz Junqueira (eds.) DEGAS 2007 - Proceedings of the 1st International Workshop on Design and Evaluation of e-Government Applications and Services September 11th, 2007, Rio de Janeiro, Brazil. pp. 127-140. http://dx.doi.org/10.1007/978-3-540-74800-7_10

Furniss, Dominic, Blandford, Ann, Curzon, Paul (2007): Usability evaluation methods in practice: understanding the context in which they are embe. In: Brinkman, Willem-Paul, Ham, Dong-Han, Wong, B. L. William (eds.) ECCE 2007 - Proceedings of the 14th European Conference on Cognitive Ergonomics August 28-31, 2007, London, UK. pp. 253-256. http://doi.acm.org/10.1145/1362550.1362602

Faisal, Sarah, Cairns, Paul A., Blandford, Ann (2007): Building for Users not for Experts: Designing a Visualization of the Literature Domain. In: IV 2007 - 11th International Conference on Information Visualisation 2-6 July, 2007, Zürich, Switzerland. pp. 707-712. http://doi.ieeecomputersociety.org/10.1109/IV.2007.32

Back, Jonathan, Blandford, Ann, Curzon, Paul (2007): Slip errors and cue salience. In: Brinkman, Willem-Paul, Ham, Dong-Han, Wong, B. L. William (eds.) ECCE 2007 - Proceedings of the 14th European Conference on Cognitive Ergonomics August 28-31, 2007, London, UK. pp. 221-224. http://doi.acm.org/10.1145/1362550.1362595

Makri, Stephann, Blandford, Ann, Gow, Jeremy, Rimmer, Jon, Warwick, Claire, Buchanan, George (2007): A library or just another information resource? A case study of users\' mental models of t. In JASIST - Journal of the American Society for Information Science and Technology, 58 (3) pp. 433-445. http://dx.doi.org/10.1002/asi.20510

Blandford, Ann, Gow, Jeremy, Buchanan, George, Warwick, Claire, Rimmer, Jon (2007): Creators, Composers and Consumers: Experiences of Designing a Digital Library. In: Baranauskas, Maria Cecília Calani, Palanque, Philippe A., Abascal, Julio, Barbosa, Simone Diniz Junqueira (eds.) DEGAS 2007 - Proceedings of the 1st International Workshop on Design and Evaluation of e-Government Applications and Services September 11th, 2007, Rio de Janeiro, Brazil. pp. 239-242. http://dx.doi.org/10.1007/978-3-540-74796-3_23

Buchanan, George, Gow, Jeremy, Blandford, Ann, Rimmer, Jon, Warwick, Claire (2007): Representing aggregate works in the digital library. In: JCDL07: Proceedings of the 7th ACM/IEEE-CS Joint Conference on Digital Libraries , 2007, . pp. 247-256. http://doi.acm.org/10.1145/1255175.1255224

Blandford, Ann, Keith, Suzette, Butterworth, Richard, Fields, Bob, Furniss, Dominic (2007): Disrupting digital library development with scenario informed design. In Interacting with Computers, 19 (1) pp. 70-82. http://dx.doi.org/10.1016/j.intcom.2006.07.003

Blandford, Ann, Benedyk, Rachel, Berthouze, Nadia, Cox, Anna Louise, Dowell, John (2007): The Challenges of Creating Connections and Raising Awareness: Experience from UCLIC. In: Baranauskas, Maria Cecília Calani, Palanque, Philippe A., Abascal, Julio, Barbosa, Simone Diniz Junqueira (eds.) DEGAS 2007 - Proceedings of the 1st International Workshop on Design and Evaluation of e-Government Applications and Services September 11th, 2007, Rio de Janeiro, Brazil. pp. 682-683. http://dx.doi.org/10.1007/978-3-540-74800-7_81

Green, T. R. G., Blandford, Ann, Church, L., Roast, Chris R., Clarke, S. (2006): Cognitive dimensions: Achievements, new directions, and open questions. In J. Vis. Lang. Comput., 17 (4) pp. 328-365. http://dx.doi.org/10.1016/j.jvlc.2006.04.004

Furniss, Dominic, Blandford, Ann (2006): Understanding Emergency Medical Dispatch in terms of Distributed Cognition: a case study. In Ergonomics, 49 (12) pp. 1174-1203. http://www.uclic.ucl.ac.uk/annb/docs/dfabErgopreprint.pdf

Blandford, Ann, Keith, Suzette, Fields, Bob (2006): Claims Analysis. In International Journal of Human-Computer Interaction, 21 (2) pp. 197-218. http://www.leaonline.com/doi/abs/10.1207/s15327590ijhc2102_5

Ruksenas, Rimvydas, Curzon, Paul, Back, Jonathan, Blandford, Ann (2006): Formal Modelling of Cognitive Interpretation. In: Doherty, Gavin, Blandford, Ann (eds.) DSV-IS 2006 - Interactive Systems. Design, Specification, and Verification, 13th International Workshop July 26-28, 2006, Dublin, Ireland. pp. 123-136. http://dx.doi.org/10.1007/978-3-540-69554-7_10

Back, J., Cheng, W. L., Dann, R., Curzon, P., Blandford, Ann (2006): Does Being Motivated to Avoid Procedural Errors Influence Their Systematicity?. In: Proceedings of the HCI06 Conference on People and Computers XX , 2006, . pp. 151-158.

Chozos, Nick, Sheridan, Jennifer G., Mehmet, Özcan, Naghsh, Amir, Lee, Kwang Chun, Blandford, Ann (2005): Supporting Values Other Than Usability and Performance Within the Design Process. In: Gilroy, Stephen W., Harrison, Michael D. (eds.) DSV-IS 2005 - Interactive Systems, Design, Specification, and Verification, 12th International Workshop July 13-15, 2005, Newcastle upon Tyne, UK. pp. 262-263. http://dx.doi.org/10.1007/11752707_23

Adams, Anne, Blandford, Ann, Lunt, Peter (2005): Social Empowerment and Exclusion: A case study on Digital Libraries. In ACM Transactions on CHI, 0 (0) pp. 174-200. http://www.uclic.ucl.ac.uk/annb/docs/aaabplToCHIpreprint.pdf

Blandford, Ann, Furniss, Dominic (2005): DiCoT: A Methodology for Applying Distributed Cognition to the Design of Teamworking Syste. In: Gilroy, Stephen W., Harrison, Michael D. (eds.) DSV-IS 2005 - Interactive Systems, Design, Specification, and Verification, 12th International Workshop July 13-15, 2005, Newcastle upon Tyne, UK. pp. 26-38. http://dx.doi.org/10.1007/11752707_3

Warwick, Claire, Rimmer, Jon, Blandford, Ann, Buchanan, George (2005): User centred interactive search in the humanities. In: JCDL05: Proceedings of the 5th ACM/IEEE-CS Joint Conference on Digital Libraries , 2005, . pp. 400. http://doi.acm.org/10.1145/1065385.1065503

Adams, Anne, Blandford, Ann (2005): Digital libraries\' support for the user\'s \'information journey\'. In: JCDL05: Proceedings of the 5th ACM/IEEE-CS Joint Conference on Digital Libraries , 2005, . pp. 160-169. http://doi.acm.org/10.1145/1065385.1065424

Blandford, Ann, Green, T. R. G., Connell, Iain (2005): Formalising an Understanding of User-System Misfits. In: Bastide, Remi, Palanque, Philippe A., Roth, Jorg (eds.) Engineering Human Computer Interaction and Interactive Systems, Joint Working Conferences EHCI-DSVIS 2004 July 11-13, 2005, Hamburg, Germany. pp. 253-270. http://dx.doi.org/10.1007/11431879_17

Adams, Anne, Blandford, Ann (2005): Bridging the gap between organizational and user perspectives of security in the clinical . In International Journal of Human-Computer Studies, 63 (1) pp. 175-202. http://dx.doi.org/10.1016/j.ijhcs.2005.04.022

Connell, Iain, Blandford, Ann, Green, T. R. G. (2004): CASSM and cognitive walkthrough: usability issues with ticket vending machines. In Behaviour and Information Technology, 23 (5) pp. 307-320. http://journalsonline.tandf.co.uk/openurl.asp?genre=article&eissn=1362-3001&volume=23&issue=5&spage=307

Blandford, Ann, Wong, B. L. William (2004): Situation awareness in emergency medical dispatch. In International Journal of Human-Computer Studies, 61 (4) pp. 421-452. http://dx.doi.org/10.1016/j.ijhcs.2003.12.012

Fields, B., Keith, S., Blandford, Ann (2004): Designing for Expert Information Finding Strategies. In: Proceedings of the HCI04 Conference on People and Computers XVIII , 2004, . pp. 89-102.

Blandford, Ann, Keith, Suzette, Connell, Iain, Edwards, Helen (2004): Analytical usability evaluation for digital libraries: a case study. In: JCDL04: Proceedings of the 4th ACM/IEEE-CS Joint Conference on Digital Libraries , 2004, . pp. 27-36. http://doi.acm.org/10.1145/996350.996360

Blandford, Ann, Butterworth, Richard, Curzon, Paul (2004): Models of interactive systems: a case study on programmable user modelling. In International Journal of Human-Computer Studies, 60 (2) pp. 149-200.

Buchanan, George, Blandford, Ann, Thimbleby, Harold, Jones, Matt (2004): Integrating information seeking and structuring: exploring the role of spatial hypertext i. In: Proceedings of the Fifteenth ACM Conference on Hypertext , 2004, . pp. 225-234. http://doi.acm.org/10.1145/1012807.1012864

Attfield, Simon, Blandford, Ann, Craft, Brock (2004): Task Embedded Visualisation: The Design for an Interactive IR Results Display for Journali. In: IV 2004 - 8th International Conference on Information Visualisation 14-16 July, 2004, London, UK. pp. 650-655. http://csdl.computer.org/comp/proceedings/iv/2004/2177/00/21770650abs.htm

Connell, I., Green, T., Blandford, Ann (2003): Ontological Sketch Models: Highlighting User-System Misfits. In: Proceedings of the HCI03 Conference on People and Computers XVII , 2003, . pp. 163-178.

Blandford, Ann, Connell, Iain (2003): Ontological Sketch Modelling (OSM): Concept-based Usability Analysis. In: Proceedings of IFIP INTERACT03: Human-Computer Interaction , 2003, Zurich, Switzerland. pp. 1021.

Thimbleby, Harold, Blandford, Ann, Cairns, P., Curzon, P., Jones, M. (2002): User Interface Design as Systems Design. In: Faulkner, Xristine, Finlay, Janet, Détienne, Françoise (eds.) Proceedings of the HCI02 Conference on People and Computers XVI September 18-20, 2002, Pisa, Italy. pp. 281-302.

Curzon, Paul, Blandford, Ann (2002): From a Formal User Model to Design Rules. In: Forbrig, Peter, Limbourg, Quentin, Urban, Bodo, Vanderdonckt, Jean M. (eds.) DSV-IS 2002 - Interactive Systems. Design, Specification, and Verification, 9th International Workshop June 12-14, 2002, Rostock, Germany. pp. 1-15. http://link.springer.de/link/service/series/0558/bibs/2545/25450001.htm

Blandford, Ann, Buchanan, George (2002): Usability for digital libraries. In: JCDL02: Proceedings of the 2nd ACM/IEEE-CS Joint Conference on Digital Libraries , 2002, . pp. 424. http://doi.acm.org/10.1145/544220.544374

Blandford, Ann, Rugg, Gordon (2002): A case study on integrating contextual information with analytical usability evaluation. In International Journal of Human-Computer Studies, 57 (1) pp. 75-99.

Wong, William B. L., Blandford, Ann (2002): Analysing Ambulance Dispatcher Decision Making: Trialling Emergent Themes Analysis. In: Proceedings of the HF2002 Human Factors Conference November 25-27, 2002, Melbourne, Australia. http://www.uclic.ucl.ac.uk/annb/docs/wwabfh2002.pdf

Blandford, Ann, Wong, William B. L., Connell, Iain, Green, Thomas (2002): Multiple Viewpoints On Computer Supported Team Work: A Case Study On Ambulance Dispatch. In: Faulkner, Xristine, Finlay, Janet, Détienne, Françoise (eds.) Proceedings of the HCI02 Conference on People and Computers XVI September 18-20, 2002, Pisa, Italy. pp. 139-156. http://www.uclic.ucl.ac.uk/annb/docs/abwwictgHCI02preprint.pdf

Curzon, Paul, Blandford, Ann, Butterworth, Richard, Bhogal, Ravinder (2002): Interaction design issues for car navigation systems. In: Sharp, Helen, Chalk, Pete, LePeuple, Jenny, Rosbottom, John (eds.) Proceeding of HCI 2002 September 2-6, 2002, London, United Kingdom. pp. 38-41. http://eprints.ucl.ac.uk/16834/1/16834.pdf

Blandford, Ann, Wong, B. L. W., Connell, I., Green, T. (2002): Multiple Viewpoints On Computer Supported Team Work: A Case Study On Ambulance Dispatch. In: Faulkner, Xristine, Finlay, Janet, Détienne, Françoise (eds.) Proceedings of the HCI02 Conference on People and Computers XVI September 18-20, 2002, Pisa, Italy. pp. 139-156.

Curzon, Paul, Blandford, Ann (2001): Detecting Multiple Classes of User Errors. In: Little, Murray Reed, Nigay, Laurence (eds.) EHCI 2001 - Engineering for Human-Computer Interaction, 8th IFIP International Conference May 11-13, 2001, Toronto, Canada. pp. 57-72. http://link.springer.de/link/service/series/0558/bibs/2254/22540057.htm

Blandford, Ann, Stelmaszewska, Hanna, Bryan-Kinns, Nick (2001): Use of Multiple Digital Libraries: A Case Study. In: JCDL01: Proceedings of the 1st ACM/IEEE-CS Joint Conference on Digital Libraries , 2001, . pp. 179-188. http://www.acm.org/pubs/articles/proceedings/dl/379437/p179-blandford/p179-blandford.pdf

Blandford, Ann, Butterworth, R., Curzon, P. (2001): PUMA Footprints: Linking Theory and Craft Skill in Usability Evaluation. In: Proceedings of IFIP INTERACT01: Human-Computer Interaction , 2001, Tokyo, Japan. pp. 577-584.

Blandford, Ann, Green, T. R. G. (2001): Group and Individual Time Management Tools: What You Get is Not What You Need. In Personal and Ubiquitous Computing, 5 (4) pp. 213-230. http://springerlink.metapress.com/openurl.asp?genre=article&issn=1617-4917&volume=5&issue=4&spage=213

Butterworth, Richard, Blandford, Ann, Duke, David J. (1999): Using Formal Models to Explore Display-Based Usability Issues. In J. Vis. Lang. Comput., 10 (4) pp. 455-479.

Blandford, Ann, Shum, Simon Buckingham, Young, Richard M. (1998): Training Software Engineers in a Novel Usability Evaluation Technique. In International Journal of Human-Computer Studies, 49 (3) pp. 245-279.

Butterworth, Richard, Blandford, Ann (1998): The Role of Formal Proof in Modelling Interactive Behaviour. In: Markopoulos, Panos, Johnson, Peter (eds.) DSV-IS 1998 - Design, Specification and Verification of Interactive Systems98, Proceedings of the Fifth International Eurographics Workshop June 3-5, 1998, Abingdon, United Kingdom. pp. 87-101.

Blandford, Ann, Duke, David (1997): Integrating User and Computer System Concerns in the Design of Interactive Systems. In International Journal of Human-Computer Studies, 46 (5) pp. 653-679.

Blandford, Ann, Butterworth, Richard, Good, Jason (1997): Users as rational interacting agents: formalising assumptions about cognition and interact. In: Harrison, Michael D., Torres, Juan Carlos (eds.) DSV-IS 1997 - Design, Specification and Verification of Interactive Systems97, Proceedings of the Fourth International Eurographics Workshop June 4-6, 1997, Granada, Spain. pp. 45-60.

Bellotti, Victoria, Blandford, Ann, Duke, David, MacLean, Allan, May, Jon, Nigay, Laurence (1996): Interpersonal Access Control in Computer-Mediated Communications: A Systematic Analysis of. In Human-Computer Interaction, 11 (4) pp. 357-432.

Shum, Simon Buckingham, Blandford, Ann, Duke, David, Good, Jason, May, Jon, Paterno, Fabio, Young, Richard (1996): Multidisciplinary Modelling for User-Centred System Design: An Air-Traffic Control Case St. In: Sasse, Martina Angela, Cunningham, R. J., Winder, R. L. (eds.) Proceedings of the Eleventh Conference of the British Computer Society Human Computer Interaction Specialist Group - People and Computers XI August, 1996, London, UK. pp. 201-219.

Nigay, Laurence, Coutaz, Joëlle, Salber, Daniel, Blandford, Ann, May, Jon, Young, Richard M. (1995): Four Easy Pieces for Assessing the Usability of Multimodal Interaction: the CARE Propertie. In: Nordby, Knut (eds.) Proceedings of INTERACT 95 - IFIP TC13 Fifth International Conference on Human-Computer Interaction June 25-29, 1995, Lillehammer, Norway. pp. 115-120. http://iihm.imag.fr/publs/1995/Interact95_CARE.pdf

Blandford, Ann, Harrison, Michael, Barnard, Philip J. (1995): Using Interaction Framework to Guide the Design of Interactive Systems. In International Journal of Human-Computer Studies, 43 (1) pp. 101-130.

Harrison, Michael D., Blandford, Ann, Barnard, Philip J. (1994): Modelling Interactive Systems and Providing Task Relevant Information. In: Paterno, Fabio (eds.) DSV-IS 1994 - Design, Specification and Verification of Interactive Systems94, Proceedings of the First International Eurographics Workshop June 8-10, 1994, Bocca di Magra, Italy. pp. 267-277.

Blandford, Ann (1993): "Knowledge Negotiation," edited by R. Moyse and M. T. Elsom-Cook. In International Journal of Man-Machine Studies, 39 (6) pp. 1051-1057.

Blandford, Ann (1993): An Agent-Theoretic Approach to Computer Participation in Dialogue. In International Journal of Man-Machine Studies, 39 (6) pp. 965-998.

Blandford, Ann, Young, Richard M. (1993): Developing Runnable User Models: Separating the Problem Solving Techniques from the Domain. In: Alty, James L., Diaper, Dan, Guest, D. (eds.) Proceedings of the Eighth Conference of the British Computer Society Human Computer Interaction Specialist Group - People and Computers VIII August 7-10, 1993, Loughborough University, UK. pp. 111-121.

Blandford, Ann

15.12 Commentary by Ann Blandford

Gilbert Cockton’s article on Usability Evaluation does a particularly good job of drawing out the history of “usability” and “user experience” (UX), and highlighting the limitations as well as the importance of a classical “usability” perspective. For several years, I taught a course called “Usability Evaluation Methods”, but I changed the name to “User-centred Evaluation Methods” because “usability” had somehow come to mean “the absence of bad” rather than “the presence of good”. Cockton argues that “user experience” is the more positive term, and we should clearly be aiming to deliver systems that have greater value than being “not bad”.

However, there remains an implicit assumption that evaluation is summative rather than formative. For example, he discusses the HEART measures of Happiness, Engagement, Adoption, Retention and Task success, and contrasts these with the PULSE measures. Used effectively, these can give a measure of the quality (or even the worth) of a system, alone or in the product ecologies of which it is a part. However, they do not provide information for design improvement. A concern with the quantifiable, and with properties of evaluation methods such as reliability (e.g. Hertzum & Jacobsen, 2001), has limited our perspective in terms of what is valuable about evaluation methods. Wixon (2003) argues that the most important feature of any method is its downstream utility: does the evaluation method yield insights that will improve the design? To deliver downstream utility, the method has to deliver insights not just about whether a product improves (for example) user happiness, but also why it improves happiness, and how the design could be changed to improve happiness even further (or reduce frustration, or whatever). This demands evaluation methods that can inform the design of next-generation products.

Of course, no method stands alone: a method is simply a tool to be used by practitioners for a purpose. As Cockton notes, methods in practice are adopted and adapted by their users, so there is in a sense no such thing as a “method”, but a repertoire of resources that can be selected, adapted and applied, with more or less skill and insight, to yield findings that are more or less useful. To focus this selection and adaptation process, we have developed the Pret A Rapporter framework (Blandford et al, 2008a) for planning a study. The first important element of the framework is making explicit the obvious point that every study is conducted for a purpose, and that that purpose needs to be clear (whether it is formative or summative, focused or exploratory). The second important element is that every study has to work with the available resources and constraints: every evaluation study is an exercise in the art of the possible.

Every evaluation approach has a potential scope — purposes for which it is and is not well suited. For example, an interview study is not going to yield reliable findings about the details of people’s interactions with an interface (simply because people cannot generally recall such details), but might be a great way to find out people’s attitudes to a new technology; a GOMS study (John and Kieras, 1996) can reveal important points about task structure, and deliver detailed timing predictions for well structured tasks, but is not going to reveal much about user attitudes to a system; and a transaction log analysis will reveal what people did, but not why they did it.

Cockton draws a distinction between analytical and empirical methods, where analytical methods involve inspection of a system and empirical methods are based on usage. This is a good first approximation, but hides some important differences between methods. Some analytical methods (such as Heuristic Evaluation or Expert Walkthrough) have no direct grounding in theory, but provide more or less support for the analyst (e.g. in the form of heuristics); others (including GOMS) have a particular theoretical basis which typically both constrains the analyst, in terms of what issues can be identified through the method, and provides more support, yielding greater insight into the underlying causes of any issues identified, and hence a stronger basis to inform redesign. In a study of several different analytical methods (Blandford et al, 2008c), we found that methods with a clear theoretical underpinning yielded rich insights about a narrow range of issues (concerning system design, likely user misconceptions, how well the system fits the way users think about their activities, the quality of physical fit between user and system, or how well the system fits its context of use); methods such as Heuristic Evaluation, which do not have theoretical underpinnings, tend to yield insights across a broader range of issues, but also tend to focus more on the negative (what is wrong with a system) than the positive (what already works well, or how a system might be improved).

Cockton rightly emphasises the importance of context for assessing usability (or user experience); surprisingly little attention has been paid to developing methods that really assess how systems fit their users in their various contexts of use. In the context of e-commerce, such as his van hire example, it is widely recognised that the Total Customer Experience matters more than the UX of the website interface (e.g. Minocha et al, 2005): the website is one component of a broader system, and what matters is that the whole system works well for the customers (and also for the staff who have to work within it). The same is true in most contexts: the system has to perform well, it has to be usable and provide a positive user experience, but it also has to fit well into the context of use.

In different contexts, different criteria become prominent. For example, for a banking system, security is at least as important as usability, and having confidence in the security of the system is an important aspect of user experience. A few days ago, I was trying to set up a new standing order (i.e. regular payment from my bank account to a named payee) to pay annually at the beginning of the year ... but the online banking system would only allow me to set up a new standing order to make a payment in the next four months, even though it would permit payment to be annual. This was irritating, and a waste of time (as I tried to work out whether there was a way to force the system to accept a later date for first payment), but it did not undermine my confidence in the system, so I will continue to use it because in many other situations it provides a level of convenience that old-fashioned banking did not.

Cockton points out that there are many values that a system may offer other than usability. We have recently been conducting a study of home haemodialysis. We had expected basic usability to feature significantly in the study, but it does not: not because the systems are easy to use (they are not), but because the users have to be very well trained before they are able to dialyse at home, their lives depend on dialysis (so they are grateful to have access to such machines), and being able to dialyse at home improves their quality of life compared to having to travel to a dialysis centre several times a week. The value to users of usability is much lower than the values of quality of life and safety.

Particularly when evaluating use in context, there doesn’t have to be an either-or between analytical and empirical methods. In our experience, combining empirical studies (involving interviews and observations) with some form of theory-based analysis provides a way of generalising findings beyond the particular context that is being studied, while also grounding the evaluation in user data. If you do a situated study of (for example) a digital library in a hospital setting (Adams et al, 2005), it is difficult to assess how, or whether, the findings generalise to even a different hospital setting, never mind other contexts of use. Being able to apply a relevant theoretical lens (in this case, Communities of Practice) to the data gives at least some idea of what generalises and what doesn’t. In this case, the theory did not contribute to an understanding of usability per se, but to an understanding of how the deployment of the technology influenced its acceptance and take-up in practice. Similarly, in a study of an ambulance dispatch system (Blandford and Wong, 2004), a theory of situation awareness enabled us to reason about which aspects of the system design, and the way it was used in context, supported or hindered the situation awareness of control room staff. It was possible to apply an alternative theoretical perspective (Distributed Cognition) to the same context of use (ambulance dispatch) (Furniss and Blandford, 2006) to get a better understanding of how the technology design and workspace design contribute to the work of control room staff, including the ways that they coordinate their activity. By providing a semi-structured method (DiCoT) for conducting Distributed Cognition analyses of systems (Blandford and Furniss, 2006), we are encoding key aspects of the theory to make it easier for others to apply it (e.g. McKnight and Doherty, 2008), and we are also applying it ourselves to new contexts, such as an intensive care unit (Rajkomar and Blandford, in press). Even though particular devices are typically at the centre of these studies, they do not focus on classical usability of the device, or even on user experience as defined by Cockton, but on how the design of the device supports work in its context of use.

Another important aspect of use in context is how people think about their activities and how a device requires them to think about those activities. Green (1989) and others (Green et al, 2006) developed Cognitive Dimensions as a vocabulary for talking about the mismatch between the way that people conceptualise an activity and the way they can achieve their goals with a particular device; for example, Green proposes the term “viscosity” to capture the idea that something that is conceptually simple (e.g. inserting a new figure in a document) is practically difficult (requiring each subsequent figure to be renumbered systematically in many word processors). We went on to develop CASSM (Blandford et al, 2008b) as a method for systematically evaluating the quality of the conceptual fit between a system and its users. Where there are different classes of users of the same system, which you might regard as different personas, you are likely to find different qualities of fit (Blandford et al, 2002). CASSM contrasts with most established evaluation methods in being formative rather than summative; in focusing on concepts rather than procedures; in being a hybrid empirical-analytical approach; and in focusing on use in context rather than either usability or user experience as Cockton describes them. It is a method for evaluating how existing systems support their users in context, which is a basis for identifying future design opportunities to either improve those systems or deliver novel systems that address currently unmet needs. Evaluation should not be the end of the story: as Carroll and Rosson (1992) argue, systems and uses evolve over time, and evaluation of the current generation of products can be a basis for designing the next generation.

This commentary has strayed some way from the classical definitions of usability as encapsulated in many of the standards, and cited by Cockton, to focus more on how to evaluate “quality in use”, or the “extent to which a product can be used by specified users to achieve specified goals” within their situated context of use. Cockton argues that “several evaluation and other methods may be needed to identify and relate a nexus of causes”. I would argue that CASSM and DiCoT are examples of formative methods that address this need, focusing on how products are used in context, and how an understanding of situated use can inform the design of future products. Neither is a silver bullet, but each contributes to the agenda Cockton outlines.

15.12.1 References

  • Adams, A., Blandford, A. & Lunt, P. (2005) Social empowerment and exclusion:  A case study on digital libraries. In ACM Transactions on CHI. 12.2. 174-200. DOI http://doi.acm.org/10.1145/1067860.1067863
  • Blandford, A., Adams, A., Attfield, S., Buchanan, G., Gow, J., Makri, S., Rimmer, J. & Warwick, C. (2008a) The PRET A Rapporter framework: Evaluating digital libraries from the perspective of information work. Information Processing & Management. 44. 4-21. DOI http://dx.doi.org/10.1016/j.ipm.2007.01.021
  • Blandford, A. & Furniss, D. (2006) DiCoT: a methodology for applying Distributed Cognition to the design of team working systems. In S. Gilroy & M. Harrison (Eds.) Proc. DSVIS 2005. Springer: LNCS 3941. 26-38. DOI http://dx.doi.org/10.1007/11752707_3
  • Blandford, A., Green, T. R. G., Furniss, D. & Makri, S. (2008b) Evaluating system utility and conceptual fit using CASSM. International Journal of Human—Computer Studies. 66. 393-409. DOI http://dx.doi.org/10.1016/j.ijhcs.2007.11.005
  • Blandford, A., Hyde, J. K., Green, T. R. G. & Connell, I. (2008c) Scoping Analytical Usability Evaluation Methods: A Case Study. Human Computer Interaction Journal. 23.3. 278 — 327. DOI 10.1080/07370020802278254
  • Blandford, A. & Wong, W. (2004) Situation Awareness in Emergency Medical Dispatch. International Journal of Human—Computer Studies. 61(4). 421-452. DOI http://dx.doi.org/10.1016/j.ijhcs.2003.12.012
  • Blandford, A. E., Wong, B. L. W., Connell, I. & Green, T. R. G. (2002) Multiple Viewpoints On Computer Supported Team Work: A Case Study On Ambulance Dispatch. In X. Faulkner, J. Finlay & F. Détienne (Eds.) People and Computers XVI: Proc. HCI’02. 139-156. Springer./li>
  • Carroll, J. M. & Rosson, M.B. (1992) Getting around the task-artifact cycle: how to make claims and design by scenario. ACM Transactions on Information Systems, 10(2), 181-212.
  • Furniss, D. & Blandford, A. (2006), Understanding Emergency Medical Dispatch in terms of Distributed Cognition: a case study. Ergonomics Journal. 49. 12/13. 1174-1203. DOI http://dx.doi.org/10.1080/00140130600612663
  • Green, T.R.G. (1989) Cognitive dimensions of notations.  In R. Winder and A. Sutcliffe (Eds), People and Computers V.  Cambridge University Press< 443-460.
  • Green, T.R.G., Blandford, A.E., CHurch, L., Roast, C.R. & Clarke, S. (2006) Cognitive dimensions: Achievements, new directions, and open questions, Journal of Visual Languages & Computing, Volume 17, Issue 4, Ten Years of Cognitive Dimensions, August 2006, Pages 328-365.
  • Hertzum, M., and Jacobsen, N.E. (2001) The Evaluator Effect: A Chilling Fact about Usability Evaluation Methods. International Journal of Human-Computer Interaction, 13(4), 421-443.
  • John, B. & Kieras, D.E. (1996) Using GOMS for user interface design and evaluation: which technique? ACM ToCHI 3.4. 287-319.
  • McKnight, J. and Doherty, G. (2008) Distributed cognition and mobile healthcare work. In Proceedings of the 22nd British HCI Group Annual Conference on People and Computers: Culture, Creativity, Interaction - Volume 2 (BCS-HCI '08), Vol. 2. British Computer Society, Swindon, UK, 35-38.
  • Minocha, S., Dawson, L., Blandford, A. & Millard, N. (2005) Providing value to customers in e-commerce environments: the customer’s perspective. In  Contemporary Research in E-Marketing Vol 2. Ideas Group publishing. 119-146.
  • Rajkomar, A. & Blandford, A. (in press) Understanding Infusion Administration in the ICU through Distributed Cognition. Journal of Biomedical Informatics. To appear.DOI http://dx.doi.org/10.1016/j.jbi.2012.02.003
  • Wixon, D. (2003) Evaluating Usability Methods; Why the Current Literature Fails the Practitioner.  Interactions July and August 2003. 28-34.

Doherty, Gavin, Blandford, Ann (eds.) DSV-IS 2006 - Interactive Systems. Design, Specification, and Verification, 13th International Workshop July 26-28, 2006, Dublin, Ireland.

Blandford, Ann, Gray, Philip D., Vanderdonckt,, Jean (2001): People and Computers XV - Interaction without Frontiers: Joint Proceedings of HCI 2001 and IHM 2001, Springer Verlag,

Keith, S., Blandford, Ann, Fields, Bob, Harrison, Michael (eds.) 12th International Conference on Engineering and Product Design Education EPDE10 , 2010, .

Blandford, Ann (2014): Semi-structured qualitative studies. In: Soegaard, Mads, Dam, Rikke Friis (eds). "The Encyclopedia of Human-Computer Interaction, 2nd Ed." The Interaction Design Foundation .