Publication statistics

Pub. period:2003-2009
Pub. count:12
Number of co-authors:10



Co-authors

Number of publications with 3 favourite co-authors:

Erik Frokjaer:5
Aran Lunzer:2
Georgios Christou:1

 

 

Productive colleagues

Kasper Hornbaek's 3 most productive colleagues in number of publications:

Effie Lai-Chong La..:30
Jan Stage:28
Yuzuru Tanaka:14
 
 
 

Upcoming Courses

go to course
User-Centred Design - Module 2
Starts the day after tomorrow !
go to course
Design Thinking: The Beginner's Guide
92% booked. Starts in 3 days
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

 
 
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
 
 
 
 
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading
 
 

Kasper Hornbaek

Add description
Rename / change spelling
Add publication
 

Publications by Kasper Hornbaek (bibliography)

 what's this?
2009
 
Edit | Del

Christou, Georgios, Law, Effie Lai-Chong, Green, William and Hornbaek, Kasper (2009): Challenges in evaluating usability and user experience of reality-based interaction. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 4811-4814.

This workshop aims to further the understanding of the challenges relating to the evaluation methods of usability and user experience that are specific to Reality-Based Interaction (RBI), and to identify effective practical responses to these challenges. The emergence of Post-WIMP interfaces has led to new ways of interacting with technologies. However, there are still no integrated ways of evaluating the usability and user experience of these interfaces. Developers and designers are left to discover their own metrics and evaluation methods. This approach presents problems, in that the metrics used in each case may provide results that are neither valid nor meaningful. For this reason, the time is ripe to integrate the methods that have been developed for evaluating interfaces that belong to the RBI umbrella. The measures and techniques will then be turned into a framework that enables designers of RBI interfaces to select appropriately existing methods and tools to evaluate systematically the usability and user experience of their prototypes and products. Reusing and adapting validated evaluation approaches can not only avoid reinventing the wheel and wasting time but also further improve and consolidate these approaches. Such a framework will also provide a basis for comparison between designs of RBI interfaces in different application contexts.

© All rights reserved Christou et al. and/or ACM Press

2006
 
Edit | Del

Jakobsen, Mikkel R. and Hornbaek, Kasper (2006): Evaluating a fisheye view of source code. In: Proceedings of ACM CHI 2006 Conference on Human Factors in Computing Systems 2006. pp. 377-386.

Navigating and understanding the source code of a program are highly challenging activities. This paper introduces a fisheye view of source code to a Java programming environment. The fisheye view aims to support a programmer's navigation and understanding by displaying those parts of the source code that have the highest degree of interest given the current focus. An experiment was conducted which compared the usability of the fisheye view with a common, linear presentation of source code. Sixteen participants performed tasks significantly faster with the fisheye view, although results varied dependent on the task type. The participants generally preferred the interface with the fisheye view. We analyse participants' interaction with the fisheye view and suggest how to improve its performance. In the calculation of the degree of interest, we suggest to emphasize those parts of the source code that are semantically related to the programmer's current focus.

© All rights reserved Jakobsen and Hornbaek and/or ACM Press

 
Edit | Del

Norgaard, Mie and Hornbaek, Kasper (2006): What do usability evaluators do in practice?: an explorative study of think-aloud testing. In: Proceedings of DIS06: Designing Interactive Systems: Processes, Practices, Methods, & Techniques 2006. pp. 209-218.

Think-aloud testing is a widely employed usability evaluation method, yet its use in practice is rarely studied. We report an explorative study of 14 think-aloud sessions, the audio recordings of which were examined in detail. The study shows that immediate analysis of observations made in the think-aloud sessions is done only sporadically, if at all. When testing, evaluators seem to seek confirmation of problems that they are already aware of. During testing, evaluators often ask users about their expectations and about hypothetical situations, rather than about experienced problems. In addition, evaluators learn much about the usability of the tested system but little about its utility. The study shows how practical realities rarely discussed in the literature on usability evaluation influence sessions. We discuss implications for usability researchers and professionals, including techniques for fast-paced analysis and tools for capturing observations during sessions.

© All rights reserved Norgaard and Hornbaek and/or ACM Press

 Cited in the following chapter:

Semi-structured qualitative studies: [/encyclopedia/semi-structured_qualitative_studies.html]


 
 
Edit | Del

Hornbaek, Kasper (2006): Current practice in measuring usability: Challenges to usability studies and research. In International Journal of Human-Computer Studies, 64 (2) pp. 79-102.

How to measure usability is an important question in HCI research and user interface evaluation. We review current practice in measuring usability by categorizing and discussing usability measures from 180 studies published in core HCI journals and proceedings. The discussion distinguish several problems with the measures, including whether they actually measure usability, if they cover usability broadly, how they are reasoned about, and if they meet recommendations on how to measure usability. In many studies, the choice of and reasoning about usability measures fall short of a valid and reliable account of usability as quality-in-use of the user interface being studied. Based on the review, we discuss challenges for studies of usability and for research into how to measure usability. The challenges are to distinguish and empirically compare subjective and objective measures of usability; to focus on developing and employing measures of learning and retention; to study long-term use and usability; to extend measures of satisfaction beyond post-use questionnaires; to validate and standardize the host of subjective satisfaction questionnaires used; to study correlations between usability measures as a means for validation; and to use both micro and macro tasks and corresponding measures of usability. In conclusion, we argue that increased attention to the problems identified and challenges discussed may strengthen studies of usability and usability research.

© All rights reserved Hornbaek and/or Academic Press

 
Edit | Del

Lunzer, Aran and Hornbaek, Kasper (2006): RecipeSheet: creating, combining and controlling information processors. In: Proceedings of the ACM Symposium on User Interface Software and Technology 2006. pp. 145-154.

Many tasks require users to extract information from diverse sources, to edit or process this information locally, and to explore how the end results are affected by changes in the information or in its processing. We present the RecipeSheet, a general-purpose tool for assisting users in such tasks. The RecipeSheet lets users create information processors, called recipes, which may take input in a variety of forms such as text, Web pages, or XML, and produce results in a similar variety of forms. The processing carried out by a recipe may be specified using a macro or query language, of which we currently support Rexx, Smalltalk and XQuery, or by capturing the behaviour of a Web application or Web service. In the RecipeSheet's spreadsheet-inspired user interface, information appears in cells, with inter-cell dependencies defined by recipes rather than formulas. Users can also intervene manually to control which information flows through the dependency connections. Through a series of examples we illustrate how tasks that would be challenging in existing environments are supported by the RecipeSheet.

© All rights reserved Lunzer and Hornbaek and/or ACM Press

 
Edit | Del

Hornbaek, Kasper and Stage, Jan (2006): The Interplay Between Usability Evaluation and User Interaction Design. In International Journal of Human-Computer Interaction, 21 (2) pp. 117-123.

Usability evaluations inform user interaction design in a relevant manner, and successful user interaction design can be attained through usability evaluation. These are obvious conjectures about a mature usability engineering discipline. Unfortunately, research and practice suggest that, in reality, the interplay between usability evaluation and user interaction design is significantly more complex and too often far from optimal. This article provides a simple model of the interplay between usability evaluation and user interaction design that captures their main relationships. From the model, what is seen as the key challenges in improving the interplay between evaluation and design is outlined. The intention is to create a background against which the remainder of this special issue, containing 5 research articles presenting empirical data on the interplay between design and evaluation and a commentary, can be contrasted.

© All rights reserved Hornbaek and Stage and/or Lawrence Erlbaum Associates

2005
 
Edit | Del

Hornbaek, Kasper and Frokjaer, Erik (2005): Comparing usability problems and redesign proposals as input to practical systems development. In: Proceedings of ACM CHI 2005 Conference on Human Factors in Computing Systems 2005. pp. 391-400.

Usability problems predicted by evaluation techniques are useful input to systems development; it is uncertain whether redesign proposals aimed at alleviating those problems are likewise useful. We present a study of how developers of a large web application assess usability problems and redesign proposals as input to their systems development. Problems and redesign proposals were generated by 43 evaluators using an inspection technique and think aloud testing. Developers assessed redesign proposals to have higher utility in their work than usability problems. In interviews they explained how redesign proposals gave them new ideas for tackling well known problems. Redesign proposals were also seen as constructive and concrete input. Few usability problems were new to developers, but the problems supported prioritizing ongoing development of the application and taking design decisions. No developers, however, wanted to receive only problems or redesigns. We suggest developing and using redesign proposals as an integral part of usability evaluation.

© All rights reserved Hornbaek and Frokjaer and/or ACM Press

2004
 
Edit | Del

Fujima, Jun, Lunzer, Aran, Hornbaek, Kasper and Tanaka, Yuzuru (2004): Clip, connect, clone: combining application elements to build custom interfaces for information access. In: Proceedings of the 2004 ACM Symposium on User Interface Software and Technology 2004. pp. 175-184.

Many applications provide a form-like interface for requesting information: the user fills in some fields, submits the form, and the application presents corresponding results. Such a procedure becomes burdensome if (1) the user must submit many different requests, for example in pursuing a trial-and-error search, (2) results from one application are to be used as inputs for another, requiring the user to transfer them by hand, or (3) the user wants to compare results, but only the results from one request can be seen at a time. We describe how users can reduce this burden by creating custom interfaces using three mechanisms: clipping of input and result elements from existing applications to form cells on a spreadsheet; connecting these cells using formulas, thus enabling result transfer between applications; and cloning cells so that multiple requests can be handled side by side. We demonstrate a prototype of these mechanisms, initially specialised for handling Web applications, and show how it lets users build new interfaces to suit their individual needs.

© All rights reserved Fujima et al. and/or ACM Press

 
Edit | Del

Hornbaek, Kasper and Frokjaer, Erik (2004): Two psychology-based usability inspection techniques studied in a diary experiment. In: Proceedings of the Third Nordic Conference on Human-Computer Interaction October 23-27, 2004, Tampere, Finland. pp. 3-12.

Inspection techniques are widely used during systems design as a supplement to empirical evaluations of usability. Psychology-based inspection techniques could give important insights into how thinking shapes interaction, yet most inspection techniques do not explicitly consider users' thinking. We present an experiment comparing two psychology-based inspection techniques, cognitive walkthrough (CW) and metaphors of human thinking (MOT). Twenty participants evaluated web sites for e-commerce while keeping diaries of insights and problems experienced with the techniques. Using MOT, participants identified 30% more usability problems and in a reference collection of problems achieved a broader coverage. Participants preferred using the metaphors, finding them broader in scope. An analysis of the diaries shows that participants find it hard to understand MOT, while CW limits the scope of their search for usability problems. Participants identified problems in many ways, not only through the techniques, reflecting large differences in individual working styles.

© All rights reserved Hornbaek and Frokjaer and/or ACM Press

 
Edit | Del

Hornbaek, Kasper and Frokjaer, Erik (2004): Reading patterns and usability in visualizations of electronic documents. In Interactions, 11 (1) pp. 11-12.

 
Edit | Del

Hornbaek, Kasper and Frokjaer, Erik (2004): Usability Inspection by Metaphors of Human Thinking Compared to Heuristic Evaluation. In International Journal of Human-Computer Interaction, 17 (3) pp. 357-374.

A new usability inspection technique based on metaphors of human thinking has been experimentally compared to heuristic evaluation (HE). The aim of metaphors of thinking (MOT) is to focus inspection on users' mental activity and to make inspection easily applicable to different devices and use contexts. Building on classical introspective psychology, MOT bases inspection on metaphors of habit formation, stream of thought, awareness and associations, the relation between utterances and thought, and knowing. An experiment was conducted in which 87 novices evaluated a large Web application, and its key developer assessed the problems found. Compared to HE, MOT uncovered usability problems that were assessed as more severe for users and also appeared more complex to repair. The evaluators using HE found more cosmetic problems. The time spent learning and performing an evaluation with MOT was shorter. A discussion of strengths and weaknesses of MOT and HE is provided, which shows how MOT can be an effective alternative or supplement to HE.

© All rights reserved Hornbaek and Frokjaer and/or Lawrence Erlbaum Associates

2003
 
Edit | Del

Hornbaek, Kasper and Frokjaer, Erik (2003): Metaphors of Human Thinking: A New Tool in User Interface Design and Evaluation. In: Proceedings of IFIP INTERACT03: Human-Computer Interaction 2003, Zurich, Switzerland. p. 781.

 
Add publication
Show list on your website
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
Join our community!
 
 
 

Changes to this page (author)

09 May 2009: Modified
12 May 2008: Modified
26 Jul 2007: Modified
26 Jul 2007: Modified
24 Jul 2007: Modified
24 Jul 2007: Modified
29 Jun 2007: Modified
29 Jun 2007: Modified
27 Jun 2007: Modified
22 Jun 2007: Modified
22 Jun 2007: Modified
19 Jun 2007: Modified
11 Jun 2007: Added

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/kasper_hornbaek.html

Publication statistics

Pub. period:2003-2009
Pub. count:12
Number of co-authors:10



Co-authors

Number of publications with 3 favourite co-authors:

Erik Frokjaer:5
Aran Lunzer:2
Georgios Christou:1

 

 

Productive colleagues

Kasper Hornbaek's 3 most productive colleagues in number of publications:

Effie Lai-Chong La..:30
Jan Stage:28
Yuzuru Tanaka:14
 
 
 

Upcoming Courses

go to course
User-Centred Design - Module 2
Starts the day after tomorrow !
go to course
Design Thinking: The Beginner's Guide
92% booked. Starts in 3 days
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

 
 
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
 
 
 
 
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading