Number of co-authors:8
Number of publications with 3 favourite co-authors:Janet C. Read:3Matthew Horton:2Barbara McManus:2
Gavin Sim's 3 most productive colleagues in number of publications:Janet C. Read:35Stuart MacFarlane:11Matthew Horton:7
It's all about one thing: creative problem-solving to get the story out.
-- Robert Greenberg, R/GA, 2006
Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess
User Experience and Experience Design !
Our Latest Books
Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger
Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad
Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam
Publications by Gavin Sim (bibliography)
Sim, Gavin, Horton, Matthew and Danino, Nicky (2012): Evaluating game preference using the fun toolkit across cultures. In: Proceedings of the HCI12 Conference on People and Computers XXVI 2012. pp. 386-391.
Over the past decade many new evaluation methods have emerged for evaluating user experience with children, but the results of these studies have tended to be reported in isolation and cultural implications have been largely ignored. This paper reports on a comparative analysis of the Fun Toolkit and the effect of culture on game preference. In total 37 children aged between 7 and 9 participated in the study, from a school in the UK and Jordan. The children played 2 different games on a tablet PC and their experiences of each were captured using the Fun Toolkit. The results showed that culture did not appear to affect children's preference and Fun Toolkit is a valid user experience tool across cultures.
© All rights reserved Sim et al. and/or their publisher
Sim, Gavin and Read, Janet C. (2010): The Damage Index: an aggregation tool for usability problem prioritisation. In: Proceedings of the HCI10 Conference on People and Computers XXIV 2010. pp. 54-61.
The aggregation of usability problems is an integral part of a usability evaluation. Numerous problems can be revealed and given that there are usually limited resources for fixing or redesigning the system then prioritisation of the problem set is essential. This paper examines the prioritisation of usability problems from a single heuristic evaluation and multiple heuristic evaluations of Questionmark Perception, a computer assisted assessment application widely used within educational institutions. Two different methods for prioritisation are critiqued; one based on the severity ratings alone and the other on a Damage Index formula proposed by the authors. The results highlight the difference in ranking of problems dependent upon the approach taken. The Damage Index offers a method of systematically prioritising the usability problems in a repeatable way, removing subjectivity from this process, therefore offering improvements over just the reliance upon the severity ratings alone.
© All rights reserved Sim and Read and/or BCS
Xu, Diana, Read, Janet C., Sim, Gavin and McManus, Barbara (2009): Experience it, draw it, rate it: capture children's experiences with their drawings. In: Proceedings of ACM IDC09 Interaction Design and Children 2009. pp. 266-270.
This paper investigates the use of drawings as a tool for the evaluation of children's interfaces. In the study, children's experiences on a variety of computer interfaces were captured in drawings. A group of four researchers participated in the coding of the drawings, before the results were aggregated and statistically analysed. The evaluation of the approach is positive: the chosen drawing method could be used easily and was effective in conveying the user experience from the drawings; a number of the drawings conveyed information pertaining to user experiences: fun (F), goal fit (GF) and tangible magic (TM); the method was found generally reliable at capturing all three elements and particularly reliable at capturing fun.
© All rights reserved Xu et al. and/or ACM Press
Xu, Diana Yifan, Read, Janet C., Sim, Gavin, McManus, Barbara and Qualter, Pam (2009): Children and 'smart' technologies: can children's experiences be interpreted and coded?. In: Proceedings of the HCI09 Conference on People and Computers XXIII 2009. pp. 224-231.
This paper has a focus on young children and their emerging new technologies. It examines children's drawings as an evaluation tool for capturing their experiences of different novel interfaces. A recent evaluation study with children and two follow-up expert coding sessions were used to demonstrate how drawings could be used and coded and how the intercoder reliability could be improved. Usability and User Experience (UX) factors: Fun (F), Goal Fit (GF) and Tangible Magic (TM) were included in the coding scheme and they were the factors that have been looked at in the coding sessions. Our studies show the thoroughness and ease-of-use of the drawing method. The method was effective and reliable in conveying the user experience form the drawings. It also shows some of the limitation of the method: e.g. resource intensive and open to evaluator's interpretation. From the result of the study, a number of the drawings conveyed information pertaining to user experiences: F, GF and TM, and the method was particularly reliable at capturing fun. The result also led to the correlation found on the GF and TM.
© All rights reserved Xu et al. and/or their publisher
MacFarlane, Stuart, Sim, Gavin and Horton, Matthew (2005): Assessing usability and fun in educational software. In: Proceedings of ACM IDC05: Interaction Design and Children 2005. pp. 103-109.
We describe an investigation into the relationship between usability and fun in educational software designed for children. Twenty-five children aged between 7 and 8 participated in the study. Several evaluation methods were used; some collected data from observers, and others collected reports from the users. Analysis showed that in both observational data, and user reports, ratings for fun and usability were correlated, but that there was no significant correlation between the observed data and the reported data. We discuss the possible reasons for these findings, and describe a method that was successful in eliciting opinions from young children about fun and usability.
© All rights reserved MacFarlane et al. and/or ACM Press
Show list on your website
Join our community and advance:
Changes to this page (author)09 Nov 2012: Modified03 Apr 2012: Modified
03 Nov 2010: Modified
26 Jun 2009: Modified
23 Jun 2007: Added
Page maintainer: The Editorial Team