- Personal Homepage
- Eindhoven University of Technology (http://www.tue.nl/universiteit/kolom-2/faculteiten/industrial-design/)
Tilde Bekker is an associate professor in the Industrial Design department at the Eindhoven University of Technology. Her research interests include designing for playful interaction, and designing products for children and older adults. She leads and participates in research projects on playful interactions that examine how to persuade people to a healthier lifestyle. She has over 75 publications in international, peer-reviewed journals and conference proceedings and she is co-founder of the Interaction Design and children conference series.
- Publication period start
- Publication period end
- Number of co-authors
Number of publications with favourite co-authors
Most productive colleagues in number of publications
Barendregt, Wolmet, Bekker, Tilde (2011): Children may expect drag-and-drop instead of point-and-click. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems , 2011, . pp. 1297-1302. http://dx.doi.org/10.1145/1979742.1979764
Bekker, Tilde, Antle, Alissa N. (2011): Developmentally situated design (DSD): making theoretical knowledge accessible to designer. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems , 2011, . pp. 2531-2540. http://dx.doi.org/10.1145/1978942.1979312
Gilutz, Shuli, Bekker, Tilde, Fisch, Shalom, Blikstein, Paulo (2011): Teaching interaction design & children within diverse disciplinary curricula. In: Proceedings of ACM IDC11 Interaction Design and Children , 2011, . pp. 257-259. http://dx.doi.org/10.1145/1999030.1999076
Fernaeus, Ylva, Holopainen, Jussi, Bekker, Tilde (2011): Please enjoy!?: 2nd workshop on playful experiences in mobile HCI. In: Proceedings of 13th Conference on Human-computer interaction with mobile devices and services , 2011, . pp. 745-748. http://dx.doi.org/10.1145/2037373.2037503
Romero, Natalia, Sturm, Janienke, Bekker, Tilde, Valk, Linda De, Kruitwagen, Sander (2010): Playful persuasion to support older adults' social and physical activities. In Interacting with Computers, 22 (6) pp. 485-495. http://www.sciencedirect.com/science/article/B6V0D-50YK829-1/2/0d0614876352b7bbf0a33f8c0c46b8ae
Hof, Lisa op't, Pee, Jente de, Sturm, Janienke, Bekker, Tilde, Verbeek, Jos (2010): Prolonged play with the ColorFlares: how does open-ended play behavior change over time?. In: Proceedings of the 3rd International Conference Fun and Games , 2010, . pp. 99-106. http://doi.acm.org/10.1145/1823818.1823829
Bekker, Tilde, Sturm, Janienke (2009): Stimulating physical and social activity through open-ended play. In: Proceedings of ACM IDC09 Interaction Design and Children , 2009, . pp. 309-312. http://doi.acm.org/10.1145/1551788.1551869
Bekker, Tilde, Sturm, Janienke, Wesselink, Rik, Groenendaal, Bas, Eggen, Berry (2008): Interactive play objects and the effects of open-ended play on social interaction and fun. In: Inakage, Masa, Cheok, Adrian David (eds.) Proceedings of the International Conference on Advances in Computer Entertainment Technology - ACE 2008 December 3-5, 2008, Yokohama, Japan. pp. 389-392. http://doi.acm.org/10.1145/1501750.1501841
Sturm, Janienke, Bekker, Tilde, Groenendaal, Bas, Wesselink, Rik, Eggen, Berry (2008): Key issues for the successful design of an intelligent, interactive playground. In: Proceedings of ACM IDC08 Interaction Design and Children , 2008, . pp. 258-265. http://doi.acm.org/10.1145/1463689.1463764
Thang, Binh, Sluis-Thiescheffer, Wouter, Bekker, Tilde, Eggen, Berry, Vermeeren, Arnold, Ridder, Huib de (2008): Comparing the creativity of children\'s design solutions based on expert assessment. In: Proceedings of ACM IDC08 Interaction Design and Children , 2008, . pp. 266-273. http://doi.acm.org/10.1145/1463689.1463765
Bekker, Tilde, Eggen, Berry (2008): Designing for children\'s physical play. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008, . pp. 2871-2876. http://doi.acm.org/10.1145/1358628.1358776
Barendregt, Wolmet, Bekker, Tilde, Bouwhuis, Don, Baauw, Esther (2007): Predicting effectiveness of children participants in user testing based on personality cha. In Behaviour and Information Technology, 26 (2) pp. 133-147. http://www.informaworld.com/10.1080/01449290500330372
Barendregt, W., Bekker, Tilde, Bouwhuis, Don, Baauw, E. (2006): Identifying usability and fun problems in a computer game during first use and after some . In International Journal of Human-Computer Studies, 64 (9) pp. 830-846. http://dx.doi.org/10.1016/j.ijhcs.2006.03.004
Baauw, Ester, Bekker, Tilde, Markopoulos, Panos (2006): Assessing the applicability of the structured expert evaluation method (SEEM) for a wider . In: Proceedings of ACM IDC06: Interaction Design and Children , 2006, . pp. 73-80. http://doi.acm.org/10.1145/1139073.1139095
Baauw, E., Bekker, Tilde, Barendregt, W. (2005): A Structured Expert Evaluation Method for the Evaluation of Children\'s Computer Games. In: Proceedings of IFIP INTERACT05: Human-Computer Interaction , 2005, . pp. 457-469. http://www.springerlink.com/openurl.asp?genre=article&id=doi:10.1007/11555261_38
Bekker, Tilde, Barendregt, W., Crombeen, S., Biesheuvel, M. (2004): Evaluating Usability and Challenge during Initial and Extended Use of Children\'s Computer. In: Proceedings of the HCI04 Conference on People and Computers XVIII , 2004, . pp. 331-346.
Bekker, Tilde, Beusmans, Julie, Keyson, David, Lloyd, Peter (2003): KidReporter: a user requirements gathering technique for designing with children. In Interacting with Computers, 15 (2) pp. 187-202.
Gilutz, Shuli, Bekker, Tilde, Druin, Allison, Fisch, Shalom, Read, Janet (2003): Children\'s online interfaces: is usability testing worthwhile?. In: Proceedings of ACM IDC03: Interaction Design and Children , 2003, . pp. 143-145. http://doi.acm.org/10.1145/953536.953557
Vermeeren, Arnold, Kesteren, Ilse van, Bekker, Tilde (2003): Managing the \'Evaluator Effect\' in User Testing. In: Proceedings of IFIP INTERACT03: Human-Computer Interaction , 2003, Zurich, Switzerland. pp. 647.
Barendregt, Wolmet, Bekker, Tilde, Speerstra, Mathilde (2003): Empirical evaluation of usability and fun in computer games for children. In: Proceedings of IFIP INTERACT03: Human-Computer Interaction , 2003, Zurich, Switzerland. pp. 705.
Kesteren, Ilse E. H. van, Bekker, Tilde, Vermeeren, Arnold P. O. S., Lloyd, Peter A. (2003): Assessing usability evaluation methods on their effectiveness to elicit verbal comments fr. In: Proceedings of ACM IDC03: Interaction Design and Children , 2003, . pp. 41-49. http://doi.acm.org/10.1145/953536.953544
Bekker, Tilde, Long, John (2000): User Involvement in the Design of Human-Computer Interactions: Some Similarities and Diffe. In: Proceedings of the HCI00 Conference on People and Computers XIV , 2000, . pp. 135-148.
Wilson, Stephanie, Bekker, Tilde, Johnson, Peter, Johnson, Hilary (1997): Helping and Hindering User Involvement -- A Tale of Everyday Design. In: Pemberton, Steven (eds.) Proceedings of the ACM CHI 97 Human Factors in Computing Systems Conference March 22-27, 1997, Atlanta, Georgia. pp. 178-185. http://www.acm.org/pubs/articles/proceedings/chi/258549/p178-wilson/p178-wilson.pdf
Wilson, Stephanie, Bekker, Tilde, Johnson, Hilary, Johnson, Peter (1996): Costs and Benefits of User Involvement in Design: Practitioners' Views. In: Sasse, Martina Angela, Cunningham, R. J., Winder, R. L. (eds.) Proceedings of the Eleventh Conference of the British Computer Society Human Computer Interaction Specialist Group - People and Computers XI August, 1996, London, UK. pp. 221-240.
Bekker, Tilde, Olson, Judith S., Olson, Gary M. (1995): Analysis of Gestures in Face-to-Face Design Teams Provides Guidance for How to Use Groupwa. In: Proceedings of DIS95: Designing Interactive Systems: Processes, Practices, Methods, & Techniques , 1995, . pp. 157-166.
15.14 Commentary by Tilde Bekker
The area of usability evaluation is on the move, as Gilbert Cockton describes. The chapter provides a thorough description of the historical development of usability evaluation methods and provides a good starting point for considering what needs to be done next.
In my commentary I expand on one aspect of evaluation methods: eliciting information from users. I describe how, in the area of Interaction Design and Children, evaluation methods have been adapted to increase the output of children participants in evaluation sessions. Two approaches have been applied: by providing different strategies for supporting verbalizations and by providing non-verbal ways to children for expressing their opinion.
For more than 10 years I have been teaching HCI and Industrial Design students how to apply a wide variety of evaluation approaches to various kinds of products and interfaces. Applying evaluation methods to the design of technologies for children can provide a new perspective because it forces us to re-examine some of the assumptions we make about usability evaluation methods.
15.14.1 Adapting evaluation methods to participants' skills
What is an interesting challenge in designing and evaluating interactive products for children is to find a good match between the skills and qualities of the participants and the properties of the design and evaluation activity. This approach, which has been widespread in the research area of Interaction Design and Children, has led to some interesting adaptations to existing usability evaluation methods and also to the development of new usability evaluation methods.
In the past 10 to 15 years various studies have examined whether children have the skills and qualities required for a variety of evaluation methods. We can of course argue that when a participant has trouble participating in an evaluation session, we have to train the participant. Another or complementary option is to adjust or redesign the evaluation method to make it easier and possibly more fun to participate in an evaluation session.
15.14.2 Verbalization techniques
An important skill required for many evaluation methods is the ability to verbalize one’s thoughts. Such verbalizations can be used as a basis for interpreting what usability problems are embedded in the user interface. Different techniques are applied for eliciting verbal output.
One common approach for eliciting verbal output is the think-aloud method. Participants are asked to verbalize their thoughts while they are interacting with the product. The evaluation facilitator may prompt the participant to keep talking during the session. However, can children think aloud during usability evaluation sessions? Initially it was suggested that children of 13 years and older can think aloud (Hanna et al., 1997). More recent research showed that children of 7 years and older can think aloud when the protocol for facilitating the verbalizations is adjusted to a more relaxed dialogue (Donker and Markopoulos, 2002).
Evaluation methods may also incorporate other strategies to support participants in verbalizing their thoughts than being prompted by a facilitator. Examples of other strategies are participating in an evaluation session together with peers, tutoring another child, or being prompted by a social robot as a proxy for the facilitator. However, the success of these strategies may depend on children having other skills required for these set-ups, such as the ability to collaborate.
An evaluation method called co-discovery or constructive interaction applies a technique where two participants collaborate in performing tasks in an evaluation setting. Supporting verbalizations by talking to a peer may be a more natural setting than holding a monologue or talking to a test facilitator. However, children do need to collaborate for the evaluation sessions to be effective. Some research has shown that younger children of 6 to 7 years old participating may still lack sufficient social skills to be effective participants. They may forget to collaborate and work on a task on their own, thus not providing many verbal utterances. They may sometimes actually compete when doing a task (Markopoulos and Bekker, 2003; Van Kesteren et al., 2003). Older children (between 13 and 14) have been shown to collaborate quite well in co-discovery sessions (Als et al, 2005). Other factors that may influence the quality of the collaboration and the outcome of the session are whether the pairs are friends or not and gender.
Another method, called peer tutoring, is based on the idea that one child explains to another child how a product works (Höysniemi et al, 2003). At the beginning one child will try out using a product. Then the first child will become the tutor of a second child. The tutor will help the second child to interact with the product. From the dialogue between the two children their understanding about the product and usability problems can be distilled. The success of this approach depends on whether the tutor is able to fulfill his tutor role effectively, and whether the tutee is open to being taught by another child. Evidence from peer tutoring indicates that when the tutor forgets to play the tutor role the pairs of children take on roles more similar to those in co-discovery sessions. Furthermore, tutors may have trouble only explaining the interaction to the other child without taking over doing a task (Van Kesteren et al, 2003).
A more recently developed method, in which a child is being prompted by a facilitator through a robot interface, is called the robotic intervention method (Fransen and Markopoulos, 2010). Providing a context in which children can talk to a playful and toy-like robot is expected to be less inhibiting than talking to an adult. So far, no increase in problems uncovered using this method compared to an active intervention method was found. Children did seem more at ease when participating in the sessions. A slight drawback of the methods was that children perceived the questions asked by the robot to be more difficult than those asked by a human facilitator.
15.14.3 Complementing verbal with non-verbal approaches
A different strategy than facilitating verbal output is to provide alternative non-verbal ways to indicate positive and negative aspects in an interface. This was applied in the PhD work by Wolmet Barendregt who developed the picture cards method (Barendregt et al., 2008). The method was developed to find problems with children’s computer games. It includes cards that children can pick up to indicate both positive and negative aspects in an interface. They can place a card in a box every time they experience a particular emotion shown on one of the cards. They pick up a picture card to express their feelings when interacting with a game or product. The categories of the cards correspond to various types of problems and fun issues. In a study with children of 5 and 6 years old children expressed more problems explicitly with the picture cards method that in a think-aloud session.
15.14.4 A rich usability evaluation context
I agree with Cockton that there are no generalizable evaluation methods. Learning how to conduct usability evaluations requires developing an understanding of the complete evaluation context. This context includes many factors, such as who applies the method in what type of development process. And it also includes, as I illustrated earlier, specific requirements of the user group.
Evaluation methods can be further improved by adapting them to the skills and qualities of all the stakeholders involved, by providing diverse ways to provide input, addressing positive and negative experiences, and possibly even making the activity more fun and enjoyable.
Developing evaluation approaches is like developing products and systems: for every improvement we try to incorporate in an evaluation method, we run the risk of adding new challenges for the participants.
Bekker, Tilde, Robertson, Judy, Skov, Mikael B. (eds.) Proceedings of the 6th International Conference on Interaction Design and Children June 6-8, 2007, Aalborg, Denmark.
Bekker, Tilde, Robertson, Judy, Skov, Mikael B. (eds.) 6th International Conference on Interaction Design and Children June 6-8, 2007, Aalborg, Denmark.