Publication statistics

Pub. period:2003-2012
Pub. count:29
Number of co-authors:39



Co-authors

Number of publications with 3 favourite co-authors:

Guy Hoffman:5
Andrew G. Brooks:4
Jesse Gray:3

 

 

Productive colleagues

Cynthia Breazeal's 3 most productive colleagues in number of publications:

Rosalind W. Picard:45
Leah Buechley:20
Hayes Raffle:20
 
 
 

Upcoming Courses

go to course
User Experience: The Beginner's Guide
90% booked. Starts in 5 days
go to course
User-Centred Design - Module 2
89% booked. Starts in 6 days
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

 
 
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
 
 
 
 
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading
 
 

Cynthia Breazeal

Personal Homepage:
http://web.media.mit.edu/~cynthiab/

 

Publications by Cynthia Breazeal (bibliography)

 what's this?
2012
 
Edit | Del

Robert, David and Breazeal, Cynthia (2012): Blended reality characters. In: Proceedings of the 7th International Conference on Human-Robot Interaction 2012. pp. 359-366. Available online

We present the idea and formative design of a blended reality character, a new class of character able to maintain visual and kinetic continuity between the fully physical and fully virtual. The interactive character's embodiment fluidly transitions from an animated character on-screen to a small, alphabet block-shaped mobile robot designed as a platform for informal learning through play. We present the design and results of our study with thirty-four children aged three and a half to seven conducted using non-reactive, unobtrusive observational methods and a validated evaluation instrument. Our claim is that young children have accepted the idea, persistence and continuity of blended reality characters. Furthermore, we found that children are more deeply engaged with blended reality characters and are more fully immersed in blended reality play as co-protagonists in the experience, in comparison to interactions with strictly screen-based representations. As substantiated through the use of quantitative and qualitative analysis of drawings and verbal utterances, the study shows that young children produce longer, detailed and more imaginative descriptions of their experiences following blended reality play. The desire to continue engaging in blended reality play as expressed by children's verbal requests to revisit and extend their play time with the character positively affirms the potential for the development of an informal learning platform with sustained appeal to young children.

© All rights reserved Robert and Breazeal and/or their publisher

2011
 
Edit | Del

Shah, Julie, Wiken, James, Williams, Brian and Breazeal, Cynthia (2011): Improved human-robot team performance using Chaski, a human-inspired plan execution system. In: Proceedings of the 6th International Conference on Human Robot Interaction 2011. pp. 29-36. Available online

We describe the design and evaluation of Chaski, a robot plan execution system that uses insights from human-human teaming to make human-robot teaming more natural and fluid. Chaski is a task-level executive that enables a robot to collaboratively execute a shared plan with a person. The system chooses and schedules the robot's actions, adapts to the human partner, and acts to minimize the human's idle time. We evaluate Chaski in human subject experiments in which a person works with a mobile and dexterous robot to collaboratively assemble structures using building blocks. We measure team performance outcomes for robots controlled by Chaski compared to robots that are verbally commanded, step-by-step by the

© All rights reserved Shah et al. and/or their publisher

 
Edit | Del

Alonso, Jason B., Chang, Angela, Robert, David and Breazeal, Cynthia (2011): Toward a dynamic dramaturgy: an art of presentation in interactive storytelling. In: Proceedings of the 2011 Conference on Creativity and Cognition 2011. pp. 311-312. Available online

In interactive storytelling systems, we see common challenges of artistic expression that pertains to presentation, standing apart from narrative structure. We believe this expression can be achieved computationally, which is a core challenge in using procedurally-generated worlds in interactive storytelling. This computational expression so is what we call dynamic dramaturgy. We intend dynamic dramaturgy as a complement to interactive narrative systems, particularly drama management, and as a fundamentally distinct task from plot-level narrative construction, yet it is still a basic medium for artistic expression by an author. It is, in effect, an art of presentation in interactive storytelling.

© All rights reserved Alonso et al. and/or ACM Press

 
Edit | Del

Chang, Angela and Breazeal, Cynthia (2011): TinkRBook: shared reading interfaces for storytelling. In: Proceedings of ACM IDC11 Interaction Design and Children 2011. pp. 145-148. Available online

Today, the way children learn to read is very different from the way they learn from playing with toys. Books present static images and text on the page, whereas toys allow for manipulation and interactive exploration of cause-effect relations. What if books were "tinkerable"? What if children could actively explore and modify a story, through voice and touch, to dynamically explore meaning as conveyed by the relationship of text to illustrated concept? How might this change how books are experienced, explored, and shared between parent and child? How might interactivity support and enhance existing shared reading practices? We report the development of interaction design techniques for encouraging storytelling behavior during shared book reading. The design of our storytelling platform, the TinkRBook, encourages active exploration when parents read to very young children (ages 2-5 years old). Our approach uses findings from in-situ parent-child ethnographies and advice from 24 participatory design interviews with researchers, designers and professionals from relevant domains. We believe that our approach addresses the environmental conditions in which interactive storytelling with preschoolers is most likely to be adopted, and is compatible with current shared reading practices.

© All rights reserved Chang and Breazeal and/or ACM Press

 
Edit | Del

Wistort, Ryan and Breazeal, Cynthia (2011): TofuDraw: a mixed-reality choreography tool for authoring robot character performance. In: Proceedings of ACM IDC11 Interaction Design and Children 2011. pp. 213-216. Available online

TofuDraw combines an expressive semi-autonomous robot character (called Tofu) with a new mixed reality DigitalPaint interface whereby children can draw a "program" on the floor that governs the robot character's behavior. Initial evaluations of the TofuDraw system with children ages 3-8 suggest that children can successfully use this interface to choreograph the expressive robot's behavior. Our ultimate goal for this tool is to enable young children to engage in STEM learning experiences in new contexts such as creating interactive robot theatre performances.

© All rights reserved Wistort and Breazeal and/or ACM Press

 
Edit | Del

Freed, Natalie, Qi, Jie, Setapen, Adam, Breazeal, Cynthia, Buechley, Leah and Raffle, Hayes (2011): Sticking together: handcrafting personalized communication interfaces. In: Proceedings of ACM IDC11 Interaction Design and Children 2011. pp. 238-241. Available online

We present I/O Stickers, adhesive sensors and actuators that children can use to handcraft personalized remote communication interfaces. By attaching I/O Stickers to special wirelessly connected greeting cards, children can invent ways to communicate with long-distance loved ones. Children decorate these cards with their choice of craft materials, creatively expressing themselves while making a functioning interface. The low-bandwidth connections -- simple actuators that change as the sensor stickers are manipulated -- leave room not only to design the look and function of the card, but also to decide how to interpret the information transmitted. We aim to empower children to implement ideas that would otherwise require advanced electronics knowledge. In addition, we hope to support creative learning about communication and to make keeping in touch playful and meaningful. In this paper, we describe the design of the I/O Stickers, analyze a variety of artifacts children have created, and explore future directions for the toolkit.

© All rights reserved Freed et al. and/or ACM Press

2010
 
Edit | Del

Lee, Jun Ki and Breazeal, Cynthia (2010): Human social response toward humanoid robot's head and facial features. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 4237-4242. Available online

This study explores how people's social response toward a humanoid robot can change when we vary the number of the active degrees of freedom in the robot's head and face area. We investigate this problem by conducting two wizard-of-oz user studies that situate an elder person in a self-disclosure dialogue with a remotely operated robot. In our first study, we investigated the effect of expressive head gestures with a four-degree-of-freedom neck. In the second study we focused on the face where we investigated the effect of expressive eyebrow movement versus active gaze and eyelid movement. In the first study, we found that participants are willing to disclose more to the robot when the robot moved its neck in an expressive manner. In the second study, our data suggests a trend where gaze and expressive eyelid movement results in more disclosure over eyebrow movement.

© All rights reserved Lee and Breazeal and/or their publisher

 
Edit | Del

Adalgeirsson, Sigurdur O. and Breazeal, Cynthia (2010): MeBot: a robotic platform for socially embodied presence. In: Proceedings of the 5th ACM/IEEE International Conference on Human Robot Interaction 2010. pp. 15-22. Available online

Telepresence refers to a set of technologies that allow users to feel present at a distant location; telerobotics is a subfield of telepresence. This paper presents the design and evaluation of a telepresence robot which allows for social expression. Our hypothesis is that a telerobot that communicates more than simply audio or video but also expressive gestures, body pose and proxemics, will allow for a more engaging and enjoyable interaction. An iterative design process of the MeBot platform is described in detail, as well as the design of supporting systems and various control interfaces. We conducted a human subject study where the effects of expressivity were measured. Our results show that a socially expressive robot was found to be more engaging and likable than a static one. It was also found that expressiveness contributes to more psychological involvement and better cooperation.

© All rights reserved Adalgeirsson and Breazeal and/or their publisher

2009
 
Edit | Del

Wistort, Ryan and Breazeal, Cynthia (2009): TOFU: a socially expressive robot character for child interaction. In: Proceedings of ACM IDC09 Interaction Design and Children 2009. pp. 292-293. Available online

The TOFU project introduces a robotic platform for enabling new opportunities in robot based learning with emphasis on storytelling and artistic expression. This project introduces a socially expressive robot character designed to mimic the expressive abilities of animated characters. This demonstration proposal describes the expressive abilities and operator interface to the TOFU project. In this demonstration session, participants will have the opportunity to physically interact with the TOFU project and puppeteer the behavior of the robotic character through a simple joystick interface.

© All rights reserved Wistort and Breazeal and/or ACM Press

 
Edit | Del

Stiehl, Walter Dan, Lee, Jun Ki, Breazeal, Cynthia, Nalin, Marco, Morandi, Angelica and Sanna, Alberto (2009): The huggable: a platform for research in robotic companions for pediatric care. In: Proceedings of ACM IDC09 Interaction Design and Children 2009. pp. 317-320. Available online

Robotic companions offer a unique combination of embodiment and computation which open many new interesting opportunities in the field of pediatric care. As these new technologies are developed, we must consider the central research questions of how such systems should be designed and what the appropriate applications for such systems are. In this paper we present the Huggable, a robotic companion in the form factor of a teddy bear and outline a series of studies we are planning to run using the Huggable in a pediatric care unit.

© All rights reserved Stiehl et al. and/or ACM Press

 
Edit | Del

Breazeal, Cynthia (2009): Living better with robots. In: Proceedings of the 2009 International Conference on Multimodal Interfaces 2009. pp. 1-2. Available online

The emerging field of Human-Robot Interaction is undergoing rapid growth, motivated by important societal challenges and new applications for personal robotic technologies for the general public. In this talk, I highlight several projects from my research group to illustrate recent research trends to develop socially interactive robots that work and learn with people as partners. An important goal of this work is to use interactive robots as a scientific tool to understand human behavior, to explore the role of physical embodiment in interactive technology, and to use these insights to design robotic technologies that can enhance human performance and quality of life. Throughout the talk I will highlight synergies with HCI and connect HRI research goals to specific applications in healthcare, education, and communication.

© All rights reserved Breazeal and/or his/her publisher

2008
 
Edit | Del

Lee, Chia-Hsun Jackie, Kim, Kyunghee, Breazeal, Cynthia and Picard, Rosalind W. (2008): Shybot: friend-stranger interaction for children living with autism. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 3375-3380. Available online

This paper presents Shybot, a personal mobile robot designed to both embody and elicit reflection on shyness behaviors. Shybot is being designed to detect human presence and familiarity from face detection and proximity sensing in order to categorize people as friends or strangers to interact with. Shybot also can reflect elements of the anxious state of its human companion through LEDs and a spinning propeller. We designed this simple social interaction to open up a new direction for intervention for children living with autism. We hope that from minimal social interaction, a child with autism or social anxiety disorders could reflect on and more deeply attain understanding about personal shyness behaviors, as a first step toward helping make progress in developing greater capacity for complex social interaction.

© All rights reserved Lee et al. and/or ACM Press

 
Edit | Del

Hoffman, Guy and Breazeal, Cynthia (2008): Achieving fluency through perceptual-symbol practice in human-robot collaboration. In: Proceedings of the 3rd ACM/IEEE International Conference on Human Robot Interaction 2008. pp. 1-8. Available online

We have developed a cognitive architecture for robotic teammates based on the neuro-psychological principles of perceptual symbols and simulation, with the aim of attaining increased fluency in human-robot teams. An instantiation of this architecture was implemented on a robotic desk lamp, performing in a human-robot collaborative task. This paper describes initial results from a human-subject study measuring team efficiency and team fluency, in which the robot works on a joint task with untrained subjects. We find significant differences in a number of efficiency and fluency metrics, when comparing our architecture to a purely reactive robot with similar capabilities.

© All rights reserved Hoffman and Breazeal and/or ACM Press

 
Edit | Del

Breazeal, Cynthia (2008): Living better with robots. In: Cousins, Steve B. and Beaudouin-Lafon, Michel (eds.) Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology October 19-22, 2008, Monterey, CA, USA. pp. 209-210. Available online

 
Edit | Del

Kidd, Cory D. and Breazeal, Cynthia (2008): Robots at home: Understanding long-term human-robot interaction. In: Proceedings 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems September 22-26, 2008, Nice,France. pp. 3230-3235. Available online

Human-robot interaction (HRI) is now well enough understood to allow us to build useful systems that can function outside of the laboratory. We are studying long-term interaction in natural user environments and describe the implementation of a robot designed to help individuals effect behavior change while dieting. Our robotic weight loss coach is compared to a standalone computer and a paper log in a controlled study. We describe the software model used to create successful long-term HRI. We summarize the experimental design, analysis, and results of our study, the first where a sociable robot interacts with a user to achieve behavior change. Results show that participants track their calorie consumption and exercise for nearly twice as long when using the robot than with the other methods and develop a closer relationship with the robot. Both are indicators of longer-term success at weight loss and maintenance and show the effectiveness of sociable robots for long-term HRI.

© All rights reserved Kidd and Breazeal and/or IEEE

 
Edit | Del

Lee, Jun Ki, Toscano, Robert Lopez, Stiehl, Walter D. and Breazeal, Cynthia (2008): The Design of a Semi-Autonomous Robot Avatar for Family Communication and Education. In: Buss, Martin and Khnlenz, Kolja (eds.) Proceeding of 17th IEEE International Symposium on Robot and Human Interactive Communication RO-MAN August 1-3, 2008, Munich, Germany. pp. 166-173. Available online

Robots as an embodied, multi-modal technology have great potential to be used as a new type of communication device. In this paper we outline our development of the Huggable robot as a semi-autonomous robot avatar for two specific types of remote interaction - family communication and education. Through our discussion we highlight how we have applied six important elements in our system to allow for the robot to function as a richly embodied communication channel.

© All rights reserved Lee et al. and/or IEEE

2007
 
Edit | Del

Hoffman, Guy and Breazeal, Cynthia (2007): Effects of anticipatory action on human-robot teamwork efficiency, fluency, and perception of team. In: Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction 2007. pp. 1-8. Available online

A crucial skill for fluent action meshing in human team activity is a learned and calculated selection of anticipatory actions. We believe that the same holds for robotic teammates, if they are to perform in a similarly fluent manner with their human counterparts. In this work, we propose an adaptive action selection mechanism for a robotic teammate, making anticipatory decisions based on the confidence of their validity and their relative risk. We predict an improvement in task efficiency and fluency compared to a purely reactive process. We then present results from a study involving untrained human subjects working with a simulated version of a robot using our system. We show a significant improvement in best-case task efficiency when compared to a group of users working with a reactive agent, as well as a significant difference in the perceived commitment of the robot to the team and its contribution to the team's fluency and success. By way of explanation, we propose a number of fluency metrics that differ significantly between the two study groups.

© All rights reserved Hoffman and Breazeal and/or ACM Press

 
Edit | Del

Breazeal, Cynthia, Wang, Andrew and Picard, Rosalind W. (2007): Experiments with a robotic computer: body, affect and cognition interactions. In: Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction 2007. pp. 153-160. Available online

We present RoCo, the first robotic computer designed with the ability to move its monitor in subtly expressive ways that respond to and encourage its user's own postural movement. We use RoCo in a novel user study to explore whether a computer's "posture" can influence its user's subsequent posture, and if the interaction of the user's body state with their affective state during a task leads to improved task measures such as persistence in problem solving. We believe this is possible in light of new theories that link physical posture and its influence on affect and cognition. Initial results with 71 subjects support the hypothesis that RoCo's posture not only manipulates the user's posture, but also is associated with hypothesized posture-affect interactions. Specifically, we found effects on increased persistence on a subsequent cognitive task, and effects on perceived level of comfort.

© All rights reserved Breazeal et al. and/or ACM Press

 
Edit | Del

Ahn, Hyungil, Teeters, Alea, Wang, Andrew, Breazeal, Cynthia and Picard, Rosalind W. (2007): Stoop to Conquer: Posture and Affect Interact to Influence Computer Users' Persistence. In: Paiva, Ana, Prada, Rui and Picard, Rosalind W. (eds.) ACII 2007 - Affective Computing and Intelligent Interaction, Second International Conference September 12-14, 2007, Lisbon, Portugal. pp. 582-593. Available online

2006
 
Edit | Del

Brooks, Andrew G. and Breazeal, Cynthia (2006): Working with robots and objects: revisiting deictic reference for achieving spatial common ground. In: Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction 2006. pp. 297-304. Available online

Robust joint visual attention is necessary for achieving a common frame of reference between humans and robots interacting multimodally in order to work together on real-world spatial tasks involving objects. We make a comprehensive examination of one component of this process that is often otherwise implemented in an ad hoc fashion: the ability to correctly determine the object referent from deictic reference including pointing gestures and speech. From this we describe the development of a modular spatial reasoning framework based around decomposition and resynthesis of speech and gesture into a language of pointing and object labeling. This framework supports multimodal and unimodal access in both real-world and mixed-reality workspaces, accounts for the need to discriminate and sequence identical and proximate objects, assists in overcoming inherent precision limitations in deictic gesture, and assists in the extraction of those gestures. We further discuss an implementation of the framework that has been deployed on two humanoid robot platforms to date.

© All rights reserved Brooks and Breazeal and/or ACM Press

 
Edit | Del

Thomaz, Andrea L., Hoffman, Guy and Breazeal, Cynthia (2006): Experiments in socially guided machine learning: understanding how humans teach. In: Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction 2006. pp. 359-360. Available online

In Socially Guided Machine Learning we explore the ways in which machine learning can more fully take advantage of natural human interaction. In this work we are studying the role real-time human interaction plays in training assistive robots to perform new tasks. We describe an experimental platform, Sophie's World, and present descriptive analysis of human teaching behavior found in a user study. We report three important observations of how people administer reward and punishment to teach a simulated robot a new task through Reinforcement Learning. People adjust their behavior as they develop a model of the learner, they use the reward channel for guidance as well as feedback, and they may also use it as a motivational channel.

© All rights reserved Thomaz et al. and/or ACM Press

 
Edit | Del

Thomaz, Andrea Lockerd and Breazeal, Cynthia (2006): Teachable Characters: User Studies, Design Principles, and Learning Performance. In: Gratch, Jonathan, Young, Michael, Aylett, Ruth, Ballin, Daniel and Olivier, Patrick (eds.) IVA 2006 - Intelligent Virtual Agents - 6th International Conference August 21-23, 2006, Marina Del Rey, CA, USA. pp. 395-406. Available online

2005
 
Edit | Del

Breazeal, Cynthia (2005): Socially intelligent robots. In Interactions, 12 (2) pp. 19-22.

 
Edit | Del

Stiehl, Walter Dan and Breazeal, Cynthia (2005): Affective Touch for Robotic Companions. In: Tao, Jianhua, Tan, Tieniu and Picard, Rosalind W. (eds.) ACII 2005 - Affective Computing and Intelligent Interaction, First International Conference October 22-24, 2005, Beijing, China. pp. 747-754. Available online

2004
 
Edit | Del

Breazeal, Cynthia (2004): Interviews with Cynthia Breazeal. In Computers in Entertainment, 2 (3) p. 8. Available online

 
Edit | Del

Brooks, Andrew G., Gray, Jesse, Hoffman, Guy, Lockerd, Andrea, Lee, Hans and Breazeal, Cynthia (2004): Robot's play: interactive games with sociable machines. In Computers in Entertainment, 2 (3) p. 10. Available online

 
Edit | Del

Breazeal, Cynthia, Brooks, Andrew G., Gray, Jesse, Hoffman, Guy, Kidd, Cory D., Lee, Hans, Lieberman, Jeff, Lockerd, Andrea and Chilongo, David (2004): Tutelage and Collaboration for Humanoid Robots. In International Journal of Humanoid Robot, 1 (2) pp. 315-348. Available online

This paper presents an overview of our work towards building socially intelligent, cooperative humanoid robots that can work and learn in partnership with people. People understand each other in social terms, allowing them to engage others in a variety of complex social interactions including communication, social learning, and cooperation. We present our theoretical framework that is a novel combination of Joint Intention Theory and Situated Learning Theory and demonstrate how this framework can be applied to develop our sociable humanoid robot, Leonardo. We demonstrate the robot's ability to learn quickly and effectively from natural human instruction using gesture and dialog, and then cooperate to perform a learned task jointly with a person. Such issues must be addressed to enable many new and exciting applications for robots that require them to play a long-term role in people's daily lives.

© All rights reserved Breazeal et al. and/or World Scientific Publishing

2003
 
Edit | Del

Breazeal, Cynthia (2003): Emotion and sociable humanoid robots. In International Journal of Human-Computer Studies, 59 (1) pp. 119-155.

This paper focuses on the role of emotion and expressive behavior in regulating social interaction between humans and expressive anthropomorphic robots, either in communicative or teaching scenarios. We present the scientific basis underlying our humanoid robot's emotion models and expressive behavior, and then show how these scientific viewpoints have been adapted to the current implementation. Our robot is also able to recognize affective intent through tone of voice, the implementation of which is inspired by the scientific findings of the developmental psycholinguistics community. We first evaluate the robot's expressive displays in isolation. Next, we evaluate the robot's overall emotive behavior (i.e. the coordination of the affective recognition system, the emotion and motivation systems, and the expression system) as it socially engages nave human subjects face-to-face.

© All rights reserved Breazeal and/or Academic Press

 
Edit | Del

Breazeal, Cynthia, Brooks, Andrew G., Gray, Jesse, Hancher, Matthew D., McBean, John, Stiehl, Walter Dan and Strickon, Joshua (2003): Interactive robot theatre. In Communications of the ACM, 46 (7) pp. 76-85. Available online

 
Add publication
Show list on your website
 
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
Join our community!
 
 
 

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/cynthia_breazeal.html