Number of co-authors:16
Number of publications with 3 favourite co-authors:Takeo Igarashi:6Masahiko Inami:3Ehud Sharlin:2
Kentaro Ishii's 3 most productive colleagues in number of publications:Takeo Igarashi:66Masahiko Inami:47Ehud Sharlin:28
A designer knows he has achieved perfection not when there is nothing left to add, but when there is nothing left to take away.
-- Antoine De Saint-Exupery
Read the fascinating history of Wearable Computing, told by its father, Steve Mann
Read Steve's chapter !
Our Latest Books
Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger
Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad
Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam
Publications by Kentaro Ishii (bibliography)
Young, James, Ishii, Kentaro, Igarashi, Takeo and Sharlin, Ehud (2012): Style by demonstration: teaching interactive movement style to robots. In: Proceedings of the 2012 International Conference on Intelligent User Interfaces 2012. pp. 41-50.
The style in which a robot moves, expressed through its gait or locomotion, can convey effective messages to people. For example, a robot could move aggressively in reaction to a person's actions, or alternatively react using a set of careful, submissive movements. Designing, implementing and programming robotic interfaces that react to users' actions with properly styled movements can be a difficult, daunting, and time consuming technical task. On the other hand, most people can easily perform such stylistic tasks and movements, for example, through acting them out. Following this observation, we propose to enable people to use their existing teaching skills to directly demonstrate to robots, via in-situ acting, a desired style of interaction. In this paper we present an initial style-by-demonstration (SBD) proof-of-concept of our approach, allowing people to teach a robot specific, interactive locomotion styles by providing a demonstration. We present a broomstick-robot interface for directly demonstrating locomotion style to a collocated robot, and a design critique evaluation by experienced programmers that compares our SBD approach to traditional programming methods.
© All rights reserved Young et al. and/or ACM Press
Mi, Haipeng, Ishii, Kentaro, Ma, Lei, Laokulrat, Natsuda, Inami, Masahiko and Igarashi, Takeo (2012): Pebbles: an interactive configuration tool for indoor robot navigation. In: Adjunct Proceedings of the 2012 ACM Symposium on User Interface Software and Technology 2012. pp. 11-12.
This study presents an interactive configuration tool that assists non-expert users to design specific navigation route for mobile robot in an indoor environment. The user places small active markers, called pebbles, on the floor along the desired route in order to guide the robot to the destination. The active markers establish a navigation network by communicating each other with IR beacon and the robot follows the markers to reach the designated goal. During the installation, a user can get effective feedback from LED indicators and voice prompts, so that the user can immediately understand if the navigation route is appropriately configured as expected. With this tool a novice user may easily customize a mobile robot for various indoor tasks.
© All rights reserved Mi et al. and/or ACM Press
Ishii, Kentaro, Ishida, Akihiko, Saul, Greg, Inami, Masahiko and Igarashi, Takeo (2010): Active navigation landmarks for a service robot in a home environment. In: Proceedings of the 5th ACM/IEEE International Conference on Human Robot Interaction 2010. pp. 99-100.
This paper proposes a physical user interface for a user to teach a robot to navigate a home environment. The user places small devices containing infrared based communication functionality as landmarks in the environment. The robot follows these landmarks to navigate to a goal landmark. Active landmarks communicate with each other to map their spatial relationships. Our method allows the user to start using the system immediately after placing the landmarks without installing any global position sensing system or prior mapping by the robot.
© All rights reserved Ishii et al. and/or their publisher
Young, James E., Ishii, Kentaro, Igarashi, Takeo and Sharlin, Ehud (2010): Showing robots how to follow people using a broomstick interface. In: Proceedings of the 5th ACM/IEEE International Conference on Human Robot Interaction 2010. pp. 133-134.
Robots are poised to enter our everyday environments such as our homes and offices, contexts that present unique questions such as the style of the robot's actions. Style-oriented characteristics are difficult to define programmatically, a problem that is particularly prominent for a robot's interactive behaviors, those that must react accordingly to dynamic actions of people. In this paper, we present a technique for programming the style of how a robot should follow a person by demonstration, such that non-technical designers and users can directly create the style of following using their existing skill sets. We envision that simple physical interfaces like ours can be used by non-technical people to design the style of a wide range of robotic behaviors.
© All rights reserved Young et al. and/or their publisher
Mistry, Pranav, Ishii, Kentaro, Inami, Masahiko and Igarashi, Takeo (2010): Blinkbot: look at, blink and move. In: Proceedings of the 2010 ACM Symposium on User Interface Software and Technology 2010. pp. 397-398.
In this paper we present BlinkBot -- a hands free input interface to control and command a robot. BlinkBot explores the natural modality of gaze and blink to direct a robot to move an object from a location to another. The paper also explains detailed hardware and software implementation of the prototype system.
© All rights reserved Mistry et al. and/or their publisher
Zhao, Shengdong, Nakamura, Koichi, Ishii, Kentaro and Igarashi, Takeo (2009): Magic cards: a paper tag interface for implicit robot control. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 173-182.
Typical Human Robot Interaction (HRI) assumes that the user explicitly interacts with robots. However, explicit control with robots can be unnecessary or even undesirable in certain cases, such as dealing with domestic services (or housework). In this paper, we propose an alternative strategy of interaction: the user implicitly controls a robot by issuing commands on corresponding real world objects and the environment. Robots then discover these commands and complete them in the background. We implemented a paper-tag-based interface to support such implicit robot control in a sensor-augmented home environment. Our initial user studies indicated that the paper-tag-based interface is particularly simple to use and provides users with flexibility in planning and controlling their housework tasks in a simulated home environment.
© All rights reserved Zhao et al. and/or ACM Press
Ishii, Kentaro, Yamamoto, Yukiko, Imai, Michita and Nakadai, Kazuhiro (2007): A Navigation System Using Ultrasonic Directional Speaker with Rotating Base. In: Smith, Michael J. and Salvendy, Gavriel (eds.) Symposium on Human Interface 2007 - Part II July 22-27, 2007, Beijing, China. pp. 526-535.
Show list on your website
Join the design elite and advance:
Changes to this page (author)23 Nov 2012: Modified04 Apr 2012: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
05 Jun 2009: Modified
09 May 2009: Added
Page maintainer: The Editorial Team