Publication statistics

Pub. period:2005-2012
Pub. count:16
Number of co-authors:33



Co-authors

Number of publications with 3 favourite co-authors:

Takeo Igarashi:10
Masahiko Inami:7
Hiroshi Ishiguro:4

 

 

Productive colleagues

Daisuke Sakamoto's 3 most productive colleagues in number of publications:

Takeo Igarashi:66
Hiroshi Ishiguro:55
Masahiko Inami:47
 
 
 

Upcoming Courses

go to course
User-Centred Design - Module 3
71% booked. Starts in 24 days
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

 
 
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
 
 
 
 
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading
 
 

Daisuke Sakamoto

Add description
Rename / change spelling
Add publication
 

Publications by Daisuke Sakamoto (bibliography)

 what's this?
2012
 
Edit | Del

Matsumura, Kohei, Sakamoto, Daisuke, Inami, Masahiko and Igarashi, Takeo (2012): Universal earphones: earphones with automatic side and shared use detection. In: Proceedings of the 2012 International Conference on Intelligent User Interfaces 2012. pp. 305-306.

We present universal earphones that use both a proximity sensor and a skin conductance sensor and we demonstrate several implicit interaction techniques they achieve by automatically detecting the context of use. The universal earphones have two main features. The first involves detecting the left and right sides of ears, which provides audio to either ear, and the second involves detecting the shared use of earphones and this provides mixed stereo sound to both earphones. These features not merely free users from having to check the left and right sides of earphones, but they enable them to enjoy sharing stereo audio with other people.

© All rights reserved Matsumura et al. and/or ACM Press

 
Edit | Del

Wibowo, Amy, Sakamoto, Daisuke, Mitani, Jun and Igarashi, Takeo (2012): DressUp: a 3D interface for clothing design with a physical mannequin. In: Proceedings of the 6th International Conference on Tangible and Embedded Interaction 2012. pp. 99-102.

This paper introduces DressUp, a computerized system for designing dresses with 3D input using the form of the human body as a guide. It consists of a body-sized physical mannequin, a screen, and tangible prop tools for drawing in 3D on and around the mannequin. As the user draws, he/she modifies or creates pieces of digital cloth, which are displayed on a model of the mannequin on the screen. We explore the capacity of our 3D input tools to create a variety of dresses. We also describe observations gained from users designing actual physical garments with the system.

© All rights reserved Wibowo et al. and/or ACM Press

 
Edit | Del

Kato, Jun, Sakamoto, Daisuke and Igarashi, Takeo (2012): Phybots: a toolkit for making robotic things. In: Proceedings of DIS12 Designing Interactive Systems 2012. pp. 248-257.

There are many toolkits for physical UIs, but most physical UI applications are not locomotive. When the programmer wants to make things move around in the environment, he faces difficulty related to robotics. Toolkits for robot programming, unfortunately, are usually not as accessible as those for building physical UIs. To address this interdisciplinary issue, we propose Phybots, a toolkit that allows researchers and interaction designers to rapidly prototype applications with locomotive robotic things. The contributions of this research are the combination of a hardware setup, software API, its underlying architecture and a graphical runtime debug tool that supports the whole prototyping activity. This paper introduces the toolkit, applications and lessons learned from three user studies.

© All rights reserved Kato et al. and/or ACM Press

 
Edit | Del

Allen, Jeffrey, Young, James E., Sakamoto, Daisuke and Igarashi, Takeo (2012): Style by demonstration for interactive robot motion. In: Proceedings of DIS12 Designing Interactive Systems 2012. pp. 592-601.

As robots continue to enter people's everyday spaces, we argue that it will be increasingly important to consider the robots' movement style as an integral component of their interaction design. That is, aspects of the robot's movement which are not directly related to a task at hand (e.g., pick up a ball) can have a strong impact on how people perceive that action (e.g., aggressively or hesitantly). We call these elements the movement style. We believe that perceptions of this kind of style will be highly dependent on the culture, group, or individual, and so people will need to have the ability to customize their robot. Therefore, in this work we use Style by Demonstration, a style focus on the more-traditional programming by demonstration technique, and present the Puppet Dancer system, an interface for constructing paired and interactive robotic dances. In this paper we detail the Puppet Dancer interface and interaction design, explain our new algorithms for teaching dance by demonstration, and present the results from a formal qualitative study.

© All rights reserved Allen et al. and/or ACM Press

2011
 
Edit | Del

Liu, Kexi, Sakamoto, Daisuke, Inami, Masahiko and Igarashi, Takeo (2011): Roboshop: multi-layered sketching interface for robot housework assignment and management. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 647-656.

As various home robots come into homes, the need for efficient robot task management tools is arising. Current tools are designed for controlling individual robots independently, so they are not ideally suitable for assigning coordinated action among multiple robots. To address this problem, we developed a management tool for home robots with a graphical editing interface. The user assigns instructions by selecting a tool from a toolbox and sketching on a bird's-eye view of the environment. Layering supports the management of multiple tasks in the same room. Layered graphical representation gives a quick overview of and access to rich information tied to the physical environment. This paper describes the prototype system and reports on our evaluation of the system.

© All rights reserved Liu et al. and/or their publisher

 
Edit | Del

Sugiura, Yuta, Kakehi, Gota, Withana, Anusha, Lee, Calista, Sakamoto, Daisuke, Sugimoto, Maki, Inami, Masahiko and Igarashi, Takeo (2011): Detecting shape deformation of soft objects using directional photoreflectivity measurement. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 509-516.

We present the FuwaFuwa sensor module, a round, hand-size, wireless device for measuring the shape deformations of soft objects such as cushions and plush toys. It can be embedded in typical soft objects in the household without complex installation procedures and without spoiling the softness of the object because it requires no physical connection. Six LEDs in the module emit IR light in six orthogonal directions, and six corresponding photosensors measure the reflected light energy. One can easily convert almost any soft object into a touch-input device that can detect both touch position and surface displacement by embedding multiple FuwaFuwa sensor modules in the object. A variety of example applications illustrate the utility of the FuwaFuwa sensor module. An evaluation of the proposed deformation measurement technique confirms its effectiveness.

© All rights reserved Sugiura et al. and/or ACM Press

2010
 
Edit | Del

Kato, Jun, Sakamoto, Daisuke and Igarashi, Takeo (2010): Surfboard: keyboard with microphone as a low-cost interactive surface. In: Proceedings of the 2010 ACM Symposium on User Interface Software and Technology 2010. pp. 387-388.

We introduce a technique to detect simple gestures of "surfing" (moving a hand horizontally) on a standard keyboard by analyzing recorded sounds in real-time with a microphone attached close to the keyboard. This technique allows the user to maintain a focus on the screen while surfing on the keyboard. Since this technique uses a standard keyboard without any modification, the user can take full advantage of the input functionality and tactile quality of his favorite keyboard supplemented with our interface.

© All rights reserved Kato et al. and/or their publisher

 
Edit | Del

Shirokura, Takumi, Sakamoto, Daisuke, Sugiura, Yuta, Ono, Tetsuo, Inami, Masahiko and Igarashi, Takeo (2010): RoboJockey: real-time, simultaneous, and continuous creation of robot actions for everyone. In: Proceedings of the 2010 ACM Symposium on User Interface Software and Technology 2010. pp. 399-400.

We developed a RoboJockey (Robot Jockey) interface for coordinating robot actions, such as dancing -- similar to "Disc jockey" and "Video jockey". The system enables a user to choreograph a dance for a robot to perform by using a simple visual language. Users can coordinate humanoid robot actions with a combination of arm and leg movements. Every action is automatically performed to background music and beat. The RoboJockey will give a new entertainment experience with robots to the end-users.

© All rights reserved Shirokura et al. and/or their publisher

2009
 
Edit | Del

Sakamoto, Daisuke, Honda, Koichiro, Inami, Masahiko and Igarashi, Takeo (2009): Sketch and run: a stroke-based interface for home robots. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 197-200.

Numerous robots have been developed, and some of them are already being used in homes, institutions, and workplaces. Despite the development of useful robot functions, the focus so far has not been on user interfaces of robots. General users of robots find it hard to understand what the robots are doing and what kind of work they can do. This paper presents an interface for the commanding home robots by using stroke gestures on a computer screen. This interface allows the user to control robots and design their behaviors by sketching the robot's behaviors and actions on a top-down view from ceiling cameras. To convey a feeling of directly controlling the robots, our interface employs the live camera view. In this study, we focused on a house-cleaning task that is typical of home robots, and developed a sketch interface for designing behaviors of vacuuming robots.

© All rights reserved Sakamoto et al. and/or ACM Press

 
Edit | Del

Kato, Jun, Sakamoto, Daisuke, Inami, Masahiko and Igarashi, Takeo (2009): Multi-touch interface for controlling multiple mobile robots. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 3443-3448.

We must give some form of a command to robots in order to have the robots do a complex task. An initial instruction is required even if they do their tasks autonomously. We therefore need interfaces for the operation and teaching of robots. Natural languages, joysticks, and other pointing devices are currently used for this purpose. These interfaces, however, have difficulty in operating multiple robots simultaneously. We developed a multi-touch interface with a top-down view from a ceiling camera for controlling multiple mobile robots. The user specifies a vector field followed by all robots on the view. This paper describes the user interface and its implementation, and future work of the project.

© All rights reserved Kato et al. and/or ACM Press

 
Edit | Del

Seifried, Thomas, Haller, Michael, Scott, Stacey D., Perteneder, Florian, Rendl, Christian, Sakamoto, Daisuke and Inami, Masahiko (2009): CRISTAL: a collaborative home media and device controller based on a multi-touch display. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 33-40.

While most homes are inherently social places, existing devices designed to control consumer electronics typically only support single user interaction. Further, as the number of consumer electronics in modern homes increases, people are often forced to switch between many controllers to interact with these devices. To simplify interaction with these devices and to enable more collaborative forms of device control, we propose an integrated remote control system, called CRISTAL (Control of Remotely Interfaced Systems using Touch-based Actions in Living spaces). CRISTAL enables people to control a wide variety of digital devices from a centralized, interactive tabletop system that provides an intuitive, gesture-based interface that enables multiple users to control home media devices through a virtually augmented video image of the surrounding environment. A preliminary user study of the CRISTAL system is presented, along with a discussion of future research directions.

© All rights reserved Seifried et al. and/or their publisher

2008
 
Edit | Del

Shiomi, Masahiro, Sakamoto, Daisuke, Takayuki, Kanda, Ishi, Carlos Toshinori, Ishiguro, Hiroshi and Hagita, Norihiro (2008): A semi-autonomous communication robot: a field trial at a train station. In: Proceedings of the 3rd ACM/IEEE International Conference on Human Robot Interaction 2008. pp. 303-310.

This paper reports an initial field trial with a prototype of a semiautonomous communication robot at a train station. We developed an operator-requesting mechanism to achieve semiautonomous operation for a communication robot functioning in real environments. The operator-requesting mechanism autonomously detects situations that the robot cannot handle by itself; a human operator helps by assuming control of the robot. This approach gives semi-autonomous robots the ability to function naturally with minimum human effort. Our system consists of a humanoid robot and ubiquitous sensors. The robot has such basic communicative behaviors as greeting and route guidance. The experimental results revealed that the operator-requesting mechanism correctly requested operator's help in 85% of the necessary situations; the operator only had to control 25% of the experiment time in the semi-autonomous mode with a robot system that successfully guided 68% of the passengers. At the same time, this trial provided the opportunity to gather user data for the further development of natural behaviors for such robots operating in real environments.

© All rights reserved Shiomi et al. and/or ACM Press

2007
 
Edit | Del

Hayashi, Kotaro, Sakamoto, Daisuke, Kanda, Takayuki, Shiomi, Masahiro, Koizumi, Satoshi, Ishiguro, Hiroshi, Ogasawara, Tsukasa and Hagita, Norihiro (2007): Humanoid robots as a passive-social medium: a field experiment at a train station. In: Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction 2007. pp. 137-144.

This paper reports a method that uses humanoid robots as a communication medium. There are many interactive robots under development, but due to their limited perception, their interactivity is still far poorer than that of humans. Our approach in this paper is to limit robots' purpose to a non-interactive medium and to look for a way to attract people's interest in the information that robots convey. We propose using robots as a passive-social medium, in which multiple robots converse with each other. We conducted a field experiment at a train station for eight days to investigate the effects of a passive-social medium.

© All rights reserved Hayashi et al. and/or ACM Press

 
Edit | Del

Sakamoto, Daisuke, Kanda, Takayuki, Ono, Tetsuo, Ishiguro, Hiroshi and Hagita, Norihiro (2007): Android as a telecommunication medium with a human-like presence. In: Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction 2007. pp. 193-200.

In this research, we realize human telepresence by developing a remote-controlled android system called Geminoid HI-1. Experimental results confirm that participants felt stronger presence of the operator when he talked through the android than when he appeared on a video monitor in a video conference system. In addition, participants talked with the robot naturally and evaluated its human likeness as equal to a man on a video monitor. At this paper's conclusion, we will discuss a remote-control system for telepresence that uses a human-like android robot as a new telecommunication medium.

© All rights reserved Sakamoto et al. and/or ACM Press

2006
 
Edit | Del

Sakamoto, Daisuke and Ono, Tetsuo (2006): Sociality of robots: do robots construct or collapse human relations?. In: Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction 2006. pp. 355-356.

With developments in robotics, robots "living" with people will become a part of daily life in the near future. However, there are many problems with social robots. In particular, the behavior of robots can influence human relations, and societies have not yet clarified this. In this paper, we report on an experiment we conducted to verify the influence of robot behavior on human relations using the "balance theory." The results show that robots can have both good and bad influence on human relations. One person's impression of another can undergo changes because of a robot. In other words, robots can construct or collapse human relations.

© All rights reserved Sakamoto and Ono and/or ACM Press

2005
 
Edit | Del

Sakamoto, Daisuke, Kanda, Takayuki, Ono, Tetsuo, Kamashima, Masayuki, Imai, Michita and Ishiguro, Hiroshi (2005): Cooperative embodied communication emerged by interactive humanoid robots. In International Journal of Human-Computer Studies, 62 (2) pp. 247-265.

Research on humanoid robots has produced various uses for their body properties in communication. In particular, mutual relationships of body movements between a robot and a human are considered to be important for smooth and natural communication, as they are in human-human communication. We have developed a semi-autonomous humanoid robot system that is capable of cooperative body movements with humans using environment-based sensors and switching communicative units. Concretely, this system realizes natural communication by using typical behaviors such as: "nodding," "eye-contact," "face-to-face," etc. It is important to note that the robot parts are NOT operated directly; only the communicative units in the robot system are switched. We conducted an experiment using the mentioned robot system and verified the importance of cooperative behaviors in a route-guidance situation where a human gives directions to the robot. The task requires a human participant (called the "speaker") to teach a route to a "hearer" that is (1) a human, (2) a developed robot that performs cooperative movements, and (3) a robot that does not move at all. This experiment is subjectively evaluated through a questionnaire and an analysis of body movements using three-dimensional data from a motion capture system. The results indicate that the cooperative body movements greatly enhance the emotional impressions of human speakers in a route-guidance situation. We believe these results will allow us to develop interactive humanoid robots that sociably communicate with humans.

© All rights reserved Sakamoto et al. and/or Academic Press

 
Add publication
Show list on your website
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
Join our community!
 
 
 

Changes to this page (author)

09 Nov 2012: Modified
09 Nov 2012: Modified
05 Apr 2012: Modified
05 Apr 2012: Modified
04 Apr 2012: Modified
05 Jul 2011: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
09 Jul 2009: Modified
09 Jul 2009: Modified
09 Jul 2009: Modified
09 Jul 2009: Modified
09 May 2009: Modified
09 May 2009: Modified
27 Jun 2007: Added

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/daisuke_sakamoto.html

Publication statistics

Pub. period:2005-2012
Pub. count:16
Number of co-authors:33



Co-authors

Number of publications with 3 favourite co-authors:

Takeo Igarashi:10
Masahiko Inami:7
Hiroshi Ishiguro:4

 

 

Productive colleagues

Daisuke Sakamoto's 3 most productive colleagues in number of publications:

Takeo Igarashi:66
Hiroshi Ishiguro:55
Masahiko Inami:47
 
 
 

Upcoming Courses

go to course
User-Centred Design - Module 3
71% booked. Starts in 24 days
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

 
 
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
 
 
 
 
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading