Number of co-authors:19
Number of publications with 3 favourite co-authors:Daisuke Sakamoto:4Makoto Okamoto:4Kiyohide Ito:4
Tetsuo Ono's 3 most productive colleagues in number of publications:Takeo Igarashi:66Hiroshi Ishiguro:55Masahiko Inami:47
go to course
Gamification: Creating Addictive User Experience
Starts tomorrow LAST CALL!
go to course
User-Centred Design - Module 3
67% booked. Starts in 28 days
Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess
User Experience and Experience Design !
Our Latest Books
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
Publications by Tetsuo Ono (bibliography)
Okamoto, Makoto, Komatsu, Takanori, Ito, Kiyohide, Akita, Junichi and Ono, Tetsuo (2011): FutureBody: design of perception using the human body. In: Proceedings of the 2011 Augmented Human International Conference 2011. p. 35.
We created a new interactive design concept "FutureBody" that generates or augments new perceptions for users. The concept of FutureBody consists of two elements, "active searching" and "embodiment," allowing users to search their environment actively and to emit indirect feedback to activate users' embodiments. We believe this concept will form the basis for a new perception design methodology for people.
© All rights reserved Okamoto et al. and/or ACM Press
Shirokura, Takumi, Sakamoto, Daisuke, Sugiura, Yuta, Ono, Tetsuo, Inami, Masahiko and Igarashi, Takeo (2010): RoboJockey: real-time, simultaneous, and continuous creation of robot actions for everyone. In: Proceedings of the 2010 ACM Symposium on User Interface Software and Technology 2010. pp. 399-400.
We developed a RoboJockey (Robot Jockey) interface for coordinating robot actions, such as dancing -- similar to "Disc jockey" and "Video jockey". The system enables a user to choreograph a dance for a robot to perform by using a simple visual language. Users can coordinate humanoid robot actions with a combination of arm and leg movements. Every action is automatically performed to background music and beat. The RoboJockey will give a new entertainment experience with robots to the end-users.
© All rights reserved Shirokura et al. and/or their publisher
Ogawa, Kohei and Ono, Tetsuo (2008): ITACO: Effects to Interactions by Relationships between Humans and Artifacts. In: Prendinger, Helmut, Lester, James C. and Ishizuka, Mitsuru (eds.) IVA 2008 - Intelligent Virtual Agents - 8th International Conference September 1-3, 2008, Tokyo, Japan. pp. 296-307.
Sakamoto, Daisuke, Kanda, Takayuki, Ono, Tetsuo, Ishiguro, Hiroshi and Hagita, Norihiro (2007): Android as a telecommunication medium with a human-like presence. In: Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction 2007. pp. 193-200.
In this research, we realize human telepresence by developing a remote-controlled android system called Geminoid HI-1. Experimental results confirm that participants felt stronger presence of the operator when he talked through the android than when he appeared on a video monitor in a video conference system. In addition, participants talked with the robot naturally and evaluated its human likeness as equal to a man on a video monitor. At this paper's conclusion, we will discuss a remote-control system for telepresence that uses a human-like android robot as a new telecommunication medium.
© All rights reserved Sakamoto et al. and/or ACM Press
Sakamoto, Daisuke and Ono, Tetsuo (2006): Sociality of robots: do robots construct or collapse human relations?. In: Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction 2006. pp. 355-356.
With developments in robotics, robots "living" with people will become a part of daily life in the near future. However, there are many problems with social robots. In particular, the behavior of robots can influence human relations, and societies have not yet clarified this. In this paper, we report on an experiment we conducted to verify the influence of robot behavior on human relations using the "balance theory." The results show that robots can have both good and bad influence on human relations. One person's impression of another can undergo changes because of a robot. In other words, robots can construct or collapse human relations.
© All rights reserved Sakamoto and Ono and/or ACM Press
Ono, Tetsuo, Komatsu, Takanori, Akita, Junichi, Ito, Kiyohide and Okamoto, Makoto (2006): CyARM: Interactive Device for Environment Recognition and Joint Haptic Attention Using Non-visual Modality. In: Miesenberger, Klaus, Klaus, Joachim, Zagler, Wolfgang L. and Karshmer, Arthur I. (eds.) ICCHP 2006 - Computers Helping People with Special Needs, 10th International Conference July 11-13, 2006, Linz, Austria. pp. 1251-1258.
Sakamoto, Daisuke, Kanda, Takayuki, Ono, Tetsuo, Kamashima, Masayuki, Imai, Michita and Ishiguro, Hiroshi (2005): Cooperative embodied communication emerged by interactive humanoid robots. In International Journal of Human-Computer Studies, 62 (2) pp. 247-265.
Research on humanoid robots has produced various uses for their body properties in communication. In particular, mutual relationships of body movements between a robot and a human are considered to be important for smooth and natural communication, as they are in human-human communication. We have developed a semi-autonomous humanoid robot system that is capable of cooperative body movements with humans using environment-based sensors and switching communicative units. Concretely, this system realizes natural communication by using typical behaviors such as: "nodding," "eye-contact," "face-to-face," etc. It is important to note that the robot parts are NOT operated directly; only the communicative units in the robot system are switched. We conducted an experiment using the mentioned robot system and verified the importance of cooperative behaviors in a route-guidance situation where a human gives directions to the robot. The task requires a human participant (called the "speaker") to teach a route to a "hearer" that is (1) a human, (2) a developed robot that performs cooperative movements, and (3) a robot that does not move at all. This experiment is subjectively evaluated through a questionnaire and an analysis of body movements using three-dimensional data from a motion capture system. The results indicate that the cooperative body movements greatly enhance the emotional impressions of human speakers in a route-guidance situation. We believe these results will allow us to develop interactive humanoid robots that sociably communicate with humans.
© All rights reserved Sakamoto et al. and/or Academic Press
Ito, Kiyohide, Okamoto, Makoto, Akita, Junichi, Ono, Tetsuo, Gyobu, Ikuko, Takagi, Tomohito, Hoshi, Takahiro and Mishima, Yu (2005): CyARM: an alternative aid device for blind persons. In: Proceedings of ACM CHI 2005 Conference on Human Factors in Computing Systems 2005. pp. 1483-1488.
With the concept of 'human-machine interface', designed especially for visually impaired persons, we have developed an electric aid device for use in guiding orientation and locomotion. The device, which we call CyARM, measures the distance between a person and an object with an ultrasonic sensor and transmits the distance information to the user's haptic sense. In this report, we will: (1) outline the concept of CyARM, (2) describe its mechanism, and (3) demonstrate three preliminary experiments that verify the usability of CyARM. We conducted the experiments in terms of detection of objects, detection of space, and tracking object movement. As a result of these experiments, we have concluded that CyARM is potentially effective for visually impaired persons. Our study will encourage the related studies of user interfaces, particularly focusing on electric aid devices that guide visually impaired persons in detecting their environment.
© All rights reserved Ito et al. and/or ACM Press
Ogawa, Kohei and Ono, Tetsuo (2005): Ubiquitous cognition: mobile environment achieved by migratable agent. In: Proceedings of 7th conference on Human-computer interaction with mobile devices and services 2005. pp. 337-338.
We propose a concept of Ubiquitous Cognition and introduce an integrated agent for communication (ITACO) system based on the concept. To realize our proposed concept, the ITACO system tries to appropriately support a user using a migratable agent which is context-sensitive and gives continuous assistance. The key factor in this system is the construction of a relationship between the user and the agent, and to carry on this relationship between the user and the object that the agent has migrated to. Psychological experiments were carried out to verify this succession of the relationship between media. The results of the experiments showed that the subjects' attachment to the media as well as the relationship was succeeded through the media by the agent migration.
© All rights reserved Ogawa and Ono and/or ACM Press
Ogawa, Kohei and Ono, Tetsuo (2005): Ubiquitous cognition: mobile environment achieved by migratable agent. In: Tscheligi, Manfred, Bernhaupt, Regina and Mihalic, Kristijan (eds.) Proceedings of the 7th Conference on Human-Computer Interaction with Mobile Devices and Services - Mobile HCI 2005 September 19-22, 2005, Salzburg, Austria. pp. 337-338.
Okamoto, Makoto, Akita, Junichi, Ito, Kiyohide, Ono, Tetsuo and Takagi, Tomohito (2004): CyARM - Interactive Device for Environment Recognition Using a Non-visual Modality. In: Klaus, Joachim, Miesenberger, Klaus, Zagler, Wolfgang L. and Burger, Dominique (eds.) ICCHP 2004 - Computers Helping People with Special Needs - 9th International Conference July 7-9, 2004, Paris, France. pp. 462-467.
Join our community and advance:
Changes to this page (author)18 Apr 2011: Modified03 Nov 2010: Modified
23 Jul 2009: Modified
23 Jul 2009: Modified
12 Jul 2009: Modified
09 Jul 2009: Modified
09 Jul 2009: Modified
29 May 2009: Modified
24 Jul 2007: Modified
29 Jun 2007: Modified
27 Jun 2007: Added
Page maintainer: The Editorial Team