Publication statistics

Pub. period:2004-2011
Pub. count:10
Number of co-authors:23



Co-authors

Number of publications with 3 favourite co-authors:

Yoshio Matsumoto:7
Kentaro Takemura:2
Abdelaziz Khiat:2

 

 

Productive colleagues

Tsukasa Ogasawara's 3 most productive colleagues in number of publications:

Hiroshi Ishiguro:55
Norihiro Hagita:46
Takayuki Kanda:46
 
 
 

Upcoming Courses

Affordances: Designing Intuitive User Interfaces

89% booked. Starts in 6 days
 
 
 

User Experience: The Beginner's Guide

84% booked. Starts in 11 days
 
 
 
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger

 
Start reading

Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad

 
Start reading

Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam

 
Start reading
 
 

Tsukasa Ogasawara

Add description
Add publication

Publications by Tsukasa Ogasawara (bibliography)

 what's this?
2011
 
Edit | Del

Kondo, Yutaka, Kawamura, Masato, Takemura, Kentaro, Takamatsu, Jun and Ogasawara, Tsukasa (2011): Gaze motion planning for android robot. In: Proceedings of the 6th International Conference on Human Robot Interaction 2011. pp. 171-172.

Androids are potentially required to show human-like behavior, because their appearance resembles humans' physical features. Therefore, we propose a gaze motion planning method. Within this method, we control the convergence of eyes and the ratio of eye angle to head angle, which leads to a more precise estimation of gaze direction. We implemented our method on the android Actroid-SIT and conducted experiments for evaluation of the effects of our method. Through these experiments, we achieved a common guidance for androids when planning more precise gaze motion.

© All rights reserved Kondo et al. and/or their publisher

 
Edit | Del

Takemura, Kentaro, Ito, Akihiro, Takamatsu, Jun and Ogasawara, Tsukasa (2011): Active bone-conducted sound sensing for wearable interfaces. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 53-54.

In this paper, we propose a wearable sensor system that measures an angle of an elbow and position tapped by finger using bone-conducted sound. Our system consists of two microphones and a speaker, and they are attached on forearm. A novelty of this paper is to use active sensing for measuring an angle of an elbow. In this paper, active sensing means to emit sounds to a bone, and a microphone receives the sounds reflected at the elbow. The reflection of sound depends on the angle of elbow. Since frequencies of bone-conducted sound by tapping and from the speaker are different, these proposed techniques can be used simultaneously. We confirmed the feasibility of proposed system through experiments.

© All rights reserved Takemura et al. and/or ACM Press

2009
 
Edit | Del

Causo, Albert J., Matsuo, Mai, Ueda, Etsuko, Matsumoto, Yoshio and Ogasawara, Tsukasa (2009): Individualization of voxel-based hand model. In: Proceedings of the 4th ACM/IEEE International Conference on Human Robot Interaction 2009. pp. 219-220.

Improvements in hand pose estimation, made possible by refining the model matching step, is necessary in creating a more natural human-robot interface. Individualizing the 3D hand model of the user can result to a better hand pose estimation. This paper presents a way to accomplish the individualization by estimating the length of the finger links (bones), which is unique for every user. The 3D model of the hand is made up of voxel data derived from silhouette images obtained by multiple cameras and the finger link is estimated by searching a set of models generated from the calibration motion of the fingers. Initial pose estimation result using the model shows the feasibility of the system.

© All rights reserved Causo et al. and/or ACM Press

2007
 
Edit | Del

Ido, Junichi, Ueda, Etsuko, Matsumoto, Yoshio and Ogasawara, Tsukasa (2007): Robotic telecommunication system based on facial information measurement. In: Proceedings of the 2007 International Conference on Intelligent User Interfaces 2007. pp. 266-269.

This paper proposes a multi-modal telecommunication system using a facial expression robot. We developed a telecommunication system which projects the facial expression of an operator to a remote place using the facial expression robot "Infanoid2." The facial information of the operator is measured using a stereo camera system and projected through a robot in order to communicate with another person in a remote location. Impression evaluation experiment is performed using this system. This paper discusses the effectiveness of robots as a telecommunication medium based on the experimental results.

© All rights reserved Ido et al. and/or ACM Press

 
Edit | Del

Hayashi, Kotaro, Sakamoto, Daisuke, Kanda, Takayuki, Shiomi, Masahiro, Koizumi, Satoshi, Ishiguro, Hiroshi, Ogasawara, Tsukasa and Hagita, Norihiro (2007): Humanoid robots as a passive-social medium: a field experiment at a train station. In: Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction 2007. pp. 137-144.

This paper reports a method that uses humanoid robots as a communication medium. There are many interactive robots under development, but due to their limited perception, their interactivity is still far poorer than that of humans. Our approach in this paper is to limit robots' purpose to a non-interactive medium and to look for a way to attract people's interest in the information that robots convey. We propose using robots as a passive-social medium, in which multiple robots converse with each other. We conducted a field experiment at a train station for eight days to investigate the effects of a passive-social medium.

© All rights reserved Hayashi et al. and/or ACM Press

2006
 
Edit | Del

Khiat, Abdelaziz, Toyota, Masataka, Matsumoto, Yoshio and Ogasawara, Tsukasa (2006): Investigating the relation between robot bodily expressions and their impression on the user. In: Proceedings of the 2006 International Conference on Intelligent User Interfaces 2006. pp. 339-341.

During an interaction process, people usually adapt their behavior according to the interpretation of their partner's bodily expressions. It is not known how much similar expressions performed by robots affect a human observer. This paper explores this issue. The study shows a correlation between the nature of the bodily expressions, through the result of questionnaires, and the effect on brain activity. It has been demonstrated that unpleasant bodily expressions of the robot elicit unpleasant impressions and vice versa. This was observed through brain activity in a specific area when the expression is pleasant, and in another area when it is unpleasant.

© All rights reserved Khiat et al. and/or ACM Press

2005
 
Edit | Del

Suenaga, Tsuyoshi, Matsumoto, Yoshio and Ogasawara, Tsukasa (2005): 3D display based on motion parallax using non-contact 3D measurement of head position. In: Proceedings of OZCHI05, the CHISIG Annual Conference on Human-Computer Interaction 2005. pp. 1-4.

In this paper, a novel non-contact 3D display based on motion parallax is proposed. The 3D viewpoint of the user is measured by real-time non-contact measurement system. By moving the user's head position and watching a CG image which corresponds to the measured viewpoint, the user can perceive 3D information using a normal flat display. Basic experiments for depth perception using single eye and both eyes of the user are conducted to show the feasibility of the system.

© All rights reserved Suenaga et al. and/or their publisher

 
Edit | Del

Kondo, Masahiro, Ueda, Jun, Matsumoto, Yoshio and Ogasawara, Tsukasa (2005): Evaluation of Manipulative Familiarization and Fatigue Based on Contact State Transition. In: WHC 2005 - World Haptics Conference 18-20 March, 2005, Pisa, Italy. pp. 102-107.

2004
 
Edit | Del

Khiat, Abdelaziz, Matsumoto, Yoshio and Ogasawara, Tsukasa (2004): Task specific eye movements understanding for a gaze-sensitive dictionary. In: Nunes, Nuno Jardim and Rich, Charles (eds.) International Conference on Intelligent User Interfaces 2004 January 13-16, 2004, Funchal, Madeira, Portugal. pp. 265-267.

In this paper, we study the relation between the user's degree of understanding and his/her eye movements; in an effort to realize a proactive interface that monitors the user and provides a contextual support. The application is a gaze sensitive dictionary that helps the user when reading a text in a browser's window. Not only is the user's gaze analyzed but also the context and thus the difficulty degree of the text being read. The experiment results suggest using regressions as an indicator to trigger the help process along with a context grounding approach.

© All rights reserved Khiat et al. and/or ACM Press

 
Edit | Del

Koeda, Masanao, Matsumoto, Yoshio and Ogasawara, Tsukasa (2004): Annotation-Based Assistance System for Unmanned Helicopter with Wearable Augmented Reality Environment. In: 3rd IEEE and ACM International Symposium on Mixed and Augmented Reality ISMAR 2004 2-5 November, 2004, Arlington, VA, USA. pp. 288-289.

 
Add publication
Show list on your website
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
 
 
 

Changes to this page (author)

05 Apr 2012: Modified
18 Apr 2011: Modified
09 Jul 2009: Modified
09 Jul 2009: Modified
12 Jun 2009: Modified
29 May 2009: Modified
24 Jul 2007: Modified
24 Jul 2007: Modified
24 Jul 2007: Modified
28 Apr 2003: Added

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/tsukasa_ogasawara.html

Publication statistics

Pub. period:2004-2011
Pub. count:10
Number of co-authors:23



Co-authors

Number of publications with 3 favourite co-authors:

Yoshio Matsumoto:7
Kentaro Takemura:2
Abdelaziz Khiat:2

 

 

Productive colleagues

Tsukasa Ogasawara's 3 most productive colleagues in number of publications:

Hiroshi Ishiguro:55
Norihiro Hagita:46
Takayuki Kanda:46
 
 
 

Upcoming Courses

Affordances: Designing Intuitive User Interfaces

89% booked. Starts in 6 days
 
 
 

User Experience: The Beginner's Guide

84% booked. Starts in 11 days
 
 
 
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger

 
Start reading

Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad

 
Start reading

Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam

 
Start reading
 
 
 
 
This course starts in