Upcoming Courses

go to course
User Experience: The Beginner's Guide
91% booked. Starts in 4 days
go to course
User-Centred Design - Module 2
90% booked. Starts in 5 days
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

 
 
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
 
 
 
 
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading
 
 

Jun Takamatsu

 

Publications by Jun Takamatsu (bibliography)

 what's this?
2011
 
Edit | Del

Kondo, Yutaka, Kawamura, Masato, Takemura, Kentaro, Takamatsu, Jun and Ogasawara, Tsukasa (2011): Gaze motion planning for android robot. In: Proceedings of the 6th International Conference on Human Robot Interaction 2011. pp. 171-172. Available online

Androids are potentially required to show human-like behavior, because their appearance resembles humans' physical features. Therefore, we propose a gaze motion planning method. Within this method, we control the convergence of eyes and the ratio of eye angle to head angle, which leads to a more precise estimation of gaze direction. We implemented our method on the android Actroid-SIT and conducted experiments for evaluation of the effects of our method. Through these experiments, we achieved a common guidance for androids when planning more precise gaze motion.

© All rights reserved Kondo et al. and/or their publisher

 
Edit | Del

Takemura, Kentaro, Ito, Akihiro, Takamatsu, Jun and Ogasawara, Tsukasa (2011): Active bone-conducted sound sensing for wearable interfaces. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 53-54. Available online

In this paper, we propose a wearable sensor system that measures an angle of an elbow and position tapped by finger using bone-conducted sound. Our system consists of two microphones and a speaker, and they are attached on forearm. A novelty of this paper is to use active sensing for measuring an angle of an elbow. In this paper, active sensing means to emit sounds to a bone, and a microphone receives the sounds reflected at the elbow. The reflection of sound depends on the angle of elbow. Since frequencies of bone-conducted sound by tapping and from the speaker are different, these proposed techniques can be used simultaneously. We confirmed the feasibility of proposed system through experiments.

© All rights reserved Takemura et al. and/or ACM Press

 
Add publication
Show list on your website
 
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
Join our community!
 
 
 

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/jun_takamatsu.html