Number of co-authors:11
Number of publications with 3 favourite co-authors:Jun Rekimoto:10Emi Tamaki:3Adiyan Mujibiya:2
Takashi Miyaki's 3 most productive colleagues in number of publications:Jun Rekimoto:60Michael Beigl:23Michimune Kohno:8
User error: replace user and press any key to continue.
-- Popular computer one-liner
Read the fascinating history of Wearable Computing, told by its father, Steve Mann
Read Steve's chapter !
Publications by Takashi Miyaki (bibliography)
Tamaki, Emi, Miyaki, Takashi and Rekimoto, Jun (2011): PossessedHand: techniques for controlling human hands using electrical muscles stimuli. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 543-552.
If a device can control human hands, the device can be useful for HCI and tangible application's output. To aid the controlling of finger movement, we present PossessedHand, a device with a forearm belt that can inform when and which fingers should be moved. PossessedHand controls the user's fingers by applying electrical stimulus to the muscles around the forearm. Each muscle is stimulated via 28 electrode pads. Muscles at different depths in the forearm can be selected for simulation by varying the stimulation level. PossessedHand can automatically calibrate the system for individuals. The automatic calibration system estimates relations between each electrode pad, stimulation level and muscle movement. Experiments show that PossessedHand can control the motion of 16 joints in the hand. Further, we also discuss an application based on this device to aid in playing a musical instrument.
© All rights reserved Tamaki et al. and/or their publisher
Miyaki, Takashi, Ding, Yong, Banitalebi, Behnam and Beigl, Michael (2011): Things that hover: interaction with tiny battery-less robots on desktop. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 531-540.
This paper presents computationally and physically augmented desktop objects -- "Things that hover" -- that is capable of moving autonomously on desktop, and discusses about technical mechanisms, future possible interaction styles and applications based on this architecture. A goal of the design is to create self-moving robotic modules on top of a flat surface. Integrating lightweight piezoelectric air-blow actuators and contact-less power providing technology from desktop surface, tiny robots can hover and control the direction of movement without any battery, which illustrates that our approach is practically feasible.
© All rights reserved Miyaki et al. and/or their publisher
Mujibiya, Adiyan, Miyaki, Takashi and Rekimoto, Jun (2010): Anywhere touchtyping: text input on arbitrary surface using depth sensing. In: Proceedings of the 2010 ACM Symposium on User Interface Software and Technology 2010. pp. 443-444.
In this paper, touch typing enabled virtual keyboard system using depth sensing on arbitrary surface is proposed. Keystroke event detection is conducted using 3-dimensional hand appearance database matching combined with fingertip's surface touch sensing. Our prototype system acquired hand posture depth map by implementing phase shift algorithm for Digital Light Processor (DLP) fringe projection on arbitrary flat surface. The system robustly detects hand postures on the sensible surface with no requirement of hand position alignment on virtual keyboard frame. The keystroke feedback is the physical touch to the surface, thus no specific hardware must be worn. The system works real-time in average of 20 frames per second.
© All rights reserved Mujibiya et al. and/or their publisher
Tamaki, Emi, Miyaki, Takashi and Rekimoto, Jun (2010): PossessedHand: a hand gesture manipulation system using electrical stimuli. In: Proceedings of the 2010 Augmented Human International Conference 2010. p. 2.
Acquiring knowledge about the timing and speed of hand gestures is important to learn physical skills, such as playing musical instruments, performing arts, and making handicrafts. However, it is difficult to use devices that dynamically and mechanically control a user's hand for learning because such devices are very large, and hence, are unsuitable for daily use. In addition, since groove-type devices interfere with actions such as playing musical instruments, performing arts, and making handicrafts, users tend to avoid wearing these devices. To solve these problems, we propose PossessedHand, a device with a forearm belt, for controlling a user's hand by applying electrical stimulus to the muscles around the forearm of the user. The dimensions of PossessedHand are 10 x 7.0 x 8.0 cm, and the device is portable and suited for daily use. The electrical stimuli are generated by an electronic pulse generator and transmitted from 14 electrode pads. Our experiments confirmed that PossessedHand can control the motion of 16 joints in the hand. We propose an application of this device to help a beginner learn how to play musical instruments such as the piano and koto.
© All rights reserved Tamaki et al. and/or ACM Press
Ishiguro, Yoshio, Mujibiya, Adiyan, Miyaki, Takashi and Rekimoto, Jun (2010): Aided eyes: eye activity sensing for daily life. In: Proceedings of the 2010 Augmented Human International Conference 2010. p. 25.
Our eyes collect a considerable amount of information when we use them to look at objects. In particular, eye movement allows us to gaze at an object and shows our level of interest in the object. In this research, we propose a method that involves real-time measurement of eye movement for human memory enhancement; the method employs gaze-indexed images captured using a video camera that is attached to the user's glasses. We present a prototype system with an infrared-based corneal limbus tracking method. Although the existing eye tracker systems track eye movement with high accuracy, they are not suitable for daily use because the mobility of these systems is incompatible with a high sampling rate. Our prototype has small phototransistors, infrared LEDs, and a video camera, which make it possible to attach the entire system to the glasses. Additionally, the accuracy of this method is compensated by combining image processing methods and contextual information, such as eye direction, for information extraction. We develop an information extraction system with real-time object recognition in the user's visual attention area by using the prototype of an eye tracker and a head-mounted camera. We apply this system to (1) fast object recognition by using a SURF descriptor that is limited to the gaze area and (2) descriptor matching of a past-images database. Face recognition by using haar-like object features and text logging by using OCR technology is also implemented. The combination of a low-resolution camera and a high-resolution, wide-angle camera is studied for high daily usability. The possibility of gaze-guided computer vision is discussed in this paper, as is the topic of communication by the photo transistor in the eye tracker and the development of a sensor system that has a high transparency.
© All rights reserved Ishiguro et al. and/or ACM Press
Tamaki, Emi, Miyaki, Takashi and Rekimoto, Jun (2009): Brainy hand: an ear-worn hand gesture interaction device. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 4255-4260.
Existing wearable hand gesture interaction devices are very bulky and cannot be worn in everyday life, because of the presence of a large visual feedback device. In particular, an eyeglass-type head-mounted display is very large for constant usage. To solve this problem, we propose Brainy Hand, which is a simple wearable device that adopts laser line, or more specifically, a mini-projector as a visual feedback device. Brainy Hand consists of a color camera, an earphone, and a laser line or mini-projector. This device uses a camera to detect 3D hand gestures. The earphone is used for receiving audio feedback. In this study, we introduce several user interfaces using Brainy Hand. (e.g., music player, phone).
© All rights reserved Tamaki et al. and/or ACM Press
Iwasaki, Ken, Miyaki, Takashi and Rekimoto, Jun (2009): Expressive typing: a new way to sense typing pressure and its applications. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 4369-4374.
In this paper, we propose a new way for measuring key typing pressure when using off-the-shelf laptop computers. Accelerometers embedded in laptop computers to protect hard discs from sudden motion are becoming very common. This paper explores the concept of utilizing this accelerometer for sensing non-verbal aspects of key typing, such as key typing pressure. This possibility enables a wide variety of pressure-sensitive user interfaces through the use of software without requiring any additional hardware/sensors. Such software can be distributed easily to a substantial number of potential users. To confirm the feasibility of this idea, we compared typing finger velocities (obtained by high-speed camera images) with sensor data from an accelerometer embedded in a laptop computer. We then confirmed that there is a clear correlation between these two sets of data. We also investigated differences in typing pressure patterns among different users. By combining keystroke speeds and typing pressure, we found it is possible to distinguish among users. This feature can be used for security purposes such as preventing a laptop computer from being used by non-owners. We also present possible application ideas such as rich text expression, new types of user interface elements, and authentication.
© All rights reserved Iwasaki et al. and/or ACM Press
Miyaki, Takashi and Rekimoto, Jun (2009): GraspZoom: zooming and scrolling control model for single-handed mobile interaction. In: Proceedings of 11th Conference on Human-computer interaction with mobile devices and services 2009. p. 11.
A pressure sensing based single-handed interaction model is presented in this paper. Unlike traditional desktop GUI model, mobile UI model has not been established yet. For example, Apple iPhone proposed "Pinch" operation, which use two fingers to zoom-in and zoom-out objects. However, in a today's hand-held situation, manipulation methods using two fingers are not always good solution because they require two hands in order to hold the device itself in most cases. We propose a single-handed UI scheme "GraspZoom": multi-state input model using pressure sensing. Force Sensitive Resistor (FSR) attached on backside of a mobile phone was employed in order to evaluate effectiveness of pressure based control model. We also describe example applications which enable intuitive and continuous zooming and scrolling. By using tiny thumb gesture input along with this pressure sensing method, bi-directional operations (e.g., zoom-in and -out) are also achieved.
© All rights reserved Miyaki and Rekimoto and/or their publisher
Kawauchi, Kensaku, Miyaki, Takashi and Rekimoto, Jun (2009): Directional Beaconing: A Robust WiFi Positioning Method Using Angle-of-Emission Information. In: Choudhury, Tanzeem, Quigley, Aaron J., Strang, Thomas and Suginuma, Koji (eds.) Location and Context Awareness - Fourth International Symposium - LoCA 2009 May 7-8, 2009, Tokyo, Japan. pp. 103-119.
Rekimoto, Jun, Miyaki, Takashi and Ishizawa, Takaaki (2007): LifeTag: WiFi-Based Continuous Location Logging for Life Pattern Analysis. In: Hightower, Jeffrey, Schiele, Bernt and Strang, Thomas (eds.) Location- and Context-Awareness - Third International Symposium - LoCA 2007 September 20-21, 2007, Oberpfaffenhofen, Germany. pp. 35-49.
Rekimoto, Jun, Miyaki, Takashi and Kohno, Michimune (2004): ProxNet: Secure Dynamic Wireless Connection by Proximity Sensing. In: Ferscha, Alois and Mattern, Friedemann (eds.) PERVASIVE 2004 - Pervasive Computing, Second International Conference April 21-23, 2004, Vienna, Austria. pp. 213-218.
Show this list on your homepage
Join the technology elite and advance:
Changes to this page (author)25 Jul 2011: Added25 Jul 2011: Added
05 Jul 2011: Added
05 Jul 2011: Added
18 Apr 2011: Added
18 Apr 2011: Added
03 Nov 2010: Added
02 Nov 2010: Added
10 Feb 2010: Modified
24 Aug 2009: Added
09 May 2009: Added
09 May 2009: Added
Page maintainer: The Editorial Team