Upcoming Courses

go to course
The Practical Guide to Usability
89% booked. Starts in 6 days
go to course
The Ultimate Guide to Visual Perception and Design
83% booked. Starts in 12 days
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

 
 
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
 
 
 
 
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading
 
 

Adiyan Mujibiya

Add description
Rename / change spelling
Add publication
 

Publications by Adiyan Mujibiya (bibliography)

 what's this?
2010
 
Edit | Del

Mujibiya, Adiyan, Miyaki, Takashi and Rekimoto, Jun (2010): Anywhere touchtyping: text input on arbitrary surface using depth sensing. In: Proceedings of the 2010 ACM Symposium on User Interface Software and Technology 2010. pp. 443-444.

In this paper, touch typing enabled virtual keyboard system using depth sensing on arbitrary surface is proposed. Keystroke event detection is conducted using 3-dimensional hand appearance database matching combined with fingertip's surface touch sensing. Our prototype system acquired hand posture depth map by implementing phase shift algorithm for Digital Light Processor (DLP) fringe projection on arbitrary flat surface. The system robustly detects hand postures on the sensible surface with no requirement of hand position alignment on virtual keyboard frame. The keystroke feedback is the physical touch to the surface, thus no specific hardware must be worn. The system works real-time in average of 20 frames per second.

© All rights reserved Mujibiya et al. and/or their publisher

 
Edit | Del

Ishiguro, Yoshio, Mujibiya, Adiyan, Miyaki, Takashi and Rekimoto, Jun (2010): Aided eyes: eye activity sensing for daily life. In: Proceedings of the 2010 Augmented Human International Conference 2010. p. 25.

Our eyes collect a considerable amount of information when we use them to look at objects. In particular, eye movement allows us to gaze at an object and shows our level of interest in the object. In this research, we propose a method that involves real-time measurement of eye movement for human memory enhancement; the method employs gaze-indexed images captured using a video camera that is attached to the user's glasses. We present a prototype system with an infrared-based corneal limbus tracking method. Although the existing eye tracker systems track eye movement with high accuracy, they are not suitable for daily use because the mobility of these systems is incompatible with a high sampling rate. Our prototype has small phototransistors, infrared LEDs, and a video camera, which make it possible to attach the entire system to the glasses. Additionally, the accuracy of this method is compensated by combining image processing methods and contextual information, such as eye direction, for information extraction. We develop an information extraction system with real-time object recognition in the user's visual attention area by using the prototype of an eye tracker and a head-mounted camera. We apply this system to (1) fast object recognition by using a SURF descriptor that is limited to the gaze area and (2) descriptor matching of a past-images database. Face recognition by using haar-like object features and text logging by using OCR technology is also implemented. The combination of a low-resolution camera and a high-resolution, wide-angle camera is studied for high daily usability. The possibility of gaze-guided computer vision is discussed in this paper, as is the topic of communication by the photo transistor in the eye tracker and the development of a sensor system that has a high transparency.

© All rights reserved Ishiguro et al. and/or ACM Press

 
Add publication
Show list on your website
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
Join our community!
 
 
 
Date created: Not available
Date last modified: Not available Date created: Not available
Date last modified: Not available

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/adiyan_mujibiya.html