Number of co-authors:16
Number of publications with 3 favourite co-authors:Hiroyuki Kajimoto:2Shogo Fukushima:1Taku Hachisu:1
Takuya Nojima's 3 most productive colleagues in number of publications:Masahiko Inami:47Susumu Tachi:41Hiroyuki Kajimoto:29
go to course
User-Centred Design - Module 2
92% booked. Starts in 3 days
go to course
Design Thinking: The Beginner's Guide
91% booked. Starts in 4 days
Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess
User Experience and Experience Design !
Our Latest Books
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
Publications by Takuya Nojima (bibliography)
Yoshikawa, Hiromi, Hachisu, Taku, Fukushima, Shogo, Furukawa, Masahiro, Kajimoto, Hiroyuki and Nojima, Takuya (2012): Studies of vection field II: a method for generating smooth motion pattern. In: Proceedings of the 2012 International Conference on Advanced Visual Interfaces 2012. pp. 705-708.
Along public pathways, visual signs and audio cues are used by pedestrians to guide them into forming smoother pedestrian flows. However, often ignored or neglected, these signals require greater pedestrian attentiveness and appropriate conscious effort. To solve this problem, we have proposed the concept of "vection field". This is a field of optical flow that cues movement according to a pedestrian's motion. Visual stimulus within this optical flow leads pedestrians innately in specific directions without requiring direct interventions. We have implemented such a field by covering the ground with a lenticular lens screen; in this setup, neither power supply nor position tracking of pedestrians is necessary. An experimental result from our previous study shows that a vection field can direct pedestrians to one side. However, the quality of the optical flow such as image clarity and smoothness of motion was unsatisfactory in that it could cause a reduction in leading inducement. In this paper, we describe in detail a new display method involving a lenticular lens screen that yields an improvement in the quality of the vection field and ultimately pedestrian optical flow. Experiments showed improvements over previous attempts.
© All rights reserved Yoshikawa et al. and/or ACM Press
Miyauchi, Masato, Kimura, Takashi and Nojima, Takuya (2012): Development of a non-contact tongue-motion acquisition system. In: Adjunct Proceedings of the 2012 ACM Symposium on User Interface Software and Technology 2012. pp. 75-76.
We present a new tongue detection system called SITA, which comprises only a Kinect device and conventional laptop computer. In contrast with other tongue-based devices, the SITA system does not require the subject to wear a device. This avoids the issue of oral hygiene and removes the risk of swallowing a device inserted in the mouth. In this paper, we introduce the SITA system and an application. To evaluate the system, a user test was conducted. The results indicate that the system could detect the tongue position in real time. Moreover, there are possibilities of training the tongue with this system.
© All rights reserved Miyauchi et al. and/or ACM Press
Yamakawa, Shumpei and Nojima, Takuya (2012): A proposal for a MMG-based hand gesture recognition method. In: Adjunct Proceedings of the 2012 ACM Symposium on User Interface Software and Technology 2012. pp. 89-90.
We propose a novel hand-gesture recognition method based on mechanomyograms (MMGs). Skeletal muscles generate sounds specific to their activity. By recording and analyzing these sounds, MMGs provide means to evaluate the activity. Previous research revealed that specific motions produce specific sounds enabling human motion to be classified based on MMGs. In that research, microphones and accelerometers are often used to record muscle sounds. However, environmental conditions such as noise and human motion itself easily overwhelm such sensors. In this paper, we propose to use piezoelectric-based sensing of MMGs to improve robustness from environmental conditions. The preliminary evaluation shows this method is capable of classifying several hand gestures correctly with high accuracy under certain situations.
© All rights reserved Yamakawa and Nojima and/or ACM Press
Kato, Kojiro, Kitani, Kris M. and Nojima, Takuya (2011): Ego-motion analysis using average image data intensity. In: Proceedings of the 2011 Augmented Human International Conference 2011. p. 9.
In this paper, we present a new method to perform ego-motion analysis using intensity averaging of image data. The method can estimate general motions from two sequential images on pixel plane by calculating cross correlations. With distance information between camera and objects, this method also enables estimates of camera motion. This method is sufficiently robust even for out of focus image and the calculational overhead is quite low because it uses a simple averaging method. In the future, this method could be used to measure fast motions such as human head tracking, or robot movement. We present a detailed description of the proposed method, and experimental results demonstrating its basic capability. With these results, we verify that our proposed system can detect camera motion even with blurred images. Furthermore, we confirm that it can operate at up to 714 FPS in calculating one dimensional translation motion.
© All rights reserved Kato et al. and/or ACM Press
Higuchi, Hideaki and Nojima, Takuya (2010): Shoe-shaped i/o interface. In: Proceedings of the 2010 ACM Symposium on User Interface Software and Technology 2010. pp. 423-424.
In this research, we propose a shoe-shaped I/O interface. The benefits to users of wearable devices are significantly reduced if they are aware of them. Wearable devices should have the ability to be worn without requiring any attention from the user. However, previous wearable systems required users to be careful and be aware of wearing or carrying them. To solve this problem, we propose a shoe-shaped I/O interface. By wearing the shoes throughout the day, users soon cease to be conscious of them. Electromechanical devices are potentially easy to install in shoes. This report describes the concept of a shoe-shaped I/O interface, the development of a prototype system, and possible applications.
© All rights reserved Higuchi and Nojima and/or their publisher
Ichikawa, Takashi and Nojima, Takuya (2010): Development of the motion-controllable ball. In: Proceedings of the 2010 ACM Symposium on User Interface Software and Technology 2010. pp. 425-426.
In this report, we propose a novel ball type interactive interface device. Balls are one of the most important pieces of equipment used for entertainment and sports. Their motion guides a player's response in terms of, for example, a feint or similar movement. Many kinds of breaking ball throws have been developed for various sports (e.g. baseball). However, acquiring the skill to appropriately react to these breaking balls is often hard to achieve and requires long-term training. Many researchers focus on the ball itself and have developed interactive balls with visual and acoustic feedbacks. However, these balls do not have the ability for motion control. In this paper, we introduce a ball-type motion control interface device. It is composed of a ball and an air-pressure tank to change its vector using gas ejection. We conducted an experiment that measures the ball's flight path while subjected to gas ejection and the results showed that the prototype system had enough power to change the ball's vector while flying.
© All rights reserved Ichikawa and Nojima and/or their publisher
Nojima, Takuya and Kajimoto, Hiroyuki (2008): A study on a flight display using retro-reflective projection technology and a propeller. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 2721-2726.
The head up display (HUD) is becoming increasingly common in the aerospace field because it has many benefits such as enabling operations in poor visibility and improving flight safety. The HUD is a kind of augmented reality display that enables a pilot to observe the scene outside the cockpit while simultaneously viewing an artificial image of flight information. However, the HUD is too expensive and heavy for light airplanes. In this paper, we propose a new method to combine real and artificial images using Retro-reflective Projection Technology and rotating objects, and we apply the method to an airplane with a single propeller to compose a simple HUD. In this report, we also describe the developed system and preliminary experimental results.
© All rights reserved Nojima and Kajimoto and/or ACM Press
Nojima, Takuya and Funabiki, Kohei (2005): Cockpit Display Using Tactile Sensation. In: WHC 2005 - World Haptics Conference 18-20 March, 2005, Pisa, Italy. pp. 501-502.
Nojima, Takuya, Sekiguchi, Dairoku, Inami, Masahiko and Tachi, Susumu (2002): The SmartTool: A System for Augmented Reality of Haptics. In: VR 2002 2002. pp. 67-72.
Join our community and advance:
Changes to this page (author)23 Nov 2012: Modified23 Nov 2012: Modified
09 Nov 2012: Modified
02 May 2011: Modified
18 Apr 2011: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
12 Jun 2009: Modified
12 May 2008: Added
Page maintainer: The Editorial Team