Number of co-authors:20
Number of publications with 3 favourite co-authors:Takeo Igarashi:5Masahiko Inami:5Daisuke Sakamoto:2
Yuta Sugiura's 3 most productive colleagues in number of publications:Takeo Igarashi:66Masahiko Inami:47Maki Sugimoto:19
Visual appearance is one of the most effective variables for quickly differentiating one application from another
-- Bob Baxley, 2003
Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess
User Experience and Experience Design !
Our Latest Books
Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger
Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad
Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam
Publications by Yuta Sugiura (bibliography)
Ogata, Masa, Sugiura, Yuta, Osawa, Hirotaka and Imai, Michita (2012): iRing: intelligent ring using infrared reflection. In: Proceedings of the 2012 ACM Symposium on User Interface Software and Technology 2012. pp. 131-136.
We present the iRing, an intelligent input ring device developed for measuring finger gestures and external input. iRing recognizes rotation, finger bending, and external force via an infrared (IR) reflection sensor that leverages skin characteristics such as reflectance and softness. Furthermore, iRing allows using a push and stroke input method, which is popular in touch displays. The ring design has potential to be used as a wearable controller because its accessory shape is socially acceptable, easy to install, and safe, and iRing does not require extra devices. We present examples of iRing applications and discuss its validity as an inexpensive wearable interface and as a human sensing device.
© All rights reserved Ogata et al. and/or ACM Press
Sugiura, Yuta, Inami, Masahiko and Igarashi, Takeo (2012): A thin stretchable interface for tangential force measurement. In: Proceedings of the 2012 ACM Symposium on User Interface Software and Technology 2012. pp. 529-536.
We have developed a simple skin-like user interface that can be easily attached to curved as well as flat surfaces and used to measure tangential force generated by pinching and dragging interactions. The interface consists of several photoreflectors that consist of an IR LED and a phototransistor and elastic fabric such as stocking and rubber membrane. The sensing method used is based on our observation that photoreflectors can be used to measure the ratio of expansion and contraction of a stocking using the changes in transmissivity of IR light passing through the stocking. Since a stocking is thin, stretchable, and nearly transparent, it can be easily attached to various types of objects such as mobile devices, robots, and different parts of the body as well as to various types of conventional pressure sensors without altering the original shape of the object. It can also present natural haptic feedback in accordance with the amount of force exerted. A system using several such sensors can determine the direction of a two-dimensional force. A variety of example applications illustrated the utility of this sensing system.
© All rights reserved Sugiura et al. and/or ACM Press
Yoshizaki, Wataru, Sugiura, Yuta, Chiou, Albert C., Hashimoto, Sunao, Inami, Masahiko, Igarashi, Takeo, Akazawa, Yoshiaki, Kawachi, Katsuaki, Kagami, Satoshi and Mochimaru, Masaaki (2011): An actuated physical puppet as an input device for controlling a digital manikin. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 637-646.
We present an actuated handheld puppet system for controlling the posture of a virtual character. Physical puppet devices have been used in the past to intuitively control character posture. In our research, an actuator is added to each joint of such an input device to provide physical feedback to the user. This enhancement offers many benefits. First, the user can upload pre-defined postures to the device to save time. Second, the system is capable of dynamically adjusting joint stiffness to counteract gravity, while allowing control to be maintained with relatively little force. Third, the system supports natural human body behaviors, such as whole-body reaching and joint coupling. This paper describes the user interface and implementation of the proposed technique and reports the results of expert evaluation. We also conducted two user studies to evaluate the effectiveness of our method.
© All rights reserved Yoshizaki et al. and/or their publisher
Sugiura, Yuta, Kakehi, Gota, Withana, Anusha, Lee, Calista, Sakamoto, Daisuke, Sugimoto, Maki, Inami, Masahiko and Igarashi, Takeo (2011): Detecting shape deformation of soft objects using directional photoreflectivity measurement. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 509-516.
We present the FuwaFuwa sensor module, a round, hand-size, wireless device for measuring the shape deformations of soft objects such as cushions and plush toys. It can be embedded in typical soft objects in the household without complex installation procedures and without spoiling the softness of the object because it requires no physical connection. Six LEDs in the module emit IR light in six orthogonal directions, and six corresponding photosensors measure the reflected light energy. One can easily convert almost any soft object into a touch-input device that can detect both touch position and surface displacement by embedding multiple FuwaFuwa sensor modules in the object. A variety of example applications illustrate the utility of the FuwaFuwa sensor module. An evaluation of the proposed deformation measurement technique confirms its effectiveness.
© All rights reserved Sugiura et al. and/or ACM Press
Sugiura, Yuta, Sakamoto, Diasuke, Withana, Anusha, Inami, Masahiko and Igarashi, Takeo (2010): Cooking with robots: designing a household system working in open environments. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 2427-2430.
We propose a cooking system that operates in an open environment. The system cooks a meal by pouring various ingredients into a boiling pot on an induction heating cooker and adjusts the heating strength according to the user's instructions. We then describe how the system incorporates robotic- and human-specific elements in a shared workspace so as to achieve a cooperative rudimentary cooking capability. First, we use small mobile robots instead of built-in arms to save space, improve flexibility and increase safety. Second, we use detachable visual markers to allow the user to easily configure the real-world environment. Third, we provide a graphical user interface to display detailed cooking instructions to the user. We hope insights obtained in this experiment will be useful for the design of other household systems in the future.
© All rights reserved Sugiura et al. and/or their publisher
Shirokura, Takumi, Sakamoto, Daisuke, Sugiura, Yuta, Ono, Tetsuo, Inami, Masahiko and Igarashi, Takeo (2010): RoboJockey: real-time, simultaneous, and continuous creation of robot actions for everyone. In: Proceedings of the 2010 ACM Symposium on User Interface Software and Technology 2010. pp. 399-400.
We developed a RoboJockey (Robot Jockey) interface for coordinating robot actions, such as dancing -- similar to "Disc jockey" and "Video jockey". The system enables a user to choreograph a dance for a robot to perform by using a simple visual language. Users can coordinate humanoid robot actions with a combination of arm and leg movements. Every action is automatically performed to background music and beat. The RoboJockey will give a new entertainment experience with robots to the end-users.
© All rights reserved Shirokura et al. and/or their publisher
Show list on your website
Join our community and advance:
Changes to this page (author)23 Nov 2012: Modified23 Nov 2012: Modified
05 Apr 2012: Modified
05 Jul 2011: Modified
03 Nov 2010: Modified
02 Nov 2010: Added
Page maintainer: The Editorial Team