Publication statistics

Pub. period:1996-2012
Pub. count:37
Number of co-authors:52



Co-authors

Number of publications with 3 favourite co-authors:

Mark Billinghurst:10
Karl D. D. Willis:4
Suzanne Weghorst:4

 

 

Productive colleagues

Ivan Poupyrev's 3 most productive colleagues in number of publications:

Scott E. Hudson:113
Hiroshi Ishii:111
Mark Billinghurst:92
 
 
 

Upcoming Courses

go to course
The Practical Guide to Usability
89% booked. Starts in 6 days
go to course
The Ultimate Guide to Visual Perception and Design
83% booked. Starts in 12 days
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

 
 
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
 
 
 
 
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading
 
 

Ivan Poupyrev

Picture of Ivan Poupyrev.
Update pic
Has also published under the name of:
"I. Poupyrev"

Personal Homepage:
ivanpoupyrev.com/index.php

Current place of employment:
Interaction Lab, Sony CSL

Senior Research Scientist, Walt Disney Research. I am a career researcher in interactive technologies and interface design. My job is to come up with new ideas, concepts and research directions. Some of these ideas have become real products, others serve to inspire new research directions, while many others simply satisfy my curiosity

Edit author info
Rename / change spelling
Add publication
 

Publications by Ivan Poupyrev (bibliography)

 what's this?
2012
 
Edit | Del

Poupyrev, Ivan (2012): Infusing the physical world into user interfaces. In: Proceedings of the 2012 International Conference on Multimodal Interfaces 2012. pp. 229-230.

Advances in new materials and manufacturing techniques are rapidly blending the computational and physical worlds. With every new turn in technology development -- e.g., discovering a novel "smart" material, inventing a more efficient manufacturing process or designing a faster microprocessor -- there are new and exciting ways to take user interfaces away from the screen and blend them into our living spaces and everyday objects, making them more responsive, intelligent and adaptive. As the world around us becomes increasingly infused with technology, the user interfaces and computers themselves will disappear into the background, blending into the physical world around us. Thus, the old tried-and-true paradigms for designing interaction and interfaces must be re-evaluated, re-designed and, in some cases, even discarded to take advantage of the new possibilities that these cutting-edge technologies provide. While the challenges and opportunities are distinct, the fundamental goal remains the same: to provide for the effortless and effective consumption, control and transmission of information at any time and in any place, while delivering a unique experience that is only possible with these emerging technologies. In this talk I will present work produced by myself and the research group that I have been directing at Disney Research Pittsburgh. We are addressing these exciting challenges. This talk will cover projects investigating tactile and haptics interfaces, deformable computing devices, augmented reality interfaces and novel touch sensing techniques, as well as biologically-inspired interfaces, among others. The presentation will cover both projects conducted while at Sony Corporation and more recent research efforts in the Interaction Group at Walt Disney Research, Pittsburgh.

© All rights reserved Poupyrev and/or ACM Press

 
Edit | Del

Poupyrev, Ivan, Harrison, Chris and Sato, Munehiko (2012): Touché: touch and gesture sensing for the real world. In: Proceedings of the 2012 International Conference on Uniquitous Computing 2012. p. 536.

Touch proposes a novel Swept Frequency Capacitive Sensing technique that can not only detect a touch event, but also recognize complex configurations of the human hands and body. Such contextual information significantly enhances touch interaction in a broad range of applications, from conventional touchscreens to unique contexts and materials. For example, in our explorations we add touch and gesture sensitivity to the human body and liquids. We demonstrate the rich capabilities of Touch with five example setups from different application domains and conduct experimental studies that show gesture classification accuracies of 99% are achievable with our technology.

© All rights reserved Poupyrev et al. and/or ACM Press

 
Edit | Del

Harrison, Chris, Sato, Munehiko and Poupyrev, Ivan (2012): Capacitive fingerprinting: exploring user differentiation by sensing electrical properties of the human body. In: Proceedings of the 2012 ACM Symposium on User Interface Software and Technology 2012. pp. 537-544.

At present, touchscreens can differentiate multiple points of contact, but not who is touching the device. In this work, we consider how the electrical properties of humans and their attire can be used to support user differentiation on touchscreens. We propose a novel sensing approach based on Swept Frequency Capacitive Sensing, which measures the impedance of a user to the environment (i.e., ground) across a range of AC frequencies. Different people have different bone densities and muscle mass, wear different footwear, and so on. This, in turn, yields different impedance profiles, which allows for touch events, including multitouch gestures, to be attributed to a particular user. This has many interesting implications for interactive design. We describe and evaluate our sensing approach, demonstrating that the technique has considerable promise. We also discuss limitations, how these might be overcome, and next steps.

© All rights reserved Harrison et al. and/or ACM Press

 
Edit | Del

Willis, Karl, Brockmeyer, Eric, Hudson, Scott and Poupyrev, Ivan (2012): Printed optics: 3D printing of embedded optical elements for interactive devices. In: Proceedings of the 2012 ACM Symposium on User Interface Software and Technology 2012. pp. 589-598.

We present an approach to 3D printing custom optical elements for interactive devices labelled Printed Optics. Printed Optics enable sensing, display, and illumination elements to be directly embedded in the casing or mechanical structure of an interactive device. Using these elements, unique display surfaces, novel illumination techniques, custom optical sensors, and embedded optoelectronic components can be digitally fabricated for rapid, high fidelity, highly customized interactive devices. Printed Optics is part of our long term vision for interactive devices that are 3D printed in their entirety. In this paper we explore the possibilities for this vision afforded by fabrication of custom optical elements using today's 3D printing technology.

© All rights reserved Willis et al. and/or ACM Press

2011
 
Edit | Del

Willis, Karl D. D., Poupyrev, Ivan and Shiratori, Takaaki (2011): Motionbeam: a metaphor for character interaction with handheld projectors. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 1031-1040.

We present the MotionBeam metaphor for character interaction with handheld projectors. Our work draws from the tradition of pre-cinema handheld projectors that use direct physical manipulation to control projected imagery. With our prototype system, users interact and control projected characters by moving and gesturing with the handheld projector itself. This creates a unified interaction style where input and output are tied together within a single device. We introduce a set of interaction principles and present prototype applications that provide clear examples of the MotionBeam metaphor in use. Finally we describe observations and insights from a preliminary user study with our system.

© All rights reserved Willis et al. and/or their publisher

 
Edit | Del

Israr, Ali and Poupyrev, Ivan (2011): Tactile brush: drawing on skin with a tactile grid display. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 2019-2028.

Tactile Brush is an algorithm that produces smooth, two-dimensional tactile moving strokes with varying frequency, intensity, velocity and direction of motion. The design of the algorithm is derived from the results of psychophysical investigations of two tactile illusions -- apparent tactile motion and phantom sensations. Combined together they allow for the design of high-density two-dimensional tactile displays using sparse vibrotactile arrays. In a series of experiments and evaluations we demonstrate that Tactile Brush is robust and can reliably generate a wide variety of moving tactile sensations for a broad range of applications.

© All rights reserved Israr and Poupyrev and/or their publisher

 
Edit | Del

Xu, Cheng, Israr, Ali, Poupyrev, Ivan, Bau, Olivier and Harrison, Chris (2011): Tactile display for the visually impaired using TeslaTouch. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 317-322.

TeslaTouch is a technology that provides tactile sensation to moving fingers on touch screens. Based on TeslaTouch, we have developed applications for the visually impaired to interpret and create 2D tactile information. In this paper, we demonstrate these applications, present observations from the interaction, and discuss TeslaTouch's potential in supporting communication among visually impaired individuals.

© All rights reserved Xu et al. and/or their publisher

 
Edit | Del

Willis, Karl D. D., Poupyrev, Ivan, Hudson, Scott E. and Mahler, Moshe (2011): SideBySide: ad-hoc multi-user interaction with handheld projectors. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 431-440.

We introduce SideBySide, a system designed for ad-hoc multi-user interaction with handheld projectors. SideBySide uses device-mounted cameras and hybrid visible/infrared light projectors to track multiple independent projected images in relation to one another. This is accomplished by projecting invisible fiducial markers in the near-infrared spectrum. Our system is completely self-contained and can be deployed as a handheld device without instrumentation of the environment. We present the design and implementation of our system including a hybrid handheld projector to project visible and infrared light, and techniques for tracking projected fiducial markers that move and overlap. We introduce a range of example applications that demonstrate the applicability of our system to real-world scenarios such as mobile content exchange, gaming, and education.

© All rights reserved Willis et al. and/or ACM Press

2010
 
Edit | Del

Willis, Karl D. D. and Poupyrev, Ivan (2010): MotionBeam: designing for movement with handheld projectors. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 3253-3258.

In this paper we present a novel interaction metaphor for handheld projectors we label MotionBeam. We detail a number of interaction techniques that utilize the physical movement of a handheld projector to better express the motion and physicality of projected objects. Finally we present the first iteration of a projected character design that uses the MotionBeam metaphor for user interaction.

© All rights reserved Willis and Poupyrev and/or their publisher

 
Edit | Del

Poupyrev, Ivan, Yeo, Zhiquan, Griffin, Joshua D. and Hudson, Scott E. (2010): Sensing human activities with resonant tuning. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 4135-4140.

Designing new interactive experiences requires effective methods for sensing human activities. In this paper, we propose new sensor architecture based on tracking changes in the resonant frequency of objects with which users interact.

© All rights reserved Poupyrev et al. and/or their publisher

 
Edit | Del

Israr, Ali and Poupyrev, Ivan (2010): Exploring surround haptics displays. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 4171-4176.

In this paper we present the design and evaluation of a two dimensional haptics display intended to be used for enhancing experience for movies and rides. The display, haptics surface, utilizes an array of vibrators contacting the skin at discrete locations and creates static and dynamic haptic sensations derived from scenes and situations. For this regard, a set of haptic morphs are introduced that can be used as building blocks to create new sensations on the skin. A novel haptic sensation, haptic blur, is also introduced that gives an illusion of continuous motion across the skin using discrete vibrating points. A pilot study investigating the reliability of haptic blur along a two dimensional skin surface is presented along with conceptual discussion on future haptic feelings rendered through the haptics surface.

© All rights reserved Israr and Poupyrev and/or their publisher

 
Edit | Del

Bau, Olivier, Poupyrev, Ivan, Israr, Ali and Harrison, Chris (2010): TeslaTouch: electrovibration for touch surfaces. In: Proceedings of the 2010 ACM Symposium on User Interface Software and Technology 2010. pp. 283-292.

We present a new technology for enhancing touch interfaces with tactile feedback. The proposed technology is based on the electrovibration principle, does not use any moving parts and provides a wide range of tactile feedback sensations to fingers moving across a touch surface. When combined with an interactive display and touch input, it enables the design of a wide variety of interfaces that allow the user to feel virtual elements through touch. We present the principles of operation and an implementation of the technology. We also report the results of three controlled psychophysical experiments and a subjective user evaluation that describe and characterize users' perception of this technology. We conclude with an exploration of the design space of tactile touch screens using two comparable setups, one based on electrovibration and another on mechanical vibrotactile actuation.

© All rights reserved Bau et al. and/or their publisher

2009
 
Edit | Del

Vertegaal, Roel and Poupyrev, Ivan (2009): Eek! a mouse! organic user interfaces: tangible, transitive materials and programmable reality. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 3313-3316.

In this panel, we explore the role emerging transitive materials, like flexible thin-film displays, multi-touch input skins, e-textiles, micro-actuators and Claytronics might play in re-defining the human interface towards a programmable form of reality. Panelist will extrapolate historical trends from Tangibles to new developments in organic user interfaces, trying to identify a future in which interfaces will no longer be predominantly flat, but instead have any possible shape or form: from skins that are foldable, flexible and physical to three-dimensional products that are fully kinetic.

© All rights reserved Vertegaal and Poupyrev and/or ACM Press

 
Edit | Del

Coelho, Marcelo, Poupyrev, Ivan, Sadi, Sajid, Vertegaal, Roel, Berzowska, Joanna, Buechley, Leah, Maes, Pattie and Oxman, Neri (2009): Programming reality: from transitive materials to organic user interfaces. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 4759-4762.

Over the past few years, a quiet revolution has been redefining our fundamental computing technologies. Flexible E-Ink, OLED displays, shape-changing materials, parametric design, e-textiles, sensor networks, and intelligent interfaces promise to spawn entirely new user experiences that will redefine our relationship with technology. This workshop invites researchers and practitioners to imagine and debate this future, exploring two converging themes. Transitive Materials focuses on how emerging materials and computationally-driven behaviors can operate in unison blurring the boundaries between form and function, human body and environment, structures and membranes. Organic User Interfaces (OUI) explores future interactive designs and applications as these materials become commonplace.

© All rights reserved Coelho et al. and/or ACM Press

2008
 
Edit | Del

Poupyrev, Ivan, Oba, Haruo, Ikeda, Takuo and Iwabuchi, Eriko (2008): Designing embodied interfaces for casual sound recording devices. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 2129-2134.

In the Special Moment project we prototype and evaluate the design of interfaces for casual sound recording devices. These devices are envisioned to be used by a casual user to capture and store their everyday experiences in the form of "sound albums" -- collections of recordings related to a certain situation. We formulate a number of design principles for such recording devices, as well as implement and evaluate two working prototypes. A candle recorder allows for capturing the general atmosphere at a party, and the children's book recorder records the interactions between parents and children while reading a book together.

© All rights reserved Poupyrev et al. and/or ACM Press

 
Edit | Del

Poupyrev, Ivan and Willis, Karl D. D. (2008): TwelvePixels: drawing & creativity on a mobile phone. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 2361-2366.

TwelvePixels is an interface for drawing pixel-based imagery using only the standard keys on the mobile phone handset. Using an essentially simple drawing method, an extensive range of imagery can be created and shared between users. This paper explores the rationale and details behind the development of the TwelvePixels interface; tracking possible applications for promoting creativity, communication, and content sharing on mobile phones.

© All rights reserved Poupyrev and Willis and/or ACM Press

 
Edit | Del

Parkes, Amanda J., Poupyrev, Ivan and Ishii, Hiroshi (2008): Designing kinetic interactions for organic user interfaces. In Communications of the ACM, 51 (6) pp. 58-65.

 
Edit | Del

Vertegaal, Roel and Poupyrev, Ivan (2008): Introduction. In Communications of the ACM, 51 (6) pp. 26-30.

2007
 
Edit | Del

Poupyrev, Ivan, Nashida, Tatsushi and Okabe, Makoto (2007): Actuation and tangible user interfaces: the Vaucanson duck, robots, and shape displays. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 205-212.

In the last decade, the vision of future interfaces has shifted from virtual reality to augmented and tangible user interfaces (UI) where virtual and physical (or "bits and atoms") co-exist in harmony. Recently, a growing number of designers and researchers have been taking the next logical step: creating interfaces where physical, tangible elements are not merely dynamically coupled to the digital attributes and information, but are themselves dynamic, self-reconfigurable devices that can change their physical properties depending on the state of the interfaces, the user, or the environment. A combination of the actuation, self-configuration, and tangibility can expand and enhance the design of tangible interfaces. In this paper, we present an overview of the use of actuation in user interfaces and discuss the rationality of building actuated interfaces. We then discuss actuated interfaces in detail based on our experience designing Lumen shape displays. Work on actuated interfaces is still in its infancy, projects are few and far between, so we consider this paper an invitation to discussion and hope it can help stimulate further research in this area.

© All rights reserved Poupyrev et al. and/or ACM Press

2005
 
Edit | Del

Bowman, Doug A., Kruijff, Ernst, LaViola, Joseph J. and Poupyrev, Ivan (2005): 3D User Interfaces: Theory and Practice. Addison-Wesley Professional

Here's what three pioneers in computer graphics and human-computer interaction have to say about this book: “What a tour de force—everything one would want—comprehensive, encyclopedic, and authoritative.” —Jim Foley “At last, a book on this important, emerging area. It will be an indispensable reference for the practitioner, researcher, and student interested in 3D user interfaces.” —Andy van Dam “Finally, the book we need to bridge the dream of 3D graphics with the user-centered reality of interface design. A thoughtful and practical guide for researchers and product developers. Thorough review, great examples.” —Ben Shneiderman As 3D technology becomes available for a wide range of applications, its successful deployment will require well-designed user interfaces (UIs). Specifically, software and hardware developers will need to understand the interaction principles and techniques peculiar to a 3D environment. This understanding, of course, builds on usability experience with 2D UIs. But it also involves new and unique challenges and opportunities. Discussing all relevant aspects of interaction, enhanced by instructive examples and guidelines, 3D User Interfaces comprises a single source for the latest theory and practice of 3D UIs. Many people already have seen 3D UIs in computer-aided design, radiation therapy, surgical simulation, data visualization, and virtual-reality entertainment. The next generation of computer games, mobile devices, and desktop applications also will feature 3D interaction. The authors of this book, each at the forefront of research and development in the young and dynamic field of 3D UIs, show how to produce usable 3D applications that deliver on their enormous promise. Coverage includes: The psychology and human factors of various 3D interaction tasks Different approaches for evaluating 3D UIs Results from empirical studies of 3D interaction techniques Principles for choosing appropriate input and output devices for 3D systems Details and tips on implementing common 3D interaction techniques Guidelines for selecting the most effective interaction techniques for common 3D tasks Case studies of 3D UIs in real-world applications To help you keep pace with this fast-evolving field, the book’s Web site, www.3dui.org, will offer information and links to the latest 3D UI research and applications.

© All rights reserved Bowman et al. and/or Addison-Wesley Professional

 Cited in the following chapter:

3D User Interfaces: [/encyclopedia/3d_user_interfaces.html]


 
2004
 
Edit | Del

Schwesig, Carsten, Poupyrev, Ivan and Mori, Eijiro (2004): Gummi: a bendable computer. In: Dykstra-Erickson, Elizabeth and Tscheligi, Manfred (eds.) Proceedings of ACM CHI 2004 Conference on Human Factors in Computing Systems April 24-29, 2004, Vienna, Austria. pp. 263-270.

Gummi is an interaction technique and device concept based on physical deformation of a handheld device. The device consists of several layers of flexible electronic components, including sensors measuring deformation of the device. Users interact with this device by a combination of bending and 2D position control. Gummi explores physical interaction techniques and screen interfaces for such a device. Its graphical user interface facilitates a wide range of interaction tasks, focused on browsing of visual information. We implemented both hardware and software prototypes to explore and evaluate the proposed interaction techniques. Our evaluations have shown that users can grasp Gummi's key interaction principles within minutes. Gummi demonstrates promising possibilities for new interaction techniques and devices based on flexible electronic components.

© All rights reserved Schwesig et al. and/or ACM Press

 
Edit | Del

Bowman, Doug A., Kruijff, Ernst, LaViola, Joseph J. and Poupyrev, Ivan (2004): 3D User Interfaces: Theory and Practice. Addison-Wesley Professional

Here's what three pioneers in computer graphics and human-computer interaction have to say about this book: “What a tour de force—everything one would want—comprehensive, encyclopedic, and authoritative.” —Jim Foley “At last, a book on this important, emerging area. It will be an indispensable reference for the practitioner, researcher, and student interested in 3D user interfaces.” —Andy van Dam “Finally, the book we need to bridge the dream of 3D graphics with the user-centered reality of interface design. A thoughtful and practical guide for researchers and product developers. Thorough review, great examples.” —Ben Shneiderman As 3D technology becomes available for a wide range of applications, its successful deployment will require well-designed user interfaces (UIs). Specifically, software and hardware developers will need to understand the interaction principles and techniques peculiar to a 3D environment. This understanding, of course, builds on usability experience with 2D UIs. But it also involves new and unique challenges and opportunities. Discussing all relevant aspects of interaction, enhanced by instructive examples and guidelines, 3D User Interfaces comprises a single source for the latest theory and practice of 3D UIs. Many people already have seen 3D UIs in computer-aided design, radiation therapy, surgical simulation, data visualization, and virtual-reality entertainment. The next generation of computer games, mobile devices, and desktop applications also will feature 3D interaction. The authors of this book, each at the forefront of research and development in the young and dynamic field of 3D UIs, show how to produce usable 3D applications that deliver on their enormous promise. Coverage includes: The psychology and human factors of various 3D interaction tasks Different approaches for evaluating 3D UIs Results from empirical studies of 3D interaction techniques Principles for choosing appropriate input and output devices for 3D systems Details and tips on implementing common 3D interaction techniques Guidelines for selecting the most effective interaction techniques for common 3D tasks Case studies of 3D UIs in real-world applications To help you keep pace with this fast-evolving field, the book’s Web site, www.3dui.org, will offer information and links to the latest 3D UI research and applications.

© All rights reserved Bowman et al. and/or Addison-Wesley Professional

 Cited in the following chapter:

Aesthetic Computing: [/encyclopedia/aesthetic_computing.html]


 
2003
 
Edit | Del

Poupyrev, Ivan and Maruyama, Shigeaki (2003): Tactile interfaces for small touch screens. In: Proceedings of the 16th annural ACM Symposium on User Interface Software and Technology November, 2-5, 2003, Vancouver, Canada. pp. 217-220.

We present the design, implementation, and informal evaluation of tactile interfaces for small touch screens used in mobile devices. We embedded a tactile apparatus in a Sony PDA touch screen and enhanced its basic GUI elements with tactile feedback. Instead of observing the response of interface controls, users can feel it with their fingers as they press the screen. In informal evaluations, tactile feedback was greeted with enthusiasm. We believe that tactile feedback will become the next step in touch screen interface design and a standard feature of future mobile devices.

© All rights reserved Poupyrev and Maruyama and/or ACM Press

2002
 
Edit | Del

Poupyrev, Ivan, Maruyama, Shigeaki and Rekimoto, Jun (2002): Ambient touch: designing tactile interfaces for handheld devices. In: Beaudouin-Lafon, Michel (ed.) Proceedings of the 15th annual ACM symposium on User interface software and technology October 27-30, 2002, Paris, France. pp. 51-60.

This paper investigates the sense of touch as a channel for communicating with miniature handheld devices. We embedded a PDA with a TouchEngine -- a thin, miniature lower-power tactile actuator that we have designed specifically to use in mobile interfaces (Figure 1). Unlike previous tactile actuators, the TouchEngine is a universal tactile display that can produce a wide variety of tactile feelings from simple clicks to complex vibrotactile patterns. Using the TouchEngine, we began exploring the design space of interactive tactile feedback for handheld computers. Here, we investigated only a subset of this space: using touch as the ambient, background channel of interaction. We proposed a general approach to design such tactile interfaces and described several implemented prototypes. Finally, our user studies demonstrated 22% faster task completion when we enhanced handheld tilting interfaces with tactile feedback.

© All rights reserved Poupyrev et al. and/or ACM Press

 
Edit | Del

Poupyrev, Ivan, Tan, Desney S., Billinghurst, Mark, Kato, Hirokazu, Regenbrecht, Holger and Tetsutani, Nobuji (2002): Developing a Generic Augmented-Reality Interface. In IEEE Computer, 35 (3) pp. 44-50.

 
Edit | Del

Billinghurst, Mark, Kato, Hirokazu, Kiyokawa, Kiyoshi, Belcher, Daniel and Poupyrev, Ivan (2002): Experiments with Face-To-Face Collaborative AR Interfaces. In Virtual Reality, 6 (3) pp. 107-121.

2001
 
Edit | Del

Billinghurst, Mark, Kato, H. and Poupyrev, Ivan (2001): Collaboration With Tangible Augmented Reality Interfaces. In: Proceedings of the Ninth International Conference on Human-Computer Interaction 2001. pp. 797-801.

 
Edit | Del

Poupyrev, Ivan, Berry, R., Billinghurst, Mark, Kato, H., Nakao, K., Baldwin, L. and Kurumisawa, J. (2001): Augmented Reality Interface for Electronic Music Performance. In: Proceedings of the Ninth International Conference on Human-Computer Interaction 2001. pp. 805-808.

 
Edit | Del

Poupyrev, Ivan, Tan, Desney S., Billinghurst, Mark, Kato, H., Regenbrecht, H. and Tetsutani, N. (2001): Tiles: A Mixed Reality Authoring Interface. In: Proceedings of IFIP INTERACT01: Human-Computer Interaction 2001, Tokyo, Japan. pp. 334-341.

 
Edit | Del

Bowman, Doug A., Kruijff, Ernst, LaViola, Joseph J. and Poupyrev, Ivan (2001): An Introduction to 3D User Interface Design. In Presence: Teleoperators and Virtual Environments, 10 (1) pp. 96-108.

 
Edit | Del

Billinghurst, Mark, Kato, Hirokazu and Poupyrev, Ivan (2001): The MagicBook?Moving Seamlessly between Reality and Virtuality. In IEEE Computer Graphics and Applications, 21 (3) pp. 6-8.

 
Edit | Del

Billinghurst, Mark, Kato, Hirokazu and Poupyrev, Ivan (2001): The MagicBook: a transitional AR interface. In Computers & Graphics, 25 (5) pp. 745-753.

2000
 
Edit | Del

Poupyrev, Ivan, Weghorst, Suzanne and Fels, Sidney (2000): Non-Isomorphic 3D Rotational Techniques. In: Turner, Thea, Szwillus, Gerd, Czerwinski, Mary, Peterno, Fabio and Pemberton, Steven (eds.) Proceedings of the ACM CHI 2000 Human Factors in Computing Systems Conference April 1-6, 2000, The Hague, The Netherlands. pp. 540-547.

This paper demonstrates how non-isomorphic rotational mappings and interaction techniques can be designed and used to build effective spatial 3D user interfaces. In this paper, we develop a mathematical framework allowing us to design non-isomorphic 3D rotational mappings and techniques, investigate their usability properties, and evaluate their user performance characteristics. The results suggest that non-isomorphic rotational mappings can be an effective tool in building high-quality manipulation dialogs in 3D interfaces, allowing our subjects to accomplish experimental tasks 13% faster without a statistically detectable loss in accuracy. The current paper will help interface designers to use non-isomorphic rotational mappings effectively.

© All rights reserved Poupyrev et al. and/or ACM Press

1999
 
Edit | Del

Poupyrev, Ivan and Ichikawa, Tadao (1999): Manipulating Objects in Virtual Worlds: Categorization and Empirical Evaluation of Interaction Techniques. In J. Vis. Lang. Comput., 10 (1) pp. 19-35.

1998
 
Edit | Del

Poupyrev, Ivan, Weghorst, Suzanne, Billinghurst, Mark and Ichikawa, Tadao (1998): Egocentric Object Manipulation in Virtual Environments: Evaluation of Interaction Techniques. In Comput. Graph. Forum, 17 (3) pp. 41-52.

1997
 
Edit | Del

Poupyrev, Ivan, Weghorst, Suzanne, Billinghurst, Mark and Ichikawa, Tadao (1997): A framework and testbed for studying manipulation techniques for immersive VR. In: VRST 1997 1997. pp. 21-28.

1996
 
Edit | Del

Poupyrev, Ivan, Billinghurst, Mark, Weghorst, Suzanne and Ichikawa, Tadao (1996): The Go-Go Interaction Technique: Non-Linear Mapping for Direct Manipulation in VR. In: Kurlander, David, Brown, Marc and Rao, Ramana (eds.) Proceedings of the 9th annual ACM symposium on User interface software and technology November 06 - 08, 1996, Seattle, Washington, United States. pp. 79-80.

The Go-Go immersive interaction technique uses the metaphor of interactively growing the user's arm and non-linear mapping for reaching and manipulating distant objects. Unlike others, our technique allows for seamless direct manipulation of both nearby objects and those at a distance.

© All rights reserved Poupyrev et al. and/or ACM Press

 Cited in the following chapter:

3D User Interfaces: [/encyclopedia/3d_user_interfaces.html]


 
 
Add publication
Show list on your website
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
Join our community!
 
 
 
Date created: Not available
Date last modified: Not available Date created: Not available
Date last modified: Not available

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/ivan_poupyrev.html