May 07

If we want users to like our software, we should design it to behave like a likeable person.

-- Alan Cooper

 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger

 
Start reading

Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad

 
Start reading

Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam

 
Start reading
 
 

Help us help you!

 
 

Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces


 
Time and place:

2009
Conf. description:
Interactive surfaces is emerging as an exciting new research area. Display technologies coupled with input sensors capable of enabling direct interaction, are being experimented with and embedded in tabletops, walls and floors to support a diversity of collaborative activities.
Next conference:
is coming up
Sep16
16 Sep 2014 in Dresden, Germany
Other years:
EDIT

References from this conference (2009)

The following articles are from "Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces":

 what's this?

Articles

p. 1-8

Berard, Francois and Laurillau, Yann (2009): Single user multitouch on the DiamondTouch: from 2 x 1D to 2D. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 1-8. Available online

The DiamondTouch is a widely used multi-touch surface that offers high quality touch detection and user identification. But its underlying detection mechanism relies on two 1D projections (x and y) of the 2D surface. This creates ambiguous responses when a single user exercises multiple contacts on the surface and limits the ability of the DiamondTouch to provide full support of common multi-touch interactions such as the unconstrained translation, rotation and scaling of objects with two fingers. This paper presents our solution to reduce this limitation. Our approach is based on a precise modeling, using mixtures of Gaussians, of the touch responses on each array of antennas. This greatly reduces the shadowing of the touch locations when two or more fingers align with each other. We use these accurate touch detections to implement two 1D touch trackers and a global 2D tracker. The evaluation of our system shows that, in many situations, it can provide the complete 2D locations of at least two contacts points from the same user.

© All rights reserved Berard and Laurillau and/or their publisher

p. 101-108

Dang, Chi Tai, Straub, Martin and Andre, Elisabeth (2009): Hand distinction for multi-touch tabletop interaction. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 101-108. Available online

Recent multi-touch multi-user tabletop systems offer rich touch contact properties to applications. Not only they provide touch positions, but also finger orientations. Applications can use these properties separated for each finger or derive information by combining the given touch contact data. In this paper, we present an approach to map fingers to their associated joined hand contributing to potential enhancements for gesture recognition and user interaction. For instance, a gesture can be composed of multiple fingers of one hand or different hands. Therefore, we present a simple heuristic for mapping fingers to hands that makes use of constraints applied to the touch position combined with the finger orientation. We tested our approach with collected diverse touch contact data and analyze the results.

© All rights reserved Dang et al. and/or their publisher

p. 109-116

Voida, Stephen, Tobiasz, Matthew, Stromer, Julie, Isenberg, Petra and Carpendale, Sheelagh (2009): Getting practical with interactive tabletop displays: designing for dense data, "fat fingers," diverse interactions, and face-to-face collaboration. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 109-116. Available online

Tabletop displays with touch-based input provide many powerful affordances for directly manipulating and collaborating around information visualizations. However, these devices also introduce several challenges for interaction designers, including discrepancies among the resolutions of the visualization, the tabletop's display, and its sensing technologies; a need to support diverse types of interactions required by different visualization techniques; and the ability to support face-to-face collaboration. As a result, most interactive tabletop applications for working with information currently demonstrate limited functionality and do not approach the power or versatility of their desktop counterparts. We present a series of design considerations, informed by prior interaction design and focus+context visualization research, for ameliorating the challenges inherent in designing practical interaction techniques for tabletop information visualization applications. We then discuss two specific techniques, i-Loupe and iPodLoupe, which illustrate how different choices among these design considerations enable vastly different experiences in working with complex data on interactive surfaces.

© All rights reserved Voida et al. and/or their publisher

p. 117-124

Schick, Alexander, Camp, Florian van de, Ijsselmuiden, Joris and Stiefelhagen, Rainer (2009): Extending touch: towards interaction with large-scale surfaces. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 117-124. Available online

Touch is a very intuitive modality for interacting with objects displayed on arbitrary surfaces. However, when using touch for large-scale surfaces, not every point is reachable. Therefore, an extension is required that keeps the intuitivity provided by touch: pointing. We will present our system that allows both input modalities in one single framework. Our method is based on 3D reconstruction, using standard RGB cameras only, and allows seamless switching between touch and pointing, even while interacting. Our approach scales very well with large surfaces without modifying them. We present a technical evaluation of the system's accuracy, as well as a user study. We found that users preferred our system to a touch-only system, because they had more freedom during interaction and could solve the presented task significantly faster.

© All rights reserved Schick et al. and/or their publisher

p. 125-132

Wilson, Andrew D. (2009): Simulating grasping behavior on an imaging interactive surface. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 125-132. Available online

We present techniques and algorithms to simulate grasping behavior on an imaging interactive surface (e.g., Microsoft Surface). In particular, we describe a contour model of touch contact shape, and show how these contours may be represented in a real-time physics simulation in a way that allows more realistic grasping behavior. For example, a virtual object may be moved by "squeezing" it with multiple contacts undergoing motion. The virtual object is caused to move by simulated contact and friction forces. Previous work [14] uses many small rigid bodies ("particle proxies") to approximate touch contact shape. This paper presents a variation of the particle proxy approach which allows grasping behavior. The advantages and disadvantages of this new approach are discussed.

© All rights reserved Wilson and/or his/her publisher

p. 133-140

Hancock, Mark, Cate, Thomas ten and Carpendale, Sheelagh (2009): Sticky tools: full 6DOF force-based interaction for multi-touch tables. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 133-140. Available online

Tabletop computing techniques are using physically familiar force-based interactions to enable compelling interfaces that provide a feeling of being embodied with a virtual object. We introduce an interaction paradigm that has the benefits of force-based interaction complete with full 6DOF manipulation. Only multi-touch input, such as that provided by the Microsoft Surface and the SMART Table, is necessary to achieve this interaction freedom. This paradigm is realized through sticky tools: a combination of sticky fingers, a physically familiar technique for moving, spinning, and lifting virtual objects; opposable thumbs, a method for flipping objects over; and virtual tools, a method for propagating behaviour to other virtual objects in the scene. We show how sticky tools can introduce richer meaning to tabletop computing by drawing a parallel between sticky tools and the discussion in Urp [20] around the meaning of tangible devices in terms of nouns, verbs, reconfigurable tools, attributes, and pure objects. We then relate this discussion to other force-based interaction techniques by describing how a designer can introduce complexity in how people can control both physical and virtual objects, how physical objects can control both physical and virtual objects, and how virtual objects can control virtual objects.

© All rights reserved Hancock et al. and/or their publisher

p. 141-148

Ajaj, Rami, Vernier, Frederic and Jacquemin, Christian (2009): Navigation modes for combined table/screen 3D scene rendering. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 141-148. Available online

This paper compares two navigation techniques for settings that combine a 2D table-top view and a 3D large wall display, both rendering the same 3D virtual scene. The two navigation techniques, called Camera Based (CB) and View Based (VB) strongly rely on the spatial relationships between both displays. In the CB technique, the 3D point of view displayed on the wall is controlled through a draggable icon on the 2D table-top view. The VB technique presents the same icon on the table-top view but statically located at the center and oriented toward the physical wall display while the user pans and rotates the whole scene around the icon. While CB offers a more consistent 2D view, VB reduces the user's mental rotations required to understand the relations between both views. We perform a comparative user study showing user's preference for VB technique while performances for complex tasks are better with the CB technique. Finally we discuss other aspects of such navigation techniques, such as the possibility of having more than one point of view, occlusion, and multiple users.

© All rights reserved Ajaj et al. and/or their publisher

p. 149-156

Frisch, Mathias, Heydekorn, Jens and Dachselt, Raimund (2009): Investigating multi-touch and pen gestures for diagram editing on interactive surfaces. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 149-156. Available online

Creating and editing large graphs and node-link diagrams are crucial activities in many application areas. For them, we consider multi-touch and pen input on interactive surfaces as very promising. This fundamental work presents a user study investigating how people edit node-link diagrams on an interactive tabletop. The study covers a set of basic operations, such as creating, moving, and deleting diagram elements. Participants were asked to perform spontaneous gestures for 14 given tasks. They could interact in three different ways: using one hand, both hands, as well as pen and hand together. The subjects' activities were observed and recorded in various ways, analyzed and enriched with think-aloud data. As a result, we contribute a user-elicited collection of touch and pen gestures for editing node-link diagrams. The study provides valuable insight how people would interact on interactive surfaces for this as well as other tabletop domains.

© All rights reserved Frisch et al. and/or their publisher

p. 157-164

Hancock, Mark, Nacenta, Miguel, Gutwin, Carl and Carpendale, Sheelagh (2009): The effects of changing projection geometry on the interpretation of 3D orientation on tabletops. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 157-164. Available online

Applications with 3D models are now becoming more common on tabletop displays. Displaying 3D objects on tables, however, presents problems in the way that the 3D virtual scene is presented on the 2D surface; different choices in the way the projection is designed can lead to distorted images and difficulty interpreting angles and orientations. To investigate these problems, we studied people's ability to judge object orientations under different projection conditions. We found that errors increased significantly as the center of projection diverged from the observer's viewpoint, showing that designers must take this divergence into consideration, particularly for multi-user tables. In addition, we found that a neutral center of projection combined with parallel projection geometry provided a reasonable compromise for multi-user situations.

© All rights reserved Hancock et al. and/or their publisher

p. 165-172

Freeman, Dustin, Benko, Hrvoje, Morris, Meredith Ringel and Wigdor, Daniel (2009): ShadowGuides: visualizations for in-situ learning of multi-touch and whole-hand gestures. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 165-172. Available online

We present ShadowGuides, a system for in-situ learning of multi-touch and whole-hand gestures on interactive surfaces. ShadowGuides provides on-demand assistance to the user by combining visualizations of the user's current hand posture as interpreted by the system (feedback) and available postures and completion paths necessary to finish the gesture (feedforward). Our experiment compared participants learning gestures with ShadowGuides to those learning with video-based instruction. We found that participants learning with ShadowGuides remembered more gestures and expressed significantly higher preference for the help system.

© All rights reserved Freeman et al. and/or their publisher

p. 17-24

Hansen, Thomas E., Hourcade, Juan Pablo, Virbel, Mathieu, Patali, Sharath and Serra, Tiago (2009): PyMT: a post-WIMP multi-touch user interface toolkit. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 17-24. Available online

Multi-touch and tabletop input paradigms open novel doors for post-WIMP (Windows, Icons, Menus, Pointer) user interfaces. Developing these novel interfaces and applications poses unique challenges for designers and programmers alike. We present PyMT (Python Multi-Touch), a toolkit aimed at addressing these challenges. We discuss PyMT's architecture and sample applications to demonstrate how it enables rapid development of prototypes and interaction techniques while being accessible to novice programmers and providing great flexibility and creative freedom to advanced users. We share experiences gathered in the open source development of PyMT to explore design and programming challenges posed by multi-touch tabletop and post-WIMP interfaces. Specifically, we discuss changes to the event model and the implementation of development and debugging tools that we found useful along the way.

© All rights reserved Hansen et al. and/or their publisher

p. 173-180

Hesselmann, Tobias, Flöring, Stefan and Schmitt, Marwin (2009): Stacked Half-Pie menus: navigating nested menus on interactive tabletops. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 173-180. Available online

Hierarchical menus can be found in many of today's software applications. However, these menus are often optimized for mouse or keyboard interaction and their suitability for touch screen-based interactive tabletops is questionable. On touch based interfaces, screen occlusion by the user, menu item size and the usage of intuitive navigation paradigms are essential aspects that need to be considered. In this paper we present our approach: "Stacked Half-Pie menus" that allow visualization of an unlimited number of hierarchical menu items as well as interactive navigation and selection of these items by touch. Our evaluation shows a fairly high usability of touchable half-pie menus, making them an interesting alternative to other established menu types on interactive tabletops.

© All rights reserved Hesselmann et al. and/or their publisher

p. 181-188

Tuddenham, Philip, Davies, Ian and Robinson, Peter (2009): WebSurface: an interface for co-located collaborative information gathering. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 181-188. Available online

Co-located collaborative Web browsing is a relatively common task and yet is poorly supported by conventional tools. Prior research in this area has focused on adapting conventional browsing interfaces to add collaboration support. We propose an alternative approach, drawing on ideas from tabletop interfaces. We present WebSurface, a novel tabletop interface for collaborative Web browsing. WebSurface explores two design challenges of this approach: providing sufficient resolution for legible text; and navigating through information. We report our early experiences with an exploratory user study, in which pairs of collaborators gathered information using WebSurface. The findings suggest that a tabletop approach for collaborative Web browsing can help address limitations of conventional tools, and presents beneficial affordances for information layout.

© All rights reserved Tuddenham et al. and/or their publisher

p. 189-196

Fleck, Rowanne, Rogers, Yvonne, Yuill, Nicola, Marshall, Paul, Carr, Amanda, Rick, Jochen and Bonnett, Victoria (2009): Actions speak loudly with words: unpacking collaboration around the table. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 189-196. Available online

The potential of tabletops to enable groups of people to simultaneously touch and manipulate a shared tabletop interface provides new possibilities for supporting collaborative learning. However, findings from the few studies carried out to date have tended to show small or insignificant effects compared with other technologies. We present the Collaborative Learning Mechanisms framework used to examine the coupling of verbal interactions and physical actions in collaboration around the tabletop and reveal subtle mechanisms at play. Analysis in this way revealed that what might be considered undesirable or harmful interactions and intrusions in general collaborative settings, might be beneficial for collaborative learning. We discuss the implications of these findings for how tabletops may be used to support children's collaboration, and the value of considering verbal and physical aspects of interaction together in this way.

© All rights reserved Fleck et al. and/or their publisher

p. 197-204

Battocchi, A., Pianesi, F., Tomasini, D., Zancanaro, M., Esposito, G., Venuti, P., Sasson, A. Ben, Gal, E. and Weiss, P. L. (2009): Collaborative Puzzle Game: a tabletop interactive game for fostering collaboration in children with Autism Spectrum Disorders (ASD). In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 197-204. Available online

We present the design and evaluation of the Collaborative Puzzle Game (CPG), a tabletop interactive activity developed for fostering collaboration in children with Autism Spectrum Disorder (ASD). The CPG was inspired by cardboard jigsaw puzzles and runs on the MERL DiamondTouch table [7]. Digital pieces can be manipulated by direct finger touch. The CPG features a set of interaction rules called Enforced Collaboration (EC); in order to be moved, puzzle pieces must be touched and dragged simultaneously by two players. Two studies were conducted to test whether EC has the potential to serve as an interaction paradigm that would help foster collaborative skills. In Study 1, 70 boys with typical development were tested and in Study 2 16 boys with ASD were tested. Results show that EC has a positive effect on collaboration although it appears to be associated with a more complex interaction. For children with ASD, EC was also related to a higher number of "negotiation" moves, which may reflect their higher need of coordination during the collaborative activity.

© All rights reserved Battocchi et al. and/or their publisher

p. 25-28

Jackson, Daniel, Bartindale, Tom and Olivier, Patrick (2009): FiberBoard: compact multi-touch display using channeled light. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 25-28. Available online

Multi-touch displays based on infrared (IR) light offer many advantages over alternative technologies. Existing IR multi-touch devices either use complex custom electronic sensor arrays, or a camera that must be placed relatively distant from the display. FiberBoard is an easily constructed compact IR-sensing multi-touch display. Using an array of optical fibers, reflected IR light is channeled to a camera. As the fibers are flexible the camera is free to be positioned so as to minimize the depth of the device. The resulting display is around one tenth of the depth of a conventional camera-based multi-touch display. We describe our prototype, its novel calibration process, and virtual camera software based on existing multi-touch image processing tools.

© All rights reserved Jackson et al. and/or their publisher

p. 29-32

Echtler, Florian, Dippon, Andreas, Tönnis, Marcus and Klinker, Gudrun (2009): Inverted FTIR: easy multitouch sensing for flatscreens. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 29-32. Available online

The increased attention which multitouch interfaces have received in recent years is partly due to the availability of cheap sensing hardware such as FTIR-based screens. However, this method has so far required a bulky projector-camera setup behind the screen. In this paper, we present a new approach to FTIR sensing by "inverting" the setup and placing the camera in front of the screen. This allows the use of unmodified flat screens as display, thereby dramatically shrinking the space required behind the screen and enabling the easy construction of new types of interactive surfaces.

© All rights reserved Echtler et al. and/or their publisher

p. 33-40

Seifried, Thomas, Haller, Michael, Scott, Stacey D., Perteneder, Florian, Rendl, Christian, Sakamoto, Daisuke and Inami, Masahiko (2009): CRISTAL: a collaborative home media and device controller based on a multi-touch display. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 33-40. Available online

While most homes are inherently social places, existing devices designed to control consumer electronics typically only support single user interaction. Further, as the number of consumer electronics in modern homes increases, people are often forced to switch between many controllers to interact with these devices. To simplify interaction with these devices and to enable more collaborative forms of device control, we propose an integrated remote control system, called CRISTAL (Control of Remotely Interfaced Systems using Touch-based Actions in Living spaces). CRISTAL enables people to control a wide variety of digital devices from a centralized, interactive tabletop system that provides an intuitive, gesture-based interface that enables multiple users to control home media devices through a virtually augmented video image of the surrounding environment. A preliminary user study of the CRISTAL system is presented, along with a discussion of future research directions.

© All rights reserved Seifried et al. and/or their publisher

p. 41-48

Micire, Mark, Desai, Munjal, Courtemanche, Amanda, Tsui, Katherine M. and Yanco, Holly A. (2009): Analysis of natural gestures for controlling robot teams on multi-touch tabletop surfaces. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 41-48. Available online

Multi-touch technologies hold much promise for the command and control of mobile robot teams. To improve the ease of learning and usability of these interfaces, we conducted an experiment to determine the gestures that people would naturally use, rather than the gestures they would be instructed to use in a pre-designed system. A set of 26 tasks with differing control needs were presented sequentially on a DiamondTouch to 31 participants. We found that the task of controlling robots exposed unique gesture sets and considerations not previously observed, particularly in desktop-like applications. In this paper, we present the details of these findings, a taxonomy of the gesture set, and guidelines for designing gesture sets for robot control.

© All rights reserved Micire et al. and/or their publisher

p. 49-52

Helmes, John, Cao, Xiang, Lindley, Sian E. and Sellen, Abigail (2009): Developing the story: designing an interactive storytelling application. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 49-52. Available online

This paper describes the design of a tabletop storytelling application for children, called TellTable. The goal of the system was to stimulate creativity and collaboration by allowing children to develop their own story characters and scenery through photography and drawing, and record stories through direct manipulation and narration. Here we present the initial interface design and its iteration following the results of a preliminary trial. We also describe key findings from TellTable's deployment in a primary school that relate to its design, before concluding with a discussion of design implications from the process.

© All rights reserved Helmes et al. and/or their publisher

p. 53-56

Vandoren, Peter, Claesen, Luc, Laerhoven, Tom Van, Taelman, Johannes, Raymaekers, Chris, Flerackers, Eddy and Reeth, Frank Van (2009): FluidPaint: an interactive digital painting system using real wet brushes. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 53-56. Available online

This paper presents FluidPaint, a novel digital paint system using real wet brushes. A new interactive canvas, accurately registering brush footprints and paint strokes in high precision has been developed. It is based on the real-time imaging of brushes and other painting instruments as well as the real-time co-located rendering of the painting results. This new painting user interface enhances the user experience and the artist's expressiveness. User tests demonstrate the intuitive nature of FluidPaint, naturally integrating interface elements of traditional painting in a digital paint system.

© All rights reserved Vandoren et al. and/or their publisher

p. 57-60

Bartindale, Tom and Harrison, Chris (2009): Stacks on the surface: resolving physical order using fiducial markers with structured transparency. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 57-60. Available online

We present a method for identifying the order of stacked items on interactive surfaces. This is achieved using conventional, passive fiducial markers, which in addition to reflective regions, also incorporate structured areas of transparency. This allows particular orderings to appear as unique marker patterns. We discuss how such markers are encoded and fabricated, and include relevant mathematics. To motivate our approach, we comment on various scenarios where stacking could be especially useful. We conclude with details from our proof-of-concept implementation, built on Microsoft Surface.

© All rights reserved Bartindale and Harrison and/or their publisher

p. 61-64

Abednego, Martha, Lee, Joong-Ho, Moon, Won and Park, Ji-Hyung (2009): I-Grabber: expanding physical reach in a large-display tabletop environment through the use of a virtual grabber. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 61-64. Available online

While working on large tabletop interfaces, reaching and manipulating objects beyond the physical reach of a user can be considerably vexing. In order to reach such an object, a user may have to physically move to the object's location. Alternatively, a user could attempt to reach for the object from his/her current location, but the territory of other users may become obstructed as a result. We propose a multi-touch interaction technique which enables users to easily select and manipulate objects that are beyond their physical reach. Our technique provides direct visual feedback to users, which allows them to be aware of their current active location. Using a controllable "interactive grabber" (I-Grabber) as a virtual hand extension, users can reach and manipulate any object from their current location.

© All rights reserved Abednego et al. and/or their publisher

p. 65-68

Dragicevic, Pierre and Shi, Yuanchun (2009): Visualizing and manipulating automatic document orientation methods using vector fields. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 65-68. Available online

We introduce and illustrate a design framework whereby tabletop documents are oriented according to vector fields that can be visualized and altered by end users. We explore and illustrate the design space using interactive 2D mockups and show how this approach can potentially combine the advantages of the fully manual and fully automatic document orientation methods previously proposed in the literature.

© All rights reserved Dragicevic and Shi and/or their publisher

p. 69-76

Spindler, Martin, Stellmach, Sophie and Dachselt, Raimund (2009): PaperLens: advanced magic lens interaction above the tabletop. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 69-76. Available online

In order to improve the three-dimensional (3D) exploration of virtual spaces above a tabletop, we developed a set of navigation techniques using a handheld magic lens. These techniques allow for an intuitive interaction with two-dimensional and 3D information spaces, for which we contribute a classification into volumetric, layered, zoomable, and temporal spaces. The proposed PaperLens system uses a tracked sheet of paper to navigate these spaces with regard to the Z-dimension (height above the tabletop). A formative user study provided valuable feedback for the improvement of the PaperLens system with respect to layer interaction and navigation. In particular, the problem of keeping the focus on selected layers was addressed. We also propose additional vertical displays in order to provide further contextual clues.

© All rights reserved Spindler et al. and/or their publisher

p. 77-84

Hancock, Mark, Hilliges, Otmar, Collins, Christopher, Baur, Dominikus and Carpendale, Sheelagh (2009): Exploring tangible and direct touch interfaces for manipulating 2D and 3D information on a digital table. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 77-84. Available online

On traditional tables, people often manipulate a variety of physical objects, both 2D in nature (e.g., paper) and 3D in nature (e.g., books, pens, models, etc.). Current advances in hardware technology for tabletop displays introduce the possibility of mimicking these physical interactions through direct-touch or tangible user interfaces. While both promise intuitive physical interaction, they are rarely discussed in combination in the literature. In this paper, we present a study that explores the advantages and disadvantages of tangible and touch interfaces, specifically in relation to one another. We discuss our results in terms of how effective each technique was for accomplishing both a 3D object manipulation task and a 2D information visualization exploration task. Results suggest that people can more quickly move and rotate objects in 2D with our touch interaction, but more effectively navigate the visualization using tangible interaction. We discuss how our results can be used to inform future designs of tangible and touch interaction.

© All rights reserved Hancock et al. and/or their publisher

p. 85-92

Marquardt, Nicolai, Nacenta, Miguel A., Young, James E., Carpendale, Sheelagh, Greenberg, Saul and Sharlin, Ehud (2009): The Haptic Tabletop Puck: tactile feedback for interactive tabletops. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 85-92. Available online

In everyday life, our interactions with objects on real tables include how our fingertips feel those objects. In comparison, current digital interactive tables present a uniform touch surface that feels the same, regardless of what it presents visually. In this paper, we explore how tactile interaction can be used with digital tabletop surfaces. We present a simple and inexpensive device -- the Haptic Tabletop Puck -- that incorporates dynamic, interactive haptics into tabletop interaction. We created several applications that explore tactile feedback in the area of haptic information visualization, haptic graphical interfaces, and computer supported collaboration. In particular, we focus on how a person may interact with the friction, height, texture and malleability of digital objects.

© All rights reserved Marquardt et al. and/or their publisher

p. 9-16

Kaltenbrunner, Martin (2009): reacTIVision and TUIO: a tangible tabletop toolkit. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 9-16. Available online

This article presents the recent updates and an evaluation of reacTIVision, a computer vision toolkit for fiducial marker tracking and multi-touch interaction. It also discusses the current and future development of the TUIO protocol and framework, which has been primarily designed as an abstraction layer for the description and transmission of pointers and tangible object states in the context of interactive tabletop surfaces. The initial protocol definition proved to be rather robust due to the simple and straightforward implementation approach, which also supported its widespread adoption within the open source community. This article also discusses the current limitations of this simplistic approach and provides an outlook towards a next generation protocol definition, which will address the need for additional descriptors and the protocol's general extensibility.

© All rights reserved Kaltenbrunner and/or his/her publisher

p. 93-100

Benko, Hrvoje, Saponas, T. Scott, Morris, Dan and Tan, Desney (2009): Enhancing input on and above the interactive surface with muscle sensing. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 93-100. Available online

Current interactive surfaces provide little or no information about which fingers are touching the surface, the amount of pressure exerted, or gestures that occur when not in contact with the surface. These limitations constrain the interaction vocabulary available to interactive surface systems. In our work, we extend the surface interaction space by using muscle sensing to provide complementary information about finger movement and posture. In this paper, we describe a novel system that combines muscle sensing with a multi-touch tabletop, and introduce a series of new interaction techniques enabled by this combination. We present observations from an initial system evaluation and discuss the limitations and challenges of utilizing muscle sensing for tabletop applications.

© All rights reserved Benko et al. and/or their publisher

p. D1

Zarin, Ru (2009): Trollskogen: a multitouch table top framework for enhancing communication amongst cognitively disabled children. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. p. D1. Available online

Trollskogen is a communicative framework designed to enhance communication among people with cognitive disabilities. The forest is split up into interactive modules that provide a fun and engaging learning environment while helping improve on certain aspects of speech, reading/writing and symbol based languages. This framework has been deployed on a custom multi-touch table prototype built at the Interactive institute Umeå, enabling the children to interact with their fingers in a more natural, intuitive way rather than a traditional keyboard/mouse setup.

© All rights reserved Zarin and/or his/her publisher

p. D10

Bortolaso, Christophe, Dubois, Emmanuel, Dittlo, Nicolas and Rivière, Jean-Baptiste de la (2009): 3D multitouch advanced interaction techniques and applications. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. p. D10. Available online

The Cubtile is a 3D multitouch device composed of 5 multitouch surfaces. It allows the use of classical multitouch gestures in 3D, and therefore to ease the manipulation of 3D scenes: it provides more direct ways to handle complex 3D operations such as applying arbitrary rotations. The video illustrates several of those advanced gestures that the cubtile supports, and demonstrates the integration of this device in an actual museal application: it allows visitors, even non experts, to manipulate with great efficiency a 3D environment used to teach the basics of species classifications.

© All rights reserved Bortolaso et al. and/or their publisher

p. D11

Flöring, Stefan and Hesselmann, Tobias (2009): TAP: visual analytics on surface computers. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. p. D11. Available online

In this demo we present TaP, a gesture driven visual analytics application for the exploration of vast amounts of multidimensional data. Using multitouch gestures, users can intuitively interact with the system to look at data under different perspectives, modify visualisations in different ways and ultimately unearth insights hidden inside the data. Using interactive tabletops, the system also enables multiple users to collaboratively analyse data in separate charts on the screen. The application also integrates stacked half-pie menus, a new approach for navigating in deeply nested hierarchic data structures, specifically designed for the use on interactive tabletops.

© All rights reserved Flöring and Hesselmann and/or their publisher

p. D12

David, Darren, Granas, Lee, Konig, Jules, Moody, Nathan and Santangelo, Joshua (2009): TouchTones: multi-user collaborative music composer. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. p. D12. Available online

TouchTones lets up to four people create music collaboratively on Microsoft Surface. You don't need to know anything about music to make something that sounds beautiful. Start an instrument playing by touching a colored spinner, change the arrow directions on the grid to change the melody, and that's about it! TouchTones provides an immediate and enjoyable musical experience for any small group. TouchTones can be learned with only a few seconds of exploration or by viewing its integrated help video. From there, additional features emerge through play. Create tricky melody paths through the note grid, or use multiple fingers and play TouchTones like a keyboard. Tested with users from age 4 to age 60, TouchTones opens up either minutes or hours of enjoyment, for as few as one user or even a whole family. Touchtones is a collaborative, multi-touch, multi-user, grid-based music sequencer that is being released as freeware for Microsoft Surface. It has four instruments distributed across four octaves, all playing to a master tempo. Sounds can be triggered by user-controlled animated "sprites" or by simply pressing a colored button and pressing one of the icons on the grid at the same time. The patterns on the grid produce melody, and anyone can alter the melody, even while it's playing. Volume and reset controls help to round out the simple and wholly visual user interface. While TouchTones comes with a clean, modern design and a set of pleasant sounds, it has been designed to be reskinnable. Both the sounds and visuals can be completely customized to match any brand, mood, or theme.

© All rights reserved David et al. and/or their publisher

p. D13

Jackson, Daniel, Bartindale, Tom and Olivier, Patrick (2009): FiberBoard: compact multi-touch display using channeled light. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. p. D13. Available online

Multi-touch displays based on infrared (IR) light offer many advantages over alternative technologies. Existing IR multi-touch devices either use complex custom electronic sensor arrays, or a camera that must be placed relatively distant from the display. FiberBoard is an easily constructed compact IR-sensing multi-touch display. Using an array of optical fibers, reflected IR light is channeled to a camera. As the fibers are flexible the camera is free to be positioned so as to minimize the depth of the device. The resulting display is around one tenth of the depth of a conventional camera-based multi-touch display. We present our prototype, its novel calibration process, and virtual camera software based on existing multi-touch image processing tools.

© All rights reserved Jackson et al. and/or their publisher

p. D14

Frieß, Marc René, Kleinhans, Martin, Echtler, Florian, Forster, Florian and Groh, Georg (2009): A multi-touch tabletop interface for applying collaborative creativity techniques. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. p. D14. Available online

The demo-video is dedicated to a collaborative multi-touch tabletop interface for a tool, which is targeted to support idea-generation by providing a generic architectural model for collaborative creativity techniques. This tool also provides situated support through the possibility of selecting between different user-interaction paradigms adapted to the interaction situation. In order to address more communication and coordination relevant co-located settings, a tabletop interface can be seen as a promising way of IT-support.

© All rights reserved Frieß et al. and/or their publisher

p. D15

Voss, Henning and Schneider, Georg (2009): Nori Scrum meeting table. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. p. D15. Available online

Scrum is a process model, commonly used for agile software development. Scrum is based on several meetings where teams of developers meet to monitor and plan a software development process together. These meetings, which are an integral part of Scrum, are usually held without the support of software or digital media. We developed an interactive, multitouch-enabled Meeting-Table, which can be used by teams during the whole development process. Development teams are guided by the multitouch Application through meetings and can plan and observe their work directly and together. Through the support of the development team with our interactive meeting table, the efficiency of development teams in a Scrum software development process can be increased.

© All rights reserved Voss and Schneider and/or their publisher

p. D2

Marquardt, Nicolai, Nacenta, Miguel A., Young, James E., Carpendale, Sheelagh, Greenberg, Saul and Sharlin, Ehud (2009): The Haptic Tabletop Puck: the video. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. p. D2. Available online

In everyday life, our interactions with objects on real tables include how our fingertips feel those objects. In comparison, current digital interactive tables present a uniform touch surface that feels the same, regardless of what it presents visually. In this video, we demonstrate how tactile interaction can be used with digital tabletop surfaces. We present a simple and inexpensive device -- the Haptic Tabletop Puck -- that incorporates dynamic, interactive haptics into tabletop interaction. We created several applications that explore tactile feedback in the area of haptic information visualization, haptic graphical interfaces, and computer supported collaboration. In particular, we focus on how a person may interact with the friction, height, texture and malleability of digital objects.

© All rights reserved Marquardt et al. and/or their publisher

p. D4

Chaboissier, Jonathan and Vernier, Frederic (2009): RealTimeChess: a real-time strategy and multiplayer game for tabletop displays. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. p. D4. Available online

RealTimeChess (RTC) is a real-time strategy and multiplayer game designed for tabletop display. This is based on pieces of chess but enable players (from two to four) to play all at the same time. The speed of the game can be adjusted and it is not necessary to know the strategies of Chess in order to play RTC. There are several types of game, including a tutorial for beginners. Finally, RTC seamlessly combines direct and remote interactions techniques.

© All rights reserved Chaboissier and Vernier and/or their publisher

p. D5

Sprengart, Benjamin, Collins, Anthony and Kay, Judy (2009): Curator: a design environment for curating tabletop museum experiences. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. p. D5. Available online

Interactive tabletops show great potential to be used in learning contexts, particularly in museums, as a way for people to collaboratively learn and explore rich sets of digital information. However, it is a real challenge for exhibition designers, or Curators, to create exhibitions for tabletop displays, as it is tedious to create these data-sets manually. Curator is a cross-platform tool that can be used by non-technical designers and museum staff to construct rich information collections for exploration on our interactive tabletop. After the data-set has been constructed using Curator on a desktop computer, this information can be tested and displayed on the tabletop immediately, providing an engaging, collaborative experience for exploration and learning.

© All rights reserved Sprengart et al. and/or their publisher

p. D6

Funato, Daisuke, Shibuya, Satoshi, Kizuka, Ayumi, Kimura, Ken-ichi and Naganuma, Rina (2009): Seamless interaction between "creation" and "appreciation": multi-touch drawing interface. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. p. D6. Available online

In the present study, it proposes the system that supports "creation" and "appreciation" in the seamless by using the multi touch interface. This system achieved the interface where the drawing was able to be expressed by using a variety of physical operations and personal belongings by using the multi touch interface, and the diversity of the drawing act assumed to be difficult for mouse and pen tablet was achieved. Moreover, the creative environment to be able to appreciate others' works and own works was achieved by the function that the picture produced by the user is immediately posted to an online gallery.

© All rights reserved Funato et al. and/or their publisher

p. D7

Spindler, Martin and Dachselt, Raimund (2009): PaperLens: advanced magic lens interaction above the tabletop. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. p. D7. Available online

To solve the challenge of exploring large information spaces on interactive surfaces such as tabletops, we developed an optically tracked, lightweight, passive display (magic lens) that provides elegant three-dimensional exploration of rich datasets. This can either be volumetric, layered, zoomable, or temporal information spaces, which are mapped onto the physical volume above a tabletop. By moving the magic lens through the volume, corresponding data is displayed, thus serving as a window into virtuality. Hereby, various interaction techniques are introduced, which especially utilize the lens' height in a novel way, e.g. for zooming or displaying various information layers.

© All rights reserved Spindler and Dachselt and/or their publisher

p. D8

Sato, Toshiki, Mamiya, Haruko, Fukuchi, Kentaro and Koike, Hideki (2009): PAC-PAC: pinching gesture recognition for augmented tabletop video game. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. p. D8. Available online

A novel tabletop entertainment system that allows simultaneous interactions by multiple participants was developed. The newly developed interaction technique of this system recognizes a pinching gesture performed with the thumb and forefinger. This gesture recognition technique enables rapid response and high degree-of-freedom input for the players.

© All rights reserved Sato et al. and/or their publisher

p. D9

Bichlmeier, Christoph, Heining, Sandro-Michael, Omary, Latifa, Stefan, Philipp, Ockert, Ben, Euler, Ekkehard and Navab, Nassir (2009): MeTaTop: a multi-sensory and multi-user interface for collaborative analysis. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. p. D9. Available online

The video demonstrates the potentials of integrating TableTop systems into medical environments to support the clinical workflow. Our group investigates its application to collaboratively review patients' data such as medical imaging data for diagnostics and the preparation and planning of surgical procedures. Usually, a team of clinicians dealing with a particular case reviews available patient data on a computer monitor. Browsing through stacks of slices reconstructed from volumetric imaging data is performed by only one person with classical interfaces such as keyboard or mouse. Other members of the team passively examine the presented imagery by looking over the main user's shoulder. In order to enhance the collaborative aspect of analyzing patient data, we suggest providing every participant with the abilities to contribute more actively. For this reason, we designed and developed a multi-touch TableTop display system to support team oriented discussion and decision making upon intuitively, interactively and effectively presented patients' data.

© All rights reserved Bichlmeier et al. and/or their publisher




 
 

Join our community and advance:

 
1.

Your career

 
2.

Your network

 
 3.

Your skills

 
 
 
 
 

User-contributed notes

Give us your opinion! Do you have any comments/additions
that you would like other visitors to see?

 
comment You (your email) say: May 7th, 2014
#1
May 7
Add a thoughtful commentary or note to this page ! 
 

your homepage, facebook profile, twitter, or the like
will be spam-protected
How many?
= e.g. "6"
By submitting you agree to the Site Terms
 
 
 
 

Changes to this page (conference)

03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Added
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified
03 Nov 2010: Modified

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/conferences/proceedings_of_the_2009_acm_international_conference_on_interactive_tabletops_and_surfaces.html
May 07

If we want users to like our software, we should design it to behave like a likeable person.

-- Alan Cooper

 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger

 
Start reading

Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad

 
Start reading

Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam

 
Start reading
 
 

Help us help you!