Proceedings of the 1st International Conference on Tangible and Embedded Interaction
Time and place:
TEI is the first international conference dedicated to research in tangible, embedded, and embodied interaction. The conference attempts to bring together this new field, providing a meeting ground for the diverse communities of research and practice involved with tangibles -- from computing, hardware, and sensor technology, to HCI, interaction design, and CSCW, to product and industrial design and interactive arts.
The following articles are from "Proceedings of the 1st International Conference on Tangible and Embedded Interaction":
Shultz, Peter (2007): Brand consciousness as a driving design force. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 103-104. Available online
This paper describes a recent interactive project which used slide projectors (instead of video projection or computer screens) to best fulfill the brand specific communication needs of an installation. It discuss why brand issues matter as well as how the design of systems, both physical systems and content there within, can address those issues.
Minakuchi, Mitsuru and Nakamura, Satoshi (2007): Collaborative ambient systems by blow displays. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 105-108. Available online
We implemented blow displays, which provide force feelings with no contact. Although blow displays can use only wind velocities and directions to represent information, they are less intrusive and less visually polluting to other media than other displays. We propose collaborative ambient systems to utilize blow displays' characteristics of spatiality and compatibility with other media. In collaborative ambient systems, blow displays direct the user to displays that provide rich information. Blow displays can also express information auxiliary to the main content the user is attending to. In this paper, we describe some ongoing applications and discuss their benefits and issues.
The research on Tangible Interaction (TI) has been inspired by many different disciplines, including psychology, sociology, engineering and human-computer interaction (HCI). Now that the field is getting more mature, in the sense that basic technologies and interaction paradigms have been explored, we observe a growing potential for a more design-oriented research approach. We suggest that there are several arguments for this proposed broadening of the TI-perspective: 1) the need for designing products within contexts-of-use that are much more challenging and diverse than the task-oriented desktop (or tabletop) systems that mostly inspire us today, 2) the interest to also design TI starting from existing physical activities instead of only as add-ons to digital applications, 3) the need for iterative design and evaluation of prototypes in order to develop applications that are grounded within daily practice over prolonged periods of time, and 4) the need to extend ease-of-use to more hedonic aspects of interaction such as fun and engagement.
Etter, Richard and Röcker, Carsten (2007): A tangible user interface for multi-user awareness systems. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 11-12. Available online
In this paper a music-based awareness system called 'Social Radio' is presented. The system focuses on small intimate groups and enables multiple persons to stay in touch using smart artifacts and tangible interaction mechanisms.
Sokoler, Tomas, Lowgren, Jonas, Eriksen, Mette Agger, Linde, Per and Olofsson, Stefan (2007): Explicit interaction for surgical rehabilitation. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 117-124. Available online
We discuss the design ideal of explicit interaction, which is a way to approach the dimensions of explicitness versus ambience and explicitness versus obtrusiveness in ubiquitous computing. Explicit interaction refers to interaction techniques designed to make actions and intentions visible, understandable and accountable. We introduce three levels of analysis -- usability, materialization, and social performance -- and present the design of an explicit interaction assembly of devices for rehabilitation after hand surgery. The assembly, intended to support video recording during patient-therapist consultations, is evaluated and we find that it provides superior usability and the potential to improve rehabilitation outcomes through materialization. Moreover, we find that the design of cues to support the social practice in the rehabilitation ward needs to be improved since the assembly allowed for uses unanticipated during the design.
Regier, Hannah (2007): Giving materials a voice. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 125-126. Available online
My thesis work involves incorporating physical materials into communication design systems. As part of this work, I have been using embedded interactions to explore creating behaviors, attitudes, and personalities that convey the spirit and living origins of materials such as leather, cloth, wood and felt. With the help of custom authoring software  (a flash-based interface), interactive objects and spaces become a tool for the designer to explore the rich narrative potentials that exist between physical materials and humans.
Hurtienne, Jörn and Israel, Johann Habakuk (2007): Image schemas and their metaphorical extensions: intuitive patterns for tangible interaction. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 127-134. Available online
One of the goals of tangible interaction is to build more intuitive interfaces. This paper gives a definition of intuitivity and presents a continuum of knowledge serving as a classification for intuitive interaction. Against the background of the continuum recent taxonomies for tangible interaction are reviewed. A new approach for classifying tangible interaction will be presented: image schemas and their metaphorical extensions. Motivated by linguistic studies of meaning this taxonomy is able to overcome some limitations of previous approaches. The taxonomy is illustrated with examples of using image schemas and their metaphorical extensions in potential TUI applications. A more complex example, the Tangible Memories Box, shows how our taxonomy and earlier approaches may complement each other.
Marquardt, Nicolai and Greenberg, Saul (2007): Distributed physical interfaces with shared phidgets. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 13-20. Available online
Tangible interfaces are best viewed as an interacting collection of remotely-located distributed hardware and software components. The problem is that current physical user interface toolkits do not normally offer distributed systems capabilities, leaving developers with extra burdens such as device discovery and management, low-level hardware access, and networking. Our solution is Shared Phidgets, a toolkit for rapidly prototyping distributed physical interfaces. It offers programmers 3 ways to access and control remotely-located hardware, and the ability to create abstract devices by transforming, aggregating and even simulating device capabilities. Network communication and low-level access to device hardware are handled transparently, regardless of device location.
Chang, Angela, Gouldstone, James, Zigelbaum, Jamie and Ishii, Hiroshi (2007): Simplicity in interaction design. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 135-138. Available online
Attaining simplicity is a key challenge in interaction design. Our approach relies on a minimalist design exercise to explore the communication capacity for interaction components. This approach results in expressive design solutions, useful perspectives of interaction design and new interaction techniques.
Jordà, Sergi, Geiger, Günter, Alonso, Marcos and Kaltenbrunner, Martin (2007): The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 139-146. Available online
In recent years we have seen a proliferation of musical tables. Believing that this is not just the result of a tabletop trend, in this paper we first discuss several of the reasons for which live music performance and HCI in general, and musical instruments and tabletop interfaces in particular, can lead to a fertile two-way cross-pollination that can equally benefit both fields. After that, we present the reacTable, a musical instrument based on a tabletop interface that exemplifies several of these potential achievements.
Smith, Andrew C. (2007): Using magnets in physical blocks that behave as programming objects. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 147-150. Available online
In this paper we describe the implementation of GameBlocks, a novel digital manipulative system for coding simple programme sequences to control a toy robot. A contact-less, magnetic field-based mechanism for transferring information about the blocks is described. The mechanical and electronic system components are described. We position this implementation in relation to prior related work. Problems encountered are given, with suggestions for future work.
Bakker, Saskia, Vorstenbosch, Debby, Hoven, Elise van den, Hollemans, Gerard and Bergman, Tom (2007): Weathergods: tangible interaction in a digital tabletop game. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 151-152. Available online
In this paper we describe the game 'Weathergods', which is implemented on the Entertaible tabletop gaming platform . The game uses either iconic or symbolic  tangible objects for interaction and marries both the advantages of traditional board games and computer games.
Hengeveld, Bart, Voort, Riny, Balkom, Hans van, Hummels, Caroline and Moor, Jan de (2007): Designing for diversity: developing complex adaptive tangible products. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 155-158. Available online
Interactive products can help very young multi-handicapped children (1-4 years) develop their language and communication skills, under the condition that they are optimally tuned to the individual child. This has great consequences for design, since this kind of interactive products need to be adaptive to the child's development, possibilities, interests and needs. There are currently hardly any guidelines for designing adaptive interactive tangible products for such a heterogeneous user group. Through LinguaBytes, a three-year research project aimed at the development of an adaptive interactive toy for stimulating language and communication skills of multi-handicapped toddlers, we want to establish a theoretical framework, including guidelines and tools, for designing complex interactive products.
Horn, Michael S. and Jacob, Robert J. K. (2007): Designing tangible programming languages for classroom use. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 159-162. Available online
This paper describes a new technique for implementing educational programming languages using tangible interface technology. It emphasizes the use of inexpensive and durable parts with no embedded electronics or power supplies. Students create programs in offline settings -- on their desks or on the floor -- and use a portable scanning station to compile their code. We argue that languages created with this approach offer an appealing and practical alternative to text-based and visual languages for classroom use. In this paper we discuss the motivations for our project and describe the design and implementation of two tangible programming languages. We also describe an initial case study with children and outline future research goals.
Marshall, Paul (2007): Do tangible interfaces enhance learning?. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 163-170. Available online
Conceptual work on tangible interfaces has focused primarily on the production of descriptive frameworks. While this work has been successful in mapping out a space of technical possibilities and providing a terminology to ground discussion, it provides little guidance on the cognitive or social effects of using one type of interface or another. In this paper we look at the area of learning with tangible interfaces, suggesting that more empirically grounded research is needed to guide development. We provide an analytic framework of six perspectives, which describes latent trends and assumptions that might be used to motivate and guide this work, and makes links with existing research in cognitive science and education.
Droumeva, Milena, Antle, Alissa and Wakkary, Ron (2007): Exploring ambient sound techniques in the design of responsive environments for children. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 171-178. Available online
This paper describes the theoretical framework, design, implementation and results from an exploratory informant workshop that examines an alternative approach to sound feedback in the design of responsive environments for children. This workshop offers preliminary directions and models for using intensity-based ambient sound display in the design of interactive learning environments for children that offer assistance in task-oriented activities. We see the value of this research in developing a more cohesive and ecological model for use of audio feedback in the design of embedded interactions for children. The approach presented here takes the design of multi-modal feedback beyond being experiential, to one that supports learning and problem solving.
Dünser, Andreas and Hornecker, Eva (2007): Lessons from an AR book study. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 179-182. Available online
We have observed children reading an augmented book aimed at early literacy education. We explored how children aged six to seven experience and interact with these novel instructional media. We here focus on issues arising from the tangibility of interface elements, the integration of physical and digital elements, on-screen and paper elements, and of text and interactive sequences.
Girouard, Audrey, Solovey, Erin Treacy, Hirshfield, Leanne M., Ecott, Stacey, Shaer, Orit and Jacob, Robert J. K. (2007): Smart Blocks: a tangible mathematical manipulative. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 183-186. Available online
We created Smart Blocks, an augmented mathematical manipulative that allows users to explore the concepts of volume and surface area of 3-dimensional (3D) objects. This interface supports physical manipulation for exploring spatial relationships and it provides continuous feedback for reinforcing learning. By leveraging the benefits of physicality with the advantages of digital information, this tangible interface provides an engaging environment for learning about surface area and volume of 3D objects.
Verhaegh, Janneke, Fontijn, Willem and Hoonhout, Jettie (2007): TagTiles: optimal challenge in educational electronics. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 187-190. Available online
In this paper we describe TagTiles, a tangible electronic board game for educational purposes. It was designed to be suitable for investigating the balance between challenge and control by providing fine-grained and wide ranged difficulty levels. TagTiles can address a range of skills including fine motor skills, cognitive and social skills. Evaluation of the game showed that the children appreciated the game and that most of them were offered a challenge that was appropriate for their skill level.
Khandelwal, Madhur and Mazalek, Ali (2007): Teaching table: a tangible mentor for pre-k math education. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 191-194. Available online
In this paper, we describe the design and implementation of Teaching Table -- an interactive tabletop audio-visual device aimed at enhancing the learning experience for pre-kindergarten children by involving them in physical activities. Using electromagnetic sensing technology, the table can track tagged objects placed on its surface, accurately identifying their type and location while providing a coincident visual display and audio feedback. Teaching activities that are aimed at developing early math skills have been created for the table in alignment with standard curriculum guidelines for pre-K schools. Additionally, we include software based assessment tools for mentors/teachers to easily track an individual child's progress during the process of interacting with the table.
Antle, Alissa N. (2007): The CTI framework: informing the design of tangible systems for children. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 195-202. Available online
New forms of tangible and spatial child computer interaction and supporting technologies can be designed to leverage the way children develop intelligence in the world. The author describes a preliminary design framework which conceptualizes how the unique features of tangible and spatial interactive systems can be utilized to support the cognitive development of children under the age of twelve. The framework is applied to the analytical evaluation of an existing tangible interface.
Poupyrev, Ivan, Nashida, Tatsushi and Okabe, Makoto (2007): Actuation and tangible user interfaces: the Vaucanson duck, robots, and shape displays. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 205-212. Available online
In the last decade, the vision of future interfaces has shifted from virtual reality to augmented and tangible user interfaces (UI) where virtual and physical (or "bits and atoms") co-exist in harmony. Recently, a growing number of designers and researchers have been taking the next logical step: creating interfaces where physical, tangible elements are not merely dynamically coupled to the digital attributes and information, but are themselves dynamic, self-reconfigurable devices that can change their physical properties depending on the state of the interfaces, the user, or the environment. A combination of the actuation, self-configuration, and tangibility can expand and enhance the design of tangible interfaces. In this paper, we present an overview of the use of actuation in user interfaces and discuss the rationality of building actuated interfaces. We then discuss actuated interfaces in detail based on our experience designing Lumen shape displays. Work on actuated interfaces is still in its infancy, projects are few and far between, so we consider this paper an invitation to discussion and hope it can help stimulate further research in this area.
Motamedi, Nima (2007): Keep in touch: a tactile-vision intimate interface. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 21-22. Available online
We present an overview of Keep in Touch, a networked fabric touchscreen designed to support and maintain intimacy for couples in long distance relationships. To achieve this, a novel sensorial interface was created by combining the visual and tactile senses together. Each partner is presented with a blurred digital projection of their lover. When they touch their partner's body, the image comes into focus revealing their features. We describe how this sensory mapping creates an expressive and emotional interface allowing couples to communicate through touch, gestures, and body language.
We report on approaches for context-awareness in a kitchen environment. Two devices, an augmented cutting board and a sensor-enriched knife, enable the environment to determine the type of food handled during the preparation of meals.
Biloria, Nimish (2007): Spatializing real time interactive environments. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 215-222. Available online
The research paper exemplifies upon a series of design-research experiments specifically aimed at developing realtime interactive spatial prototypes. The experimental projects are conceived as architectural design research undertakings, demonstrating a fusion between the material, the electronic and the digital domains. This fusion is attained through harnessing a synergistic merger between the fields of ambient sensing, control systems, ubiquitous computing, architectural design, pneumatic systems and computation (real-time game design techniques). The prototypes are visualized as complex adaptive systems, continually engaged in activities of data-exchange and physical adaptation of their constituting components in response to contextual variations. A strategic co-evolution of technical knowledge between the Industry (specifically Festo, a pneumatic engineering company) and Academic research (spatial and information design) gives shape to the interactive constructs, hence developing an information bridge between the two knowledge sectors.
Sitorus, Larisa, Cao, Shan Shan and Buur, Jacob (2007): Tangible user interfaces for configuration practices. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 223-230. Available online
In this paper, we present a project in which we explored interactional possibilities for designing a tangible configuration interface as an alternative to conventional input devices in the field of industrial refrigeration maintenance. Based on ethnographic field studies and design workshops, we built three prototypes of configuration interfaces. Each interface was built and used to explore issues that are dealt with by users in their everyday work practice, namely collaborative sense-making of complex system, manipulation of interdependent and fluid digital material, and anticipating future changes through situated learning.
This paper describes the move.me interaction prototype developed in conjunction with V2_lab in Rotterdam. move.me proposes a scenario for social interaction and the notion of social intimacy. Interaction with sensory-enhanced, soft, pliable, tactile, throw-able cushions afford new approaches to pleasure, movement and play. A somatics approach to touch and kinaesthesia provides an underlying design framework. The technology developed for move.me uses the surface of the cushion as an intelligent tactile interface. Making use of a movement analysis system called Laban Effort-Shape, we have developed a model that provides a high-level interpretation of varying qualities of touch and motion trajectory. We describe the notion of social intimacy, and how we model it through techniques in somatics and performance practice. We describe the underlying concepts of move.me and its motivations. We illustrate the structural layers of interaction and related technical detail. Finally, we discuss the related body of work in the context of evaluating our approach and conclude with plans for future work.
Mugellini, Elena, Rubegni, Elisa, Gerardi, Sandro and Khaled, Omar Abou (2007): Using personal objects as tangible interfaces for memory recollection and sharing. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 231-238. Available online
Tangible User Interfaces (TUIs) are emerging as a new paradigm of interaction with the digital world aiming at facilitating traditional GUI-based interaction. Interaction with TUIs relies on users' existing skills of interaction with the real world , thereby offering the promise of interfaces that are quicker to learn and easier to use. Recently it has been demonstrated  that the use of personal objects as tangible interfaces will be even more straightforward since users already have a mental model associated to the physical objects thus facilitating the comprehension and usage modalities of that objects. However TUIs are currently very challenging to build and this limits their widespread diffusion and exploitation. In order to address this issue we propose a user-oriented framework, called Memodules Framework, which allows the easy creation and management of Personal TUIs, providing end users with the ability of dynamically configuring and reconfiguring their TUIs. The framework is based on a model, called MemoML (Memodules Markup Language), which guarantees framework flexibility, extensibility and evolution over time.
Jensen, Mads Vedel (2007): A physical approach to tangible interaction design. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 241-244. Available online
The field of tangible interaction is growing in rich and diverse directions calling for new forms of understanding. In this paper I will present a view on tangible interaction that has a strong focus on movement and interaction qualities. I will describe a design exercise that transfers interaction qualities identified from user observations made in particular contexts to the design of new interaction modalities. The exercise was completed with 16 graduate students and resulted in a set of interactive sculptures that aim to convey particular interaction experiences. I will introduce the process through which the exercise was conducted and discuss the outcomes; specially the role of movement metaphors.
Frey, Martin (2007): CabBoots: shoes with integrated guidance system. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 245-246. Available online
Conventional navigational devices normally communicate with the user on the acoustic and visual levels. Concerning pedestrian navigation, the visual and auditive based output channels do not always work satisfyingly due to several reasons. This paper describes the concept for an alternate interface for pedestrian guidance applications, called CabBoots. The information transmission process can be perceived tactilely, is intuitively understandable, and is applied to the part of the body most directly involved in the act of walking: the foot. The applied communications metaphor is familiar to all; it's something that everyone who's ever walked along a well-trodden path is aware of.
Schuricht, Susanne, Hohl, Michael and Struppek, Mirjam (2007): Freequent Traveller: interaction versus contemplation. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 247-250. Available online
Freequent Traveller (2001) is a live interactive installation by the artist Susanne Schuricht, developed in collaboration with Tobias Schmidt . The interface consists of a hammock, whose movement is tracked by a custom-made hardware interface. While relaxing in the hammock one's motion animates text across a projection-sail. The dynamics of this animation are perceived as intricately synchronized with, and connected to ones own bodily movement together with the hammock. The projected texts are short essays and excerpts about technology, mobility, home and identity. The installation is an instrument to generate awareness through rhythmic bodily experience. Interaction is considered as a process to create contemplation and change in outlook, to go beyond playful experience with an interface.
Moen, Jin (2007): From hand-held to body-worn: embodied experiences of the design and use of a wearable movement-based interaction concept. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 251-258. Available online
This paper argues that movement-based interaction should be designed from a non-technological, people-centered point of view in order to create embodied and engaging interaction experiences. Further, it discusses social and contextual aspects that need to be taken into account when designing for movement-based interaction. The paper presents the design process and user explorations of a wearable movement-based interaction concept that was created in order to explore full-body movement as interaction modality. The starting point was taken in people's own experiences of communication and interaction through bodily movements, inspired by methods and theories used within modern dance. As design guidelines for the prototyped interaction concept we used aspects on movement that were directly derived from field studies of physical expression. The user explorations of the concept show preliminary examples of how people engage in movement-based interaction and how they are affected by the social interaction context.
Pering, Trevor, Anokwa, Yaw and Want, Roy (2007): Gesture connect: facilitating tangible interaction with a flick of the wrist. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 259-262. Available online
The Gesture Connect system streamlines the process of connecting to and/or controlling objects from a user's personal mobile device. Typically, in order to connect two devices together they users must follow a two-step process that consists of first selecting which devices should be connected, and then specifying what the devices should do once they are connected. By combining contact-based connections with gesture-based selection, the Gesture Connect system combines these two steps into a single physical action for simple actions, greatly simplifying the common case. In order to demonstrate and test the underlying concept, a hardware extension comprising Near Field Communication (NFC) and accelerometer capability has been developed for standard commercial mobile phone devices, along with the associated tagging and gesture recognition software. This system greatly reduces overall interaction time for common-case interaction, enhancing the overall user experience.
Ronkainen, Sami, Häkkilä, Jonna, Kaleva, Saana, Colley, Ashley and Linjama, Jukka (2007): Tap input as an embedded interaction method for mobile devices. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 263-270. Available online
In this paper we describe a novel interaction method for interacting with mobile devices without the need to access a keypad or a display. A tap with a hand can be reliably detected e.g. through a pocket by means of an acceleration sensor. By carefully designing the user interface, the tap can be used to activate logically similar functionalities on the device, leading to a simple but useful interaction method. We present results of user tests aimed at studying the usability of various tap input based user interface applications.
Larssen, Astrid Twenebowa, Robertson, Toni and Edwards, Jenny (2007): The feel dimension of technology interaction: exploring tangibles through movement and touch. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 271-278. Available online
This paper presents concepts to extend our understandings of bodily aspects of technology interactions. The aim of the paper is to offer a way of looking at the role our haptic and kinaesthetic senses play in experiencing tangibles. We approach this issue by framing it around how our bodies establish relationships with things. Four themes body-thing dialogue, potential for action, actions in space (consisting of within-reach, out-of-reach) and movement expression are introduced. We discuss the role these themes can play in our thinking about, and exploration for, tangible and non-tangible technology interactions. The idea is that these themes can help us consider, not just how a design or a technology might look, but also how it might feel to use.
Boess, Stella, Saakes, Daniel and Hummels, Caroline (2007): When is role playing really experiential? Case studies. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 279-282. Available online
This paper presents and evaluates examples from our work with role playing exercises in design, both in design education and in our own design work. Rationales for role playing in design are: communication within the design process, the increase of technological complexity, the experience and empathy of designers, and the tangibility of interaction, and attentiveness to social change. They led us in developing role playing techniques for design ideation. Here, we reflect on the practical problems of integrating role playing exercises in design teaching and in a design process, and evaluate what hinders or aids the ability to engage with interaction experientially and empathically. Careful consideration of the actor-audience relationship, the setting, sufficient preparation for acting, and props emerge as important elements.
Brewer, Johanna, Williams, Amanda and Dourish, Paul (2007): A handle on what's going on: combining tangible interfaces and ambient displays for collaborative groups. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 3-10. Available online
While tangible interfaces open up new possibilities for input and interaction, they are also interesting because of the ways in which they occupy the physical world just as we do. We have been working at the intersection of three research areas -- tangible interfaces, ambient displays, and collaboration awareness. Our system, Nimio, uses engaging physical objects as both input devices (capturing aspects of individual activity) and output devices (expressing aspects of group activity). We present our design and experiences, focusing in particular on the tension between legibility and ambiguity and its relevance in collaborative settings.
Eriksson, Eva, Hansen, Thomas Riisgaard and Lykke-Olesen, Andreas (2007): Reclaiming public space: designing for public interaction with private devices. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 31-38. Available online
Public spaces are changing from being ungoverned places for interaction to be more formalized, controlled, less interactive, and designed places aimed at fulfilling a purpose. Simultaneously, new personal mobile technology aims at providing private individual spaces in the public domain. In this paper we explore the implications of interacting in public space and how technology can be rethought to not only act as personal devices, but be the tool to reclaim the right and possibility to interact in public spaces. We introduce information exchange, social support and regulation as three central aspects for reclaiming public space. The PhotoSwapper application is presented as an example of a system designed to integrate pervasive technology in a public setting. The system is strongly inspired by the activities at a traditional market place. Based on the design of the application we discuss four design challenges when designing for public interaction.
Richter, Jan, Thomas, Bruce H., Sugimoto, Maki and Inami, Masahiko (2007): Remote active tangible interactions. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 39-42. Available online
This paper presents a new form of remote active tangible interactions built with the Display-based Measurement and Control System. A prototype system was constructed to demonstrate the concepts of coupled remote tangible objects on rear projected tabletop displays. A user evaluation measuring social presence for two users performing a furniture placement task was performed, to determine a difference between this new system and a traditional mouse.
Zigelbaum, Jamie, Horn, Michael S., Shaer, Orit and Jacob, Robert J. K. (2007): The tangible video editor: collaborative video editing with active tokens. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 43-46. Available online
In this paper we introduce the Tangible Video Editor (TVE), a multi-user, tangible interface for sequencing digital video. We present a new approach to tabletop interaction by using multiple handheld computers embedded in plastic tokens. Drawing from the rich physical experience of tradition film editing techniques we designed the TVE to engage multiple users in a collaborative process and encourage the exploration of narrative ideas. We used active tokens to provide a malleable interface, enabling users to organize the interface components in unspecified ways. Our implementation improves upon common projection-based tabletop interfaces in a number of ways including a design for use beyond dedicated two dimensional spaces and a naturally scaling screen resolution.
Villar, Nicolas and Gellersen, Hans-Werner (2007): A malleable control structure for softwired user interfaces. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 49-56. Available online
Rather than existing as a computer input device with a rigid shape, a predetermined selection of controls and a fixed layout, a malleable control structure is made up of a set of controls that can be freely arranged on control areas. The structure is physically adaptable by users during operation: control areas and controls can be introduced, organized and removed to suit interaction requirements and personal preference. We present an implementation of a malleable control structure called VoodooIO. Our design contributes a novel material -- the network substrate -- that can be used to transform everyday surfaces into control areas, and the concept of implementing basic control units (such as buttons, sliders or dials) as ad hoc network nodes. VoodooIO does not constitute an application interface in itself. Like any input device, it only becomes concrete as an interface component in the context of a particular application. We introduce the concept of softwiring as a collection of techniques and practices that allow users to benefit from malleable control interfaces in a number of concrete scenarios of use.
Signer, Beat and Norrie, Moira C. (2007): PaperPoint: A Paper-Based Presentation and Interactive Paper Prototyping Tool. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 57-64. Available online
Recent developments in digital pen and paper solutions enable, not only the digital capture of handwriting, but also paper to be used as an interactive medium that links to digital information and services. We present a tool that builds on technologies for interactive paper to enable PowerPoint presentations to be controlled from printed slide handouts. Furthermore, slides can be easily annotated during presentations by simply drawing on the printed version of the slide. As well as discussing the advantages of such a paper-based interface and initial findings on its use, we describe how we were also able to exploit it to provide a general prototyping tool for interactive paper applications.
Parkes, Amanda and Ägeslevä, Jussi (2007): Physical interventions in a location based cultural narrative: a case study of embedded media in public space installations. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 65-68. Available online
The majority of large scale media embedded into public spaces clashes with the existing architectural and cultural characteristics of an environments. This paper discusses the challenge to overcome this phenomenon through the design of new embedded artifacts which tie interactive media elements into existing physical properties of an environment, whether material, spatial, cultural, or social, to fluidly mesh and compliment a shared public space. Design challenges and guidelines are explored through the presentation of two experimental installations developed by the authors- the Algorithmic Topiary and Thermochromic Laundry Line. The site specific installations were created as physical interventions as part of the History Unwired Project, a mobile device based walking tour in the Castello neighborhood of Venice, Italy, and were intended to bring attention and dynamism to the physical and ephemeral elements of the environments which the tour describes.
Kaltenbrunner, Martin and Bencina, Ross (2007): reacTIVision: a computer-vision framework for table-based tangible interaction. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 69-74. Available online
This article provides an introductory overview to first-time users of the reacTIVision framework -- an open-source cross-platform computer-vision framework primarily designed for the construction of table-based tangible user interfaces. The central component of the framework is a standalone application for fast and robust tracking of fiducial markers in a real-time video stream. The framework also defines a transport protocol for efficient and reliable transmission of object states via a local or wide area network. In addition, the distribution includes a collection of client example projects for various programming environments that allow the rapid development of unique tangible user interfaces. This article also provides a discussion of key points relevant to the construction of the necessary table hardware and surveys some projects that have been based on this technology.
Merrill, David, Kalanithi, Jeevan and Maes, Pattie (2007): Siftables: towards sensor network user interfaces. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 75-78. Available online
This paper outlines Siftables, a novel platform that applies technology and methodology from wireless sensor networks to tangible user interfaces in order to yield new possibilities for human-computer interaction. Siftables are compact devices with sensing, graphical display, and wireless communication. They can be physically manipulated as a group to interact with digital information and media. We discuss the unique affordances that a sensor network user interface (SNUI) such as Siftables provides, as well as the resulting directness between the physical interface and the data being manipulated. We conclude with a description of some gestural language primitives that we are currently prototyping with Siftables.
Elliot, Kathryn, Neustaedter, Carman and Greenberg, Saul (2007): StickySpots: using location to embed technology in the social practices of the home. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 79-86. Available online
Ethnographic studies of domestic environments have shown the fundamental roles that locations and context play in helping people understand and manage information in their homes. Yet it is not clear how this knowledge can be applied to the design of home technologies. For this reason, we present a case study in home technology design that uses the understandings gained from previous ethnographic studies on domestic locations to motivate the design of a home messaging system. Our prototype, called StickySpots, uses locations to embed technology in the social practices of the home. We then use this case study to reflect more generally on location-based design in the home.
Döring, Tanja and Beckhaus, Steffi (2007): The card box at hand: exploring the potentials of a paper-based tangible interface for education and research in art history. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 87-90. Available online
This paper presents art historical research and education as a novel application area for tangible user interfaces. The academic discipline of art history and its subjects are currently undergoing changes that will lead to a rising importance of computers. However, the computer is generally not the art historian's tool of choice. We feel that this is due to existing GUI systems not fully meeting researchers' needs. We therefore propose a design for a tabletop tangible user interface considering art historians' desire "to collect things as tokens"  and to remain within traditional techniques. We present a case study of the usage of image cards within iconographic work. Based on our results, we derive implications for the design of the tangible interface that integrates approved traditional paper based techniques with the advantages of digital representation.
Levisohn, Aaron, Cochrane, Jayme, Gromala, Diane and Seo, Jinsil (2007): The Meatbook: tangible and visceral interaction. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 91-92. Available online
The Meatbook, an interactive art installation, explores the use of a novel tangible interface to provoke a visceral response in the viewer. The Meatbook presents the symbiosis of the mechanical and the organic as it simultaneously juxtaposes the conflicting materiality of these media. Sensors, motors and other mechanics are used to animate the meat, generating movements specifically designed to produce visceral, even cathartic responses from the user. By simultaneously generating revulsion and fascination, the user undergoes an embodied experience in which the alien and the familiar come together in the form of a book.
Lee, Hyun-Jean, Khandelwal, Madhur and Mazalek, Ali (2007): Tilting table: a movable screen. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 93-96. Available online
In screen-based experiences, the screen itself can become the physical device used for interaction. The "move-ability" of the screen affords interactivity between the screen artifact and the viewer, and between the virtual and physical spaces. We have created a movable screen interface, called the tilting table, which provides a display surface via overhead projection. This interface invites user interaction through the action of tilting the tabletop. The degree of tilt affects the displayed visuals and audio output. This simple interaction makes users feel they have a closer connection to the virtual imagery, and the screen thus blurs the boundary between them. In this paper, we introduce four applications that have been implemented for the tilting table. Two use an artistic approach to create expressive and entertaining media interactions, while two have been developed for navigating large information spaces. Each application is described and user feedback is discussed.
Spiessl, Wolfgang, Villar, Nicolas, Gellersen, Hans-Werner and Schmidt, Albrecht (2007): VoodooFlash: authoring across physical and digital form. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 97-100. Available online
Design tools that integrate hardware and software components facilitate product design work across aspects of physical form and user interaction, but at the cost of requiring designers to work with other than their accustomed programming tools. In this paper we introduce VoodooFlash, a tool designed to build on the widespread use of Flash while facilitating design work across physical and digital components. VoodooFlash extends the existing practice of authoring interactive applications in terms of arranging components on a virtual stage, and provides a physical stage on which controls can be arranged, linked to software components, and appropriated with other physical design materials.