Tangible Interaction has come to be the 'umbrella term' used to describe a set of related research and design approaches which have emerged in several disciplines. It became noticeable as a research topic in the late 90s and then rapidly grew into a research area.
the embedding of the interface and the users' interaction in real spaces and contexts.
Tangible Interaction is a very interdisciplinary area. It spans a variety of perspectives, such as HCI and Interaction Design, but specializes on interfaces or systems that are in some way physically embodied - be it in physical artefacts or in environments. Furthermore it has connections with product/industrial design, arts and architecture. Finally, new developments in Ubiquitous Computing, Actuation, Sensors, Robotics and Mechanics contribute through enabling technologies to the field of Tangible Interaction.
45.1 A history of Tangible Interaction: influences, perspectives, and influential prototype systems
Tangible Interaction has been influenced by work from different disciplines, in particular Computing, HCI, and Product/Industrial Design. For Computing and HCI, the notion of a ‘Tangible User Interface’ (as it was originally conceived in the mid/late 90s) constituted an alternative vision for computer interfaces that brings computing back ‘into the real world’ (Wellner, Mackay, Gold 1993; Ishii, Ullmer 1997). A general dissatisfaction with traditional screen-based interfaces and with Virtual Reality, which were seen as estranging people from ‘the real world’, motivated the development of the first prototypes, while technological innovations enabled building these (e.g. RFID technology). In contrast, the field of Industrial Design came to engage with Tangible Interaction out of necessity, as increasingly appliances contain electronic and digital components and become ‘intelligent’. For designers, this constituted new challenges as well as new opportunities (Djajadiningrat, Overbeeke, Wensveen 2000; Djajadiningrat et al 2004).
An interesting point is that challenges and established skills are complementary for the above mentioned disciplines: Where considerations of physical form factors, choice of materials and so on forced computer scientists and HCI researchers out of their comfort zone, industrial designers now had to focus on designing complex behaviour that is digitally controlled and has no inherent relationship to product form.
These practice and research fields had no common discussion forum and only intersected occasionally or through personal contacts, with e.g. particular product ideas and sketches inspiring the notion of a Tangible User Interface. The Marble Answering Machine, devised by Durrell Bishop while studying design at the Royal College of Art, is one such sketch that used marbles to represent incoming messages. The marbles fall out of the machine and can be played by placing them into a mould on the machine (Poynor 1995). Generalizing this design yielded the idea of representing data through physical objects and of manipulating the data by physical handling of the objects – Ishii’s Tangible Bits vision (Ishii, Ullmer 1997).
In the early years of the new century researchers with a design background more frequently participated at HCI-related conferences, starting a dialogue. From about the same time, the number of workshops addressing Tangible User Interfaces or Tangible Interaction (a term which was proposed by parts of the design community) as a topic increased steadily. From this grew an interdisciplinary research community that adopted the term ‘Tangible Interaction’ to describe its shared focus, and has its own conference since 2007.
With emerging technologies coming quickly onto the market, the field has become more diverse (e.g. some systems involve actuation, some rely on complex sensor-based data-collection, some are based on conductive fabrics etc.) and also more inclusive, as it has become easier and cheaper to build working prototypes and functioning systems. Whereas in the late 90s, specialized hardware and expertise was required to build a prototype with comparatively simple functionality, in 2009 this has become a standard project assignment in many industrial or interaction design courses.
The following gives an overview of the major influencing perspectives. As much of the conceptual and visionary development went hand in hand with the building of prototype systems, this is very much in the style of ‘a history through examples’.
45.2 HCI and Computing: Tangible User Interfaces
Within Computing and HCI Tangible Interaction first became prominent with the notion of 'Tangible User Interfaces' (TUIs) proposed by Hiroshi Ishii and his group at the MIT Media Lab in 1997 (Ishii, Ullmer 1997). This work built on prior work by George Fitzmaurice in collaboration with Bill Buxton and Ishii himself (Fitzmaurice, Ishii, Buxton 1995). Fitzmaurice's PhD thesis (1996) explored the use of graspable bricks as a more direct input mechanism for the interaction with graphical representations. It further suggested employing multiple graspable objects that are distributed in space, with strong-specific functionality, instead of the generic input device we know as a mouse, which distributes input over time. The bricks were laid on top of graphics (displayed on a horizontal screen), which then got anchored to them. Moving a brick thus moved the graphics, and moving two corners of a triangle apart with two bricks would stretch the triangle correspondingly.
Tangible User Interfaces were envisioned as an alternative to graphical displays that would bring some of the richness of interaction we have with physical devices back into our interaction with digital content (Ishii, Ullmer 1997). It was proposed to represent digital content through tangible objects, which could then be manipulated via physical interaction with these tangibles. The core idea was to quite literally allow users to grasp data with their hands and to unify representation and control. Digital representations were thought to be closely coupled, usually through graphical projections on and around the tangible objects, which came to be referred to as 'tokens'.
One of the first examples developed by the MIT's Tangible Media Group was a map that was manipulated by placing iconic representation of central buildings on it and moving these apart. Later-on the research group developed Urp, a system that supports urban planning (Underkoffler, Ishii 1999). Urp integrates a physical model with an interactive simulation of the effects of building placement on sunlight and wind flow. The tangible models of buildings cast (digital) shadows that are projected onto the surface. Simulated wind flow is projected as lines onto the surface. Several tools are available to probe e.g. the wind speed or the distance between points in space, and to change the properties of buildings (glass or stone walls) or the time of day, resulting in shadows moving. Over the years, a series of related systems have been built, and the notion of TUIs was taken up by many other research groups worldwide.
45.3 Influences from other disciplines: Product/Industrial Design and the Arts
Within other disciplines, the merging of physical form with digital contents and behaviors occurred alike. Product Design increasingly concerns complex computational behavior and designers need to rethink how to make IT-related appliances legible and usable. Some design researchers have come to investigate how form and digital behavior can be more closely coupled and how users could interact in richer ways with digital products (Djajadiningrat et al 2004; Jensen, Buur, Djajadiningrat 2005). The Marble Answering Machine is an early example of this endeavour. The term 'Tangible Interaction' originated in this context.
Djajadiningrat et al (2004) describe a concept sketch for a videodeck that integrates the physical controls within the mechanism of the mechanical device, creating physical legibility of the controls. For example the contours of the device are broken where there is interaction with the outside world. The eject button has turned into a ribbon which lies under the tape and is pulled outward. They further describe the concept design of a digital camera that attempts to replace all of the typical menu functions and identically looking buttons with physical manipulations of the camera. Here, the user e.g. slides the screen towards the memory card in order to save an image and slides the screen towards the lens to go into ready mode again.
A further merging of digital and physical design can be seen in the emergence of 'Physical Computing' within design worldwide through a culture of tinkering and making things (cp. Igoe and O'Sullivan 2004). Physical Computing involves fast prototyping with electronics, and often reuses and scavenges existing technology (tinkering). It is defined as the design of interactive objects, which are controlled by software, and that people interact with via sensors and actuators.
Within the interactive arts a related development can be seen. Many installations employ 'interactive spaces' which are sensorized to track users' behavior and integrate tangible objects into the installation (see e.g. Bongers 2002). Often, whole-body movement is used to interact within these environments. Interaction designers have also developed an interest in bodily interaction, which can be pure movement (gestures, dance) or is related to physical objects (Hummels, Overbeeke, Klooster 2007).
In a sense, whole-body interaction and interactive spaces is thinking of Tangible Interaction on another scale - instead of interacting with small objects that we can grab and move around within arms reach (this is more the focus of Tangible User Interfaces and Product Design) we interact with large objects within a large space and therefore need to move around with our whole body.
45.4 ‘Tangible Interaction’ brought different perspectives under one umbrella
The term 'Tangible Interaction' has come to embrace all these developments. As argued by Hornecker and Buur (2006), the field prioritizes as principles of design:
tangibility and materiality
physical embodiment of data
embeddedness in real spaces and contexts.
Hornecker and Buur argue that the original definition of Tangible User Interfaces excludes many interesting developments and systems from product design and the arts and therefore suggest using a more inclusive, less strictly defined term. The shift in phrasing from Tangible Interface to Tangible Interaction was intentional, similar to the distinction between Interface and Interaction Design. It places the focus on the design of the interaction instead of the visible interface. This puts the qualities of the interaction into the foreground of attention, and requires system designers to think about what people actually do with the system (see also: Djajadiningrat, Overbeeke, Wensveen 2000; Jensen, Buur, Djajadiningrat 2005). It further encourages thinking of the tangible system as part of a larger ecology and as located in a specific context. This has been described as the 'practice turn' by Fernaeus et al (2008), with newer conceptualizations of Tangible Interaction focusing on human action, control, creativity and social action instead of the representation and transmission of information.
The adoption of ‘Tangible Interaction’ as umbrella term has supported the development of a larger interdisciplinary research community (the TEI conference series), but as a downside, results in some tension/ambivalence as to where to draw the line between Tangible Interaction and other areas. For a report on discussions during the TEI 2007 and TEI 2008 panel discussions see Hornecker et al (2008). For example it remains open whether a car is a Tangible Interface and whether gesture-based interaction can be considered tangible interaction. Different people in the research community would answer this question in different ways.
Tangible Interaction therefore overlaps at its fringes with a range of other research areas, summarized in this encyclopedia entry under ‘Related Topics’. Whether a particular paper is framed as ‘tangible’ or e.g. as gesture-based interaction often depends on the conference or journal that it is submitted to. The research community seems well aware of this ambivalence, but has decided to embrace it: The TEI conference in 2010 changed its name from ‘Tangible and Embedded’ to ‘Tangible Embedded and Embodied Interaction’ in order to more explicitly invite research on whole-body or gestural interaction.
45.5 Research directions
Tangible Interaction is a growing research area. Its commercial relevance is still somewhat unclear (if we disregard standard product design for a moment). Yet companies like Philips Design and Microsoft Research increasingly invest in research in this area, and TEI 2009 was hosted by Microsoft Research in Cambridge, UK.
Furthermore there are an increasing number of spin-off companies that market systems in this area. The system currently probably best known to the public and the media is the ReacTable (http://mtg.upf.es/reactable/ and http://www.reactable.com/, see Jordá et al 2007) from the Universitat Pompeu Fabra. This is a table-based music performance instrument combining tangible input (movement of tagged objects on a flat surface) with multitouch interaction on the surface, enabling users to manipulate the graphics projected around the tangible input objects with their fingers. It was used by Björk during her 2007 world tour, won the Prix Ars Electronica Nica in 2008, and is now being marketed for museums and – soon – for musicians and DJs.
Application areas for Tangible Interaction are diverse. Many projects are aimed at supporting learning and education. This is where so far the most systems have been employed outside of the lab. Common are also domestic appliances, interactive music installations or instruments, museum installations, and tools to support planning and decision making.
Research still needs to tease apart what exactly are the advantages of tangible interaction systems and for which contexts and application areas they are the most suitable. While there is good evidence that tangibles tend to support collaboration and social interaction (Hornecker, Buur 2006), it is, for example, less clear what kinds of tangibles are most effective in supporting learning (see Marshall 2007). Related to this question, design knowledge and guidelines are still scarce.
The availability of toolkits for physical computing has made it significantly easier to develop systems, contributing to the interdisciplinarity of the field.
An exciting new direction for evolving work lies in the use of actuation. While with Tangible User Interfaces initially only input was tangible, actuation allows for tangible system output beyond visual and auditory feedback.
45.6 Relevant conference series
TEI (Tangible, Embedded, and Embodied Interaction) is the first conference series that is dedicated to Tangible Interaction. It took place first in 2007, in Baton Rouge, Louisiana. TEI is a yearly conference with proceedings published in the ACM DL and in 2010 is now organized in collaboration with ACM SigCHI.
Voice your opinition or make additions to this entry in the comments further down the page.
45.9 Suggestions for further reading
..E_USER_ERROR  Missing argument 4 for page::getReferenceList(), called in /usr/www/users/mads/www.interaction-design.org/source/php/model/publication.php(712) : eval()'d code on line 7 and defined (in line 1094 of file /usr/www/users/mads/www.interaction-design.org/source/php/model/page.php) Backtrace: Function getCallStack called on line 104 of file /usr/www/users/mads/www.interaction-design.org/source/php/utility_functions/utilty_functions_debugging_and_errorhandling.php Function getReferenceList in class page called on line 7 of file /usr/www/users/mads/www.interaction-design.org/source/php/model/publication.php(712) : eval()'d code Function eval called on line 712 of file /usr/www/users/mads/www.interaction-design.org/source/php/model/publication.php Function evalHTML in class publication called on line 142 of file /usr/www/users/mads/www.interaction-design.org/migrate_to_idf2.html Function migrateBodyColumn called on line 32 of file /usr/www/users/mads/www.interaction-design.org/migrate_to_idf2.html
Igoe, Tom and O'Sullivan, Dan (2004): Physical Computing: Sensing and Controlling the Physical World with Computers. Course Technology
Igoe, Tom and O'Sullivan, Dan (2004): Physical Computing: Sensing and Controlling the Physical World with Computers. Course Technology
Ishii, Hiroshi (2007): Tangible User Interfaces. In: Sears, Andrew and Jacko, Julie A. (eds.). "The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications (2nd Edition)". Lawrence Erlbaum Associatespp. 469-487
Ullmer, Brygg and Ishii, Hiroshi (2001): Emerging Frameworks for Tangible User Interfaces. In: Carroll, John M. (ed.). "Human-Computer Interaction in the New Millennium". Addison-Wesley Publishingpp. 579-601
45.10 Extended literature list
..E_USER_ERROR  Missing argument 4 for page::getReferenceList(), called in /usr/www/users/mads/www.interaction-design.org/source/php/model/publication.php(712) : eval()'d code on line 11 and defined (in line 1094 of file /usr/www/users/mads/www.interaction-design.org/source/php/model/page.php) Backtrace: Function getCallStack called on line 104 of file /usr/www/users/mads/www.interaction-design.org/source/php/utility_functions/utilty_functions_debugging_and_errorhandling.php Function getReferenceList in class page called on line 11 of file /usr/www/users/mads/www.interaction-design.org/source/php/model/publication.php(712) : eval()'d code Function eval called on line 712 of file /usr/www/users/mads/www.interaction-design.org/source/php/model/publication.php Function evalHTML in class publication called on line 142 of file /usr/www/users/mads/www.interaction-design.org/migrate_to_idf2.html Function migrateBodyColumn called on line 32 of file /usr/www/users/mads/www.interaction-design.org/migrate_to_idf2.html