Publication statistics

Pub. period:2005-2012
Pub. count:19
Number of co-authors:40


Number of publications with 3 favourite co-authors:

Alexander Wiethoff:
Anna Magdalena Blockner:
Johannes Schoning:



Productive colleagues

Sebastian Boring's 3 most productive colleagues in number of publications:

Saul Greenberg:140
Gregory D. Abowd:116
Albrecht Schmidt:107

Upcoming Courses

go to course
Quality Web Communication: The Beginner's Guide
Starts tomorrow LAST CALL!
go to course
UI Design Patterns for Successful Software
90% booked. Starts in 5 days

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !


Our Latest Books

The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading

Sebastian Boring


Publications by Sebastian Boring (bibliography)

 what's this?
Edit | Del

Ledo, David, Nacenta, Miguel A., Marquardt, Nicolai, Boring, Sebastian and Greenberg, Saul (2012): The HapticTouch toolkit: enabling exploration of haptic interactions. In: Proceedings of the 6th International Conference on Tangible and Embedded Interaction 2012. pp. 115-122.

In the real world, touch based interaction relies on haptic feedback (e.g., grasping objects, feeling textures). Unfortunately, such feedback is absent in current tabletop systems. The previously developed Haptic Tabletop Puck (HTP) aims at supporting experimentation with and development of inexpensive tabletop haptic interfaces in a do-it-yourself fashion. The problem is that programming the HTP (and haptics in general) is difficult. To address this problem, we contribute the Haptictouch toolkit, which enables developers to rapidly prototype haptic tabletop applications. Our toolkit is structured in three layers that enable programmers to: (1) directly control the device, (2) create customized combinable haptic behaviors (e.g., softness, oscillation), and (3) use visuals (e.g., shapes, images, buttons) to quickly make use of these behaviors. In our preliminary exploration we found that programmers could use our toolkit to create haptic tabletop applications in a short amount of time.

© All rights reserved Ledo et al. and/or ACM Press

Edit | Del

Hausen, Doris, Boring, Sebastian, Lueling, Clara, Rodestock, Simone and Butz, Andreas (2012): StaTube: facilitating state management in instant messaging systems. In: Proceedings of the 6th International Conference on Tangible and Embedded Interaction 2012. pp. 283-290.

Instant messaging systems, such as Skype, offer text, audio and video channels for one-on-one and group conversations, both for personal and professional communication. They are commonly used at a distance, i.e., across countries and continents. To avoid disrupting other tasks, they display personal states to signal others when to contact someone and when not. This mechanism, however, heavily relies on users setting their own state correctly. In an online survey with 46 participants we found that neglecting state updates leads to unwanted messages, either because the state is incorrect or others disrespect it because they assume it to be wrong anyway. We address this situation with the StaTube, a tangible object offering (1) peripheral interaction for setting one's own state and (2) peripheral awareness of selected others' state. In an in-situ evaluation we found first indicators that (1) peripheral interaction fosters more frequent state updates and more accurate state information, and (2) that our participants felt more aware of their contacts' states due to the physical ambient representation.

© All rights reserved Hausen et al. and/or ACM Press

Edit | Del

Chen, Xiang 'Anthony', Boring, Sebastian, Carpendale, Sheelagh and Greenberg, Saul (2012): Spalendar: visualizing a group's calendar events over a geographic space on a public display. In: Proceedings of the 2012 International Conference on Advanced Visual Interfaces 2012. pp. 689-696.

Portable paper calendars (i. e., day planners and organizers) have greatly influenced the design of group electronic calendars. Both use time units (hours/days/weeks/etc.) to organize visuals, with useful information (e.g., event types, locations, attendees) usually presented as -- perhaps abbreviated or even hidden -- text fields within those time units. The problem is that, for a group, this visual sorting of individual events into time buckets conveys only limited information about the social network of people. For example, people's whereabouts cannot be read 'at a glance' but require examining the text. Our goal is to explore an alternate visualization that can reflect and illustrate group members' calendar events. Our main idea is to display the group's calendar events as spatiotemporal activities occurring over a geographic space animated over time, all presented on a highly interactive public display. In particular, our Spalendar (Spatial Calendar) design animates people's past, present and forthcoming movements between event locations as well as their static locations. Detail of people's events, their movements and their locations is progressively revealed and controlled by the viewer's proximity to the display, their identity, and their gestural interactions with it, all of which are tracked by the public display.

© All rights reserved Chen et al. and/or ACM Press

Edit | Del

Boring, Sebastian, Ledo, David, Chen, Xiang 'Anthony', Marquardt, Nicolai and Greenberg, Saul (2012): The fat thumb: using the thumb's contact size for single-handed mobile interaction. In: Proceedings of the 14th Conference on Human-computer interaction with mobile devices and services 2012. pp. 39-48.

Modern mobile devices allow a rich set of multi-finger interactions that combine modes into a single fluid act, for example, one finger for panning blending into a two-finger pinch gesture for zooming. Such gestures require the use of both hands: one holding the device while the other is interacting. While on the go, however, only one hand may be available to both hold the device and interact with it. This mostly limits interaction to a single-touch (i.e., the thumb), forcing users to switch between input modes explicitly. In this paper, we contribute the Fat Thumb interaction technique, which uses the thumb's contact size as a form of simulated pressure. This adds a degree of freedom, which can be used, for example, to integrate panning and zooming into a single interaction. Contact size determines the mode (i.e., panning with a small size, zooming with a large one), while thumb movement performs the selected mode. We discuss nuances of the Fat Thumb based on the thumb's limited operational range and motor skills when that hand holds the device. We compared Fat Thumb to three alternative techniques, where people had to precisely pan and zoom to a predefined region on a map and found that the Fat Thumb technique compared well to existing techniques.

© All rights reserved Boring et al. and/or ACM Press

Edit | Del

Chen, Xiang 'Anthony', Marquardt, Nicolai, Boring, Sebastian and Greenberg, Saul (2012): Extending a mobile device's interaction space through body-centric interaction. In: Proceedings of the 14th Conference on Human-computer interaction with mobile devices and services 2012. pp. 151-160.

Modern mobile devices rely on the screen as a primary input modality. Yet the small screen real-estate limits interaction possibilities, motivating researchers to explore alternate input techniques. Within this arena, our goal is to develop Body-Centric Interaction with Mobile Devices: a class of input techniques that allow a person to position and orient her mobile device to navigate and manipulate digital content anchored in the space on and around the body. To achieve this goal, we explore such interaction in a bottom-up path of prototypes and implementations. From our experiences, as well as by examining related work, we discuss and present three recurring themes that characterize how these interactions can be realized. We illustrate how these themes can inform the design of Body-Centric Interactions by applying them to the design of a novel mobile browser application. Overall, we contribute a class of mobile input techniques where interactions are extended beyond the small screen, and are instead driven by a person's movement of the device on and around the body.

© All rights reserved Chen et al. and/or ACM Press

Edit | Del

Boring, Sebastian, Gehring, Sven, Wiethoff, Alexander, Blockner, Anna Magdalena, Schoning, Johannes and Butz, Andreas (2011): Multi-user interaction on media facades through live video on mobile devices. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 2721-2724.

The increasing number of media facades in urban spaces offers great potential for new forms of interaction especially for collaborative multi-user scenarios. In this paper, we present a way to directly interact with them through live video on mobile devices. We extend the Touch Projector interface to accommodate multiple users by showing individual content on the mobile display that would otherwise clutter the facade's canvas or distract other users. To demonstrate our concept, we built two collaborative multi-user applications: (1) painting on the facade and (2) solving a 15-puzzle. We gathered informal feedback during the ARS Electronica Festival in Linz, Austria and found that our interaction technique is (1) considered easy-to-learn, but (2) may leave users unaware of the actions of others.

© All rights reserved Boring et al. and/or their publisher

Edit | Del

Marquardt, Nicolai, Kiemer, Johannes, Ledo, David, Boring, Sebastian and Greenberg, Saul (2011): Designing user-, hand-, and handpart-aware tabletop interactions with the TouchID toolkit. In: Proceedings of the 2011 ACM International Conference on Interactive Tabletops and Surfaces 2011. pp. 21-30.

Recent work in multi-touch tabletop interaction introduced many novel techniques that let people manipulate digital content through touch. Yet most only detect touch blobs. This ignores richer interactions that would be possible if we could identify (1) which part of the hand, (2) which side of the hand, and (3) which person is actually touching the surface. Fiduciary-tagged gloves were previously introduced as a simple but reliable technique for providing this information. The problem is that its low-level programming model hinders the way developers could rapidly explore new kinds of user- and handpart-aware interactions. We contribute the TouchID toolkit to solve this problem. It allows rapid prototyping of expressive multi-touch interactions that exploit the aforementioned characteristics of touch input. TouchID provides an easy-to-use event-driven API as well as higher-level tools that facilitate development: a glove configurator to rapidly associate particular glove parts to handparts; and a posture configurator and gesture configurator for registering new hand postures and gestures for the toolkit to recognize. We illustrate TouchID's expressiveness by showing how we developed a suite of techniques that exploits knowledge of which handpart is touching the surface.

© All rights reserved Marquardt et al. and/or ACM Press

Edit | Del

Marquardt, Nicolai, Diaz-Marino, Robert, Boring, Sebastian and Greenberg, Saul (2011): The proximity toolkit: prototyping proxemic interactions in ubiquitous computing ecologies. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 315-326.

People naturally understand and use proxemic relationships (e.g., their distance and orientation towards others) in everyday situations. However, only few ubiquitous computing (ubicomp) systems interpret such proxemic relationships to mediate interaction (proxemic interaction). A technical problem is that developers find it challenging and tedious to access proxemic information from sensors. Our Proximity Toolkit solves this problem. It simplifies the exploration of interaction techniques by supplying fine-grained proxemic information between people, portable devices, large interactive surfaces, and other non-digital objects in a room-sized environment. The toolkit offers three key features. 1) It facilitates rapid prototyping of proxemic-aware systems by supplying developers with the orientation, distance, motion, identity, and location information between entities. 2) It includes various tools, such as a visual monitoring tool, that allows developers to visually observe, record and explore proxemic relationships in 3D space. (3) Its flexible architecture separates sensing hardware from the proxemic data model derived from these sensors, which means that a variety of sensing technologies can be substituted or combined to derive proxemic information. We illustrate the versatility of the toolkit with proxemic-aware systems built by students.

© All rights reserved Marquardt et al. and/or ACM Press

Edit | Del

Wimmer, Raphael, Hennecke, Fabian, Schulz, Florian, Boring, Sebastian, Butz, Andreas and Humann, Heinrich (2010): Curve: revisiting the digital desk. In: Proceedings of the Sixth Nordic Conference on Human-Computer Interaction 2010. pp. 561-570.

Current desktop workspace environments consist of a vertical area (e.g., a screen with a virtual desktop) and a horizontal area (e.g., the physical desk). Daily working activities benefit from different intrinsic properties of both of these areas. However, both areas are distinct from each other, making data exchange between them cumbersome. Therefore, we present Curve, a novel interactive desktop environment, which combines advantages of vertical and horizontal working areas using a continuous curved connection. This connection offers new ways of direct multi-touch interaction and new ways of information visualization. We describe our basic design, the ergonomic adaptations we made, and discuss technical challenges we met and expect to meet while building and configuring the system.

© All rights reserved Wimmer et al. and/or their publisher

Edit | Del

Streng, Sara, Stegmann, Karsten, Boring, Sebastian, Bohm, Sonja, Fischer, Frank and Hussmann, Heinrich (2010): Measuring effects of private and shared displays in small-group knowledge sharing processes. In: Proceedings of the Sixth Nordic Conference on Human-Computer Interaction 2010. pp. 789-792.

Knowledge sharing is important in every team or organization. Various tools are frequently used in meetings to support knowledge sharing, ranging from pen-and-paper to whiteboards and other shared workspaces. This paper reports on a user study that investigated how private and shared displays affect knowledge sharing processes in co-located meetings. Three setups were compared in a hidden-profile experiment: a distributed system providing a shared display and laptops (Note&Share), a regular whiteboard and pen-and-paper. The results show several advantages of the distributed system. For example, the group was more confident in the solution when using Note&Share. Furthermore the number of shared arguments was significantly closer to the correct number, which suggests that misunderstandings occurred less frequently. Finally some interesting effects were observed, which we claim to be connected to the availability of pen-and-paper in all conditions. Therefore, we discuss the observed effects as well as general lessons learned from this experiment.

© All rights reserved Streng et al. and/or their publisher

Edit | Del

Boring, Sebastian, Baur, Dominikus, Butz, Andreas, Gustafson, Sean and Baudisch, Patrick (2010): Touch projector: mobile interaction through video. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 2287-2296.

In 1992, Tani et al. proposed remotely operating machines in a factory by manipulating a live video image on a computer screen. In this paper we revisit this metaphor and investigate its suitability for mobile use. We present Touch Projector, a system that enables users to interact with remote screens through a live video image on their mobile device. The handheld device tracks itself with respect to the surrounding displays. Touch on the video image is "projected" onto the target display in view, as if it had occurred there. This literal adaptation of Tani's idea, however, fails because handheld video does not offer enough stability and control to enable precise manipulation. We address this with a series of improvements, including zooming and freezing the video image. In a user study, participants selected targets and dragged targets between displays using the literal and three improved versions. We found that participants achieved highest performance with automatic zooming and temporary image freezing.

© All rights reserved Boring et al. and/or their publisher

Edit | Del

Baur, Dominikus, Boring, Sebastian and Butz, Andreas (2010): Rush: repeated recommendations on mobile devices. In: Proceedings of the 2010 International Conference on Intelligent User Interfaces 2010. pp. 91-100.

We present rush as a recommendation-based interaction and visualization technique for repeated item selection from large data sets on mobile touch screen devices. Proposals and choices are intertwined in a continuous finger gesture navigating a two-dimensional canvas of recommended items. This provides users with more flexibility for the resulting selections. Our design is based on a formative user study regarding orientation and occlusion aspects. Subsequently, we implemented a version of rush for music playlist creation. In an experimental evaluation we compared different types of recommendations based on similarity, namely the top 5 most similar items, five random selections from the list of similar items and a hybrid version of the two. Participants had to create playlists using each condition. Our results show that top 5 was too restricting, while random and hybrid suggestions had comparable results.

© All rights reserved Baur et al. and/or their publisher

Edit | Del

Wimmer, Raphael and Boring, Sebastian (2009): HandSense: discriminating different ways of grasping and holding a tangible user interface. In: Villar, Nicolas, Izadi, Shahram, Fraser, Mike and Benford, Steve (eds.) TEI 2009 - Proceedings of the 3rd International Conference on Tangible and Embedded Interaction February 16-18, 2009, Cambridge, UK. pp. 359-362.

Edit | Del

Boring, Sebastian, Jurmu, Marko and Butz, Andreas (2009): Scroll, tilt or move it: using mobile phones to continuously control pointers on large public displays. In: Proceedings of OZCHI09, the CHISIG Annual Conference on Human-Computer Interaction 2009. pp. 161-168.

Large and public displays mostly provide little interactivity due to technical constraints, making it difficult for people to capture interesting information or to influence the screen's content. Through the combination of largescale visual output and the mobile phone as an input device, bidirectional interaction with large public displays can be enabled. In this paper, we propose and compare three different interaction techniques (Scroll, Tilt and Move) for continuous control of a pointer located on a remote display using a mobile phone. Since each of these techniques seemed to have arguments for and against them, we conducted a comparative evaluation and discovered their specific strengths and weaknesses. We report the implementation of the techniques, their design and results of our user study. The experiment revealed that while Move and Tilt can be faster, they also introduce higher error rates for selection tasks.

© All rights reserved Boring et al. and/or their publisher

Edit | Del

Luca, Alexander De, Frauendienst, Bernhard, Boring, Sebastian and Hussmann, Heinrich (2009): My phone is my keypad: privacy-enhanced PIN-entry on public terminals. In: Proceedings of OZCHI09, the CHISIG Annual Conference on Human-Computer Interaction 2009. pp. 401-404.

More and more services are available on public terminals. Due to their public location and permanent availability, they can easily fall victim to manipulation. These manipulations mostly aim at stealing the customers' authentication information (e.g. bank card PIN) to gain access to the victims' possessions. By relocating the input from the terminal to the users' mobile device, the system presented in this paper makes the authentication process resistant against such manipulations. In principle, this relocation makes PIN entry more complex, with a tendency to worse usability. In this paper, we present the concept as well as an evaluation that has been conducted to study the trade off between usability and security. The results show that users apparently are willing to accept a certain increase of interaction time in exchange for improved security.

© All rights reserved Luca et al. and/or their publisher

Edit | Del

Hilliges, Otmar, Terrenghi, Lucia, Boring, Sebastian, Kim, David, Richter, Hendrik and Butz, Andreas (2007): Designing for collaborative creative problem solving. In: Proceedings of the 2007 Conference on Creativity and Cognition 2007, Washington DC, USA. pp. 137-146.

Collaborative creativity is traditionally supported by formal techniques, such as brainstorming. These techniques improve the idea-generation process by creating group synergies, but also suffer from a number of negative effects. Current electronic tools to support collaborative creativity overcome some of these problems, but introduce new ones, by either losing the benefits of face-to-face communication or the immediacy of simultaneous contribution. Using an interactive environment as a test bed, we are investigating how collaborative creativity can be supported electronically while maintaining face-to-face communication. What are the design-factors influencing such a system? We have designed a brainstorming application that uses an interactive table and a large wall display, and compared the results of using it to traditional paper-based brainstorming in a user study with 30 participants. From the considerations that went into the design and the observations during the study we derive a number of design guidelines for collaborative systems in interactive environments.

© All rights reserved Hilliges et al. and/or ACM Press

Edit | Del

Boring, Sebastian, Hilliges, Otmar and Butz, Andreas (2007): A Wall-Sized Focus Plus Context Display. In: PerCom 2007 - Fifth Annual IEEE International Conference on Pervasive Computing and Communications 19-23 March, 2007, White Plains, New York, USA. pp. 161-170.

Edit | Del

Wimmer, Raphael, Kranz, Matthias, Boring, Sebastian and Schmidt, Albrecht (2007): A Capacitive Sensing Toolkit for Pervasive Activity Detection and Recognition. In: PerCom 2007 - Fifth Annual IEEE International Conference on Pervasive Computing and Communications 19-23 March, 2007, White Plains, New York, USA. pp. 171-180.

Edit | Del

Kientz, Julie A., Boring, Sebastian, Abowd, Gregory D. and Hayes, Gillian R. (2005): Abaris: Evaluating Automated Capture Applied to Structured Autism Interventions. In: Beigl, Michael, Intille, Stephen S., Rekimoto, Jun and Tokuda, Hideyuki (eds.) UbiComp 2005 Ubiquitous Computing - 7th International Conference September 11-14, 2005, Tokyo, Japan. pp. 323-339.

Add publication
Show list on your website

Join our community and advance:




Join our community!

Page Information

Page maintainer: The Editorial Team