Publication statistics

Pub. period:2009-2012
Pub. count:14
Number of co-authors:25



Co-authors

Number of publications with 3 favourite co-authors:

Michael Rohs:13
Georg Essl:2
Jorg Mller:2

 

 

Productive colleagues

Sven Kratz's 3 most productive colleagues in number of publications:

Albrecht Schmidt:111
Antonio Krger:59
Michael Rohs:46
 
 
 

Upcoming Courses

go to course
UI Design Patterns for Successful Software
Starts tomorrow LAST CALL!
go to course
Affordances: Designing Intuitive User Interfaces
Starts the day after tomorrow !
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

 
 
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
 
 
 
 
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading
 
 

Sven Kratz

 

Publications by Sven Kratz (bibliography)

 what's this?
2012
 
Edit | Del

Kratz, Sven, Rohs, Michael, Guse, Dennis, Mller, Jorg, Bailly, Gilles and Nischt, Michael (2012): PalmSpace: continuous around-device gestures vs. multitouch for 3D rotation tasks on mobile devices. In: Proceedings of the 2012 International Conference on Advanced Visual Interfaces 2012. pp. 181-188. Available online

Rotating 3D objects is a difficult task on mobile devices, because the task requires 3 degrees of freedom and (multi-)touch input only allows for an indirect mapping. We propose a novel style of mobile interaction based on mid-air gestures in proximity of the device to increase the number of DOFs and alleviate the limitations of touch interaction with mobile devices. While one hand holds the device, the other hand performs mid-air gestures in proximity of the device to control 3D objects on the mobile device's screen. A flat hand pose defines a virtual surface which we refer to as the PalmSpace for precise and intuitive 3D rotations. We constructed several hardware prototypes to test our interface and to simulate possible future mobile devices equipped with depth cameras. We conducted a user study to compare 3D rotation tasks using the most promising two designs for the hand location during interaction -- behind and beside the device -- with the virtual trackball, which is the current state-of-art technique for orientation manipulation on touch-screens. Our results show that both variants of PalmSpace have significantly lower task completion times in comparison to the virtual trackball.

© All rights reserved Kratz et al. and/or ACM Press

2011
 
Edit | Del

Shirazi, Alireza Sahami, Rohs, Michael, Schleicher, Robert, Kratz, Sven, Mller, Alexander and Schmidt, Albrecht (2011): Real-time nonverbal opinion sharing through mobile phones during sports events. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 307-310. Available online

Even with the rise of the World Wide Web, TV has remained the most pervasive entertainment medium and is nowadays often used together with other media, which allow for active participation. The idea of connecting non-collocated TV viewers via telecommunication technologies, referred to as Social TV, has recently received considerable attention. Such systems typically include set-top boxes for supporting collaboration. In this research we investigate if real-time opinion sharing about TV shows through a nonverbal (non-textual) iconic UI on mobile phones is reasonable. For this purpose we developed a mobile app, made it available to a large number of users through the Android Market, and conducted an uncontrolled user study in the wild during the soccer world cup 2010. The results of the study indicate that TV viewers who used the app had more fun and felt more connected to other viewers. We also show that by monitoring this channel it is possible to collect sentiments relevant to the broadcasted content in real-time. The collected data exemplify that the aggregated sentiments correspond to important moments, and hence can be used to generate a summary of the event.

© All rights reserved Shirazi et al. and/or their publisher

 
Edit | Del

Kratz, Sven, Westermann, Tilo, Rohs, Michael and Essl, Georg (2011): CapWidgets: tangile widgets versus multi-touch controls on mobile devices. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 1351-1356. Available online

We present CapWidgets, passive tangible controls for capacitive touch screens. CapWidgets bring back physical controls to off-the-shelf multi-touch surfaces as found in mobile phones and tablet computers. While the user touches the widget, the surface detects the capacitive marker on the widget's underside. We study the relative performance of this tangible interaction with direct multi-touch interaction and our experimental results show that user performance and preferences are not automatically in favor of tangible widgets and careful design is necessary to validate their properties.

© All rights reserved Kratz et al. and/or their publisher

 
Edit | Del

Kratz, Sven, Rohs, Michael, Wolf, Katrin, Mller, Jorg, Wilhelm, Mathias, Johansson, Carolina, Tholander, Jakob and Laaksolahti, Jarmo (2011): Body, movement, gesture & tactility in interaction with mobile devices. In: Proceedings of 13th Conference on Human-computer interaction with mobile devices and services 2011. pp. 757-759. Available online

In the search for novel and more expressive interaction techniques for mobile devices, bodily aspects such as movement, gesture, and touch based interfaces are prominent. For instance, touch-screen gestures have found widespread application in mobile device interfaces while bodily gestures involving device movement are successfully applied in gaming scenarios. Research systems increasingly explore other modalities, like pressure, free-hand and on body interaction in mobile settings. This has become possible through on-going developments that have made sensing and actuating technologies cheaper and more easily integrated in mobile and handheld devices. The turn towards experiential, embodied, and enacted perspectives on cognition and action has also contributed to a shift in what aspects of interaction to focus upon in interaction design. This has led HCIresearchers to explore not only how the whole human body can be taken into account in design, but also to explore new domains of application for instance in leisure, entertainment and public urban environments.

© All rights reserved Kratz et al. and/or ACM Press

 
Edit | Del

Qin, Qian, Rohs, Michael and Kratz, Sven (2011): Dynamic ambient lighting for mobile devices. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 51-52. Available online

The information a small mobile device can show via its display has been always limited by its size. In large information spaces, relevant information, such as important locations on a map can get clipped when a user starts zooming and panning. Dynamic ambient lighting allows mobile devices to visualize off-screen objects by illuminating the background without compromising valuable display space. The lighted spots can be used to show the direction and distance of such objects by varying the spot's position and intensity. Dynamic ambient lighting also provides a new way of displaying the state of a mobile device. Illumination is provided by a prototype rear of device shell which contains LEDs and requires the device to be placed on a surface, such as a table or desk.

© All rights reserved Qin et al. and/or ACM Press

2010
 
Edit | Del

Kratz, Sven, Brodien, Ivo and Rohs, Michael (2010): Semi-automatic zooming for mobile map navigation. In: Proceedings of 12th Conference on Human-computer interaction with mobile devices and services 2010. pp. 63-72. Available online

In this paper we present a novel interface for mobile map navigation based on Semi-Automatic Zooming (SAZ). SAZ gives the user the ability to manually control the zoom level of an SDAZ interface, while retaining the automatic zooming characteristics of that interface at times when the user is not explicitly controlling the zoom level. In a user study conducted using a realistic mobile map with a wide scale space, we compare SAZ with existing map interface techniques, multi-touch and Speed-Dependent Automatic Zooming (SDAZ). We extend a dynamic state-space model for Speed-Dependent Automatic Zooming (SDAZ) to accept 2D tilt input for scroll rate and zoom level control and implement a dynamically zoomable map view with access to high-resolution map material for use in our study. The study reveals that SAZ performs significantly better than SDAZ and that SAZ is comparable in performance and usability to a standard multi-touch map interface. Furthermore, the study shows that SAZ could serve as an alternative to multi-touch as input technique for mobile map interfaces.

© All rights reserved Kratz et al. and/or their publisher

 
Edit | Del

Stewart, Craig, Rohs, Michael, Kratz, Sven and Essl, Georg (2010): Characteristics of pressure-based input for mobile devices. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 801-810. Available online

We conducted a series of user studies to understand and clarify the fundamental characteristics of pressure in user interfaces for mobile devices. We seek to provide insight to clarify a longstanding discussion on mapping functions for pressure input. Previous literature is conflicted about the correct transfer function to optimize user performance. Our study results suggest that the discrepancy can be explained by different signal conditioning circuitry and with improved signal conditioning the user-performed precision relationship is linear. We also explore the effects of hand pose when applying pressure to a mobile device from the front, the back, or simultaneously from both sides in a pinching movement. Our results indicate that grasping type input outperforms single-sided input and is competitive with pressure input against solid surfaces. Finally we provide an initial exploration of non-visual multimodal feedback, motivated by the desire for eyes-free use of mobile devices. The findings suggest that non-visual pressure input can be executed without degradation in selection time but suffers from accuracy problems.

© All rights reserved Stewart et al. and/or their publisher

 
Edit | Del

Kratz, Sven and Rohs, Michael (2010): A $3 gesture recognizer: simple gesture recognition for devices equipped with 3D acceleration sensors. In: Proceedings of the 2010 International Conference on Intelligent User Interfaces 2010. pp. 341-344. Available online

We present the $3 Gesture Recognizer, a simple but robust gesture recognition system for input devices featuring 3D acceleration sensors. The algorithm is designed to be implemented quickly in prototyping environments, is intended to be device-independent and does not require any special toolkits or frameworks. It relies solely on simple trigonometric and geometric calculations. A user evaluation of our system resulted in a correct gesture

© All rights reserved Kratz and Rohs and/or their publisher

 
Edit | Del

Kratz, Sven and Rohs, Michael (2010): The $3 recognizer: simple 3D gesture recognition on mobile devices. In: Proceedings of the 2010 International Conference on Intelligent User Interfaces 2010. pp. 419-420. Available online

We present the $3 Gesture Recognizer, a simple but robust gesture recognition system for input devices featuring 3D acceleration sensors. The algorithm is designed to be implemented quickly in prototyping environments, is intended to be device-independent and does not require any special toolkits or frameworks, but relies solely on simple trigonometric and geometric calculations. Our method requires significantly less training data than other gesture recognizers and is thus suited to be deployed and to deliver results rapidly.

© All rights reserved Kratz and Rohs and/or their publisher

2009
 
Edit | Del

Kratz, Sven and Ballagas, Raphael (2009): Unravelling seams: improvoing mobile gesture recognition with visual feedback techniques. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 937-940. Available online

Gesture recognition is emerging as an engaging interaction technique in mobile scenarios, and high recognition rates promote user acceptance. Several factors influence recognition rates including the nature of the gesture set and the suitability of the gesture recognition algorithm. This work explores how seamfulness in gesture stroke visualization affects recognition rates. We present the results of a user evaluation of a gesture recognition system that shows that raw (seamful) visualization of low-delity gesture stroke data has recognition rates comparable to no feedback. Providing filtered (seamless) stroke visualization to the user, while retaining the un-filtered input data for recognition, resulted in a 34.9% improvement in gesture recognition rate over raw stroke data. The results provide insights into the broader design space of seamful design, and identifies areas where seamlessness is advantageous.

© All rights reserved Kratz and Ballagas and/or ACM Press

 
Edit | Del

Schoning, Johannes, Rohs, Michael, Kratz, Sven, Lochtefeld, Markus and Krger, Antonio (2009): Map torchlight: a mobile augmented reality camera projector unit. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 3841-3846. Available online

The advantages of paper-based maps have been utilized in the field of mobile Augmented Reality (AR) in the last few years. Traditional paper-based maps provide high-resolution, large-scale information with zero power consumption. There are numerous implementations of magic lens interfaces that combine high-resolution paper maps with dynamic handheld displays. From an HCI perspective, the main challenge of magic lens interfaces is that users have to switch their attention between the magic lens and the information in the background. In this paper, we attempt to overcome this problem by using a lightweight mobile camera projector unit to augment the paper map directly with additional information. The "Map Torchlight" is tracked over a paper map and can precisely highlight points of interest, streets, and areas to give directions or other guidance for interacting with the map.

© All rights reserved Schoning et al. and/or ACM Press

 
Edit | Del

Kratz, Sven and Rohs, Michael (2009): HoverFlow: expanding the design space of around-device interaction. In: Proceedings of 11th Conference on Human-computer interaction with mobile devices and services 2009. p. 4. Available online

In this paper we explore the design space of around-device interaction (ADI). This approach seeks to expand the interaction possibilities of mobile and wearable devices beyond the confines of the physical device itself to include the space around it. This enables rich 3D input, comprising coarse movement-based gestures, as well as static position-based gestures. ADI can help to solve occlusion problems and scales down to very small devices. We present a novel around-device interaction interface that allows mobile devices to track coarse hand gestures performed above the device's screen. Our prototype uses infrared proximity sensors to track hand and finger positions in the device's proximity. We present an algorithm for detecting hand gestures and provide a rough overview of the design space of ADI-based interfaces.

© All rights reserved Kratz and Rohs and/or their publisher

 
Edit | Del

Kratz, Sven and Rohs, Michael (2009): Hoverflow: exploring around-device interaction with IR distance sensors. In: Proceedings of 11th Conference on Human-computer interaction with mobile devices and services 2009. p. 42. Available online

By equipping a mobile device with distance sensing capabilities, we aim to expand the interaction possibilities of mobile and wearable devices beyond the confines of the physical device itself to include the space immediately around it. Our prototype, an Apple iPhone equipped with six IR distance sensors, allows for rich 3D input, comprising coarse movement-based hand gestures, as well as static position-based gestures. A demonstration application, HoverFlow, illustrates the use of coarse hand gestures for interaction with mobile applications. This type of interaction, which we call Around-Device Interaction (ADI) has the potential to help to solve occlusion problems on small-screen mobile devices and scales well to small device sizes.

© All rights reserved Kratz and Rohs and/or their publisher

 
Edit | Del

Kray, Christian, Rohs, Michael, Hook, Jonathan and Kratz, Sven (2009): Bridging the gap between the Kodak and the Flickr generations: A novel interaction technique for collocated photo sharing. In International Journal of Human-Computer Studies, 67 (12) pp. 1060-1072. Available online

Passing around stacks of paper photographs while sitting around a table is one of the key social practices defining what is commonly referred to as the 'Kodak Generation'. Due to the way digital photographs are stored and handled, this practice does not translate well to the 'Flickr Generation', where collocated photo sharing often involves the (wireless) transmission of a photo from one mobile device to another. In order to facilitate 'cross-generation' sharing without enforcing either practice, it is desirable to bridge this gap in a way that incorporates familiar aspects of both. In this paper, we discuss a novel interaction technique that addresses some of the constraints introduced by current communication technology, and that enables photo sharing in a way, which resembles the passing of stacks of paper photographs. This technique is based on dynamically generated spatial regions around mobile devices and has been evaluated through two user studies. The results we obtained indicate that our technique is easy to learn and as fast, or faster than, current technology such as transmitting photos between devices using Bluetooth. In addition, we found evidence of different sharing techniques influencing social practice around photo sharing. The use of our technique resulted in a more inclusive and group-oriented behavior in contrast to Bluetooth photo sharing, which resulted in a more fractured setting composed of sub-groups.

© All rights reserved Kray et al. and/or Academic Press

 
Add publication
Show list on your website
 
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
Join our community!
 
 
 

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/sven_kratz.html