Publication statistics

Pub. period:2009-2012
Pub. count:14
Number of co-authors:30



Co-authors

Number of publications with 3 favourite co-authors:

Antonio Krger:9
Sven Gehring:6
Michael Rohs:5

 

 

Productive colleagues

Markus Lochtefeld's 3 most productive colleagues in number of publications:

Matt Jones:63
Antonio Krger:59
Keith Cheverst:50
 
 
 

Upcoming Courses

go to course
Emotional Design: How to make products people will love
91% booked. Starts in 4 days
go to course
UI Design Patterns for Successful Software
83% booked. Starts in 12 days
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

 
 
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
 
 
 
 
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading
 
 

Markus Lochtefeld

 

Publications by Markus Lochtefeld (bibliography)

 what's this?
2012
 
Edit | Del

Gehring, Sven, Lochtefeld, Markus, Daiber, Florian, Bohmer, Matthias and Krger, Antonio (2012): Using intelligent natural user interfaces to support sales conversations. In: Proceedings of the 2012 International Conference on Intelligent User Interfaces 2012. pp. 97-100. Available online

During sales conversations, gestures and mimics are of high importance to communicate information about a product. One prominent example for such sales gestures is the meat and cheese counter, which is one of the remaining spots in supermarkets where sales persons interact with customers. Interactions at such counters in supermarkets normally follow a simple protocol. The customer points at an item of choice. The employee takes out the item and, in most of the cases the product needs to be cut to fit the amount the customer wants to buy. Often it is ambiguous about what specific product the customer and the employees are talking about. Up to now, there are just a few efforts in HCI research to enrich communication at the point of sale. In this paper we report and analyze one scenario in which an intelligent natural user interface can support communication between customer and employee in a sales conversation. Furthermore, we report on our prototype that is able to track pointing gestures by using a depth camera and to display information about items pointed at.

© All rights reserved Gehring et al. and/or ACM Press

 
Edit | Del

Cauchard, Jessica, Lochtefeld, Markus, Fraser, Mike, Krger, Antonio and Subramanian, Sriram (2012): m+pSpaces: virtual workspaces in the spatially-aware mobile environment. In: Proceedings of the 14th Conference on Human-computer interaction with mobile devices and services 2012. pp. 171-180. Available online

We introduce spatially-aware virtual workspaces for the mobile environment. The notion of virtual workspaces was initially conceived to alleviate mental workload in desktop environments with limited display real-estate. Using spatial properties of mobile devices, we translate this approach and illustrate that mobile virtual workspaces greatly improve task performance for mobile devices. In a first study, we compare our spatially-aware prototype (mSpaces) to existing context switching methods for navigating amongst multiple tasks in the mobile environment. We show that users are faster, make more accurate decisions and require less mental and physical effort when using spatially-aware prototypes. We furthermore prototype pSpaces and m+pSpaces, two spatially-aware systems equipped with pico-projectors as auxiliary displays to provide dual-display capability to the handheld device. A final study reveals advantages of each of the different configurations and functionalities when comparing all three prototypes. Drawing on these findings, we identify design considerations to create, manipulate and manage spatially-aware virtual workspaces in the mobile environment.

© All rights reserved Cauchard et al. and/or ACM Press

 
Edit | Del

Weir, Daryl, Rogers, Simon, Murray-Smith, Roderick and Lochtefeld, Markus (2012): A user-specific machine learning approach for improving touch accuracy on mobile devices. In: Proceedings of the 2012 ACM Symposium on User Interface Software and Technology 2012. pp. 465-476. Available online

We present a flexible Machine Learning approach for learning user-specific touch input models to increase touch accuracy on mobile devices. The model is based on flexible, non-parametric Gaussian Process regression and is learned using recorded touch inputs. We demonstrate that significant touch accuracy improvements can be obtained when either raw sensor data is used as an input or when the device's reported touch location is used as an input, with the latter marginally outperforming the former. We show that learned offset functions are highly nonlinear and user-specific and that user-specific models outperform models trained on data pooled from several users. Crucially, significant performance improvements can be obtained with a small (≈200) number of training examples, easily obtained for a particular user through a calibration game or from keyboard entry data.

© All rights reserved Weir et al. and/or ACM Press

2011
 
Edit | Del

Dachselt, Raimund, Jones, Matt, Hkkil, Jonna, Lochtefeld, Markus, Rohs, Michael and Rukzio, Enrico (2011): Mobile and personal projection (MP2). In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 21-23. Available online

The emergence of mobile and personal projection devices promises new ways to display and interact with content while the user is mobile, and offer new opportunities and challenges for HCI. This workshop aims to formulate fundamental research questions around this emerging field and provides a venue for discussion for researchers and practitioners working in this area. We will focus on new interaction techniques, applications, personal projection devices, interaction design, multi-user aspects, multi-modal user interfaces and social implications. Our aim is to foster the evolution of a mobile and personal projection community.

© All rights reserved Dachselt et al. and/or their publisher

 
Edit | Del

Lochtefeld, Markus, Gehring, Sven, Jung, Ralf and Krger, Antonio (2011): guitAR: supporting guitar learning through mobile projection. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 1447-1452. Available online

The guitar is one of the most widespread instruments amongst autodidacts, but even though a huge amount of learning material exists, it is still hard to learn especially without a guitar teacher. In this paper we propose an Augmented Reality application called guitAR that assists guitar students mastering their instrument using a projector phone. With the projector phone mounted at the headstock of the guitar, the fret board and the strings of the guitar are in the field of projection of the phone. By projecting instructions directly onto the strings of the guitar the user is easily able to realize where the fingers have to be placed on the fretboard (fingering) to play a certain chord or a tone sequence correctly.

© All rights reserved Lochtefeld et al. and/or their publisher

 
Edit | Del

Bohmer, Matthias, Gehring, Sven, Lochtefeld, Markus, Ostkamp, Morin and Bauer, Gernot (2011): The mighty un-touchables: creating playful engagement on media faades. In: Proceedings of 13th Conference on Human-computer interaction with mobile devices and services 2011. pp. 605-610. Available online

In this paper we investigate interaction with a media faade that is out of reach for touch-based interaction. We describe four different applications that utilize mobile devices to enable passers-by to interact with the faade. Each application has been designed constrained by limitations given by formal regulations of an editorial board (e.g. to prevent traffic distractions) and with the aim to catch the attention of passers-by to achieve interaction and keep the users engaged. Besides the description of the design and implementation of the different application, we report on initial feedback of users after a first preliminary user test that informs further development and design.

© All rights reserved Bohmer et al. and/or ACM Press

 
Edit | Del

Gehring, Sven, Lochtefeld, Markus, Magerkurth, Carsten, Nurmi, Petteri and Michahelles, Florian (2011): Workshop on mobile interaction in retail environments (MIRE). In: Proceedings of 13th Conference on Human-computer interaction with mobile devices and services 2011. pp. 729-731. Available online

The workshop on mobile interaction in retail environments (MIRE) brings together researchers and practitioners from academy and industry to explore how mobile phones and mobile interaction can be embedded in retail environments to create new shopping experiences and mobile enhanced services.

© All rights reserved Gehring et al. and/or ACM Press

 
Edit | Del

Cauchard, Jessica R., Lochtefeld, Markus, Irani, Pourang, Schoening, Johannes, Krger, Antonio, Fraser, Mike and Subramanian, Sriram (2011): Visual separation in mobile multi-display environments. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 451-460. Available online

Projector phones, handheld game consoles and many other mobile devices increasingly include more than one display, and therefore present a new breed of mobile Multi-Display Environments (MDEs) to users. Existing studies illustrate the effects of visual separation between displays in MDEs and suggest interaction techniques that mitigate these effects. Currently, mobile devices with heterogeneous displays such as projector phones are often designed without reference to visual separation issues; therefore it is critical to establish whether concerns and opportunities raised in the existing MDE literature apply to the emerging category of Mobile MDEs (MMDEs). This paper investigates the effects of visual separation in the context of MMDEs and contrasts these with fixed MDE results, and explores design factors for Mobile MDEs. Our study uses a novel eye-tracking methodology for measuring switches in visual context between displays and identifies that MMDEs offer increased design flexibility over traditional MDEs in terms of visual separation. We discuss these results and identify several design implications.

© All rights reserved Cauchard et al. and/or ACM Press

 
Edit | Del

Lochtefeld, Markus (2011): Advanced interaction with mobile projection interfaces. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 43-46. Available online

Through the increasing miniaturization of projection units the integration of such units in everyday-life objects is now possible. Even though these so called pico-projectors are already getting integrated into mobile devices like phones or digital cameras, comparably little research has been conducted to empower these devices to their full capabilities. I outline my previous and current work towards an interface design and a privacy framework that will facilitate mobile projection devices to be part in people's everyday-life. In particular my work is divided into two directions, on the one hand the development of a single-user scenario interface and on the other hand a framework to cope with privacy issues. This will allow the deeper exploitation of the capabilities of mobile projection units for a variety of everyday tasks.

© All rights reserved Lochtefeld and/or ACM Press

2010
 
Edit | Del

Lochtefeld, Markus, Gehring, Sven, Schoning, Johannes and Krger, Antonio (2010): PINwI: pedestrian indoor navigation without infrastructure. In: Proceedings of the Sixth Nordic Conference on Human-Computer Interaction 2010. pp. 731-734. Available online

Navigation in larger unfamiliar buildings like town halls, airports, shopping malls or other public indoor locations is often difficult for humans. Due to the high amount of infrastructure needed for indoor positioning, just a few navigation services for indoor environments exist. Therefore in many of these buildings 'YOU-ARE-HERE' (YAH) maps are provided, often located at the entrance or other key places, to facilitate orientation and navigation within the building, but they have the disadvantages of being stationary. In this paper, we try to overcome these problems by presenting PINwI (Pedestrian Indoor Navigation without Infrastructure), an application that allows the user of a mobile camera device with integrated compass and accelerometer to utilize a photo of such an indoor YAH-map to navigate through the corresponding building. Using a dead reckoning approach, we enrich stationary analog YAH-maps with basic location functionality and turn them into a digital and dynamic medium that can help decision making while taking turns or estimating distances.

© All rights reserved Lochtefeld et al. and/or their publisher

 
Edit | Del

Gehring, Sven, Lochtefeld, Markus, Schoning, Johannes, Gorecky, Dominic, Stephan, Peter, Krger, Antonio and Rohs, Michael (2010): Mobile product customization. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 3463-3468. Available online

Many companies are using the web to enable customers to individually customize their products that range from automobiles and bicycles to CDs, cosmetics and shirts. In this paper we present a mobile application for product customization and production within a smart factory. This allows the ad hoc configuration of products at the point of sale (POS). We investigate human factors when customizing products while interacting with them. We focus on the concept of the mobile client that enables this ad hoc modification, but also present the production chain behind our product. We believe that this particular 3D interaction with a product and a mobile device help to improve the customer satisfaction as it allows for customizing a product in an easy and intuitive way. From a CHI perspective an important aspect is that our mobile augmented reality interface can help to match the costumer's expectations with the final modified product and allows the most natural and intuitive interaction. As a use case of the system, we present the modification of a soap dispenser.

© All rights reserved Gehring et al. and/or their publisher

 
Edit | Del

Schoning, Johannes, Lochtefeld, Markus, Rohs, Michael and Krger, Antonio (2010): Projector Phones: A New Class of Interfaces for Augmented Reality. In International Journal of Mobile Human Computer Interaction, 2 (3) pp. 1-14. Available online

With the miniaturization of projection technology, the integration of tiny projection units into mobile devices is no longer fiction; therefore, such integrated projectors in mobile devices could make mobile projection ubiquitous. These phones will have the ability to project large-scale information onto any surfaces in the real world, and by doing so, the interaction space of the mobile device can be considerably expanded. In addition, physical objects in the environment can be augmented with additional information, which can support interaction concepts that are not even possible on modern desktop computers today. The authors believe that mobile camera-projector units can form a promising interface type for mobile Augmented Reality (AR) applications, thus, this paper identifies different application classes of such interfaces. In addition, different spatial setups of camera and projector units will have an effect on the possible applications and the interaction space with the focus on the augmentation of real word objects in the environment. This paper presents two examples of applications for mobile camera-projector units and different hardware prototypes that allow augmentation of real world objects.

© All rights reserved Schoning et al. and/or their publisher

2009
 
Edit | Del

Schoning, Johannes, Rohs, Michael, Kratz, Sven, Lochtefeld, Markus and Krger, Antonio (2009): Map torchlight: a mobile augmented reality camera projector unit. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 3841-3846. Available online

The advantages of paper-based maps have been utilized in the field of mobile Augmented Reality (AR) in the last few years. Traditional paper-based maps provide high-resolution, large-scale information with zero power consumption. There are numerous implementations of magic lens interfaces that combine high-resolution paper maps with dynamic handheld displays. From an HCI perspective, the main challenge of magic lens interfaces is that users have to switch their attention between the magic lens and the information in the background. In this paper, we attempt to overcome this problem by using a lightweight mobile camera projector unit to augment the paper map directly with additional information. The "Map Torchlight" is tracked over a paper map and can precisely highlight points of interest, streets, and areas to give directions or other guidance for interacting with the map.

© All rights reserved Schoning et al. and/or ACM Press

 
Edit | Del

Schoning, Johannes, Krger, Antonio, Cheverst, Keith, Rohs, Michael, Lochtefeld, Markus and Taher, Faisal (2009): PhotoMap: using spontaneously taken images of public maps for pedestrian navigation tasks on mobile devices. In: Proceedings of 11th Conference on Human-computer interaction with mobile devices and services 2009. p. 14. Available online

In many mid- to large-sized cities public maps are ubiquitous. One can also find a great number of maps in parks or near hiking trails. Public maps help to facilitate orientation and provide special information to not only tourists but also to locals who just want to look up an unfamiliar place while on the go. These maps offer many advantages compared to mobile maps from services like Google Maps Mobile or Nokia Maps. They often show local landmarks and sights that are not shown on standard digital maps. Often these 'You are here' (YAH) maps are adapted to a special use case, e.g. a zoo map or a hiking map of a certain area. Being designed for a fashioned purpose these maps are often aesthetically well designed and their usage is therefore more pleasant. In this paper we present a novel technique and application called PhotoMap that uses images of 'You are here' maps taken with a GPS-enhanced mobile camera phone as background maps for on-the-fly navigation tasks. We discuss different implementations of the main challenge, namely helping the user to properly georeference the taken image with sufficient accuracy to support pedestrian navigation tasks. We present a study that discusses the suitability of various public maps for this task and we evaluate if these georeferenced photos can be used for navigation on GPS-enabled devices.

© All rights reserved Schoning et al. and/or their publisher

 
Add publication
Show list on your website
 
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
Join our community!
 
 
 

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/markus_lochtefeld.html