Publication statistics

Pub. period:2007-2012
Pub. count:16
Number of co-authors:18



Co-authors

Number of publications with 3 favourite co-authors:

Saul Greenberg:15
Sebastian Boring:5
Miguel A. Nacenta:4

 

 

Productive colleagues

Nicolai Marquardt's 3 most productive colleagues in number of publications:

Saul Greenberg:140
Ken Hinckley:54
Sheelagh Carpendal..:38
 
 
 

Upcoming Courses

go to course
Emotional Design: How to make products people will love
Starts TODAY LAST CALL!
go to course
UI Design Patterns for Successful Software
87% booked. Starts in 8 days
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

 
 
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
 
 
 
 
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading
 
 

Nicolai Marquardt

Personal Homepage:
http://www.nicolaimarquardt.com

Current place of employment:
University of Calgary

Nicolai Marquardt is a Ph.D. candidate at the Interactions Lab, University of Calgary. His dissertation focuses on Proxemic Interactions within ubicomp ecologies.

 

Publications by Nicolai Marquardt (bibliography)

 what's this?
2012
 
Edit | Del

Ledo, David, Nacenta, Miguel A., Marquardt, Nicolai, Boring, Sebastian and Greenberg, Saul (2012): The HapticTouch toolkit: enabling exploration of haptic interactions. In: Proceedings of the 6th International Conference on Tangible and Embedded Interaction 2012. pp. 115-122. Available online

In the real world, touch based interaction relies on haptic feedback (e.g., grasping objects, feeling textures). Unfortunately, such feedback is absent in current tabletop systems. The previously developed Haptic Tabletop Puck (HTP) aims at supporting experimentation with and development of inexpensive tabletop haptic interfaces in a do-it-yourself fashion. The problem is that programming the HTP (and haptics in general) is difficult. To address this problem, we contribute the Haptictouch toolkit, which enables developers to rapidly prototype haptic tabletop applications. Our toolkit is structured in three layers that enable programmers to: (1) directly control the device, (2) create customized combinable haptic behaviors (e.g., softness, oscillation), and (3) use visuals (e.g., shapes, images, buttons) to quickly make use of these behaviors. In our preliminary exploration we found that programmers could use our toolkit to create haptic tabletop applications in a short amount of time.

© All rights reserved Ledo et al. and/or ACM Press

 
Edit | Del

Boring, Sebastian, Ledo, David, Chen, Xiang 'Anthony', Marquardt, Nicolai and Greenberg, Saul (2012): The fat thumb: using the thumb's contact size for single-handed mobile interaction. In: Proceedings of the 14th Conference on Human-computer interaction with mobile devices and services 2012. pp. 39-48. Available online

Modern mobile devices allow a rich set of multi-finger interactions that combine modes into a single fluid act, for example, one finger for panning blending into a two-finger pinch gesture for zooming. Such gestures require the use of both hands: one holding the device while the other is interacting. While on the go, however, only one hand may be available to both hold the device and interact with it. This mostly limits interaction to a single-touch (i.e., the thumb), forcing users to switch between input modes explicitly. In this paper, we contribute the Fat Thumb interaction technique, which uses the thumb's contact size as a form of simulated pressure. This adds a degree of freedom, which can be used, for example, to integrate panning and zooming into a single interaction. Contact size determines the mode (i.e., panning with a small size, zooming with a large one), while thumb movement performs the selected mode. We discuss nuances of the Fat Thumb based on the thumb's limited operational range and motor skills when that hand holds the device. We compared Fat Thumb to three alternative techniques, where people had to precisely pan and zoom to a predefined region on a map and found that the Fat Thumb technique compared well to existing techniques.

© All rights reserved Boring et al. and/or ACM Press

 
Edit | Del

Chen, Xiang 'Anthony', Marquardt, Nicolai, Boring, Sebastian and Greenberg, Saul (2012): Extending a mobile device's interaction space through body-centric interaction. In: Proceedings of the 14th Conference on Human-computer interaction with mobile devices and services 2012. pp. 151-160. Available online

Modern mobile devices rely on the screen as a primary input modality. Yet the small screen real-estate limits interaction possibilities, motivating researchers to explore alternate input techniques. Within this arena, our goal is to develop Body-Centric Interaction with Mobile Devices: a class of input techniques that allow a person to position and orient her mobile device to navigate and manipulate digital content anchored in the space on and around the body. To achieve this goal, we explore such interaction in a bottom-up path of prototypes and implementations. From our experiences, as well as by examining related work, we discuss and present three recurring themes that characterize how these interactions can be realized. We illustrate how these themes can inform the design of Body-Centric Interactions by applying them to the design of a novel mobile browser application. Overall, we contribute a class of mobile input techniques where interactions are extended beyond the small screen, and are instead driven by a person's movement of the device on and around the body.

© All rights reserved Chen et al. and/or ACM Press

 
Edit | Del

Marquardt, Nicolai, Hinckley, Ken and Greenberg, Saul (2012): Cross-device interaction via micro-mobility and f-formations. In: Proceedings of the 2012 ACM Symposium on User Interface Software and Technology 2012. pp. 13-22. Available online

GroupTogether is a system that explores cross-device interaction using two sociological constructs. First, F-formations concern the distance and relative body orientation among multiple users, which indicate when and how people position themselves as a group. Second, micro-mobility describes how people orient and tilt devices towards one another to promote fine-grained sharing during co-present collaboration. We sense these constructs using: (a) a pair of overhead Kinect depth cameras to sense small groups of people, (b) low-power 8GHz band radio modules to establish the identity, presence, and coarse-grained relative locations of devices, and (c) accelerometers to detect tilting of slate devices. The resulting system supports fluid, minimally disruptive techniques for co-located collaboration by leveraging the proxemics of people as well as the proxemics of devices.

© All rights reserved Marquardt et al. and/or ACM Press

2011
 
Edit | Del

Marquardt, Nicolai (2011): Proxemic interactions in ubiquitous computing ecologies. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 1033-1036. Available online

An important challenge in ubiquitous computing (ubicomp) is to create techniques that allow people to seamlessly and naturally connect to and interact with the increasing number of digital devices. I propose to leverage the knowledge of people's and devices' spatial relationships -- called proxemics -- in ubicomp interaction design. I introduce my work of proxemic interactions that consider fine-grained information of proxemics to mediate people's interactions with digital devices, such as large digital surfaces or portable personal devices. This research includes the design of development tools for programmers creating proxemic-aware systems, and the design and evaluation of such interactive ubicomp systems.

© All rights reserved Marquardt and/or his/her publisher

 
Edit | Del

Marquardt, Nicolai, Kiemer, Johannes, Ledo, David, Boring, Sebastian and Greenberg, Saul (2011): Designing user-, hand-, and handpart-aware tabletop interactions with the TouchID toolkit. In: Proceedings of the 2011 ACM International Conference on Interactive Tabletops and Surfaces 2011. pp. 21-30. Available online

Recent work in multi-touch tabletop interaction introduced many novel techniques that let people manipulate digital content through touch. Yet most only detect touch blobs. This ignores richer interactions that would be possible if we could identify (1) which part of the hand, (2) which side of the hand, and (3) which person is actually touching the surface. Fiduciary-tagged gloves were previously introduced as a simple but reliable technique for providing this information. The problem is that its low-level programming model hinders the way developers could rapidly explore new kinds of user- and handpart-aware interactions. We contribute the TouchID toolkit to solve this problem. It allows rapid prototyping of expressive multi-touch interactions that exploit the aforementioned characteristics of touch input. TouchID provides an easy-to-use event-driven API as well as higher-level tools that facilitate development: a glove configurator to rapidly associate particular glove parts to handparts; and a posture configurator and gesture configurator for registering new hand postures and gestures for the toolkit to recognize. We illustrate TouchID's expressiveness by showing how we developed a suite of techniques that exploits knowledge of which handpart is touching the surface.

© All rights reserved Marquardt et al. and/or ACM Press

 
Edit | Del

Marquardt, Nicolai, Diaz-Marino, Robert, Boring, Sebastian and Greenberg, Saul (2011): The proximity toolkit: prototyping proxemic interactions in ubiquitous computing ecologies. In: Proceedings of the 2011 ACM Symposium on User Interface Software and Technology 2011. pp. 315-326. Available online

People naturally understand and use proxemic relationships (e.g., their distance and orientation towards others) in everyday situations. However, only few ubiquitous computing (ubicomp) systems interpret such proxemic relationships to mediate interaction (proxemic interaction). A technical problem is that developers find it challenging and tedious to access proxemic information from sensors. Our Proximity Toolkit solves this problem. It simplifies the exploration of interaction techniques by supplying fine-grained proxemic information between people, portable devices, large interactive surfaces, and other non-digital objects in a room-sized environment. The toolkit offers three key features. 1) It facilitates rapid prototyping of proxemic-aware systems by supplying developers with the orientation, distance, motion, identity, and location information between entities. 2) It includes various tools, such as a visual monitoring tool, that allows developers to visually observe, record and explore proxemic relationships in 3D space. (3) Its flexible architecture separates sensing hardware from the proxemic data model derived from these sensors, which means that a variety of sensing technologies can be substituted or combined to derive proxemic information. We illustrate the versatility of the toolkit with proxemic-aware systems built by students.

© All rights reserved Marquardt et al. and/or ACM Press

2010
 
Edit | Del

Marquardt, Nicolai, Taylor, Alex S., Villar, Nicolas and Greenberg, Saul (2010): Rethinking RFID: awareness and control for interaction with RFID systems. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 2307-2316. Available online

People now routinely carry radio frequency identification (RFID) tags -- in passports, driver's licenses, credit cards, and other identifying cards -- from which nearby RFID readers can access privacy-sensitive information. The problem is that people are often unaware of security and privacy risks associated with RFID, likely because the technology remains largely invisible and uncontrollable for the individual. To mitigate this problem, we introduce a collection of novel yet simple and inexpensive tag designs. Our tags provide reader awareness, where people get visual, audible, or tactile feedback as tags come into the range of RFID readers. Our tags also provide information control, where people can allow or disallow access to the information stored on the tag by how they touch, orient, move, press or illuminate the tag.

© All rights reserved Marquardt et al. and/or their publisher

 
Edit | Del

Marquardt, Nicolai, Taylor, Alex S., Villar, Nicolas and Greenberg, Saul (2010): Visible and controllable RFID tags. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 3057-3062. Available online

Radio frequency identification (RFID) tags containing privacy-sensitive information are increasingly embedded into personal documents (e.g., passports and driver's licenses). The problem is that people are often unaware of the security and privacy risks associated with RFID, likely because the technology remains largely invisible and uncontrollable for the individual. To mitigate this problem, we developed a collection of novel yet simple and inexpensive alternative tag designs to make RFID visible and controllable. This video and demonstration illustrates these designs. For awareness, our tags provide visual, audible, or tactile feedback when in the range of an RFID reader. For control, people can allow or disallow access to the information on the tag by how they touch, orient, move, press, or illuminate the tag (for example, Figure 1 shows a tilt-sensitive RFID tag).

© All rights reserved Marquardt et al. and/or their publisher

 
Edit | Del

Greenberg, Saul, Marquardt, Nicolai, Ballendat, Till, Diaz-Marino, Rob and Wang, Miaosen (2010): Proxemic interactions: the new ubicomp?. In Interactions, 17 (6) pp. 42-50. Available online

2009
 
Edit | Del

Marquardt, Nicolai, Young, James, Sharlin, Ehud and Greenberg, Saul (2009): Situated messages for asynchronous human-robot interaction. In: Proceedings of the 4th ACM/IEEE International Conference on Human Robot Interaction 2009. pp. 301-302. Available online

An ongoing issue in human robot interaction (HRI) is how people and robots communicate with one another. While there is considerable work in real-time human-robot communication, fairly little has been done in asynchronous realm. Our approach, which we call situated messages, lets humans and robots asynchronously exchange information by placing physical tokens -- each representing a simple message -- in meaningful physical locations of their shared environment. Using knowledge of the robot's routines, a person can place a message token at a location, where the location is typically relevant to redirecting the robot's behavior at that location. When the robot passes nearby that location, it detects the message and reacts accordingly. Similarly, robots can themselves place tokens at specific locations for people to read. Thus situated messages leverages embodied interaction, where token placement exploits the everyday practices and routines of both people and robots. We describe our working prototype, introduce application scenarios, explore message categories and usage patterns, and suggest future directions.

© All rights reserved Marquardt et al. and/or ACM Press

 
Edit | Del

Marquardt, Nicolai, Nacenta, Miguel A., Young, James E., Carpendale, Sheelagh, Greenberg, Saul and Sharlin, Ehud (2009): The Haptic Tabletop Puck: Tactile Feedback for Interactive Tabletops. In: Proceedings of Interactive Tabletops and Surfaces, Tabletop 2009, Banff, Canada. . Available online

 
Edit | Del

Marquardt, Nicolai, Nacenta, Miguel A., Young, James E., Carpendale, Sheelagh, Greenberg, Saul and Sharlin, Ehud (2009): The Haptic Tabletop Puck: tactile feedback for interactive tabletops. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. pp. 85-92. Available online

In everyday life, our interactions with objects on real tables include how our fingertips feel those objects. In comparison, current digital interactive tables present a uniform touch surface that feels the same, regardless of what it presents visually. In this paper, we explore how tactile interaction can be used with digital tabletop surfaces. We present a simple and inexpensive device -- the Haptic Tabletop Puck -- that incorporates dynamic, interactive haptics into tabletop interaction. We created several applications that explore tactile feedback in the area of haptic information visualization, haptic graphical interfaces, and computer supported collaboration. In particular, we focus on how a person may interact with the friction, height, texture and malleability of digital objects.

© All rights reserved Marquardt et al. and/or their publisher

 
Edit | Del

Marquardt, Nicolai, Nacenta, Miguel A., Young, James E., Carpendale, Sheelagh, Greenberg, Saul and Sharlin, Ehud (2009): The Haptic Tabletop Puck: the video. In: Proceedings of the 2009 ACM International Conference on Interactive Tabletops and Surfaces 2009. p. D2. Available online

In everyday life, our interactions with objects on real tables include how our fingertips feel those objects. In comparison, current digital interactive tables present a uniform touch surface that feels the same, regardless of what it presents visually. In this video, we demonstrate how tactile interaction can be used with digital tabletop surfaces. We present a simple and inexpensive device -- the Haptic Tabletop Puck -- that incorporates dynamic, interactive haptics into tabletop interaction. We created several applications that explore tactile feedback in the area of haptic information visualization, haptic graphical interfaces, and computer supported collaboration. In particular, we focus on how a person may interact with the friction, height, texture and malleability of digital objects.

© All rights reserved Marquardt et al. and/or their publisher

 
Edit | Del

Marquardt, Nicolai, Gross, Tom, Carpendale, Sheelagh and Greenberg, Saul (2009): Revealing the invisible: visualizing the location and event flow of distributed physical devices. In: Proceedings of the 4th International Conference on Tangible and Embedded Interaction 2009. pp. 41-48. Available online

Distributed physical user interfaces comprise networked sensors, actuators and other devices attached to a variety of computers in different locations. Developing such systems is no easy task. It is hard to track the location and status of component devices, even harder to understand, validate, test and debug how events are transmitted between devices, and hardest yet to see if the overall system behaves correctly. Our Visual Environment Explorer supports developers of these systems by visualizing the location and status of individual and/or aggregate devices. It visualizes the current event flow between devices as they are received and transmitted, as well as the event history. Events are displayable at various levels of detail. The visualization also shows the activity of applications that use these physical devices. The tool is highly interactive: developers can explore system behavior through spatial navigation, zooming, multiple simultaneous views, event filtering, details-on-demand, and time-dependent semantic zooming.

© All rights reserved Marquardt et al. and/or their publisher

2007
 
Edit | Del

Marquardt, Nicolai and Greenberg, Saul (2007): Distributed physical interfaces with shared phidgets. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 13-20. Available online

Tangible interfaces are best viewed as an interacting collection of remotely-located distributed hardware and software components. The problem is that current physical user interface toolkits do not normally offer distributed systems capabilities, leaving developers with extra burdens such as device discovery and management, low-level hardware access, and networking. Our solution is Shared Phidgets, a toolkit for rapidly prototyping distributed physical interfaces. It offers programmers 3 ways to access and control remotely-located hardware, and the ability to create abstract devices by transforming, aggregating and even simulating device capabilities. Network communication and low-level access to device hardware are handled transparently, regardless of device location.

© All rights reserved Marquardt and Greenberg and/or ACM Press

 
Add publication
Show list on your website
 
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
Join our community!
 
 
 

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/nicolai_marquardt.html