Publication statistics

Pub. period:2005-2012
Pub. count:15
Number of co-authors:12



Co-authors

Number of publications with 3 favourite co-authors:

Edward Lank:13
Yang Li:4
Matei Negulescu:3

 

 

Productive colleagues

Jaime Ruiz's 3 most productive colleagues in number of publications:

Yang Li:30
Edward Lank:26
William B. Cowan:19
 
 
 

Upcoming Courses

go to course
Emotional Design: How to make products people will love
90% booked. Starts in 5 days
go to course
UI Design Patterns for Successful Software
82% booked. Starts in 13 days
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

 
 
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
 
 
 
 
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
 
 
 
 
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading
 
 

Jaime Ruiz

 

Publications by Jaime Ruiz (bibliography)

 what's this?
2012
 
Edit | Del

Negulescu, Matei, Ruiz, Jaime, Li, Yang and Lank, Edward (2012): Tap, swipe, or move: attentional demands for distracted smartphone input. In: Proceedings of the 2012 International Conference on Advanced Visual Interfaces 2012. pp. 173-180. Available online

Smartphones are frequently used in environments where the user is distracted by another task, for example by walking or by driving. While the typical interface for smartphones involves hardware and software buttons and surface gestures, researchers have recently posited that, for distracted environments, benefits may exist in using motion gestures to execute commands. In this paper, we examine the relative cognitive demands of motion gestures and surface taps and gestures in two specific distracted scenarios: a walking scenario, and an eyes-free seated scenario. We show, first, that there is no significant difference in reaction time for motion gestures, taps, or surface gestures on smartphones. We further show that motion gestures result in significantly less time looking at the smartphone during walking than does tapping on the screen, even with interfaces optimized for eyes-free input. Taken together, these results show that, despite somewhat lower throughput, there may be benefits to making use of motion gestures as a modality for distracted input on smartphones.

© All rights reserved Negulescu et al. and/or ACM Press

 
Edit | Del

Azad, Alec, Ruiz, Jaime, Vogel, Daniel, Hancock, Mark and Lank, Edward (2012): Territoriality and behaviour on and around large vertical publicly-shared displays. In: Proceedings of DIS12 Designing Interactive Systems 2012. pp. 468-477. Available online

We investigate behaviours on, and around, large vertical displays during concurrent usage. Using an observational field study, we identify fundamental patterns of how people use existing public displays: their orientation, positioning, group identification, and behaviour within and between social groups just-before, during, and just-after usage. These results are then used to motivate a controlled experiment where two individuals or two pairs of individuals complete tasks concurrently on a simulated large vertical display. Results from our controlled study demonstrates that vertical surface territories are similar to those found in horizontal tabletops in function, but their definitions and social conventions are different. In addition, the nature of use-while-standing systems results in more complex and dynamic physical territories around the display. We show that the anthropological notion of personal space must be slightly refined for application to vertical displays.

© All rights reserved Azad et al. and/or ACM Press

 
Edit | Del

Negulescu, Matei, Ruiz, Jaime and Lank, Edward (2012): A recognition safety net: bi-level threshold recognition for mobile motion gestures. In: Proceedings of the 14th Conference on Human-computer interaction with mobile devices and services 2012. pp. 147-150. Available online

Designers of motion gestures for mobile devices face the difficult challenge of building a recognizer that can separate gestural input from motion noise. A threshold value is often used to classify motion and effectively balances the rates of false positives and false negatives. We present a bi-level threshold recognition technique designed to lower the rate of recognition failures by accepting either a tightly thresholded gesture or two consecutive possible gestures recognized by a relaxed model. Evaluation of the technique demonstrates that the technique can aid in recognition for users who have trouble performing motion gestures. Lastly, we suggest the use of bi-level thresholding to scaffold the learning of gestures.

© All rights reserved Negulescu et al. and/or ACM Press

2011
 
Edit | Del

Ruiz, Jaime, Li, Yang and Lank, Edward (2011): User-defined motion gestures for mobile interaction. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 197-206. Available online

Modern smartphones contain sophisticated sensors to monitor three-dimensional movement of the device. These sensors permit devices to recognize motion gestures -- deliberate movements of the device by end-users to invoke commands. However, little is known about best-practices in motion gesture design for the mobile computing paradigm. To address this issue, we present the results of a guessability study that elicits end-user motion gestures to invoke commands on a smartphone device. We demonstrate that consensus exists among our participants on parameters of movement and on mappings of motion gestures onto commands. We use this consensus to develop a taxonomy for motion gestures and to specify an end-user inspired motion gesture set. We highlight the implications of this work to the design of smartphone applications and hardware. Finally, we argue that our results influence best practices in design for all gestural interfaces.

© All rights reserved Ruiz et al. and/or their publisher

 
Edit | Del

Ruiz, Jaime and Li, Yang (2011): DoubleFlip: a motion gesture delimiter for mobile interaction. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 2717-2720. Available online

To make motion gestures more widely adopted on mobile devices it is important that devices be able to distinguish between motion intended for mobile interaction and every-day motion. In this paper, we present DoubleFlip, a unique motion gesture designed as an input delimiter for mobile motion-based interaction. The DoubleFlip gesture is distinct from regular motion of a mobile device. Based on a collection of 2,100 hours of motion data captured from 99 users, we found that our DoubleFlip recognizer is extremely resistant to false positive conditions, while still achieving a high recognition rate. Since DoubleFlip is easy to perform and unlikely to be accidentally invoked, it provides an always-active input event for mobile interaction.

© All rights reserved Ruiz and Li and/or their publisher

 
Edit | Del

Negulescu, Matei, Ruiz, Jaime and Lank, Edward (2011): ZoomPointing revisited: supporting mixed-resolution gesturing on interactive surfaces. In: Proceedings of the 2011 ACM International Conference on Interactive Tabletops and Surfaces 2011. pp. 150-153. Available online

In this work, we explore the design of multi-resolution input on multi-touch devices. We devised a refined zooming technique named Offset, where the target is set at a location offset from the non-dominant hand while the dominant hand controls the direction and magnitude of the expansion. Additionally, we explored the use of non-persistent transformations of the view in our design. A think-aloud study that compared our design to a bimanual widget interaction and the classic pinch-based interaction with a freeform drawing task suggests that Offset offers benefits in terms of performance and degree of control. As well, for the drawing tasks, the transient nature of view transformations appears to impact not only performance, but workflow, focus of interaction, and subjective quality of results by providing a constant overview of the user's task.

© All rights reserved Negulescu et al. and/or ACM Press

2010
 
Edit | Del

Ruiz, Jaime and Lank, Edward (2010): Speeding pointing in tiled widgets: understanding the effects of target expansion and misprediction. In: Proceedings of the 2010 International Conference on Intelligent User Interfaces 2010. pp. 229-238. Available online

Target expansion is a pointing facilitation technique where the user's target, typically an interface widget, is dynamically enlarged to speed pointing in interfaces. However, with densely packed (tiled) arrangements of widgets, interfaces cannot expand all potential targets; they must, instead, predict the user's desired target. As a result, mispredictions will occur which may disrupt the pointing task. In this paper, we present a model describing the cost/benefit of expanding multiple targets using the probability distribution of a given predictor. Using our model, we demonstrate how the model can be used to infer the accuracy required by target prediction techniques. The results of this work are another step toward pointing facilitation techniques that allow users to outperform Fitts' Law in realistic pointing tasks.

© All rights reserved Ruiz and Lank and/or their publisher

 
Edit | Del

Ruiz, Jaime and Li, Yang (2010): DoubleFlip: a motion gesture delimiter for interaction. In: Proceedings of the 2010 ACM Symposium on User Interface Software and Technology 2010. pp. 449-450. Available online

In order to use motion gestures with mobile devices it is imperative that the device be able to distinguish between input motion and everyday motion. In this abstract we present DoubleFlip, a unique motion gesture designed to act as an input delimiter for mobile motion gestures. We demonstrate that the DoubleFlip gesture is extremely resistant to false positive conditions, while still achieving high recognition accuracy. Since DoubleFlip is easy to perform and less likely to be accidentally invoked, it provides an always-active input event for mobile interaction.

© All rights reserved Ruiz and Li and/or their publisher

2008
 
Edit | Del

Ruiz, Jaime, Bunt, Andrea and Lank, Edward (2008): A Model of Non-Preferred Hand Mode Switching. In: Proceedings of the 2008 Conference on Graphics Interface May 28-30, 2008, Windsor, Ontario, Canada. pp. 49-56.

Effective mode-switching techniques provide users of tablet interfaces with access to a rich set of behaviors. While many researchers have studied the relative performance of mode-switching techniques in these interfaces, these metrics tell us little about the behavior of one technique in the absence of a competitor. Differing from past comparison-based research, this paper describes a temporal model of the behavior of a common mode switching technique, non-preferred hand mode switching. Using the Hick-Hyman Law, we claim that the asymptotic cost of adding additional nonpreferred hand modes to an interface is a logarithmic function of the number of modes. We validate the model experimentally, and show a strong correlation between experimental data and values predicted by the model. Implications of this research for the design of mode-based interfaces are highlighted.

© All rights reserved Ruiz et al. and/or their publisher

 
Edit | Del

Ruiz, Jaime, Tausky, David, Bunt, Andrea, Lank, Edward and Mann, Richard (2008): Analyzing the Kinematics of Bivariate Pointing. In: Proceedings of the 2008 Conference on Graphics Interface May 28-30, 2008, Windsor, Ontario, Canada. pp. 251-258.

Despite the importance of pointing-device movement to efficiency in interfaces, little is known on how target shape impacts speed, acceleration, and other kinematic properties of motion. In this paper, we examine which kinematic characteristics of motion are impacted by amplitude and directional target constraints in Fitts-style pointing tasks. Our results show that instantaneous speed, acceleration, and jerk are most affected by target constraint. Results also show that the effects of target constraint are concentrated in the first 70% of movement distance. We demonstrate that we can discriminate between the two classes of target constraint using Machine Learning with accuracy greater than chance. Finally, we highlight future work in designing techniques that make use of target constraint to improve pointing efficiency in computer interfaces.

© All rights reserved Ruiz et al. and/or their publisher

2007
 
Edit | Del

Lank, Edward, Cheng, Yi-Chun Nikko and Ruiz, Jaime (2007): Endpoint prediction using motion kinematics. In: Proceedings of ACM CHI 2007 Conference on Human Factors in Computing Systems 2007. pp. 637-646. Available online

Recently proposed novel interaction techniques such as cursor jumping [1] and target expansion for tiled arrangements [13] are predicated on an ability to effectively estimate the endpoint of an input gesture prior to its completion. However, current endpoint estimation techniques lack the precision to make these interaction techniques possible. To address a recognized lack of effective endpoint prediction mechanisms, we propose a new technique for endpoint prediction that applies established laws of motion kinematics in a novel way to the identification of motion endpoint. The technique derives a model of speed over distance that permits extrapolation. We verify our model experimentally using stylus targeting tasks, and demonstrate that our endpoint prediction is almost twice as accurate as the previously tested technique [13] at points more than twice as distant from motion endpoint.

© All rights reserved Lank et al. and/or ACM Press

 
Edit | Del

Ruiz, Jaime and Lank, Edward (2007): A study on the scalability of non-preferred hand mode manipulation. In: Massaro, Dominic W., Takeda, Kazuya, Roy, Deb and Potamianos, Alexandros (eds.) Proceedings of the 9th International Conference on Multimodal Interfaces - ICMI 2007 November 12-15, 2007, Nagoya, Aichi, Japan. pp. 170-177. Available online

 
Edit | Del

Ruiz, Jaime and Lank, Edward (2007): A study on the scalability of non-preferred hand mode manipulation. In: Proceedings of the 2007 International Conference on Multimodal Interfaces 2007. pp. 170-177. Available online

In pen-tablet input devices modes allow overloading of the electronic stylus. In the case of two modes, switching modes with the non-preferred hand is most effective [12]. Further, allowing temporal overlap of mode switch and pen action boosts speed [11]. We examine the effect of increasing the number of interface modes accessible via non-preferred hand mode switching on task performance in pen-tablet interfaces. We demonstrate that the temporal benefit of overlapping mode-selection and pen action for the two mode case is preserved as the number of modes increases. This benefit is the result of both concurrent action of the hands, and reduced planning time for the overall task. Finally, while allowing bimanual overlap is still faster it takes longer to switch modes as the number of modes increases. Improved understanding of the temporal costs presented assists in the design of pen-tablet interfaces with larger sets of interface modes.

© All rights reserved Ruiz and Lank and/or their publisher

2006
 
Edit | Del

Lank, Edward, Ruiz, Jaime and Cowan, William B. (2006): Concurrent bimanual stylus interaction: a study of non-preferred hand mode manipulation. In: Proceedings of the 2006 Conference on Graphics Interface 2006. pp. 17-24. Available online

Pen/Stylus input systems are constrained by the limited input capacity of the electronic stylus. Stylus modes, which allow multiple interpretations of the same input, lift capacity limits, but confront the user with possible cognitive and motor costs associated with switching modes. This paper examines the costs of bimanual mode switching, in which the non-preferred hand performs actions that change modes while the preferred hand executes gestures that provide input. We examine three variants to control mode of a stylus gesture: pre-gesture mediation, post-gesture mediation, and mediation that occurs concurrently with stylus gesturing. The results show that concurrent mode-switching is faster than the alternatives, and, in one trial, marginally outperforms the control condition, un-moded drawing. These results demonstrate an instance in which suitably designed mode-switching offers minimal cost to the user. The implications of this result for the design of stylus input systems are highlighted.

© All rights reserved Lank et al. and/or Canadian Information Processing Society

2005
 
Edit | Del

Roberts, John, Ruiz, Jaime and Lank, Edward (2005): Making Favorites Useful. In: Hamza, M.H. (ed.) Proceedings of he IASTED International Conference on Human-Computer Interaction November 14-16, 2005, Phoenix, USA. pp. 96-101.

In this paper, we describe our work in adding functionality to the standard favorites, or bookmarks, list typically available in modern web browsers. Our goal is to increase the rate at which those browsing the web can re-find information they have previously bookmarked for later perusal. To this end, we have developed a favorites manager, deployed in a simple web browsing application, that introduces favorites management features beyond those found in typical web browsers, including additional metadata, automated organization, and search tools. We present the details of this application, along with the results of a user trial measuring the usability of the favorites management system we developed

© All rights reserved Roberts et al. and/or Acta Press

 
Add publication
Show list on your website
 
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
Join our community!
 
 
 

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/jaime_ruiz.html