Publication statistics

Pub. period:1982-2012
Pub. count:57
Number of co-authors:78



Co-authors

Number of publications with 3 favourite co-authors:

Orit Shaer:11
Erin Treacy Solovey:9
Audrey Girouard:7

 

 

Productive colleagues

Robert J. K. Jacob's 3 most productive colleagues in number of publications:

Brad A. Myers:154
Hiroshi Ishii:111
Albrecht Schmidt:110
 
 
 

Upcoming Courses

UI Design Patterns for Successful Software

Starts tomorrow LAST CALL!
 
 
 

Affordances: Designing Intuitive User Interfaces

88% booked. Starts in 7 days
 
 
 
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger

 
Start reading

Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad

 
Start reading

Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam

 
Start reading
 
 

Robert J. K. Jacob

Ph.D

Picture of Robert J. K. Jacob.
Has also published under the name of:
"Rob Jacob" and "Robert Jacob"

Personal Homepage:
http://www.cs.tufts.edu/~jacob/

Current place of employment:
Tufts University

Robert Jacob is a Professor of Computer Science at Tufts University, where his research interests are new interaction modes and techniques and user interface software; his current work focuses on adaptive brain-computer interfaces. He was also a visiting professor at the Universite Paris-Sud and at the MIT Media Laboratory. Before coming to Tufts, he was in the Human-Computer Interaction Lab at the Naval Research Laboratory. He received his Ph.D. from Johns Hopkins University, and he is a member of the editorial board of Human-Computer Interaction and the ACM Transactions on Computer-Human Interaction. He was Papers Co-Chair of the CHI 2001 conference, Co-Chair of UIST 2007 and TEI 2010, and Vice-President of ACM SIGCHI. He was elected to the ACM CHI Academy in 2007, an honorary group of the principal leaders of the field of HCI, whose efforts have shaped the discipline and industry, and have led research and innovation in human-computer interaction.

Edit author info
Add publication

Publications by Robert J. K. Jacob (bibliography)

 what's this?
2012
 
Edit | Del

Jacob, Robert J. K. (2012): Engineering next generation interfaces: past and future. In: ACM SIGCHI 2012 Symposium on Engineering Interactive Computing Systems 2012. pp. 1-2.

Tools, abstractions, models, and specification techniques for engineering new generations of interactive systems have tended to follow the development of such systems by about half a generation. In each case, hackers first start experimenting with new types of systems. Then the model developers and tool builders enter as requirements and paradigms solidify. And ultimately the tools and abstractions become so widely accepted and commonplace that they are no longer an open research area. This has happened with conventional graphical user interfaces, and it continues through new generations of interaction styles. It poses a continuing challenge to our community to focus ahead on the tools and techniques needed for each new emerging future interaction style. I will discuss research projects on specifying previous and current genres of "next generation" user interfaces and how each has been matched to its target domain and has followed this pattern. I will also describe a new genre of adaptive, lightweight brain-computer interfaces as an example of the kinds of next generation interfaces that I see emerging. I offer it as a challenge to our community -- to think about tools and techniques for engineering a new generation of interfaces of this sort.

© All rights reserved Jacob and/or ACM Press

2011
 
Edit | Del

Solovey, Erin Treacy, Lalooses, Francine, Chauncey, Krysta, Weaver, Douglas, Parasi, Margarita, Scheutz, Matthias, Sassaroli, Angelo, Fantini, Sergio, Schermerhorn, Paul, Girouard, Audrey and Jacob, Robert J. K. (2011): Sensing cognitive multitasking for a brain-based adaptive user interface. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 383-392.

Multitasking has become an integral part of work environments, even though people are not well-equipped cognitively to handle numerous concurrent tasks effectively. Systems that support such multitasking may produce better performance and less frustration. However, without understanding the user's internal processes, it is difficult to determine optimal strategies for adapting interfaces, since all multitasking activity is not identical. We describe two experiments leading toward a system that detects cognitive multitasking processes and uses this information as input to an adaptive interface. Using functional near-infrared spectroscopy sensors, we differentiate four cognitive multitasking processes. These states cannot readily be distinguished using behavioral measures such as response time, accuracy, keystrokes or screen contents. We then present our human-robot system as a proof-of-concept that uses real-time cognitive state information as input and adapts in response. This prototype system serves as a platform to study interfaces that enable better task switching, interruption management, and multitasking.

© All rights reserved Solovey et al. and/or their publisher

2010
 
Edit | Del

Girouard, Audrey, Solovey, Erin Treacy, Mandryk, Regan, Tan, Desney, Nacke, Lennart and Jacob, Robert J. K. (2010): Brain, body and bytes: psychophysiological user interaction. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems 2010. pp. 4433-4436.

The human brain and body are prolific signal generators. Recent technologies and computing techniques allow us to measure, process and interpret these signals. We can now infer such things as cognitive and emotional states to create adaptive interactive systems and to gain an understanding of user experience. This workshop brings together researchers from the formerly separated communities of physiological computing (PC), and brain-computer interfaces (BCI) to discuss psychophysiological computing. We set out to identify key research challenges, potential global synergies, and emerging technological contributions.

© All rights reserved Girouard et al. and/or their publisher

2009
 
Edit | Del

Horn, Michael S., Solovey, Erin Treacy, Crouser, R. Jordan and Jacob, Robert J. K. (2009): Comparing the use of tangible and graphical programming languages for informal science education. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 975-984.

Much of the work done in the field of tangible interaction has focused on creating tools for learning; however, in many cases, little evidence has been provided that tangible interfaces offer educational benefits compared to more conventional interaction techniques. In this paper, we present a study comparing the use of a tangible and a graphical interface as part of an interactive computer programming and robotics exhibit that we designed for the Boston Museum of Science. In this study, we have collected observations of 260 museum visitors and conducted interviews with 13 family groups. Our results show that visitors found the tangible and the graphical systems equally easy to understand. However, with the tangible interface, visitors were significantly more likely to try the exhibit and significantly more likely to actively participate in groups. In turn, we show that regardless of the condition, involving multiple active participants leads to significantly longer interaction times. Finally, we examine the role of children and adults in each condition and present evidence that children are more actively involved in the tangible condition, an effect that seems to be especially strong for girls.

© All rights reserved Horn et al. and/or ACM Press

 
Edit | Del

Hirshfield, Leanne M., Solovey, Erin Treacy, Girouard, Audrey, Kebinger, James, Jacob, Robert J. K., Sassaroli, Angelo and Fantini, Sergio (2009): Brain measurement for usability testing and adaptive interfaces: an example of uncovering syntactic workload with functional near infrared spectroscopy. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 2185-2194.

A well designed user interface (UI) should be transparent, allowing users to focus their mental workload on the task at hand. We hypothesize that the overall mental workload required to perform a task using a computer system is composed of a portion attributable to the difficulty of the underlying task plus a portion attributable to the complexity of operating the user interface. In this regard, we follow Shneiderman's theory of syntactic and semantic components of a UI. We present an experiment protocol that can be used to measure the workload experienced by users in their various cognitive resources while working with a computer. We then describe an experiment where we used the protocol to quantify the syntactic workload of two user interfaces. We use functional near infrared spectroscopy, a new brain imaging technology that is beginning to be used in HCI. We also discuss extensions of our techniques to adaptive interfaces.

© All rights reserved Hirshfield et al. and/or ACM Press

 
Edit | Del

Solovey, Erin Treacy, Girouard, Audrey, Chauncey, Krysta, Hirshfield, Leanne M., Sassaroli, Angelo, Zheng, Feng, Fantini, Sergio and Jacob, Robert J. K. (2009): Using fNIRS brain sensing in realistic HCI settings: experiments and guidelines. In: Proceedings of the ACM Symposium on User Interface Software and Technology 2009. pp. 157-166.

Because functional near-infrared spectroscopy (fNIRS) eases many of the restrictions of other brain sensors, it has potential to open up new possibilities for HCI research. From our experience using fNIRS technology for HCI, we identify several considerations and provide guidelines for using fNIRS in realistic HCI laboratory settings. We empirically examine whether typical human behavior (e.g. head and facial movement) or computer interaction (e.g. keyboard and mouse usage) interfere with brain measurement using fNIRS. Based on the results of our study, we establish which physical behaviors inherent in computer usage interfere with accurate fNIRS sensing of cognitive state information, which can be corrected in data analysis, and which are acceptable. With these findings, we hope to facilitate further adoption of fNIRS brain sensing technology in HCI research.

© All rights reserved Solovey et al. and/or their publisher

 
Edit | Del

Shaer, Orit, Jacob, Robert J. K., Green, Mark and Luyten, Kris (2009): Introduction to the special issue on UIDL for next-generation user interfaces. In ACM Transactions on Computer-Human Interaction, 16 (4) p. 16.

 
Edit | Del

Shaer, Orit and Jacob, Robert J. K. (2009): A specification paradigm for the design and implementation of tangible user interfaces. In ACM Transactions on Computer-Human Interaction, 16 (4) p. 20.

solving, and design. However, tangible user interfaces are currently considered challenging to design and build. Designers and developers of these interfaces encounter several conceptual, methodological, and technical difficulties. Among others, these challenges include: the lack of appropriate interaction abstractions, the shortcomings of current user interface software tools to address continuous and parallel interactions, as well as the excessive effort required to integrate novel input and output technologies. To address these challenges, we propose a specification paradigm for designing and implementing Tangible User Interfaces (TUIs), that enables TUI developers to specify the structure and behavior of a tangible user interface using high-level constructs which abstract away implementation details. An important benefit of this approach, which is based on User Interface Description Language (UIDL) research, is that these specifications could be automatically or semi-automatically converted into concrete TUI implementations. In addition, such specifications could serve as a common ground for investigating both design and implementation concerns by TUI developers from different disciplines. Thus, the primary contribution of this article is a high-level UIDL that provides developers from different disciplines means for effectively specifying, discussing, and programming a broad range of tangible user interfaces. There are three distinct elements to this contribution: a visual specification technique that is based on Statecharts and Petri nets, an XML-compliant language that extends this visual specification technique, as well as a proof-of-concept prototype of a Tangible User Interface Management System (TUIMS) that semi-automatically translates high-level specifications into a program controlling specific target technologies.

© All rights reserved Shaer and Jacob and/or ACM Press

2008
 
Edit | Del

Jacob, Robert J. K., Girouard, Audrey, Hirshfield, Leanne M., Horn, Michael S., Shaer, Orit, Solovey, Erin Treacy and Zigelbaum, Jamie (2008): Reality-based interaction: a framework for post-WIMP interfaces. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 201-210.

We are in the midst of an explosion of emerging human-computer interaction techniques that redefine our understanding of both computers and interaction. We propose the notion of Reality-Based Interaction (RBI) as a unifying concept that ties together a large subset of these emerging interaction styles. Based on this concept of RBI, we provide a framework that can be used to understand, compare, and relate current paths of recent HCI research as well as to analyze specific interaction designs. We believe that viewing interaction through the lens of RBI provides insights for design and uncovers gaps or opportunities for future research.

© All rights reserved Jacob et al. and/or ACM Press

 
Edit | Del

Shaer, Orit, Jacob, Robert J. K., Green, Mark and Luyten, Kris (2008): User interface description languages for next generation user interfaces. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 3949-3952.

In recent years HCI researchers have developed a broad range of new interfaces that diverge from the "window, icon, menu, pointing device" (WIMP) paradigm, employing a variety of novel interaction techniques and devices. Developers of these next generation user interfaces face challenges that are currently not addressed by state of the art user interface software tools. As part of the user interface software community's effort to address these challenges, the concept of a User Interface Description Language (UIDL), reemerge as a promising approach. To date, the UIDL research area has demonstrated extensive development, mainly targeting multi-platform and multi-modal user interfaces. However, many open questions remain regarding the usefulness and effectiveness of UIDLs in supporting the development of next generation interfaces. The aim of this workshop is to bring together both developers of next generation user interfaces and UIDL researchers in an effort to identify key challenges facing this community, to jointly develop new approaches aimed at solving these challenges and finally to consider future spaces for UIDL research.

© All rights reserved Shaer et al. and/or ACM Press

 
Edit | Del

Horn, Michael S., Solovey, Erin Treacy and Jacob, Robert J. K. (2008): Tangible programming and informal science learning: making TUIs work for museums. In: Proceedings of ACM IDC08 Interaction Design and Children 2008. pp. 194-201.

In this paper we describe the design and initial evaluation of a tangible computer programming exhibit for children on display at the Boston Museum of Science. We also discuss five design considerations for tangible interfaces in science museums that guided our development and evaluation. In doing so, we propose the notion of passive tangible interfaces. Passive tangibles serve as a way to address practical issues involving tangible interaction in public settings and as a design strategy to promote reflective thinking. Results from our evaluation indicate that passive tangibles can preserve many of the benefits of tangible interaction for informal science learning while remaining cost-effective and reliable.

© All rights reserved Horn et al. and/or ACM Press

 
Edit | Del

Petersen, Marianne Graves, Hallnäs, Lars and Jacob, Robert J. K. (2008): Introduction to special issue on the aesthetics of interaction. In ACM Transactions on Computer-Human Interaction, 15 (3) p. 10.

 
Edit | Del

Petersen, Marianne Graves, Hallnäs, Lars and Jacob, Robert J. K. (2008): Introduction to special issue on the aesthetics of interaction. In ACM Transactions on Computer-Human Interaction, 15 (4) p. 14.

 
Edit | Del

Bean, Alex, Siddiqi, Sabina, Chowdhury, Anila, Whited, Billy, Shaer, Orit and Jacob, Robert J. K. (2008): Marble track audio manipulator (MTAM): a tangible user interface for audio composition. In: Schmidt, Albrecht, Gellersen, Hans-Werner, Hoven, Elise van den, Mazalek, Ali, Holleis, Paul and Villar, Nicolas (eds.) TEI 2008 - Proceedings of the 2nd International Conference on Tangible and Embedded Interaction February 18-20, 2008, Bonn, Germany. pp. 27-30.

 
Edit | Del

Hornecker, Eva, Jacob, Robert J. K., Hummels, Caroline, Ullmer, Brygg, Schmidt, Albrecht, Hoven, Elise van den and Mazalek, Ali (2008): TEI goes on: Tangible and Embedded Interaction. In IEEE Pervasive Computing, 7 (2) pp. 91-96.

published as part of a larger section titled 'Advances in Tangible Interaction and Ubiquitous Virtual Reality'

© All rights reserved Hornecker et al. and/or IEEE Computer Society

 Cited in the following chapter:

Tangible Interaction: [/encyclopedia/tangible_interaction.html]


 
 
Edit | Del

Hong, Dongpyo, Höllerer, Tobias, Haller, Michael, Takemura, Haruo, Cheok, Adrian David, Kim, Gerard Jounghyun, Billinghurst, Mark, Woo, Woontack, Hornecker, Eva, Jacob, Robert J. K., Hummels, Caroline, Ullmer, Brygg, Schmidt, Albrecht, Hoven, Elise van den and Mazalek, Ali (2008): Advances in Tangible Interaction and Ubiquitous Virtual Reality. In IEEE Pervasive Computing, 7 (2) pp. 90-96.

2007
 
Edit | Del

Zigelbaum, Jamie, Horn, Michael S., Shaer, Orit and Jacob, Robert J. K. (2007): The tangible video editor: collaborative video editing with active tokens. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 43-46.

In this paper we introduce the Tangible Video Editor (TVE), a multi-user, tangible interface for sequencing digital video. We present a new approach to tabletop interaction by using multiple handheld computers embedded in plastic tokens. Drawing from the rich physical experience of tradition film editing techniques we designed the TVE to engage multiple users in a collaborative process and encourage the exploration of narrative ideas. We used active tokens to provide a malleable interface, enabling users to organize the interface components in unspecified ways. Our implementation improves upon common projection-based tabletop interfaces in a number of ways including a design for use beyond dedicated two dimensional spaces and a naturally scaling screen resolution.

© All rights reserved Zigelbaum et al. and/or ACM Press

 
Edit | Del

Horn, Michael S. and Jacob, Robert J. K. (2007): Designing tangible programming languages for classroom use. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 159-162.

This paper describes a new technique for implementing educational programming languages using tangible interface technology. It emphasizes the use of inexpensive and durable parts with no embedded electronics or power supplies. Students create programs in offline settings -- on their desks or on the floor -- and use a portable scanning station to compile their code. We argue that languages created with this approach offer an appealing and practical alternative to text-based and visual languages for classroom use. In this paper we discuss the motivations for our project and describe the design and implementation of two tangible programming languages. We also describe an initial case study with children and outline future research goals.

© All rights reserved Horn and Jacob and/or ACM Press

 
Edit | Del

Girouard, Audrey, Solovey, Erin Treacy, Hirshfield, Leanne M., Ecott, Stacey, Shaer, Orit and Jacob, Robert J. K. (2007): Smart Blocks: a tangible mathematical manipulative. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 183-186.

We created Smart Blocks, an augmented mathematical manipulative that allows users to explore the concepts of volume and surface area of 3-dimensional (3D) objects. This interface supports physical manipulation for exploring spatial relationships and it provides continuous feedback for reinforcing learning. By leveraging the benefits of physicality with the advantages of digital information, this tangible interface provides an engaging environment for learning about surface area and volume of 3D objects.

© All rights reserved Girouard et al. and/or ACM Press

 
Edit | Del

Jacob, Robert J. K., Girouard, Audrey, Hirshfield, Leanne M., Horn, Michael, Shaer, Orit, Solovey, Erin Treacy and Zigelbaum, Jamie (2007): CHI2006: what is the next generation of human-computer interaction?. In Interactions, 14 (3) pp. 53-58.

2006
 
Edit | Del

Shaer, Orit and Jacob, Robert J. K. (2006): A Visual Language for Programming Reality-Based Interaction. In: VL-HCC 2006 - IEEE Symposium on Visual Languages and Human-Centric Computing 4-8 September, 2006, Brighton, UK. pp. 244-245.

 
Edit | Del

Christou, Georgios, Jacob, Robert J. K. and Cheng, Pericles Leng (2006): Modeling the Task - Leveraging Knowledge-in-the-Head at Design Time. In: Manolopoulos, Yannis, Filipe, Joaquim, Constantopoulos, Panos and Cordeiro, José (eds.) ICEIS 2006 - Proceedings of the Eighth International Conference on Enterprise Information Systems Databases and Information Systems Integration May 23-27, 2006, Paphos, Cyprus. pp. 131-134.

2005
 
Edit | Del

Bahna, Eric and Jacob, Robert J. K. (2005): Augmented reading: presenting additional information without penalty. In: Proceedings of ACM CHI 2005 Conference on Human Factors in Computing Systems 2005. pp. 1909-1912.

We present a new interaction technique for computer-based reading tasks. Our technique leverages users' peripheral vision as a channel for information transfer by using a video projector along with a computer monitor. In our experiment, users of our system acquired significantly more information than did users in the control group. The results indicate that our technique conveys extra information to users nearly "for free," without adversely affecting their comprehension or reading times.

© All rights reserved Bahna and Jacob and/or ACM Press

2004
 
Edit | Del

Shiaw, Horn-yeu, Jacob, Robert J. K. and Crane, Gregory (2004): The 3D vase museum: a new approach to context in a digital library. In: JCDL04: Proceedings of the 4th ACM/IEEE-CS Joint Conference on Digital Libraries 2004. pp. 125-134.

We present a new approach to displaying and browsing a digital library collection, a set of Greek vases in the Perseus digital library. Our design takes advantage of three-dimensional graphics to preserve context even while the user focuses in on a single item. In a typical digital library user interface, a user can either get an overview for context or else see a single selected item, sacrificing the context view. In our 3D Vase Museum, the user can navigate seamlessly from a high level scatterplot-like plan view to a perspective overview of a subset of the collection, to a view of an individual item, to retrieval of data associated with that item, all within the same virtual room and without any mode change or special command. We present this as an example of a solution to the problem of focus-plus-context in information visualization. We developed 3D models from the 2D photographs in the collection and placed them in our 3D virtual room. We evaluated our approach by comparing it to the conventional interface in Perseus using tasks drawn from archaeology courses and found a clear improvement Subjects who used our 3D Vase Museum performed the tasks 33% better and did so nearly three times faster.

© All rights reserved Shiaw et al. and/or ACM Press

 
Edit | Del

Shaer, Orit, Leland, Nancy, Calvillo-Gamez, Eduardo H. and Jacob, Robert J. K. (2004): The TAC paradigm: specifying tangible user interfaces. In Personal and Ubiquitous Computing, 8 (5) pp. 359-369.

 
Edit | Del

Diep, Ellen and Jacob, Robert J. K. (2004): Visualizing E-mail with a Semantically Zoomable Interface. In: InfoVis 2004 - 10th IEEE Symposium on Information Visualization 10-12 October, 2004, Austin, TX, USA. .

2003
 
Edit | Del

Ullmer, Brygg, Ishii, Hiroshi and Jacob, Robert J. K. (2003): Tangible Query Interfaces: Physically Constrained Tokens for Manipulating Database Queries. In: Proceedings of IFIP INTERACT03: Human-Computer Interaction 2003, Zurich, Switzerland. p. 279.

 
Edit | Del

Ullmer, Brygg, Ishii, Hiroshi and Jacob, Robert J. K. (2003): Tangible Query Interfaces: Physically Constrained Tokens for Manipulating Database Queries. In: Proceedings of IFIP INTERACT03: Human-Computer Interaction 2003, Zurich, Switzerland. p. 1004?.

 
Edit | Del

Calvillo-Gamez, Eduardo H., Leland, Nancy, Shaer, Orit and Jacob, Robert J. K. (2003): The TAC paradigm: unified conceptual framework to represent Tangible User Interfaces. In: Proceedings of the 2003 Latin American conference on Human-computer interaction 2003. pp. 9-15.

This paper introduces a new paradigm for describing Tangible User Interfaces (TUI). The paradigm presented here encompasses existing TUI classifications and proposes a unified conceptual framework with which all TUIs can be understood. In order to show that the new paradigm holds and can be generalized we analyzed several existing TUIs using the proposed paradigm.

© All rights reserved Calvillo-Gamez et al. and/or ACM Press

 
Edit | Del

Christou, Georgios and Jacob, Robert J. K. (2003): Evaluating and Comparing Interaction Styles. In: Jorge, Joaquim A., Nunes, Nuno Jardim and Cunha, Joao Falcao e (eds.) DSV-IS 2003 - Interactive Systems. Design, Specification, and Verification, 10th International Workshop June 11-13, 2003, Funchal, Madeira Island, Portugal. pp. 406-409.

2002
 
Edit | Del

Jacob, Robert J. K., Ishii, Hiroshi, Pangaro, Gian and Patten, James (2002): A tangible interface for organizing information using a grid. In: Terveen, Loren (ed.) Proceedings of the ACM CHI 2002 Conference on Human Factors in Computing Systems Conference April 20-25, 2002, Minneapolis, Minnesota. pp. 339-346.

 
Edit | Del

Chang, Angela, O'Modhrain, Sile, Jacob, Robert J. K., Gunther, Eric and Ishii, Hiroshi (2002): ComTouch: design of a vibrotactile communication device. In: Proceedings of DIS02: Designing Interactive Systems: Processes, Practices, Methods, & Techniques 2002. pp. 312-320.

We describe the design of ComTouch, a device that augments remote voice communication with touch, by converting hand pressure into vibrational intensity between users in real-time. The goal of this work is to enrich inter-personal communication by complementing voice with a tactile channel. We present preliminary user studies performed on 24 people to observe possible uses of the tactile channel when used in conjunction with audio. By recording and examining both audio and tactile data, we found strong relationships between the two communication channels. Our studies show that users developed an encoding system similar to that of Morse code, as well as three original uses: emphasis, mimicry, and turn-taking. We demonstrate the potential of the tactile channel to enhance the existing voice communication channel.

© All rights reserved Chang et al. and/or ACM Press

 
Edit | Del

Deligiannidis, Leonidas and Jacob, Robert J. K. (2002): DLoVe: Using Constraints to Allow Parallel Processing in Multi-User Virtual Reality. In: VR 2002 2002. pp. 49-.

2001
 
Edit | Del

Beaudouin-Lafon, Michel and Jacob, Robert J. K. (eds.) Proceedings of the ACM CHI 2001 Human Factors in Computing Systems Conference March 31 - April 5, 2001, Seattle, Washington, USA.

 
Edit | Del

Jacob, Robert J. K. (2001): Elements of next-generation non-WIMP user interfaces. In: Graphics Interface 2001 June 7-9, 2001, Ottawa, Ontario, Canada. pp. 235-235.

 
Edit | Del

Tanriverdi, Vildan and Jacob, Robert J. K. (2001): VRID: a design model and methodology for developing virtual reality interfaces. In: VRST 2001 2001. pp. 175-182.

 
Edit | Del

Jacob, Robert J. K. (2001): Open syntax: improving access for all users. In: Proceedings of the 2001 EC/NSF Workshop on Universal Accessibility of Ubiquitous Computing 2001. pp. 84-89.

Trends in new multi-modal user interfaces and pervasive mobile computing are raising technical problems for building flexible interfaces that can adapt to different communication modes. I hope to show how some aspects of the technical solutions that will be needed for these problems will also help to solve problems of access for elderly users.

© All rights reserved Jacob and/or ACM Press

2000
 
Edit | Del

Tanriverdi, Vildan and Jacob, Robert J. K. (2000): Interacting with Eye Movements in Virtual Environments. In: Turner, Thea, Szwillus, Gerd, Czerwinski, Mary, Peterno, Fabio and Pemberton, Steven (eds.) Proceedings of the ACM CHI 2000 Human Factors in Computing Systems Conference April 1-6, 2000, The Hague, The Netherlands. pp. 265-272.

Eye movement-based interaction offers the potential of easy, natural, and fast ways of interacting in virtual environments. However, there is little empirical evidence about the advantages or disadvantages of this approach. We developed a new interaction technique for eye movement interaction in a virtual environment and compared it to more conventional 3-D pointing. We conducted an experiment to compare performance, of the two interaction types and to assess their impacts on spatial memory of subjects and to explore subjects' satisfaction with the two types of interactions. We found that the eye movement-based interaction was faster than pointing, especially for distant objects. However, subjects' ability to recall spatial information was weaker in the eye condition than the pointing one. Subjects reported equal satisfaction with both types of interactions, despite the technology limitations of current eye tracking equipment.

© All rights reserved Tanriverdi and Jacob and/or ACM Press

 
Edit | Del

Sibert, Linda E. and Jacob, Robert J. K. (2000): Evaluation of Eye Gaze Interaction. In: Turner, Thea, Szwillus, Gerd, Czerwinski, Mary, Peterno, Fabio and Pemberton, Steven (eds.) Proceedings of the ACM CHI 2000 Human Factors in Computing Systems Conference April 1-6, 2000, The Hague, The Netherlands. pp. 281-288.

Eye gaze interaction can provide a convenient and natural addition to user-computer dialogues. We have previously reported on our interaction techniques using eye gaze [10]. While our techniques seemed useful in demonstration, we now investigate their strengths and weaknesses in a controlled setting. In this paper, we present two experiments that compare an interaction technique we developed for object selection based on a where a person is looking with the most commonly used selection method using a mouse. We find that our eye gaze interaction technique is faster than selection with a mouse. The results show that our algorithm, which makes use of knowledge about how the eyes behave, preserves the natural quickness of the eye. Eye gaze interaction is a reasonable addition to computer interaction and is convenient in situations where it is important to use the hands for other tasks. It is particularly beneficial for the larger screen workspaces and virtual environments of the future, and it will become increasingly practical as eye tracker technology matures.

© All rights reserved Sibert and Jacob and/or ACM Press

1999
 
Edit | Del

Jacob, Robert J. K., Deligiannidis, Leonidas and Morrison, Stephen (1999): A Software Model and Specification Language for Non-WIMP User Interfaces. In ACM Computing Surveys (CSUR), 6 (1) pp. 1-46.

We present a software model and language for describing and programming the fine-grained aspects of interaction in a non-WIMP user interface, such as a virtual environment. Our approach is based on our view that the essence of a non-WIMP dialogue is a set of continuous elationships -- most of which are temporary. The model combines a data-flow or constraint-like component for the continuous relationships with an event-based component for discrete interactions, which can enable or disable individual continuous relationships. To demonstrate our approach, we present the PMIW user interface management system for non-WIMP interactions, a set of examples running under it, a visual editor for our user interface description language, and a discussion of our implementation and our restricted use of constraints for a performance-driven interactive situation. Our goal is to provide a model and language that captures the formal structure of non-WIMP interactions in the way that various previous techniques have captured command-based, textual, and event-based styles and to suggest that using it need not compromise real-time performance.

© All rights reserved Jacob et al. and/or ACM Press

1998
 
Edit | Del

Mynatt, Elizabeth D. and Jacob, Robert J. K. (eds.) Proceedings of the 11th annual ACM symposium on User interface software and technology November 01 - 04, 1998, San Francisco, California, United States.

1997
 
Edit | Del

Jacob, Robert J. K., Feiner, Steven K., Foley, James D., Mackinlay, Jock D. and Olsen Jr, Dan R. (1997): UIST'007: Where Will We Be Ten Years From Now?. In: Robertson, George G. and Schmandt, Chris (eds.) Proceedings of the 10th annual ACM symposium on User interface software and technology October 14 - 17, 1997, Banff, Alberta, Canada. pp. 115-118.

The conference this year is the tenth anniversary of UIST. The keynote talk discusses the history of UIST over the last ten years; this panel looks into the future of the field over the next ten. Each of the panelists will describe a scenario for what life will be like when we meet for UIST'07, ten years from now. They will also have a chance to challenge or question each others' scenarios and to participate in open discussion with the audience.

© All rights reserved Jacob et al. and/or ACM Press

1996
 
Edit | Del

Jacob, Robert J. K. (1996): Human-Computer Interaction: Input Devices. In ACM Computing Surveys, 28 pp. 177-179.

 Cited in the following chapter:

Fitts's Law: [/encyclopedia/fitts_law.html]


 
 
Edit | Del

Jacob, Robert J. K. (1996): A Visual Language for Non-WIMP User Interfaces. In: VL 1996 1996. pp. 231-238.

1995
 
Edit | Del

Hix, Deborah, Templeman, James N. and Jacob, Robert J. K. (1995): Pre-Screen Projection: From Concept to Testing of a New Interaction Technique. In: Katz, Irvin R., Mack, Robert L., Marks, Linn, Rosson, Mary Beth and Nielsen, Jakob (eds.) Proceedings of the ACM CHI 95 Human Factors in Computing Systems Conference May 7-11, 1995, Denver, Colorado. pp. 226-233.

Pre-screen projection is a new interaction technique that allows a user to pan and zoom integrally through a scene simply by moving his or her head relative to the screen. The underlying concept is based on real-world visual perception, namely, the fact that a person's view changes as the head moves. Pre-screen projection tracks a user's head in three dimensions and alters the display on the screen relative to head position, giving a natural perspective effect in response to a user's head movements. Specifically, projection of a virtual scene is calculated as if the scene were in front of the screen. As a result, the visible scene displayed on the physical screen expands (zooms) dramatically as a user moves nearer. This is analogous to the real world, where the nearer an object is, the more rapidly it visually expands as a person moves toward it. Further, with pre-screen projection a user can navigate (pan and zoom) around a scene integrally, as one unified activity, rather than performing panning and zooming as separate tasks. This paper describes the technique, the real-world metaphor on which it is conceptually based, issues involved in iterative development of the technique, and our approach to its empirical evaluation in a realistic application testbed.

© All rights reserved Hix et al. and/or ACM Press

1994
 
Edit | Del

Jacob, Robert J. K., Sibert, Linda E., McFarlane, Daniel C. and Mullen Jr, M. Preston (1994): Integrality and Separability of Input Devices. In ACM Transactions on Computer-Human Interaction, 1 (1) pp. 3-26.

Current input device taxonomies and other frameworks typically emphasize the mechanical structure of input devices. We suggest that selecting an appropriate input device for an interactive task requires looking beyond the physical structure of devices to the deeper perceptual structure of the task, the device, and the interrelationship between the perceptual structure of the task and the control properties of the device. We affirm that perception is key to understanding performance of multidimensional input devices on multidimensional tasks. We have therefore extended the theory of processing of perceptual structure to graphical interactive tasks and to the control structure of input devices. This allows us to predict task and device combinations that lead to better performance and hypothesize that performance is improved when the perceptual structure of the task matches the control structure of the device. We conducted an experiment in which subjects performed two tasks with different perceptual structures, using two input devices with correspondingly different control structures, a three-dimensional tracker and a mouse. We analyzed both speed and accuracy, as well as the trajectories generated by subjects as they used the unconstrained three-dimensional tracker to perform each task. The results support our hypothesis and confirm the importance of matching the perceptual structure of the task and the control structure of the input device.

© All rights reserved Jacob et al. and/or ACM Press

 
Edit | Del

Durbin, Jim, Jacob, Robert J. K. and Hinckley, Ken (1994): Laying the Foundation for the Information Super Highway: Human-Computer Interaction Research. In ACM SIGCHI Bulletin, 26 (4) pp. 56-58.

 
Edit | Del

Jacob, Robert J. K., Sibert, Linda E., McFarlane, Daniel C. and Mullen, M. P. (1994): Integrality and Separability of Input Devices. In ACM Transactions on Computer-Human Interaction, 1 (0) pp. 3-26.

 Cited in the following chapter:

Fitts's Law: [/encyclopedia/fitts_law.html]


 
1993
 
Edit | Del

Jacob, Robert J. K., Leggett, John, Myers, Brad A. and Pausch, Randy (1993): Interaction Styles and Input/Output Devices. In Behaviour and Information Technology, 12 (2) pp. 69-79.

1992
 
Edit | Del

Jacob, Robert J. K. and Sibert, Linda E. (1992): The Perceptual Structure of Multidimensional Input Device Selection. In: Bauersfeld, Penny, Bennett, John and Lynch, Gene (eds.) Proceedings of the ACM CHI 92 Human Factors in Computing Systems Conference June 3-7, 1992, Monterey, California. pp. 211-218.

Concepts such as the logical device, taxonomies, and other descriptive frameworks have improved understanding of input devices but ignored or else treated informally their pragmatic qualities, which are fundamental to selection of input devices for tasks. We seek the greater leverage of a predictive theoretical framework by basing our investigation of three-dimensional vs. two-dimensional input devices on Garner's theory of processing of perceptual structure in multidimensional space. We hypothesize that perceptual structure provides a key to understanding performance of multidimensional input devices on multidimensional tasks. Two three-dimensional tasks may seem equivalent, but if they involve different types of perceptual spaces, they should be assigned correspondingly different input devices. Our experiment supports this hypothesis and thus both indicates when to use three-dimensional input devices and gives credence to our theoretical basis for this indication.

© All rights reserved Jacob and Sibert and/or ACM Press

 Cited in the following chapter:

3D User Interfaces: [/encyclopedia/3d_user_interfaces.html]


 
1991
 
Edit | Del

Jacob, Robert J. K. (1991): The Use of Eye Movements in Human-Computer Interaction Techniques: What You Look At is What You Get. In ACM Transactions on Information Systems, 9 (2) pp. 152-169.

In seeking hitherto-unused methods by which users and computers can communicate, we investigate the usefulness of eye movements as a fast and convenient auxiliary user-to-computer communication mode. The barrier to exploiting this medium has not been eye-tracking technology but the study of interaction techniques that incorporate eye movements into the user-computer dialogue in a natural and unobtrusive way. This paper discusses some of the human factors and technical considerations that arise in trying to use eye movements as an input medium, describes our approach and the first eye movement-based interaction techniques that we have devised and implemented in our laboratory, and reports our experiences and observations on them.

© All rights reserved Jacob and/or ACM Press

1990
 
Edit | Del

Jacob, Robert J. K. (1990): What You Look At is What You Get: Eye Movement-Based Interaction Techniques. In: Carrasco, Jane and Whiteside, John (eds.) Proceedings of the ACM CHI 90 Human Factors in Computing Systems Conference 1990, Seattle, Washington,USA. pp. 11-18.

In seeking hitherto-unused methods by which users and computers can communicate, we investigate the usefulness of eye movements as a fast and convenient auxiliary user-to-computer communication mode. The barrier to exploiting this medium has not been eye-tracking technology but the study of interaction techniques that incorporate eye movements into the user-computer dialogue in a natural and unobtrusive way. This paper discusses some of the human factors and technical considerations that arise in trying to use eye movements as an input medium, describes our approach and the first eye movement-based interaction techniques that we have devised and implemented in our laboratory, and reports our experiences and observations on them.

© All rights reserved Jacob and/or ACM Press

 Cited in the following chapter:

Fitts's Law: [/encyclopedia/fitts_law.html]


 
1986
 
Edit | Del

Jacob, Robert J. K. (1986): A Specification Language for Direct-Manipulation User Interfaces. In ACM Transactions on Graphics, 5 (4) pp. 283-317.

A direct-manipulation user interface presents a set of visual representations on a display and a repertoire of manipulations that can be performed on any of them. Such representations might include screen buttons, scroll bars, spreadsheet cells, or flowchart boxes. Interaction techniques of this kind were first seen in interactive graphics systems; they are now proving effective in user interfaces for applications that are not inherently graphical. Although they are often easy to learn and use, these interfaces are also typically difficult to specify and program clearly. Examination of direct-manipulation interfaces reveals that they have a coroutine-like structure and, despite their surface appearance, a peculiar, highly moded dialogue. This paper introduces a specification technique for direct-manipulation interfaces based on these observations. In it, each locus of dialogue is described as a separate object with a single-thread state diagram, which can be suspended and resumed, but retains state. The objects are then combined to define the overall user interface as a set of coroutines, rather than inappropriately as a single highly regular state transition diagram. An inheritance mechanism for the interaction objects is provided to avoid repetitiveness in the specifications. A prototype implementation of a user-interface management system based on this approach is described, and example specifications are given.

© All rights reserved Jacob and/or ACM Press

1985
 
Edit | Del

Jacob, Robert J. K. (1985): A State Transition Diagram Language for Visual Programming. In IEEE Computer, 18 (8) pp. 51-59.

1983
 
Edit | Del

Jacob, Robert J. K. (1983): Executable Specifications for a Human-Computer Interface. In: Smith, Raoul N., Pew, Richard W. and Janda, Ann (eds.) Proceedings of the ACM CHI 83 Human Factors in Computing Systems Conferenc December 12-15, 1983, Boston, Massachusetts, United States. pp. 28-34.

It is useful to be able to specify a proposed human-computer interface formally before building it, particularly if a mockup suitable for testing can be obtained directly from the specification. A specification technique for user interfaces, based on state transition diagrams, is introduced and then demonstrated for a secure message system application. An interpreter that executes the resulting specification is then described. Some problems that arise in specifying a user interface are addressed by particular features of the technique: To reduce the complexity of the developer's task, a user interface is divided into the semantic, syntactic, and lexical levels, and a separate executable specification is provided for each. A process of stepwise refinement of the syntactic specification, leading from an informal specification to an executable one is also presented. Since the state diagram notation is based on a nondeterministic model, constraints necessary to realize the system with a deterministic interpreter are given.

© All rights reserved Jacob and/or ACM Press

 
Edit | Del

Jacob, Robert J. K. (1983): Using Formal Specifications in the Design of a Human-Computer Interface. In Communications of the ACM, 26 (4) pp. 259-264.

1982
 
Edit | Del

Jacob, Robert J. K. (1982): Using Formal Specifications in the Design of a Human-Computer Interface. In: Nichols, Jean A. and Schneider, Michael L. (eds.) Proceedings of the SIGCHI conference on Human factors in computing systems March 15-17, 1982, Gaithersburg, Maryland, United States. pp. 315-321.

 
Add publication
Show list on your website
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
 
 
 

Changes to this page (author)

20 Nov 2012: Added
09 Nov 2012: Modified
26 Jul 2011: Modified
05 Jul 2011: Modified
29 Apr 2011: Modified
19 Nov 2010: Modified
19 Nov 2010: Modified
03 Nov 2010: Modified
02 Nov 2010: Modified
25 Aug 2009: Modified
17 Aug 2009: Modified
31 Jul 2009: Modified
20 Jul 2009: Modified
27 Jun 2009: Modified
26 Jun 2009: Modified
19 Jun 2009: Modified
16 Jun 2009: Modified
16 Jun 2009: Modified
16 Jun 2009: Modified
02 Jun 2009: Modified
01 Jun 2009: Modified
01 Jun 2009: Modified
31 May 2009: Modified
09 May 2009: Modified
09 May 2009: Modified
08 Apr 2009: Modified
08 Apr 2009: Modified
08 Apr 2009: Modified
12 May 2008: Modified
12 May 2008: Added
12 May 2008: Modified
11 Feb 2008: Added
11 Feb 2008: Added
24 Jul 2007: Modified
24 Jul 2007: Modified
24 Jul 2007: Modified
24 Jul 2007: Modified
24 Jul 2007: Modified
29 Jun 2007: Modified
24 Jun 2007: Modified
24 Jun 2007: Added
23 Jun 2007: Modified
22 Jun 2007: Added
28 Apr 2003: Added

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/robert_j__k__jacob.html

Publication statistics

Pub. period:1982-2012
Pub. count:57
Number of co-authors:78



Co-authors

Number of publications with 3 favourite co-authors:

Orit Shaer:11
Erin Treacy Solovey:9
Audrey Girouard:7

 

 

Productive colleagues

Robert J. K. Jacob's 3 most productive colleagues in number of publications:

Brad A. Myers:154
Hiroshi Ishii:111
Albrecht Schmidt:110
 
 
 

Upcoming Courses

UI Design Patterns for Successful Software

Starts tomorrow LAST CALL!
 
 
 

Affordances: Designing Intuitive User Interfaces

88% booked. Starts in 7 days
 
 
 
 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger

 
Start reading

Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad

 
Start reading

Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam

 
Start reading
 
 
 
 
This course starts in