Publication statistics

Pub. period:2004-2010
Pub. count:9
Number of co-authors:10


Number of publications with 3 favourite co-authors:

Jeremy Birnholtz:
Mike Wu:
Khai N. Truong:



Productive colleagues

Abhishek Ranjan's 3 most productive colleagues in number of publications:

Ravin Balakrishnan:108
Khai N. Truong:45
Mark Chignell:41

Upcoming Courses

go to course
Gestalt Psychology and Web Design: The Ultimate Guide
go to course
Become a UX Designer from scratch
Starts tomorrow LAST CALL!

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !


Our Latest Books

The Glossary of Human Computer Interaction
by Mads Soegaard and Rikke Friis Dam
start reading
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading

Abhishek Ranjan

Picture of Abhishek Ranjan.
Update pic
Personal Homepage:

Current place of employment:
University of Toronto

Abhishek Ranjan is currently a PhD student in the Dynamic Graphics Project (DGP) labs at the University of Toronto. His research interest includes design and study of systems to support video-mediated collaboration, audio browsing interfaces and study of aural properties, interaction with large scale displays, and computer-vision based interfaces.

Abhishek holds an MS degree in Computer Science from the University of Toronto. He obtained his BTech (Bachelor of Technology) in Computer Science and Engineering from the Indian Institute of Technology, Bombay.


Publications by Abhishek Ranjan (bibliography)

 what's this?
Edit | Del

Ranjan, Abhishek, Birnholtz, Jeremy, Balakrishnan, Ravin and Lee, Dana (2010): Automatic camera control using unobtrusive vision and audio tracking. In: Proceedings of the 2010 Conference on Graphics Interface 2010. pp. 47-54.

While video can be useful for remotely attending and archiving meetings, the video itself is often dull and difficult to watch. One key reason for this is that, except in very high-end systems, little attention has been paid to the production quality of the video being captured. The video stream from a meeting often lacks detail and camera shots rarely change unless a person is tasked with operating the camera. This stands in stark contrast to live television, where a professional director creates engaging video by juggling multiple cameras to provide a variety of interesting views. In this paper, we applied lessons from television production to the problem of using automated camera control and selection to improve the production quality of meeting video. In an extensible and robust approach, our system uses off-the-shelf cameras and microphones to unobtrusively track the location and activity of meeting participants, control three cameras, and cut between these to create video with a variety of shots and views, in real-time. Evaluation by users and independent coders suggests promising initial results and directions for future work.

© All rights reserved Ranjan et al. and/or their publisher

Edit | Del

Birnholtz, Jeremy, Ranjan, Abhishek and Balakrishnan, Ravin (2010): Providing Dynamic Visual Information for Collaborative Tasks: Experiments With Automatic Camera Control. In Human Computer Interaction, 25 (3) pp. 261-287.

One possibility presented by novel communication technologies is the ability for remotely located experts to provide guidance to others who are performing difficult technical tasks in the real world, such as medical procedures or engine repair. In these scenarios, video views and other visual information seem likely to be useful in the ongoing negotiation of shared understanding, or common ground, but actual results with experimental systems have been mixed. One difficulty in designing these systems is achieving a balance between close-up shots that allow for discussion of detail and wide shots that allow for orientation or establishing a mutual point of focus in a larger space. Achieving this balance can be difficult without disorienting or overloading task participants. In this article we present results from two experiments involving three automated camera control systems for remote repair tasks. Results show that a system providing both detailed and overview information was superior to systems providing only one or the other in terms of performance but that some participants preferred the detail-only system.

© All rights reserved Birnholtz et al. and/or Lawrence Erlbaum

Edit | Del

Wu, Mike, Ranjan, Abhishek and Truong, Khai N. (2009): An exploration of social requirements for exercise group formation. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009. pp. 79-82.

Exercising is often a social activity performed with other people, yet finding compatible exercise partners is difficult in practice. To gain a better understanding of the social requirements involved with forming exercise groups, we conducted a two-phased exploratory study involving an online web questionnaire with 96 respondents and two focus groups. Our results highlight various aspects of collaborating with exercise partners, but also indicate the limited utility of currently available systems to support such collaborations. We discuss implications for collaborative technologies supporting exercise group formation.

© All rights reserved Wu et al. and/or ACM Press

Edit | Del

Ranjan, Abhishek, Birnholtz, Jeremy and Balakrishnan, Ravin (2008): Improving meeting capture by applying television production principles with audio and motion detection. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems April 5-10, 2008. pp. 227-236.

Video recordings of meetings are often monotonous and tedious to watch. In this paper, we report on the design, implementation and evaluation of an automated meeting capture system that applies television production principles to capture and present videos of small group meetings in a compelling manner. The system uses inputs from a motion capture system and microphones to drive multiple pan-tilt-zoom cameras and uses heuristics to frame shots and cut between them. An evaluation of the system indicates that its performance approaches that of a professional crew while requiring significantly fewer human resources.

© All rights reserved Ranjan et al. and/or ACM Press

Edit | Del

Ranjan, Abhishek, Birnholtz, Jeremy P. and Balakrishnan, Ravin (2007): Dynamic shared visual spaces: experimenting with automatic camera control in a remote repair task. In: Proceedings of ACM CHI 2007 Conference on Human Factors in Computing Systems 2007. pp. 1177-1186.

We present an experimental study of automatic camera control in the performance of collaborative remote repair tasks using video-mediated communication. Twelve pairs of participants, one "helper" and one "worker," completed a series of Lego puzzle tasks using both a static camera and an automatic camera system that was guided in part by tracking the worker's hand position. Results show substantial performance benefits for the automatic system, particularly for complex tasks. The implications of these results are discussed, along with some lessons for the use of motion tracking as a driver for camera control.

© All rights reserved Ranjan et al. and/or ACM Press

Edit | Del

Ranjan, Abhishek, Balakrishnan, Ravin and Chignell, Mark (2006): Searching in audio: the utility of transcripts, dichotic presentation, and time-compression. In: Proceedings of ACM CHI 2006 Conference on Human Factors in Computing Systems 2006. pp. 721-730.

Searching audio data can potentially be facilitated by the use of automatic speech recognition (ASR) technology to generate text transcripts which can then be easily queried. However, since current ASR technology cannot reliably generate 100% accurate transcripts, additional techniques for fluid browsing and searching of the audio itself are required. We explore the impact of transcripts of various qualities, dichotic presentation, and time-compression on an audio search task. Results show that dichotic presentation and reasonably accurate transcripts can assist in the search process, but suggest that time-compression and low accuracy transcripts should be used carefully.

© All rights reserved Ranjan et al. and/or ACM Press

Edit | Del

Ranjan, Abhishek, Birnholtz, Jeremy P. and Balakrishnan, Ravin (2006): An exploratory analysis of partner action and camera control in a video-mediated collaborative task. In: Proceedings of ACM CSCW06 Conference on Computer-Supported Cooperative Work 2006. pp. 403-412.

This paper reports on an exploratory experimental study of the relationships between physical movement and desired visual information in the performance of video-mediated collaborative tasks in the real world by geographically distributed groups. Twenty-three pairs of participants (one "helper" and one "worker") linked only by video and audio participated in a Lego construction task in one of three experimental conditions: a fixed scene camera, a helper-controlled pan-tilt-zoom camera, and a dedicated operator-controlled camera. "Worker" motion was tracked in 3-D space for all three conditions, as were all camera movements. Results suggest performance benefits for the operator-controlled condition, and the relationships between camera position/movement and worker action are explored to generate preliminary theoretical and design implications.

© All rights reserved Ranjan et al. and/or ACM Press

Edit | Del

Malik, Shahzad, Ranjan, Abhishek and Balakrishnan, Ravin (2005): Interacting with large displays from a distance with vision-tracked multi-finger gestural input. In: Proceedings of the 2005 ACM Symposium on User Interface Software and Technology 2005. pp. 43-52.

We explore the idea of using vision-based hand tracking over a constrained tabletop surface area to perform multi-finger and whole-hand gestural interactions with large displays from a distance. We develop bimanual techniques to support a variety of asymmetric and symmetric interactions, including fast targeting and navigation to all parts of a large display from the comfort of a desk and chair, as well as techniques that exploit the ability of the vision-based hand tracking system to provide multi-finger identification and full 2D hand segmentation. We also posit a design that allows for handling multiple concurrent users.

© All rights reserved Malik et al. and/or ACM Press

Edit | Del

Tsang, Steve, Balakrishnan, Ravin, Singh, Karan and Ranjan, Abhishek (2004): A suggestive interface for image guided 3D sketching. In: Dykstra-Erickson, Elizabeth and Tscheligi, Manfred (eds.) Proceedings of ACM CHI 2004 Conference on Human Factors in Computing Systems April 24-29, 2004, Vienna, Austria. pp. 591-598.

We present an image guided pen-based suggestive interface for sketching 3D wireframe models. Rather than starting from a blank canvas, existing 2D images of similar objects serve as a guide to the user. Image based filters enable attraction, smoothing, and resampling of input curves, and allows for their selective application using pinning and gluing techniques. New input strokes also invoke suggestions of relevant geometry that can be used, reducing the need to explicitly draw all parts of the new model. All suggestions appear in-place with the model being built, in the user's focal attention space. A curve matching algorithm seamlessly augments basic suggestions with more complex ones from a database populated with previously used geometry. The interface also incorporates gestural command input, and interaction techniques for camera controls that enable smooth transitions between orthographic and perspective views.

© All rights reserved Tsang et al. and/or ACM Press

Add publication
Show list on your website

Join our community and advance:




Join our community!

Page Information

Page maintainer: The Editorial Team