Publication statistics

Pub. period:2004-2007
Pub. count:4
Number of co-authors:5


Number of publications with 3 favourite co-authors:

Zhen Wen:
James Shaw:
Shimei Pan:



Productive colleagues

Vikram Aggarwal's 3 most productive colleagues in number of publications:

Michelle X. Zhou:24
Zhen Wen:14
Shimei Pan:9

Upcoming Courses

go to course
Gestalt Psychology and Web Design: The Ultimate Guide
Starts tomorrow LAST CALL!
go to course
Become a UX Designer from scratch
Starts the day after tomorrow !

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !


Our Latest Books

The Glossary of Human Computer Interaction
by Mads Soegaard and Rikke Friis Dam
start reading
The Social Design of Technical Systems: Building technologies for communities. 2nd Edition
by Brian Whitworth and Adnan Ahmad
start reading
Gamification at Work: Designing Engaging Business Software
by Janaki Mythily Kumar and Mario Herger
start reading
The Social Design of Technical Systems: Building technologies for communities
by Brian Whitworth and Adnan Ahmad
start reading
The Encyclopedia of Human-Computer Interaction, 2nd Ed.
by Mads Soegaard and Rikke Friis Dam
start reading

Vikram Aggarwal

Technical Lead

Current place of employment:
Lehman Brothers


Publications by Vikram Aggarwal (bibliography)

 what's this?
Edit | Del

Wen, Zhen, Zhou, Michelle X. and Aggarwal, Vikram (2007): Context-Aware, adaptive information retrieval for investigative tasks. In: Proceedings of the 2007 International Conference on Intelligent User Interfaces 2007. pp. 122-131.

We are building an intelligent information system to aid users in their investigative tasks, such as detecting fraud. In such a task, users must progressively search and analyze relevant information before drawing a conclusion. In this paper, we address how to help users find relevant information during an investigation. Specifically, we present a novel approach that can improve information retrieval by exploiting a user's investigative context. Compared to existing retrieval systems, which are either context insensitive or leverage only limited user context, our work offers two unique contributions. First, our system works with users cooperatively to build an investigative context, which is otherwise very difficult to capture by machine or human alone. Second, we develop a context-aware method that can adaptively retrieve and evaluate information relevant to an ongoing investigation. Experiments show that our approach can improve the relevance of retrieved information significantly. As a result, users can fulfill their investigative tasks more efficiently and effectively.

© All rights reserved Wen et al. and/or ACM Press

Edit | Del

Zhou, Michelle X., Houck, Keith, Pan, Shimei, Shaw, James, Aggarwal, Vikram and Wen, Zhen (2006): Enabling context-sensitive information seeking. In: Proceedings of the 2006 International Conference on Intelligent User Interfaces 2006. pp. 116-123.

Information seeking is an important but often difficult task, especially when it involves large and complex data sets. We hypothesize that a context-sensitive interaction paradigm would greatly assist users in their information seeking. Such a paradigm would allow users to both express their requests and receive requested information in context. Driven by this hypothesis, we have taken rigorous steps to design, develop, and evaluate a full-fledged, context-sensitive information system. We started with a Wizard-of-OZ (WOZ) study to verify the effectiveness of our envisioned system. We then built a fully automated system based on the findings from our WOZ study. We targeted the development and integration of two sets of technologies: context-sensitive multimodal input interpretation and multimedia output generation. Finally, we formally evaluated the usability of our system in real world conditions. The results show that our system greatly improves the users' ability to perform practical information-seeking tasks. These results not only confirm our initial hypothesis, but they also indicate the practicality of our approaches.

© All rights reserved Zhou et al. and/or ACM Press

Edit | Del

Wen, Zhen, Zhou, Michelle X. and Aggarwal, Vikram (2005): An Optimization-based Approach to Dynamic Visual Context Management. In: InfoVis 2005 - IEEE Symposium on Information Visualization 23-25 October, 2005, Minneapolis, MN, USA. p. 25.

Edit | Del

Zhou, Michelle X. and Aggarwal, Vikram (2004): An optimization-based approach to dynamic data content selection in intelligent multimedia interfaces. In: Proceedings of the 2004 ACM Symposium on User Interface Software and Technology 2004. pp. 227-236.

We are building a multimedia conversation system to facilitate information seeking in large and complex data spaces. To provide tailored responses to diverse user queries introduced during a conversation, we automate the generation of a system response. Here we focus on the problem of determining the data content of a response. Specifically, we develop an optimization-based approach to content selection. Compared to existing rule-based or plan-based approaches, our work offers three unique contributions. First, our approach provides a general framework that effectively addresses content selection for various interaction situations by balancing a comprehensive set of constraints (e.g., content quality and quantity constraints). Second, our method is easily extensible, since it uses feature-based metrics to systematically model selection constraints. Third, our method improves selection results by incorporating content organization and media allocation effects, which otherwise are treated separately. Preliminary studies show that our method can handle most of the user situations identified in a Wizard-of-Oz study, and achieves results similar to those produced by human designers.

© All rights reserved Zhou and Aggarwal and/or ACM Press

Add publication
Show list on your website

Join our community and advance:




Join our community!

Page Information

Page maintainer: The Editorial Team