Publication statistics

Pub. period:2011-2012
Pub. count:4
Number of co-authors:9



Co-authors

Number of publications with 3 favourite co-authors:

Björn Hartmann:3
Scott Klemmer:2
Steven Dow:2

 

 

Productive colleagues

Anand Kulkarni's 3 most productive colleagues in number of publications:

Björn Hartmann:27
Steven Dow:16
Scott Klemmer:5
 
 
 
Jul 12

To design an easy-to-use interface, pay attention to what users do, not what they say. Self-reported claims are unreliable, as are user speculations about future behavior.

-- Jakob Nielsen

 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger

 
Start reading

Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad

 
Start reading

Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam

 
Start reading
 
 

Help us help you!

 
 

Anand Kulkarni

Add description
Add publication

Publications by Anand Kulkarni (bibliography)

 what's this?
2012
 
Edit | Del

Kulkarni, Anand, Can, Matthew and Hartmann, Björn (2012): Collaboratively crowdsourcing workflows with Turkomatic. In: Proceedings of ACM CSCW12 Conference on Computer-Supported Cooperative Work 2012. pp. 1003-1012.

Preparing complex jobs for crowdsourcing marketplaces requires careful attention to workflow design, the process of decomposing jobs into multiple tasks, which are solved by multiple workers. Can the crowd help design such workflows? This paper presents Turkomatic, a tool that recruits crowd workers to aid requesters in planning and solving complex jobs. While workers decompose and solve tasks, requesters can view the status of worker-designed workflows in real time; intervene to change tasks and solutions; and request new solutions to subtasks from the crowd. These features lower the threshold for crowd employers to request complex work. During two evaluations, we found that allowing the crowd to plan without requester supervision is partially successful, but that requester intervention during workflow planning and execution improves quality substantially. We argue that Turkomatic's collaborative approach can be more successful than the conventional workflow design process and discuss implications for the design of collaborative crowd planning systems.

© All rights reserved Kulkarni et al. and/or ACM Press

 
Edit | Del

Dow, Steven, Kulkarni, Anand, Klemmer, Scott and Hartmann, Björn (2012): Shepherding the crowd yields better work. In: Proceedings of ACM CSCW12 Conference on Computer-Supported Cooperative Work 2012. pp. 1013-1022.

Micro-task platforms provide massively parallel, on-demand labor. However, it can be difficult to reliably achieve high-quality work because online workers may behave irresponsibly, misunderstand the task, or lack necessary skills. This paper investigates whether timely, task-specific feedback helps crowd workers learn, persevere, and produce better results. We investigate this question through Shepherd, a feedback system for crowdsourced work. In a between-subjects study with three conditions, crowd workers wrote consumer reviews for six products they own. Participants in the None condition received no immediate feedback, consistent with most current crowdsourcing practices. Participants in the Self-assessment condition judged their own work. Participants in the External assessment condition received expert feedback. Self-assessment alone yielded better overall work than the None condition and helped workers improve over time. External assessment also yielded these benefits. Participants who received external assessment also revised their work more. We conclude by discussing interaction and infrastructure approaches for integrating real-time assessment into online work.

© All rights reserved Dow et al. and/or ACM Press

 
Edit | Del

Lasecki, Walter, Wesley, Rachel, Kulkarni, Anand and Bigham, Jeffrey (2012): Speaking with the crowd. In: Adjunct Proceedings of the 2012 ACM Symposium on User Interface Software and Technology 2012. pp. 25-26.

Automated systems are not yet able to engage in a robust dialogue with users due the complexity and ambiguity of natural language. However, humans can easily converse with one another and maintain a shared history of past interactions. In this paper, we introduce Chorus, a system that enables real-time, two-way natural language conversation between an end user and a crowd acting as a single agent. Chorus is capable of maintaining a consistent, on-topic conversation with end users across multiple sessions, despite constituent individuals perpetually joining and leaving the crowd. This is enabled by using a curated shared dialogue history. Even though crowd members are constantly providing input, we present users with a stream of dialogue that appears to be from a single conversational partner. Experiments demonstrate that dialogue with Chorus displays elements of conversational memory and interaction consistency. Workers were able to answer 84.6% of user queries correctly, demonstrating that crowd-powered communication interfaces can serve as a robust means of interacting with software systems.

© All rights reserved Lasecki et al. and/or ACM Press

2011
 
Edit | Del

Dow, Steven, Kulkarni, Anand, Bunge, Brie, Nguyen, Truc, Klemmer, Scott and Hartmann, Björn (2011): Shepherding the crowd: managing and providing feedback to crowd workers. In: Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011. pp. 1669-1674.

Micro-task platforms provide a marketplace for hiring people to do short-term work for small payments. Requesters often struggle to obtain high-quality results, especially on content-creation tasks, because work cannot be easily verified and workers can move to other tasks without consequence. Such platforms provide little opportunity for workers to reflect and improve their task performance. Timely and task-specific feedback can help crowd workers learn, persist, and produce better results. We analyze the design space for crowd feedback and introduce Shepherd, a prototype system for visualizing crowd work, providing feedback, and promoting workers into shepherding roles. This paper describes our current progress and our plans for system development and evaluation.

© All rights reserved Dow et al. and/or their publisher

 
Add publication
Show list on your website
 

Join our community and advance:

Your
Skills

Your
Network

Your
Career

 
 
 
 

Changes to this page (author)

23 Nov 2012: Modified
03 Apr 2012: Modified
03 Apr 2012: Modified
05 Jul 2011: Added

Page Information

Page maintainer: The Editorial Team
URL: http://www.interaction-design.org/references/authors/anand_kulkarni.html

Publication statistics

Pub. period:2011-2012
Pub. count:4
Number of co-authors:9



Co-authors

Number of publications with 3 favourite co-authors:

Björn Hartmann:3
Scott Klemmer:2
Steven Dow:2

 

 

Productive colleagues

Anand Kulkarni's 3 most productive colleagues in number of publications:

Björn Hartmann:27
Steven Dow:16
Scott Klemmer:5
 
 
 
Jul 12

To design an easy-to-use interface, pay attention to what users do, not what they say. Self-reported claims are unreliable, as are user speculations about future behavior.

-- Jakob Nielsen

 
 

Featured chapter

Marc Hassenzahl explains the fascinating concept of User Experience and Experience Design. Commentaries by Don Norman, Eric Reiss, Mark Blythe, and Whitney Hess

User Experience and Experience Design !

 
 

Our Latest Books

Kumar and Herger 2013: Gamification at Work: Designing Engaging Business Software...
by Janaki Mythily Kumar and Mario Herger

 
Start reading

Whitworth and Ahmad 2013: The Social Design of Technical Systems: Building technologies for communities...
by Brian Whitworth and Adnan Ahmad

 
Start reading

Soegaard and Dam 2013: The Encyclopedia of Human-Computer Interaction, 2nd Ed....
by Mads Soegaard and Rikke Friis Dam

 
Start reading
 
 

Help us help you!