Final Workshop Report

[pdf] [bib]
Matthew Lease, Emine Yilmaz. Crowdsourcing for Information Retrieval. SIGIR Forum.
December 2011, Volume 45 Number 2, pp 66-75.


[pdf] [bib] Entire Volume

[pdf] Front Matter

Accepted Papers

[pdf] [bib] Mark Smucker and Chandra Prakash Jethani
The Crowd vs. the Lab: A Comparison of Crowd-Sourced and University Laboratory Participant Behavior
Winner: Crowdsourcing Challenge Contest

[pdf] [bib] Patrick Schone and Michael Jones
Genealogical Search Analysis Using Crowd Sourcing

[pdf] [bib] Maria Stone, Kylee Kim, Suvda Myagmar and Omar Alonso
A Comparison of On-Demand Workforce with Trained Judges for Web Search Relevance Evaluation

[pdf] [bib] Qi Su, Chu-Ren Huang and Helen Kai-yun Chen
An Ensemble Framework for Predicting Best Community Answers

[pdf] [bib] Li Tai, Zhang Chuang, Xia Tao, Wu Ming and Xie Jingjing
Quality Control of Crowdsourcing through Workers Experience

[pdf] [bib] Wei Tang and Mathew Lease
Semi-Supervised Consensus Labeling for Crowdsourcing

[pdf] [bib] David Vallet
Crowdsourced Evaluation of Personalization and Diversification Techniques in Web Search

[pdf] [bib] Jeroen Vuurens, Arjen P. De Vries and Carsten Eickhoff 
How Much Spam Can You Take? An Analysis of Crowdsourcing Results to Increase Accuracy

[pdf] [bib] Jun Wang and Bei Yu
Labeling Images with Queries: A Recall-based Image Retrieval Game Approach
Winner: Best Paper Award (Sponsored by Microsoft Bing)

Accepted Demos
[pdf] [bib] Carsten Eickhoff, Christopher G. Harris, Padmini Srinivasan and Arjen P. de Vries
GEAnn - Games for Engaging Annotations

Subpages (1): CIR2011 Citations