Summary and Highlights

Crowdsourcing can be defined as a strategy to outsource paid work that has been performed by regular
employees in the past. With Crowdsourcing, any individual or organization can accomplish this through a public request for participation on an Internet platform targeted at a crowd of — typically unknown — participants. The platform’s operator has to guarantee this platform’s operation and needs to ensure a balance of interests between employees and employers. In the first two-year phase of this project several different mechanisms have been developed to both reduce the time needed to process tasks as well as to enhance the quality of the task’s results. Recommendation systems and means to determine an employee’s level of trustworthiness have been designed independent of any specific scenarios. Other mechanisms are tailor-made for the selected fields of ‘QoE Studies’ and ‘network measurements’. The mechanisms’ design is based on several conducted analyses and models derived from these.

The goal of the second phase of this project remains to be the development and evaluation of novel mechanisms as well as to optimize the ability of Crowdsourcing to act as a novel way to organize work on the Internet. In order to achieve this, research on recommendation systems for Crowdsourcing shall continue in this project, as has been intended by the first proposal. One of the key research points will be the examination of selection criteria — specifically those related to the similarity of tasks — and an employer’s preferences of these criteria. In the project’s first phase such mechanisms have only been developed for so-called “Microtask-Markets” as well as very narrow use cases in those markets. Now, these mechanisms are to be extended to two additional use cases that have become increasingly important both in economic as well as social terms. These are: Mobile Crowdsourcing and Enterprise Crowdsourcing. Mobile Crowdsourcing requires the task to be completed on a mobile phone in order to record additional sensor readings such as the current location and other physical properties. And the term Enterprise Crowdsourcing subsumes all Crowdsourcing applications for businesses. Here, a single business acts as a provider of all tasks that are either fully or partially public, or limited to the company’s own employees. The particular characteristics of these two use cases are to be investigated in this second project phase, and then incorporated into the modeling and design of such mechanisms. Through this
approach, the developed techniques will also be tested for their suitability in real world scenarios.


  • Elsevier Computer Networks Special Issue on “Crowdsourcing”, Volume 90, 2015
  • Dagstuhl seminar 15481 “Evaluation in the Crowd: Crowdsourcing and Human-Centred Experiments, 22-27 November 2015”,
  • ACM CrowdMM “Crowdsourcing for Multimedia”: CrowdMM 2014, Orlando, FL, CrowdMM 2015, Brisbane, AU
  • IEEE Transactions on Multimedia article "Best Practices for QoE Crowdtesting: QoE Assessment with Crowdsourcing", volume 16, Feb. 2014.
  • Seufert, M., Zach, O., Hoßfeld, T., Slanina, M., & Tran-Gia, P. (2016, June). Impact of test condition selection in adaptive crowdsourcing studies on subjective quality. In Eighth Int. Conference on Quality of Multimedia Experience (QoMEX), 2016, Best Paper Award Candidate
  • Kathrin Borchert, Recommendation Systems in Crowdsourcing Platforms (Master Thesis), Award for Outstanding Thesis in Engineering Sciences by the Bavarian Ministry of Economics. (2015)  
  • Hoßfeld, T., Hirth, M., Korshunov, P., Hanhart, P., Gardlo, B., Keimel, C., & Timmerer, C. (2014, September). Survey of web-based crowdsourcing frameworks for subjective quality assessment. Top 10% Paper Award, IEEE Multimedia Signal Processing (MMSP), 2014 

Funding: The project Crowdsourcing is funded by Deutsche Forschungsgemeinschaft (DFG) in the subject area "Operating, Communication and Information Systems" since 2014. Research grant holders are Tobias Hoßfeld (University of Duisburg-Essen, Chair of Modeling of Adaptive Systems), Phuoc Tran-Gia (University of Würzburg, Chair of Communication Networks) , Ralf Steinmetz and Christoph Rensing (TU Darmstadt, Multimedia Communications Lab) under grants HO 4770/2-2, TR257/38-2, STE 866/9-2, RE 2593/3-2: "Design and Evaluation of new mechanisms for crowdsourcing as emerging paradigm for the organization of work in the Internet". The original German title of the project is "Design und Bewertung neuer Mechanismen für Crowdsourcing als neue Form der Arbeitsorganisation im Internet".