Project Results‎ > ‎

QoE Assessment with Crowdsourcing

Quality of Experience (QoE) in multimedia applications is closely linked to the end users' perception and therefore its assessment requires subjective user studies in order to evaluate the degree of delight or annoyance as experienced by the users. QoE crowdtesting refers to QoE assessment using crowdsourcing, where anonymous test subjects conduct subjective tests remotely in their preferred environment. The advantages of QoE crowdtesting lie not only in the reduced time and costs for the tests, but also in a large and diverse panel of international, geographically distributed users in realistic user settings. However, conceptual and technical challenges emerge due to the remote test settings. Key issues arising from QoE crowdtesting include the reliability of user ratings, the influence of incentives, payment schemes and the unknown environmental context of the tests on the results. In order to counter these issues, strategies and methods need to be developed, included in the test design, and also implemented in the actual test campaign, while statistical methods are required to identify reliable user ratings and to ensure high data quality. The results from the project therefore provide a collection of best practices addressing these issues based on our experience gained in a large set of conducted QoE crowdtesting studies. The focus is in particular on the issue of reliability and we use video quality assessment as an example for the proposed best practices, showing that our recommended two-stage QoE crowdtesting design leads to more reliable results.



References

[1] Hoßfeld, Tobias, Christian Keimel, Matthias Hirth, Bruno Gardlo, Julian Habigt, Klaus Diepold, and Phuoc Tran-Gia. "Best Practices for QoE Crowdtesting: QoE Assessment with Crowdsourcing." (2013): IEEE Transactions on Multimedia, vol.16, no.2, pp.541,558, Feb. 2014. doi:10.1109/TMM.2013.2291663
[2] Tobias Hoßfeld. On Training the Crowd for Subjective Quality Studies. VQEG eLetter, 1, 2014, available online.
[3] Short review edited by Tobias Hoßfeld and Christian Timmerer, IEEE COMSOC MMTC R-Letter, Vol. 5, No. 3, June 2014, available online as pdf
[4] Tobias Hoßfeld, Christian Keimel. "Crowdsourcing in QoE Evaluation" in "Quality of Experience: Advanced Concepts, Applications and Methods". Editor(s):Sebastian Möller, Alexander Raake, Springer: T-Labs Series in Telecommunication Services, ISBN 978-3-319-02680-0, doi: 10.1007/978-3-319-02681-7_21, March 2014.
[5] Bruno Gardlo, Michal Ries, Tobias Hoßfeld, Raimund Schatz. "Microworkers vs. Facebook: The Impact of Crowdsourcing Platform Choice on Experimental Results." QoMEX 2012, Yarra Valley, Australia, July 2012, doi:10.1109/QoMEX.2012.6263885
[6] Bruno Gardlo, Michal Ries, Tobias Hoßfeld. "Impact of Screening Technique on Crowdsourcing QoE Assessments". 22nd International Conference Radioelektronika 2012, Special Session on Quality in multimedia systems, Brno, Czech Republic, April 2012, IEEE Xplore.