C
Christian Keimel
Researcher at Technische Universität München
Publications - 43
Citations - 1735
Christian Keimel is an academic researcher from Technische Universität München. The author has contributed to research in topics: Video quality & Crowdsourcing. The author has an hindex of 18, co-authored 43 publications receiving 1552 citations.
Papers
More filters
Qualinet White Paper on Definitions of Quality of Experience
Kjell Brunnström,Sergio Beker,Katrien De Moor,Ann Dooms,Sebastian Egger,Marie-Neige Garcia,Tobias Hossfeld,Satu Jumisko-Pyykkö,Christian Keimel,Mohamed-Chaker Larabi,Bob Lawlor,Patrick Le Callet,Sebastian Möller,Fernando Pereira,Manuela Pereira,Andrew Perkis,Jesenka Pibernik,Antonio M. G. Pinheiro,Alexander Raake,Peter Reichl,Ulrich Reiter,Raimund Schatz,Peter Schelkens,Lea Skorin-Kapov,Dominik Strohmeier,Christian Timmerer,Martin Varela,Ina Wechsung,Junyong You,Andrej Zgank +29 more
TL;DR: The concepts and ideas cited in this paper mainly refer to the Quality of Experience of multimedia communication systems, but may be helpful also for other areas where QoE is an issue, and the document will not reflect the opinion of each individual person at all points.
Journal ArticleDOI
Best Practices for QoE Crowdtesting: QoE Assessment With Crowdsourcing
Tobias Hossfeld,Christian Keimel,Matthias Hirth,Bruno Gardlo,Julian Habigt,Klaus Diepold,Phuoc Tran-Gia +6 more
TL;DR: The focus of this article is on the issue of reliability and the use of video quality assessment as an example for the proposed best practices, showing that the recommended two-stage QoE crowdtesting design leads to more reliable results.
Proceedings ArticleDOI
QualityCrowd — A framework for crowd-based quality evaluation
TL;DR: This contribution proposes the QualityCrowd framework, which allows codec independent quality assessment with a simple web interface, usable with common web browsers, and compared the results from an online subjective test using this framework with theresults from a test in a standardized environment, showing that qualityCrowd delivers equivalent results within the acceptable inter-lab correlation.
Proceedings ArticleDOI
Crowdsourcing-based multimedia subjective evaluations: a case study on image recognizability and aesthetic appeal
TL;DR: High correlation between crowdsourcing and lab scores for recognizability but not for aesthetic appeal is found, indicating that crowdsourcing can be used for QoE subjective assessments as long as the workers' tasks are designed with extreme care to avoid misinterpretations.
Best Practices and Recommendations for Crowdsourced QoE - Lessons learned from the Qualinet Task Force Crowdsourcing
Tobias Hossfeld,Matthias Hirth,Judith Redi,Filippo Mazza,Pavel Korshunov,Babak Naderi,Michael Seufert,Bruno Gardlo,Sebastian Egger,Christian Keimel +9 more
TL;DR: This white paper summarizes the recommendations and best practices for crowdsourced quality assessment of multimedia applications from the Qualinet Task Force on “Crowdsourcing” and resulted from the experience in designing, implementing, and conducting crowdsourcing experiments as well as the analysis of the crowdsourced user ratings and context data.