Publication: A review on the methods to evaluate crowd contributions in crowdsourcing applications
No Thumbnail Available
Date
2020
Authors
Aris H.
Azizan A.
Journal Title
Journal ISSN
Volume Title
Publisher
Springer
Abstract
Due to the nature of crowdsourcing that openly accepts contributions from the crowd, the need to evaluate the contributions is obvious to ensure their reliability. A number of evaluation methods are used in the existing crowdsourcing applications to evaluate these contributions. This study aims to identify and document these methods. To do this, 50 crowdsourcing applications obtained from an extensive literature and online search were reviewed. Analysis performed on the applications found that depending on the types of crowdsourcing applications, whether simple, complex or creative, three different methods are being used. These are expert judgement, rating and feedback. While expert judgement is mostly used in complex and creative crowdsourcing initiatives, rating is widely used in simple ones. This paper is the only reference known so far that documents the current state of the evaluation methods in the existing crowdsourcing applications. It would be useful in determining the way forward for research in the area, such as designing a new evaluation method. It also justifies the need for automated evaluation method for crowdsourced contributions. � Springer Nature Switzerland AG 2020.
Description
Computation theory; Intelligent computing; Surveying; Automated evaluation; Evaluation method; Evaluation methods; Expert judgement; Grounded theory; New evaluation methods; Systematic Review; Crowdsourcing