Publication:
A review on the methods to evaluate crowd contributions in crowdsourcing applications

dc.citedby1
dc.contributor.authorAris H.en_US
dc.contributor.authorAzizan A.en_US
dc.contributor.authorid13608397500en_US
dc.contributor.authorid57213355273en_US
dc.date.accessioned2023-05-29T08:14:17Z
dc.date.available2023-05-29T08:14:17Z
dc.date.issued2020
dc.descriptionComputation theory; Intelligent computing; Surveying; Automated evaluation; Evaluation method; Evaluation methods; Expert judgement; Grounded theory; New evaluation methods; Systematic Review; Crowdsourcingen_US
dc.description.abstractDue to the nature of crowdsourcing that openly accepts contributions from the crowd, the need to evaluate the contributions is obvious to ensure their reliability. A number of evaluation methods are used in the existing crowdsourcing applications to evaluate these contributions. This study aims to identify and document these methods. To do this, 50 crowdsourcing applications obtained from an extensive literature and online search were reviewed. Analysis performed on the applications found that depending on the types of crowdsourcing applications, whether simple, complex or creative, three different methods are being used. These are expert judgement, rating and feedback. While expert judgement is mostly used in complex and creative crowdsourcing initiatives, rating is widely used in simple ones. This paper is the only reference known so far that documents the current state of the evaluation methods in the existing crowdsourcing applications. It would be useful in determining the way forward for research in the area, such as designing a new evaluation method. It also justifies the need for automated evaluation method for crowdsourced contributions. � Springer Nature Switzerland AG 2020.en_US
dc.description.natureFinalen_US
dc.identifier.doi10.1007/978-3-030-33582-3_97
dc.identifier.epage1041
dc.identifier.scopus2-s2.0-85077774650
dc.identifier.spage1031
dc.identifier.urihttps://www.scopus.com/inward/record.uri?eid=2-s2.0-85077774650&doi=10.1007%2f978-3-030-33582-3_97&partnerID=40&md5=3fb159fa6fe6068074b79fc784465406
dc.identifier.urihttps://irepository.uniten.edu.my/handle/123456789/25787
dc.identifier.volume1073
dc.publisherSpringeren_US
dc.sourceScopus
dc.sourcetitleAdvances in Intelligent Systems and Computing
dc.titleA review on the methods to evaluate crowd contributions in crowdsourcing applicationsen_US
dc.typeConference Paperen_US
dspace.entity.typePublication
Files
Collections