Abstract
Due to the nature of crowdsourcing that openly accepts contributions from the crowd, the need to evaluate the contributions is obvious to ensure their reliability. A number of evaluation methods are used in the existing crowdsourcing applications to evaluate these contributions. This study aims to identify and document these methods. To do this, 50 crowdsourcing applications obtained from an extensive literature and online search were reviewed. Analysis performed on the applications found that depending on the types of crowdsourcing applications, whether simple, complex or creative, three different methods are being used. These are expert judgement, rating and feedback. While expert judgement is mostly used in complex and creative crowdsourcing initiatives, rating is widely used in simple ones. This paper is the only reference known so far that documents the current state of the evaluation methods in the existing crowdsourcing applications. It would be useful in determining the way forward for research in the area, such as designing a new evaluation method. It also justifies the need for automated evaluation method for crowdsourced contributions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Howe, J.: Crowdsourcing: a definition (2015). http://www.crowdsourcing.com
Doan, A., Ramakrishnan, R., Halevy, A.Y.: Crowdsourcing systems on the world-wide web. Commun. ACM 54, 86–96 (2011)
Moher, D., Liberati, A., Tetzlaff, J., Altman, D.G., The PRISMA Group: Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Ann. Intern. Med. 151, 264–269 (2009)
Estellés-Arolas, E., González-Ladrón-de-Guevara, F.: Towards an integrated crowdsourcing definition. J. Inf. Sci. 38, 189–200 (2012)
Borromeo, R.M., Toyama, M.: An investigation of unpaid crowdsourcing. Hum.-Centric Comput. Inf. Sci. 6, 11 (2016)
Rogstadius, J., Kostakos, V., Smus, B., Laredo, J., Vukovic, M.: An assessment of intrinsic and extrinsic motivation on task performance in crowdsourcing markets. In: Fifth International AAAI Conference on Weblogs and Social Media, pp. 321–328. Association of Advanced Artificial Intelligence (2011)
Zhao, Y.C., Zhu, Q.: Effects of extrinsic and intrinsic motivation on participation in crowdsourcing contest: a perspective of self-determination theory. Online Inf. Rev. 38, 896–917 (2014)
Lane, N.D., Miluzzo, E., Lu, H., Peebles, D., Choudhury, T., Campbell, A.T., College, D.: ADHOC and sensor networks a survey of mobile phone sensing. IEEE Commun. Mag. 48, 140–150 (2010)
Brabham, D.C.: Moving the crowd at threadless. Inf. Commun. Soc. 13, 1122–1145 (2010)
Smeets, B.C.W.: Crowdsourcing: the process of innovation (2011)
Kleemann, F., Voß, G.G., Rieder, K.: Un(der)paid innovators: the commercial utilization of consumer work through crowdsourcing. Sci. Technol. Innov. Stud. 4, 5–26 (2008)
Brown, T.C., Daniel, T.C.: Scaling of ratings: concepts and methods. USDA For. Serv. Res. Pap. RM-293, pp. 1–24 (1990)
Müller, N.: BlaBlaCar – business model and empirical analysis of usage patterns (2018)
Pan, W., Xiang, E.W., Yang, Q.: Transfer learning in collaborative filtering with uncertain ratings transfer by integrative factorization. In: Proceedings of the Twenty-Sixth AAAI Conference on Artificial Intelligence, pp. 662–668, Toronto (2011)
Wang, G., Xie, S., Liu, B., Yu, P.S.: Review graph based online store review spammer detection. In: 2011 IEEE 11th International Conference on Data Mining, Vancouver (2011)
Hattie, J., Timperley, H.: The power of feedback. Rev. Educ. Res. 77(1), 81–112 (2007). https://doi.org/10.3102/003465430298487
Leibold, N., Schwarz, L.M.: The art of giving online feedback. J. Eff. Teach. 15, 34–46 (2015)
Acknowledgement
Information presented in this paper forms part of the research work funded by Universiti Tenaga Nasional entitled Formulation of a Trust Building Mechanism for Trustworthy Non-profit Mobile Crowdsourcing Initiatives Using Beta Reputation System (J510050668).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Aris, H., Azizan, A. (2020). A Review on the Methods to Evaluate Crowd Contributions in Crowdsourcing Applications. In: Saeed, F., Mohammed, F., Gazem, N. (eds) Emerging Trends in Intelligent Computing and Informatics. IRICT 2019. Advances in Intelligent Systems and Computing, vol 1073. Springer, Cham. https://doi.org/10.1007/978-3-030-33582-3_97
Download citation
DOI: https://doi.org/10.1007/978-3-030-33582-3_97
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-33581-6
Online ISBN: 978-3-030-33582-3
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)