Pay for performance, satisfaction and retention in longitudinal crowdsourced research

Elena M. Auer, Tara S. Behrend, Andrew B. Collmus, Richard N. Landers, Ahleah F. Miles

Research output: Contribution to journalArticlepeer-review

Abstract

In the social and cognitive sciences, crowdsourcing provides up to half of all research participants. Despite this popularity, researchers typically do not conceptualize participants accurately, as gig-economy worker-participants. Applying theories of employee motivation and the psychological contract between employees and employers, we hypothesized that pay and pay raises would drive worker-participant satisfaction, performance, and retention in a longitudinal study. In an experiment hiring 359 Amazon Mechanical Turk Workers, we found that initial pay, relative increase of pay over time, and overall pay did not have substantial influence on subsequent performance. However, pay significantly predicted participants’ perceived choice, justice perceptions, and attrition. Given this, we conclude that worker-participants are particularly vulnerable to exploitation, having relatively low power to negotiate pay. Results of this study suggest that researchers wishing to crowdsource research participants using MTurk might not face practical dangers such as decreased performance as a result of lower pay, but they must recognize an ethical obligation to treat Workers fairly.

Original languageEnglish (US)
Article numbere0245460
JournalPloS one
Volume16
Issue number1 January
DOIs
StatePublished - Jan 2021

Bibliographical note

Publisher Copyright:
Copyright: This is an open access article, free of all copyright, and may be freely reproduced, distributed, transmitted, modified, built upon, or otherwise used by anyone for any lawful purpose. The work is made available under the Creative Commons CC0 public domain dedication.

PubMed: MeSH publication types

  • Journal Article

Fingerprint

Dive into the research topics of 'Pay for performance, satisfaction and retention in longitudinal crowdsourced research'. Together they form a unique fingerprint.

Cite this