Comparing strategies for winning expert-rated and crowd-rated crowdsourcing contests: First findings

Liang Chen, De Liu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Scopus citations

Abstract

Many studies have been done on expert-rated crowdsourcing contests but few have examined crowd-rated contests in which winners are determined by the voting of the crowd. Due to the different rating mechanisms, determinants for winning may be different under two types of contests. Based on previous studies, we identify three types of winning determinants: expertise, submission timing, and social capital. Our initial investigation, based on 91 entries of two contests in Zooppa, supports that those variables play different roles in winning crowd-rated contests than in winning expert-rated contests. Specifically, past winning experience in crowd-rated contests predicts future success in crowd-rated contests, while past winning experience in expert-rated contests predicts future success in expert-rated contests. We discover a U-shaped relationship between the submission time and winning in both types of contests. Social capital elevates the probability of winning a crowd-rated contest only if the social capital is sufficiently high.

Original languageEnglish (US)
Title of host publication18th Americas Conference on Information Systems 2012, AMCIS 2012
Pages97-107
Number of pages11
StatePublished - Dec 1 2012
Externally publishedYes
Event18th Americas Conference on Information Systems 2012, AMCIS 2012 - Seattle, WA, United States
Duration: Aug 9 2012Aug 12 2012

Publication series

Name18th Americas Conference on Information Systems 2012, AMCIS 2012
Volume1

Other

Other18th Americas Conference on Information Systems 2012, AMCIS 2012
CountryUnited States
CitySeattle, WA
Period8/9/128/12/12

Keywords

  • Crowdsourcing
  • Crowdsourcing contest
  • Open innovation
  • Prize design
  • Social capital
  • Winning determinant

Fingerprint Dive into the research topics of 'Comparing strategies for winning expert-rated and crowd-rated crowdsourcing contests: First findings'. Together they form a unique fingerprint.

Cite this