Hoping for optimality or designing for inclusion: Persistence, learning, and the social network of citizen science

Julia K. Parrish, Timothy Jones, Hillary K. Burgess, Yurong He, Lucy Fortson, Darlene Cavalier

Research output: Contribution to journalArticlepeer-review

13 Scopus citations

Abstract

The explosive growth in citizen science combined with a recalcitrance on the part of mainstream science to fully embrace this data collection technique demands a rigorous examination of the factors influencing data quality and project efficacy. Patterns of contributor effort and task performance have been well reviewed in online projects; however, studies of hands-on citizen science are lacking. We used a single hands-on, out-of-doors project-the Coastal Observation and Seabird Survey Team (COASST)-to quantitatively explore the relationships among participant effort, task performance, and social connectedness as a function of the demographic characteristics and interests of participants, placing these results in the context of a meta-analysis of 54 citizen science projects. Although online projects were typified by high (>90%) rates of one-off participation and low retention (<10%) past 1 y, regular COASST participants were highly likely to continue past their first survey (86%), with 54% active 1 y later. Project-wide, task performance was high (88% correct species identifications over the 31,450 carcasses and 163 species found). However, there were distinct demographic differences. Age, birding expertise, and previous citizen science experience had the greatest impact on participant persistence and performance, albeit occasionally in opposite directions. Gender and sociality were relatively inconsequential, although highly gregarious social types, i.e., “nexus people,” were extremely influential at recruiting others. Our findings suggest that hands-on citizen science can produce high-quality data especially if participants persist, and that understanding the demographic data of participation could be used to maximize data quality and breadth of participation across the larger societal landscape.

Original languageEnglish (US)
Pages (from-to)1894-1901
Number of pages8
JournalProceedings of the National Academy of Sciences of the United States of America
Volume116
Issue number6
DOIs
StatePublished - Feb 5 2019

Bibliographical note

Funding Information:
ACKNOWLEDGMENTS. This paper grew out of a presentation at the Sackler Colloquium on Creativity and Collaboration: Reimagining Cybernetic Serendipity. The authors thank the thousands of COASST participants for the steadfast contributions of their time, effort, interest, and demographic information; Emily Grayson (Crab Team), Tina Phillips (Feeder Watch), Gretchen LeBuhn (Great Sunflower Project), Jake Weltzin (NN), Toby Ross (Puget Sound Seabird Survey), and Christy Pattengill-Semmens (REEF) for providing project retention data; and Jackie Lindsey, Jazzmine Allen, and two anonymous reviewers for critical reviews. This work was supported by National Science Foundation (NSF) Education and Human Resources (EHR), Division of Research on Learning Grants 1114734 and 1322820 and Washington Department of Fish and Wildlife Grant 13-1435, all supporting COASST (to J.K.P.); NSF Computer and Information Science and Engineering, Division of Information and Intelligent Systems Grant 1619177 supporting Zooniverse (to L.F.); and NSF EHR/DRL Grant 1516703 supporting SciStarter (to D.C.). J.K.P. is the Lowell A. and Frankie L. Wakefield Professor of Ocean Fishery Sciences.

Keywords

  • Citizen science
  • Crowdsourcing
  • Dabblers
  • Data quality
  • Retention

Fingerprint Dive into the research topics of 'Hoping for optimality or designing for inclusion: Persistence, learning, and the social network of citizen science'. Together they form a unique fingerprint.

Cite this