SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Chandler J, Mueller P, Paolacci G. Behav. Res. Methods 2014; 46(1): 112-130.

Affiliation

Woodrow Wilson School of Public Affairs, Princeton University, Princeton, NJ, USA, jjchandl@umich.edu.

Copyright

(Copyright © 2014, Holtzbrinck Springer Nature Publishing Group)

DOI

10.3758/s13428-013-0365-7

PMID

23835650

Abstract

Crowdsourcing services--particularly Amazon Mechanical Turk--have made it easy for behavioral scientists to recruit research participants. However, researchers have overlooked crucial differences between crowdsourcing and traditional recruitment methods that provide unique opportunities and challenges. We show that crowdsourced workers are likely to participate across multiple related experiments and that researchers are overzealous in the exclusion of research participants. We describe how both of these problems can be avoided using advanced interface features that also allow prescreening and longitudinal data collection. Using these techniques can minimize the effects of previously ignored drawbacks and expand the scope of crowdsourcing as a tool for psychological research.


Language: en

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print