SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Zhou H, Fishbach A. J. Pers. Soc. Psychol. 2016; 111(4): 493-504.

Copyright

(Copyright © 2016, American Psychological Association)

DOI

10.1037/pspa0000056

PMID

27295328

Abstract

The authors find that experimental studies using online samples (e.g., MTurk) often violate the assumption of random assignment, because participant attrition-quitting a study before completing it and getting paid-is not only prevalent, but also varies systemically across experimental conditions. Using standard social psychology paradigms (e.g., ego-depletion, construal level), they observed attrition rates ranging from 30% to 50% (Study 1). The authors show that failing to attend to attrition rates in online panels has grave consequences. By introducing experimental confounds, unattended attrition misled them to draw mind-boggling yet false conclusions: that recalling a few happy events is considerably more effortful than recalling many happy events, and that imagining applying eyeliner leads to weight loss (Study 2). In addition, attrition rate misled them to draw a logical yet false conclusion: that explaining one's view on gun rights decreases progun sentiment (Study 3). The authors offer a partial remedy (Study 4) and call for minimizing and reporting experimental attrition in studies conducted on the Web. (PsycINFO Database Record

(c) 2016 APA, all rights reserved).


Language: en

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print