Crowdsourcing Hypothesis Tests: Making transparent how design choices shape research results
Research output: Journal contributions › Journal articles › Research › peer-review
Standard
Crowdsourcing Hypothesis Tests : Making transparent how design choices shape research results. / Landy, Justin F.; Liam, Miaolei; Ding, Isabel L. et al.
In: Psychological Bulletin, Vol. 146, No. 5, 01.05.2020, p. 451-479.Research output: Journal contributions › Journal articles › Research › peer-review
Harvard
APA
Vancouver
Bibtex
}
RIS
TY - JOUR
T1 - Crowdsourcing Hypothesis Tests
T2 - Making transparent how design choices shape research results
AU - Landy, Justin F.
AU - Liam, Miaolei
AU - Ding, Isabel L.
AU - Viganola, Domenico
AU - Tierney, Warren
AU - Dreber, Anna
AU - Johannesson, Magnus
AU - Pfeiffer, Thomas
AU - Ebersole, Charles R.
AU - Gronau, Quentin F.
AU - Ly, Alexander
AU - Bergh, Don Van Den
AU - Marsman, Maarten
AU - Derks, Koen
AU - Wagenmaker, Eric Jan
AU - Proctor, Andrew
AU - Bartels, Daniel M.
AU - Bauman, Christopher W.
AU - Brady, William J.
AU - Cheung, Felix
AU - Cimpian, Andrei
AU - Dohle, Simone
AU - Donnellan, M. Brent
AU - Hahn, Adam
AU - Hall, Michael P.
AU - Jiménez-Leal, William
AU - Johnson, David J.
AU - Lucas, Richard E.
AU - Monin, BenoÎt
AU - Montealegre, Andres
AU - Mullen, Elizabeth
AU - Pang, Jun
AU - Ray, Jennifer
AU - Reinero, Diego A.
AU - Reynolds, Jesse
AU - Sowden, Walter
AU - Storage, Daniel
AU - Su, Runkun
AU - Tworek, Christina M.
AU - Van Bavel, Jay J.
AU - Walco, Daniel
AU - Wills, Julian
AU - Xu, Xiaobing
AU - Yam, Kai Chi
AU - Yang, Xiaoyu
AU - Cunningham, William A.
AU - Schweinsberg, Martin
AU - Urwitz, Molly
AU - Uhlmann, Eric L.
AU - Adamkovic, Matú
AU - Alaei, Ravin
AU - Albers, Casper J.
AU - Allard, Aurélien
AU - Anderson, Ian A.
AU - Andreychik, Michael R.
AU - Babinčák, Peter
AU - Baker, Bradley J.
AU - Baník, Gabriel
AU - Baskin, Ernest
AU - Bavolar, Jozef
AU - Berkers, Ruud M.W.J.
AU - Białek, Michal
AU - Blanke, Joel
AU - Breuer, Johannes
AU - Brizi, Ambra
AU - Brown, Stephanie E.V.
AU - Brühlmann, Florian
AU - Bruns, Hendrik
AU - Caldwell, Leigh
AU - Campourcy, Jean François
AU - Chan, Eugene Y.
AU - Chang, Yen Ping
AU - Cheung, Benjamin Y.
AU - Chin, Alycia
AU - Cho, Kit W.
AU - Columbus, Simon
AU - Conway, Paul
AU - Corretti, Conrad A.
AU - Craig, Adam W.
AU - Curran, Paul G.
AU - Danvers, Alexander F.
AU - Dawson, Ian G.J.
AU - Day, Martin V.
AU - Dietl, Erik
AU - Doerflinger, Johannes T.
AU - Dominici, Alice
AU - Dranseika, Vilius
AU - Edelsbrunner, Peter A.
AU - Edlund, John E.
AU - Fisher, Matthew
AU - Fung, Anna
AU - Genschow, Oliver
AU - Gnambs, Timo
AU - Goldberg, Matthew H.
AU - Graf-Vlachy, Lorenz
AU - Hafenbrack, Andrew C.
AU - Hafenbrädl, Sebastian
AU - Hartanto, Andree
AU - Heffner, Joseph P.
AU - Hilgard, Joseph
AU - Holzmeister, Felix
AU - Horchak, Oleksandr V.
AU - Huang, Tina S.T.
AU - Hüffmeier, Joachim
AU - Hughes, Sean
AU - Hussey, Ian
AU - Imhoff, Roland
AU - Jaeger, Bastian
AU - Jamro, Konrad
AU - Johnson, Samuel G.B.
AU - Jones, Andrew
AU - Keller, Lucas
AU - Kombeiz, Olga
AU - Krueger, Lacy E.
AU - Lantian, Anthony
AU - Laplante, Justin P.
AU - Lazarevic, Ljiljana B.
AU - Leclerc, Jonathan
AU - Legate, Nicole
AU - Leonhardt, James M.
AU - Leung, Desmond W.
AU - Levitan, Carmel A.
AU - Lin, Hause
AU - Liu, Qinglan
AU - Liuzza, Marco Tullio
AU - Locke, Kenneth D.
AU - Ly, Albert L.
AU - MacEacheron, Melanie
AU - Madan, Christopher R.
AU - Manley, Harry
AU - Mari, Silvia
AU - Martončik, Marcel
AU - McLean, Scott L.
AU - McPhetres, Jonathon
AU - Mercier, Brett G.
AU - Michels, Corinna
AU - Mullarkey, Michael C.
AU - Musser, Erica D.
AU - Nalborczyk, Ladislas
AU - Nilsonne, Gustav
AU - Otis, Nicholas G.
AU - Otner, Sarah M.G.
AU - Otto, Philipp E.
AU - Oviedo-Trespalacios, Oscar
AU - Paruzel-Czachura, Mariola
AU - Pellegrini, Francesco
AU - Pereira, Vitor M.D.
AU - Perfecto, Hannah
AU - Pfuhl, Gerit
AU - Phillips, Mark H.
AU - Plonsky, Ori
AU - Pozzi, Maura
AU - Puric, Danka B.
AU - Raymond-Barker, Brett
AU - Redman, David E.
AU - Reynolds, Caleb J.
AU - Ropovik, Ivan
AU - Röseler, Lukas
AU - Ruessmann, Janna K.
AU - Ryan, William H.
AU - Sablaturova, Nika
AU - Schuepfer, Kurt J.
AU - Schütz, Astrid
AU - Sirota, Miroslav
AU - Stefan, Matthias
AU - Stocks, Eric L.
AU - Strosser, Garrett L.
AU - Suchow, Jordan W.
AU - Szabelska, Anna
AU - Tey, Kian Siong
AU - Tiokhin, Leonid
AU - Troian, Jais
AU - Utesch, Till
AU - Vásquez-Echeverriá, Alejandro
AU - Vaughn, Leigh Ann
AU - Verschoor, Mark
AU - Helversen, Bettina Von
AU - Wallisch, Pascal
AU - Weissgerber, Sophia C.
AU - Wichman, Aaron L.
AU - Woike, Jan K.
AU - Žeželj, Iris
AU - Zickfeld, Janis H.
AU - Ahn, Yeonsin
AU - Blaettchen, Philippe F.
AU - Kang, Xi
AU - Lee, Yoo Jin
AU - Parker, Philip M.
AU - Parker, Paul A.
AU - Song, Jamie S.
AU - Very, May Anne
AU - Wong, Lynn
N1 - Publisher Copyright: © 2020, American Psychological Association.
PY - 2020/5/1
Y1 - 2020/5/1
N2 - To what extent are research results influenced by subjective decisions that scientists make as they design studies? Fifteen research teams independently designed studies to answer five original research questions related to moral judgments, negotiations, and implicit cognition. Participants from 2 separate large samples (total N = 15,000) were then randomly assigned to complete 1 version of each study. Effect sizes varied dramatically across different sets of materials designed to test the same hypothesis: Materials from different teams rendered statistically significant effects in opposite directions for 4 of 5 hypotheses, with the narrowest range in estimates being d = —0.37 to + 0.26. Meta-analysis and a Bayesian perspective on the results revealed overall support for 2 hypotheses and a lack of support for 3 hypotheses. Overall, practically none of the variability in effect sizes was attributable to the skill of the research team in designing materials, whereas considerable variability was attributable to the hypothesis being tested. In a forecasting survey, predictions of other scientists were significantly correlated with study results, both across and within hypotheses. Crowdsourced testing of research hypotheses helps reveal the true consistency of empirical support for a scientific claim.
AB - To what extent are research results influenced by subjective decisions that scientists make as they design studies? Fifteen research teams independently designed studies to answer five original research questions related to moral judgments, negotiations, and implicit cognition. Participants from 2 separate large samples (total N = 15,000) were then randomly assigned to complete 1 version of each study. Effect sizes varied dramatically across different sets of materials designed to test the same hypothesis: Materials from different teams rendered statistically significant effects in opposite directions for 4 of 5 hypotheses, with the narrowest range in estimates being d = —0.37 to + 0.26. Meta-analysis and a Bayesian perspective on the results revealed overall support for 2 hypotheses and a lack of support for 3 hypotheses. Overall, practically none of the variability in effect sizes was attributable to the skill of the research team in designing materials, whereas considerable variability was attributable to the hypothesis being tested. In a forecasting survey, predictions of other scientists were significantly correlated with study results, both across and within hypotheses. Crowdsourced testing of research hypotheses helps reveal the true consistency of empirical support for a scientific claim.
KW - conceptual replications
KW - crowdsourcing
KW - forecasting
KW - research robustness
KW - scientific transparency
KW - Business psychology
UR - http://www.scopus.com/inward/record.url?scp=85081412411&partnerID=8YFLogxK
U2 - 10.1037/bul0000220
DO - 10.1037/bul0000220
M3 - Journal articles
C2 - 31944796
AN - SCOPUS:85081412411
VL - 146
SP - 451
EP - 479
JO - Psychological Bulletin
JF - Psychological Bulletin
SN - 0033-2909
IS - 5
ER -