Careless responding detection revisited: Accuracy of direct and indirect measures
Research output: Journal contributions › Journal articles › Research › peer-review
Standard
In: Behavior Research Methods, Vol. 56, No. 8, 12.2024, p. 8422-8449.
Research output: Journal contributions › Journal articles › Research › peer-review
Harvard
APA
Vancouver
Bibtex
}
RIS
TY - JOUR
T1 - Careless responding detection revisited
T2 - Accuracy of direct and indirect measures
AU - Goldammer, Philippe
AU - Stöckli, Peter Lucas
AU - Escher, Yannik Andrea
AU - Annen, Hubert
AU - Jonas, Klaus
AU - Antonakis, John
N1 - Publisher Copyright: © The Author(s) 2024.
PY - 2024/12
Y1 - 2024/12
N2 - To screen for careless responding, researchers have a choice between several direct measures (i.e., bogus items, requiring the respondent to choose a specific answer) and indirect measures (i.e., unobtrusive post hoc indices). Given the dearth of research in the area, we examined how well direct and indirect indices perform relative to each other. In five experimental studies, we investigated whether the detection rates of the measures are affected by contextual factors: severity of the careless response pattern, type of item keying, and type of item presentation. We fully controlled the information environment by experimentally inducing careless response sets under a variety of contextual conditions. In Studies 1 and 2, participants rated the personality of an actor that presented himself in a 5-min-long videotaped speech. In Studies 3, 4, and 5, participants had to rate their own personality across two measurements. With the exception of maximum longstring, intra-individual response variability, and individual contribution to model misfit, all examined indirect indices performed better than chance in most of the examined conditions. Moreover, indirect indices had detection rates as good as and, in many cases, better than the detection rates of direct measures. We therefore encourage researchers to use indirect indices, especially within-person consistency indices, instead of direct measures.
AB - To screen for careless responding, researchers have a choice between several direct measures (i.e., bogus items, requiring the respondent to choose a specific answer) and indirect measures (i.e., unobtrusive post hoc indices). Given the dearth of research in the area, we examined how well direct and indirect indices perform relative to each other. In five experimental studies, we investigated whether the detection rates of the measures are affected by contextual factors: severity of the careless response pattern, type of item keying, and type of item presentation. We fully controlled the information environment by experimentally inducing careless response sets under a variety of contextual conditions. In Studies 1 and 2, participants rated the personality of an actor that presented himself in a 5-min-long videotaped speech. In Studies 3, 4, and 5, participants had to rate their own personality across two measurements. With the exception of maximum longstring, intra-individual response variability, and individual contribution to model misfit, all examined indirect indices performed better than chance in most of the examined conditions. Moreover, indirect indices had detection rates as good as and, in many cases, better than the detection rates of direct measures. We therefore encourage researchers to use indirect indices, especially within-person consistency indices, instead of direct measures.
KW - Bogus items
KW - Careless responding
KW - Careless responding detection
KW - Indirect methods
KW - Infrequency items
KW - Business psychology
KW - Psychology
UR - http://www.scopus.com/inward/record.url?scp=85201313915&partnerID=8YFLogxK
UR - https://www.mendeley.com/catalogue/6da4b518-c696-3303-ae97-95ae4fd1795a/
U2 - 10.3758/s13428-024-02484-3
DO - 10.3758/s13428-024-02484-3
M3 - Journal articles
C2 - 39147948
AN - SCOPUS:85201313915
VL - 56
SP - 8422
EP - 8449
JO - Behavior Research Methods
JF - Behavior Research Methods
SN - 1554-351X
IS - 8
ER -