Volltext-Downloads (blau) und Frontdoor-Views (grau)
Schließen

Filtering survey responses from crowdsourcing platforms : current heuristics and alternative approaches

  • Information Systems research continues to rely on survey participants from crowdsourcing platforms (e.g., Amazon MTurk). Satisficing behavior of these survey participants may reduce attention and threaten validity. To address this, the current research paradigm mandates excluding participants through filtering heuristics (e.g., time, instructional manipulation checks). Yet, both the selection of the filter and the filtering threshold are not standardized. This flexibility may lead to suboptimal filtering and potentially “p-hacking”, as researchers can pick the most “successful” filter. This research is the first to tests a comprehensive set of established and new filters against key metrics (validity, reliability, effect size, power). Additionally, we introduce a multivariate machine learning approach to identify inattentive participants. We find that while filtering heuristics require high filter levels (33% or 66% of participants), machine learning filters are often superior, especially at lower filter levels. Their “black box” character may also help prevent strategic filtering.

Export metadata

Additional Services

Search Google Scholar

Statistics

frontdoor_oas
Metadaten
Document Type:Conference Proceeding
Language:English
Author:Lennard SchmidtORCiD, Erik Maier, Florian Dost
Chairs and Professorships:Chair of Marketing and Retail
URL:https://aisel.aisnet.org/icis2019/research_methods/research_methods/8/
Year of Completion:2020
First Page:4605
Last Page:4621
Note:
Enthalten in: International Conference on Information Systems: 40th International Conference on Information Systems (ICIS 2019) : Munich, Germany, 15-18 December 2019. - Red Hook, NY : Curran Assoiciates, Inc., 2020
Content Focus:Academic Audience
Licence (German):License LogoUrheberrechtlich geschützt