- Title
- Calibration of cognitive tests to address the reliability paradox for decision-conflict tasks
- Creator
- Kucina, Talira; Wells, Lindsay; Lewis, Ian; de Salas, Kristy; Kohl, Amelia; Palmer, Matthew A.; Sauer, James D.; Matzke, Dora; Aidman, Eugene; Heathcote, Andrew
- Relation
- Nature Communications Vol. 14, Issue 1, no. 2234
- Publisher Link
- http://dx.doi.org/10.1038/s41467-023-37777-2
- Publisher
- Nature Publishing Group
- Resource Type
- journal article
- Date
- 2023
- Description
- Standard, well-established cognitive tasks that produce reliable effects in group comparisons also lead to unreliable measurement when assessing individual differences. This reliability paradox has been demonstrated in decision-conflict tasks such as the Simon, Flanker, and Stroop tasks, which measure various aspects of cognitive control. We aim to address this paradox by implementing carefully calibrated versions of the standard tests with an additional manipulation to encourage processing of conflicting information, as well as combinations of standard tasks. Over five experiments, we show that a Flanker task and a combined Simon and Stroop task with the additional manipulation produced reliable estimates of individual differences in under 100 trials per task, which improves on the reliability seen in benchmark Flanker, Simon, and Stroop data. We make these tasks freely available and discuss both theoretical and applied implications regarding how the cognitive testing of individual differences is carried out.
- Subject
- calibration; neuropsychological tests; cognition; Stroop test; reaction time; reproducibilty
- Identifier
- http://hdl.handle.net/1959.13/1485820
- Identifier
- uon:51704
- Identifier
- ISSN:2041-1723
- Rights
- x
- Language
- eng
- Reviewed
- Hits: 2405
- Visitors: 2404
- Downloads: 0
Thumbnail | File | Description | Size | Format |
---|