Statistical Methods and Data Analysis

Integrating SDT and IRT Models for Mixed-Format Exams

Integrating SDT and IRT Models for Mixed-Format Exams
Published: December 11, 2024 · Last reviewed:
📖381 words2 min read

Lawrence T. DeCarlo’s recent article introduces a psychological framework for mixed-format exams, combining signal detection theory (SDT) for multiple-choice items and item response theory (IRT) for open-ended items. This fusion allows for a unified model that captures the nuances of each item type while providing insights into the underlying cognitive processes of examinees.

Background

Mixed-format exams, commonly used in large-scale assessments, present a challenge for researchers seeking to model responses across different item types. Historically, multiple-choice items have been analyzed using frameworks like signal detection theory, while open-ended items are typically modeled using item response theory. DeCarlo’s work builds on these approaches, introducing a method to unify them through the probability of knowing, a concept that bridges both models.

Key Insights

  • Unified Framework: The article demonstrates how the SDT choice model and IRT sequential logit model can be integrated into a single framework. This approach captures latent states such as “know” and “don’t know” to analyze responses across item types.
  • Psychological Processes: By modeling both item types simultaneously, the approach highlights differences in the cognitive processes involved in multiple-choice and open-ended responses. This sheds light on how examinees interact with each type of item.
  • Estimation Benefits: Fitting the SDT and IRT models together offers potential computational advantages and allows for the examination of shared covariates, improving the overall utility of the framework.

Significance

This fusion of SDT and IRT models represents a significant step forward in psychometric analysis. By addressing the differences and connections between item types, the framework provides a deeper understanding of examinee behavior. This has implications for designing fairer and more reliable assessments, particularly in international exams where mixed-format tests are prevalent.

Future Directions

Future research could focus on expanding the application of this model to other testing contexts, including formative assessments or specialized exams. Additionally, exploring how this framework performs with diverse populations and item designs could further validate its effectiveness and versatility.

Conclusion

DeCarlo’s work offers a robust framework for analyzing mixed-format exams by integrating SDT and IRT models. This unified approach not only enhances our understanding of psychological processes in test-taking but also opens the door to more comprehensive and equitable assessments.

Reference:

DeCarlo, L. T. (2024). Fused SDT/IRT Models for Mixed-Format Exams. Educational and Psychological Measurement, 84(6), 1076-1106. https://doi.org/10.1177/00131644241235333

Related Research

Technological Advances in Psychology

Computerized Adaptive Testing Explained

If you've taken the GRE, GMAT, or certain professional certification exams, you may have noticed something odd: the questions seemed to adjust to your level.…

Feb 24, 2026
Statistical Methods and Data Analysis

Item Response Theory: How Modern Tests Work

Every time you take a standardized test — an IQ assessment, a college entrance exam, a professional certification — the questions have been calibrated using…

Nov 18, 2025
Statistical Methods and Data Analysis

Differential Item Functioning and Response Process

A test item that scores differently for two groups of equally able examinees is called a differential item functioning (DIF) item, and identifying these items…

Dec 16, 2024
Statistical Methods and Data Analysis

Bayesian Hierarchical 2PLM with ADVI

This article discusses a Bayesian hierarchical framework for the Two-Parameter Logistic (2PL) Item Response Theory (IRT) model. By introducing hierarchical priors for both respondent abilities…

Sep 19, 2024

People Also Ask

What is interpreting differential item functioning with response process data?

Understanding differential item functioning (DIF) is critical for ensuring fairness in assessments across diverse groups. A recent study by Li et al. introduces a method to enhance the interpretability of DIF items by incorporating response process data. This approach aims to improve equity in measurement by examining how participants engage with test items, providing deeper insights into the factors influencing DIF outcomes.

Read more →
What is group-theoretical symmetries in item response theory (irt)?

Item Response Theory (IRT) is a widely adopted framework in psychological and educational assessments, used to model the relationship between latent traits and observed responses. This recent work introduces an innovative approach that incorporates group-theoretic symmetry constraints, offering a refined methodology for estimating IRT parameters with greater precision and efficiency.

Read more →
What is theoretical framework for bayesian hierarchical 2plm with advi?

This article discusses a Bayesian hierarchical framework for the Two-Parameter Logistic (2PL) Item Response Theory (IRT) model. By introducing hierarchical priors for both respondent abilities and item parameters, this method offers a detailed perspective on latent traits. Additionally, the use of Automatic Differentiation Variational Inference (ADVI) makes the approach scalable and practical for larger datasets.

Read more →
What is simulated irt dataset generator v1.00 at cogn-iq.org?

The Dataset Generator available at Cogn-IQ.org is a powerful resource designed for researchers and practitioners working with Item Response Theory (IRT). This tool simulates datasets tailored for psychometric analysis, enabling users to explore a range of testing scenarios with customizable item and subject characteristics. It supports the widely used 2-Parameter Logistic (2PL) model, providing flexibility and precision for diverse applications.

Read more →
Why is background important?

Mixed-format exams, commonly used in large-scale assessments, present a challenge for researchers seeking to model responses across different item types. Historically, multiple-choice items have been analyzed using frameworks like signal detection theory, while open-ended items are typically modeled using item response theory. DeCarlo’s work builds on these approaches, introducing a method to unify them through the probability of knowing, a concept that bridges both models.

How does key insights work in practice?

Unified Framework: The article demonstrates how the SDT choice model and IRT sequential logit model can be integrated into a single framework. This approach captures latent states such as "know" and "don’t know" to analyze responses across item types. Psychological Processes: By modeling both item types simultaneously, the approach highlights differences

📋 Cite This Article

Jouve, X. (2024, December 11). Integrating SDT and IRT Models for Mixed-Format Exams. PsychoLogic. https://www.psychologic.online/sdt-irt-mixed-format-exams/

Leave a Reply