Bayesian Semiparametric Longitudinal Inverse-Probit Mixed Models for Category Learning
Minerva Mukhopadhyay (),
Jacie R. McHaney (),
Bharath Chandrasekaran () and
Abhra Sarkar ()
Additional contact information
Minerva Mukhopadhyay: Indian Institute of Technology
Jacie R. McHaney: Northwestern University
Bharath Chandrasekaran: Northwestern University
Abhra Sarkar: University of Texas at Austin
Psychometrika, 2024, vol. 89, issue 2, No 5, 485 pages
Abstract:
Abstract Understanding how the adult human brain learns novel categories is an important problem in neuroscience. Drift-diffusion models are popular in such contexts for their ability to mimic the underlying neural mechanisms. One such model for gradual longitudinal learning was recently developed in Paulon et al. (J Am Stat Assoc 116:1114–1127, 2021). In practice, category response accuracies are often the only reliable measure recorded by behavioral scientists to describe human learning. Category response accuracies are, however, often the only reliable measure recorded by behavioral scientists to describe human learning. To our knowledge, however, drift-diffusion models for such scenarios have never been considered in the literature before. To address this gap, in this article, we build carefully on Paulon et al. (J Am Stat Assoc 116:1114–1127, 2021), but now with latent response times integrated out, to derive a novel biologically interpretable class of ‘inverse-probit’ categorical probability models for observed categories alone. However, this new marginal model presents significant identifiability and inferential challenges not encountered originally for the joint model in Paulon et al. (J Am Stat Assoc 116:1114–1127, 2021). We address these new challenges using a novel projection-based approach with a symmetry-preserving identifiability constraint that allows us to work with conjugate priors in an unconstrained space. We adapt the model for group and individual-level inference in longitudinal settings. Building again on the model’s latent variable representation, we design an efficient Markov chain Monte Carlo algorithm for posterior computation. We evaluate the empirical performance of the method through simulation experiments. The practical efficacy of the method is illustrated in applications to longitudinal tone learning studies.
Keywords: category learning; B-splines; drift-diffusion models; functional models; inverse Gaussian distributions; longitudinal mixed models; speech learning (search for similar items in EconPapers)
Date: 2024
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
http://link.springer.com/10.1007/s11336-024-09947-8 Abstract (text/html)
Access to the full text of the articles in this series is restricted.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:psycho:v:89:y:2024:i:2:d:10.1007_s11336-024-09947-8
Ordering information: This journal article can be ordered from
http://www.springer. ... gy/journal/11336/PS2
DOI: 10.1007/s11336-024-09947-8
Access Statistics for this article
Psychometrika is currently edited by Irini Moustaki
More articles in Psychometrika from Springer, The Psychometric Society
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().