EconPapers    
Economics at your fingertips  
 

A Note on the Connection Between Trek Rules and Separable Nonlinear Least Squares in Linear Structural Equation Models

Maximilian S. Ernst (), Aaron Peikert, Andreas M. Brandmaier and Yves Rosseel
Additional contact information
Maximilian S. Ernst: Max Planck Institute for Human Development
Aaron Peikert: Max Planck Institute for Human Development
Andreas M. Brandmaier: Max Planck Institute for Human Development
Yves Rosseel: Ghent University

Psychometrika, 2023, vol. 88, issue 1, No 5, 98-116

Abstract: Abstract We show that separable nonlinear least squares (SNLLS) estimation is applicable to all linear structural equation models (SEMs) that can be specified in RAM notation. SNLLS is an estimation technique that has successfully been applied to a wide range of models, for example neural networks and dynamic systems, often leading to improvements in convergence and computation time. It is applicable to models of a special form, where a subset of parameters enters the objective linearly. Recently, Kreiberg et al. (Struct Equ Model Multidiscip J 28(5):725–739, 2021. https://doi.org/10.1080/10705511.2020.1835484 ) have shown that this is also the case for factor analysis models. We generalize this result to all linear SEMs. To that end, we show that undirected effects (variances and covariances) and mean parameters enter the objective linearly, and therefore, in the least squares estimation of structural equation models, only the directed effects have to be obtained iteratively. For model classes without unknown directed effects, SNLLS can be used to analytically compute least squares estimates. To provide deeper insight into the nature of this result, we employ trek rules that link graphical representations of structural equation models to their covariance parametrization. We further give an efficient expression for the gradient, which is crucial to make a fast implementation possible. Results from our simulation indicate that SNLLS leads to improved convergence rates and a reduced number of iterations.

Keywords: Gaussian graphical model; graph theory; numerical optimization; least squares estimation; RAM notation (search for similar items in EconPapers)
Date: 2023
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
http://link.springer.com/10.1007/s11336-022-09891-5 Abstract (text/html)
Access to the full text of the articles in this series is restricted.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:psycho:v:88:y:2023:i:1:d:10.1007_s11336-022-09891-5

Ordering information: This journal article can be ordered from
http://www.springer. ... gy/journal/11336/PS2

DOI: 10.1007/s11336-022-09891-5

Access Statistics for this article

Psychometrika is currently edited by Irini Moustaki

More articles in Psychometrika from Springer, The Psychometric Society
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-03-20
Handle: RePEc:spr:psycho:v:88:y:2023:i:1:d:10.1007_s11336-022-09891-5