Computation for latent variable model estimation: a unified stochastic proximal framework
Latent variable models have been playing a central role in psychometrics and related fields. In many modern applications, the inference based on latent variable models involves one or several of the following features: (1) the presence of many latent variables, (2) the observed and latent variables being continuous, discrete, or a combination of both, (3) constraints on parameters, and (4) penalties on parameters to impose model parsimony. The estimation often involves maximizing an objective function based on a marginal likelihood/pseudo-likelihood, possibly with constraints and/or penalties on parameters. Solving this optimization problem is highly non-trivial, due to the complexities brought by the features mentioned above. Although several efficient algorithms have been proposed, there lacks a unified computational framework that takes all these features into account. In this paper, we fill the gap. Specifically, we provide a unified formulation for the optimization problem and then propose a quasi-Newton stochastic proximal algorithm. Theoretical properties of the proposed algorithms are established. The computational efficiency and robustness are shown by simulation studies under various settings for latent variable model estimation.
| Item Type | Article |
|---|---|
| Copyright holders | © 2022 The Authors |
| Departments | LSE > Academic Departments > Statistics |
| DOI | 10.1007/s11336-022-09863-9 |
| Date Deposited | 28 Mar 2022 |
| Acceptance Date | 27 Mar 2022 |
| URI | https://researchonline.lse.ac.uk/id/eprint/114489 |
Explore Further
- https://www.lse.ac.uk/Statistics/People/Yunxiao-Chen (Author)
- https://www.scopus.com/pages/publications/85129565917 (Scopus publication)
- https://www.springer.com/journal/11336 (Official URL)
