EconPapers    
Economics at your fingertips  
 

glasso: Graphical lasso for learning sparse inverse covariance matrices

Aramayis Dallakyan
Additional contact information
Aramayis Dallakyan: Texas A&M University

2021 Stata Conference from Stata Users Group

Abstract: In modern multivariate statistics, where high-dimensional datasets are ubiquitous, learning large inverse covariance matrices is a fundamental problem. A popular approach is to apply a penalty on the Gaussian log-likelihood and solve the convex optimization problem. Graphical lasso (Glasso) (Friedman et al. 2008) is one of the efficient and popular algorithms for imposing sparsity on the inverse covariance matrix. In this article, we introduce a corresponding new command glasso and explore the details of the algorithm. Moreover, we discuss widely used criteria for tuning parameter selection, such as the extended Bayesian information criterion (eBIC) and cross-validation (CV), and introduce corresponding commands. Simulation results and real data analysis illustrate the use of the Glasso.

Date: 2021-08-07
New Economics Papers: this item is included in nep-isf and nep-ore
References: Add references at CitEc
Citations:

Downloads: (external link)
http://fmwww.bc.edu/repec/scon2021/US21_Dallakyan.pdf

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:boc:scon21:18

Access Statistics for this paper

More papers in 2021 Stata Conference from Stata Users Group Contact information at EDIRC.
Bibliographic data for series maintained by Christopher F Baum ().

 
Page updated 2025-03-19
Handle: RePEc:boc:scon21:18