Volume 36, Issue 1
Precision Matrix Estimation by Inverse Principal Orthogonal Decomposition

Cheng Yong Tang, Yingying Fan & Yinfei Kong

Commun. Math. Res., 36 (2020), pp. 68-92.

Published online: 2020-03

Preview Full PDF 530 5235
Export citation
  • Abstract

We investigate the structure of a large precision matrix in Gaussian graphical models by decomposing it into a low rank component and a remainder part with sparse precision matrix. Based on the decomposition, we propose to estimate the large precision matrix by inverting a principal orthogonal decomposition (IPOD). The IPOD approach has appealing practical interpretations in conditional graphical models given the low rank component, and it connects to Gaussian graphical models with latent variables. Specifically, we show that the low rank component in the decomposition of the large precision matrix can be viewed as the contribution from the latent variables in a Gaussian graphical model. Compared with existing approaches for latent variable graphical models, the IPOD is conveniently feasible in practice where only inverting a low-dimensional matrix is required. To identify the number of latent variables, which is an objective of its own interest, we investigate and justify an approach by examining the ratios of adjacent eigenvalues of the sample covariance matrix. Theoretical properties, numerical examples, and a real data application demonstrate the merits of the IPOD approach in its convenience, performance, and interpretability.

  • Keywords

High-dimensional data analysis, latent Gaussian graphical model, precision matrix.

  • AMS Subject Headings

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address

yongtang@temple.edu (Cheng Yong Tang)

fanyingy@marshall.usc.edu (Yingying Fan)

yikong@fullerton.edu (Yinfei Kong)

  • BibTex
  • RIS
  • TXT
@Article{CMR-36-68, author = {Tang , Cheng Yong and Fan , Yingying and Kong , Yinfei}, title = {Precision Matrix Estimation by Inverse Principal Orthogonal Decomposition}, journal = {Communications in Mathematical Research }, year = {2020}, volume = {36}, number = {1}, pages = {68--92}, abstract = {

We investigate the structure of a large precision matrix in Gaussian graphical models by decomposing it into a low rank component and a remainder part with sparse precision matrix. Based on the decomposition, we propose to estimate the large precision matrix by inverting a principal orthogonal decomposition (IPOD). The IPOD approach has appealing practical interpretations in conditional graphical models given the low rank component, and it connects to Gaussian graphical models with latent variables. Specifically, we show that the low rank component in the decomposition of the large precision matrix can be viewed as the contribution from the latent variables in a Gaussian graphical model. Compared with existing approaches for latent variable graphical models, the IPOD is conveniently feasible in practice where only inverting a low-dimensional matrix is required. To identify the number of latent variables, which is an objective of its own interest, we investigate and justify an approach by examining the ratios of adjacent eigenvalues of the sample covariance matrix. Theoretical properties, numerical examples, and a real data application demonstrate the merits of the IPOD approach in its convenience, performance, and interpretability.

}, issn = {2707-8523}, doi = {https://doi.org/10.4208/cmr.2020-0001}, url = {http://global-sci.org/intro/article_detail/cmr/15790.html} }
TY - JOUR T1 - Precision Matrix Estimation by Inverse Principal Orthogonal Decomposition AU - Tang , Cheng Yong AU - Fan , Yingying AU - Kong , Yinfei JO - Communications in Mathematical Research VL - 1 SP - 68 EP - 92 PY - 2020 DA - 2020/03 SN - 36 DO - http://doi.org/10.4208/cmr.2020-0001 UR - https://global-sci.org/intro/article_detail/cmr/15790.html KW - High-dimensional data analysis, latent Gaussian graphical model, precision matrix. AB -

We investigate the structure of a large precision matrix in Gaussian graphical models by decomposing it into a low rank component and a remainder part with sparse precision matrix. Based on the decomposition, we propose to estimate the large precision matrix by inverting a principal orthogonal decomposition (IPOD). The IPOD approach has appealing practical interpretations in conditional graphical models given the low rank component, and it connects to Gaussian graphical models with latent variables. Specifically, we show that the low rank component in the decomposition of the large precision matrix can be viewed as the contribution from the latent variables in a Gaussian graphical model. Compared with existing approaches for latent variable graphical models, the IPOD is conveniently feasible in practice where only inverting a low-dimensional matrix is required. To identify the number of latent variables, which is an objective of its own interest, we investigate and justify an approach by examining the ratios of adjacent eigenvalues of the sample covariance matrix. Theoretical properties, numerical examples, and a real data application demonstrate the merits of the IPOD approach in its convenience, performance, and interpretability.

Cheng Yong Tang, Yingying Fan & Yinfei Kong. (2020). Precision Matrix Estimation by Inverse Principal Orthogonal Decomposition. Communications in Mathematical Research . 36 (1). 68-92. doi:10.4208/cmr.2020-0001
Copy to clipboard
The citation has been copied to your clipboard