DC algorithm for estimation of sparse Gaussian graphical models.

Sparse estimation of a Gaussian graphical model (GGM) is an important technique for making relationships between observed variables more interpretable. Various methods have been proposed for sparse GGM estimation, including the graphical lasso that uses the ℓ1 norm regularization term, and other met...

Full description

Saved in:
Bibliographic Details
Main Authors: Tomokaze Shiratori, Yuichi Takano
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2024-01-01
Series:PLoS ONE
Online Access:https://doi.org/10.1371/journal.pone.0315740
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1841555472009658368
author Tomokaze Shiratori
Yuichi Takano
author_facet Tomokaze Shiratori
Yuichi Takano
author_sort Tomokaze Shiratori
collection DOAJ
description Sparse estimation of a Gaussian graphical model (GGM) is an important technique for making relationships between observed variables more interpretable. Various methods have been proposed for sparse GGM estimation, including the graphical lasso that uses the ℓ1 norm regularization term, and other methods that use nonconvex regularization terms. Most of these methods approximate the ℓ0 (pseudo) norm by more tractable functions; however, to estimate more accurate solutions, it is preferable to directly use the ℓ0 norm for counting the number of nonzero elements. To this end, we focus on sparse estimation of GGM with the cardinality constraint based on the ℓ0 norm. Specifically, we convert the cardinality constraint into an equivalent constraint based on the largest-K norm, and reformulate the resultant constrained optimization problem into an unconstrained penalty form with a DC (difference of convex functions) representation. To solve this problem efficiently, we design a DC algorithm in which the graphical lasso algorithm is repeatedly executed to solve convex optimization subproblems. Experimental results using two synthetic datasets show that our method achieves results that are comparable to or better than conventional methods for sparse GGM estimation. Our method is particularly advantageous for selecting true edges when cross-validation is used to determine the number of edges. Moreover, our DC algorithm converges within a practical time frame compared to the graphical lasso.
format Article
id doaj-art-8ae386b611bc4eae9fbccaff608fe6ce
institution Kabale University
issn 1932-6203
language English
publishDate 2024-01-01
publisher Public Library of Science (PLoS)
record_format Article
series PLoS ONE
spelling doaj-art-8ae386b611bc4eae9fbccaff608fe6ce2025-01-08T05:32:41ZengPublic Library of Science (PLoS)PLoS ONE1932-62032024-01-011912e031574010.1371/journal.pone.0315740DC algorithm for estimation of sparse Gaussian graphical models.Tomokaze ShiratoriYuichi TakanoSparse estimation of a Gaussian graphical model (GGM) is an important technique for making relationships between observed variables more interpretable. Various methods have been proposed for sparse GGM estimation, including the graphical lasso that uses the ℓ1 norm regularization term, and other methods that use nonconvex regularization terms. Most of these methods approximate the ℓ0 (pseudo) norm by more tractable functions; however, to estimate more accurate solutions, it is preferable to directly use the ℓ0 norm for counting the number of nonzero elements. To this end, we focus on sparse estimation of GGM with the cardinality constraint based on the ℓ0 norm. Specifically, we convert the cardinality constraint into an equivalent constraint based on the largest-K norm, and reformulate the resultant constrained optimization problem into an unconstrained penalty form with a DC (difference of convex functions) representation. To solve this problem efficiently, we design a DC algorithm in which the graphical lasso algorithm is repeatedly executed to solve convex optimization subproblems. Experimental results using two synthetic datasets show that our method achieves results that are comparable to or better than conventional methods for sparse GGM estimation. Our method is particularly advantageous for selecting true edges when cross-validation is used to determine the number of edges. Moreover, our DC algorithm converges within a practical time frame compared to the graphical lasso.https://doi.org/10.1371/journal.pone.0315740
spellingShingle Tomokaze Shiratori
Yuichi Takano
DC algorithm for estimation of sparse Gaussian graphical models.
PLoS ONE
title DC algorithm for estimation of sparse Gaussian graphical models.
title_full DC algorithm for estimation of sparse Gaussian graphical models.
title_fullStr DC algorithm for estimation of sparse Gaussian graphical models.
title_full_unstemmed DC algorithm for estimation of sparse Gaussian graphical models.
title_short DC algorithm for estimation of sparse Gaussian graphical models.
title_sort dc algorithm for estimation of sparse gaussian graphical models
url https://doi.org/10.1371/journal.pone.0315740
work_keys_str_mv AT tomokazeshiratori dcalgorithmforestimationofsparsegaussiangraphicalmodels
AT yuichitakano dcalgorithmforestimationofsparsegaussiangraphicalmodels