Spiked Dirichlet Process Priors for Gaussian Process Models
We expand a framework for Bayesian variable selection for Gaussian process (GP) models by employing spiked Dirichlet process (DP) prior constructions over set partitions containing covariates. Our approach results in a nonparametric treatment of the distribution of the covariance parameters of the G...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Wiley
2010-01-01
|
| Series: | Journal of Probability and Statistics |
| Online Access: | http://dx.doi.org/10.1155/2010/201489 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | We expand a framework for Bayesian variable selection for
Gaussian process (GP) models by employing spiked Dirichlet process (DP)
prior constructions over set partitions containing covariates. Our approach
results in a nonparametric treatment of the distribution of the covariance parameters of the GP covariance matrix that in turn induces a clustering of the
covariates. We evaluate two prior constructions: the first one employs a mixture of a point-mass and a continuous distribution as the centering distribution
for the DP prior, therefore, clustering all covariates. The second one employs a
mixture of a spike and a DP prior with a continuous distribution as the centering distribution, which induces clustering of the selected covariates only. DP
models borrow information across covariates through model-based clustering.
Our simulation results, in particular, show a reduction in posterior sampling
variability and, in turn, enhanced prediction performances. In our model formulations, we accomplish posterior inference by employing novel combinations and extensions of existing algorithms for inference with DP prior models and
compare performances under the two prior constructions. |
|---|---|
| ISSN: | 1687-952X 1687-9538 |