Bibtype |
Inproceedings |
Bibkey |
Munteanu/etal/2019a |
Author |
Munteanu, Alexander and Nayebi, Amin and Poloczek, Matthias |
Title |
A Framework for {B}ayesian Optimization in Embedded Subspaces |
Booktitle |
Proceedings of the 36th International Conference on Machine Learning (ICML) |
Series |
Proceedings of Machine Learning Research |
Volume |
97 |
Pages |
4752--4761 |
Address |
Long Beach, California, USA |
Publisher |
PMLR |
Abstract |
We present a theoretically founded approach for high-dimensional Bayesian optimization based on low-dimensional subspace embeddings. We prove that the error in the Gaussian process model is bounded tightly when going from the original high-dimensional search domain to the low-dimensional embedding. This implies that the optimization process in the low-dimensional embedding proceeds essentially as if it were run directly on an unknown active subspace of low dimensionality. The argument applies to a large class of algorithms and GP models, including non-stationary kernels. Moreover, we provide an efficient implementation based on hashing and demonstrate empirically that this subspace embedding achieves considerably better results than the previously proposed methods for high-dimensional BO based on Gaussian matrix projections and structure-learning.
|
Month |
06 |
Year |
2019 |
Projekt |
SFB876-C4 |