On the inability of Gaussian process regression to optimally learn compositional functions
File(s)2205.07764.pdf (302.36 KB)
Accepted version
Author(s)
Giordano, M
Ray, K
Schmidt-Hieber, J
Type
Conference Paper
Abstract
We rigorously prove that deep Gaussian process priors can outperform Gaussian process priors if the target function has a compositional structure. To this end, we study information-theoretic lower bounds for posterior contraction rates for Gaussian process regression in a continuous regression model. We show that if the true function is a generalized additive function, then the posterior based on any mean-zero Gaussian process can only recover the truth at a rate that is strictly slower than the minimax rate by a factor that is polynomially suboptimal in the sample size n.
Date Issued
2022-11-28
Date Acceptance
2022-09-15
ISSN
1049-5258
Start Page
1
End Page
13
Journal / Book Title
Advances in neural information processing systems
Copyright Statement
© 2022 The Author(s)
Identifier
https://proceedings.neurips.cc/paper_files/paper/2022/hash/8c420176b45e923cf99dee1d7356a763-Abstract-Conference.html
Source
36th Conference on Neural Information Processing Systems (NeurIPS 2022)
Publication Status
Published
Start Date
2022-11-28
Finish Date
2022-12-09
Country
Virtual
Date Publish Online
2022-11-28