TY - JOUR
T1 - Uniform approximation rates and metric entropy of shallow neural networks
AU - Ma, Limin
AU - Siegel, Jonathan W.
AU - Xu, Jinchao
N1 - Generated from Scopus record by KAUST IRTS on 2023-02-15
PY - 2022/9/1
Y1 - 2022/9/1
N2 - We study the approximation properties of the variation spaces corresponding to shallow neural networks with respect to the uniform norm. Specifically, we consider the spectral Barron space, which consists of the convex hull of decaying Fourier modes, and the convex hull of indicator functions of half-spaces, which corresponds to shallow neural networks with sigmoidal activation function. Up to logarithmic factors, we determine the metric entropy and nonlinear dictionary approximation rates for these spaces with respect to the uniform norm. Combined with previous results with respect to the L2-norm, this also gives the metric entropy up to logarithmic factors with respect to any Lp-norm with 1 ≤ p≤ ∞. In addition, we study the approximation rates for high-order spectral Barron spaces using shallow neural networks with ReLUk activation function. Specifically, we show that for a sufficiently high-order spectral Barron space, ReLUk networks are able to achieve an approximation rate of n-(k+1) with respect to the uniform norm.
AB - We study the approximation properties of the variation spaces corresponding to shallow neural networks with respect to the uniform norm. Specifically, we consider the spectral Barron space, which consists of the convex hull of decaying Fourier modes, and the convex hull of indicator functions of half-spaces, which corresponds to shallow neural networks with sigmoidal activation function. Up to logarithmic factors, we determine the metric entropy and nonlinear dictionary approximation rates for these spaces with respect to the uniform norm. Combined with previous results with respect to the L2-norm, this also gives the metric entropy up to logarithmic factors with respect to any Lp-norm with 1 ≤ p≤ ∞. In addition, we study the approximation rates for high-order spectral Barron spaces using shallow neural networks with ReLUk activation function. Specifically, we show that for a sufficiently high-order spectral Barron space, ReLUk networks are able to achieve an approximation rate of n-(k+1) with respect to the uniform norm.
UR - https://link.springer.com/10.1007/s40687-022-00346-y
UR - http://www.scopus.com/inward/record.url?scp=85134396276&partnerID=8YFLogxK
U2 - 10.1007/s40687-022-00346-y
DO - 10.1007/s40687-022-00346-y
M3 - Article
SN - 2522-0144
VL - 9
JO - Research in Mathematical Sciences
JF - Research in Mathematical Sciences
IS - 3
ER -