Optimal Rates of Approximation by Shallow ReLUk Neural Networks and Applications to Nonparametric Regression

Yunfei Yang*, Ding-Xuan Zhou

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

5 Citations (Scopus)
18 Downloads (CityUHK Scholars)

Abstract

We study the approximation capacity of some variation spaces corresponding to shallow ReLUk neural networks. It is shown that sufficiently smooth functions are contained in these spaces with finite variation norms. For functions with less smoothness, the approximation rates in terms of the variation norm are established. Using these results, we are able to prove the optimal approximation rates in terms of the number of neurons for shallow ReLUk neural networks. It is also shown how these results can be used to derive approximation bounds for deep neural networks and convolutional neural networks (CNNs). As applications, we study convergence rates for nonparametric regression using three ReLU neural network models: shallow neural network, over-parameterized neural network, and CNN. In particular, we show that shallow neural networks can achieve the minimax optimal rates for learning Hölder functions, which complements recent results for deep neural networks. It is also proven that over-parameterized (deep or shallow) neural networks can achieve nearly optimal rates for nonparametric regression. © The Author(s) 2024.
Original languageEnglish
JournalConstructive Approximation
Online published26 Feb 2024
DOIs
Publication statusOnline published - 26 Feb 2024

Funding

The work described in this paper was partially supported by InnoHK initiative, The Government of the HKSAR, Laboratory for AI-Powered Financial Technologies, the Research Grants Council of Hong Kong [Projects No. CityU 11306220 and 11308020] and National Natural Science Foundation of China [Project No. 12371103] when the second author worked at City University of Hong Kong. We thank the referees for their helpful comments and suggestions on the paper.

Research Keywords

  • Approximation rate
  • Neural network
  • Nonparametric regression
  • Spherical harmonic

Publisher's Copyright Statement

  • This full text is made available under CC-BY 4.0. https://creativecommons.org/licenses/by/4.0/

RGC Funding Information

  • RGC-funded

Fingerprint

Dive into the research topics of 'Optimal Rates of Approximation by Shallow ReLUk Neural Networks and Applications to Nonparametric Regression'. Together they form a unique fingerprint.

Cite this