Universality of deep convolutional neural networks

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

359 Scopus Citations
View graph of relations

Author(s)

  • Ding-Xuan Zhou

Detail(s)

Original languageEnglish
Pages (from-to)787-794
Journal / PublicationApplied and Computational Harmonic Analysis
Volume48
Issue number2
Online published13 Jun 2019
Publication statusPublished - Mar 2020

Abstract

Deep learning has been widely applied and brought breakthroughs in speech recognition, computer vision, and many other domains. Deep neural network architectures and computational issues have been well studied in machine learning. But there lacks a theoretical foundation for understanding the approximation or generalization ability of deep learning methods generated by the network architectures such as deep convolutional neural networks. Here we show that a deep convolutional neural network (CNN) is universal, meaning that it can be used to approximate any continuous function to an arbitrary accuracy when the depth of the neural network is large enough. This answers an open question in learning theory. Our quantitative estimate, given tightly in terms of the number of free parameters to be computed, verifies the efficiency of deep CNNs in dealing with large dimensional data. Our study also demonstrates the role of convolutions in deep CNNs.

Research Area(s)

  • Deep learning, Convolutional neural network, Universality, Approximation theory

Citation Format(s)

Universality of deep convolutional neural networks. / Zhou, Ding-Xuan.
In: Applied and Computational Harmonic Analysis, Vol. 48, No. 2, 03.2020, p. 787-794.

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review