Deep distributed convolutional neural networks : Universality

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

118 Scopus Citations
View graph of relations

Author(s)

  • Ding-Xuan Zhou

Related Research Unit(s)

Detail(s)

Original languageEnglish
Pages (from-to)895-919
Journal / PublicationAnalysis and Applications
Volume16
Issue number06
Online published3 May 2018
Publication statusPublished - Nov 2018

Abstract

Deep learning based on structured deep neural networks has provided powerful applications in various fields. The structures imposed on the deep neural networks are crucial, which makes deep learning essentially different from classical schemes based on fully connected neural networks. One of the commonly used deep neural network structures is generated by convolutions. The produced deep learning algorithms form the family of deep convolutional neural networks. Despite of their power in some practical domains, little is known about the mathematical foundation of deep convolutional neural networks such as universality of approximation. In this paper, we propose a family of new structured deep neural networks: deep distributed convolutional neural networks. We show that these deep neural networks have the same order of computational complexity as the deep convolutional neural networks, and we prove their universality of approximation. Some ideas of our analysis are from ridge approximation, wavelets, and learning theory.

Research Area(s)

  • convolutional neural networks, deep distributed convolutional neural networks, Deep learning, filter mask, universality

Citation Format(s)

Deep distributed convolutional neural networks: Universality. / Zhou, Ding-Xuan.
In: Analysis and Applications, Vol. 16, No. 06, 11.2018, p. 895-919.

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review