Deep distributed convolutional neural networks : Universality
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Pages (from-to) | 895-919 |
Journal / Publication | Analysis and Applications |
Volume | 16 |
Issue number | 06 |
Online published | 3 May 2018 |
Publication status | Published - Nov 2018 |
Link(s)
Abstract
Deep learning based on structured deep neural networks has provided powerful applications in various fields. The structures imposed on the deep neural networks are crucial, which makes deep learning essentially different from classical schemes based on fully connected neural networks. One of the commonly used deep neural network structures is generated by convolutions. The produced deep learning algorithms form the family of deep convolutional neural networks. Despite of their power in some practical domains, little is known about the mathematical foundation of deep convolutional neural networks such as universality of approximation. In this paper, we propose a family of new structured deep neural networks: deep distributed convolutional neural networks. We show that these deep neural networks have the same order of computational complexity as the deep convolutional neural networks, and we prove their universality of approximation. Some ideas of our analysis are from ridge approximation, wavelets, and learning theory.
Research Area(s)
- convolutional neural networks, deep distributed convolutional neural networks, Deep learning, filter mask, universality
Citation Format(s)
Deep distributed convolutional neural networks: Universality. / Zhou, Ding-Xuan.
In: Analysis and Applications, Vol. 16, No. 06, 11.2018, p. 895-919.
In: Analysis and Applications, Vol. 16, No. 06, 11.2018, p. 895-919.
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review