Abstract
While classic studies proved that wide networks allow universal approximation, recent research and successes of deep learning demonstrate the power of deep networks. Based on a symmetric consideration, we investigate if the design of artificial neural networks should have a directional preference, and what the mechanism of interaction is between the width and depth of a network. Inspired by the De Morgan law, we address this fundamental question by establishing a quasi-equivalence between the width and depth of ReLU networks. We formulate two transforms for mapping an arbitrary ReLU network to a wide ReLU network and a deep ReLU network respectively, so that the essentially same capability of the original network can be implemented. Based on our findings, a deep network has a wide equivalent, and vice versa, subject to an arbitrarily small error. ©2023 Fenglei Fan, Rongjie Lai, Ge Wang.
Original language | English |
---|---|
Article number | 183 |
Pages (from-to) | 8742-8763 |
Journal | Journal of Machine Learning Research |
Volume | 24 |
Issue number | 1 |
DOIs | |
Publication status | Published - Jan 2023 |
Externally published | Yes |
Research Keywords
- Artificial neural networks
- deep learning
- quasi-equivalence
- ReLU networks
- wide learning
Publisher's Copyright Statement
- This full text is made available under CC-BY 4.0. https://creativecommons.org/licenses/by/4.0/