Quasi-Equivalence between Width and Depth of Neural Networks

Feng-Lei Fan, Rongjie Lai, Ge Wang*

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

1 Citation (Scopus)
1 Downloads (CityUHK Scholars)

Abstract

While classic studies proved that wide networks allow universal approximation, recent research and successes of deep learning demonstrate the power of deep networks. Based on a symmetric consideration, we investigate if the design of artificial neural networks should have a directional preference, and what the mechanism of interaction is between the width and depth of a network. Inspired by the De Morgan law, we address this fundamental question by establishing a quasi-equivalence between the width and depth of ReLU networks. We formulate two transforms for mapping an arbitrary ReLU network to a wide ReLU network and a deep ReLU network respectively, so that the essentially same capability of the original network can be implemented. Based on our findings, a deep network has a wide equivalent, and vice versa, subject to an arbitrarily small error. ©2023 Fenglei Fan, Rongjie Lai, Ge Wang.
Original languageEnglish
Article number183
Pages (from-to)8742-8763
JournalJournal of Machine Learning Research
Volume24
Issue number1
DOIs
Publication statusPublished - Jan 2023
Externally publishedYes

Research Keywords

  • Artificial neural networks
  • deep learning
  • quasi-equivalence
  • ReLU networks
  • wide learning

Publisher's Copyright Statement

  • This full text is made available under CC-BY 4.0. https://creativecommons.org/licenses/by/4.0/

Fingerprint

Dive into the research topics of 'Quasi-Equivalence between Width and Depth of Neural Networks'. Together they form a unique fingerprint.

Cite this