Generalized backpropagation algorithm for training second-order neural networks

Fenglei Fan, Wenxiang Cong, Ge Wang

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

Abstract

The artificial neural network is a popular framework in machine learning. To empower individual neurons, we recently suggested that the current type of neurons could be upgraded to second-order counterparts, in which the linear operation between inputs to a neuron and the associated weights is replaced with a nonlinear quadratic operation. A single second-order neurons already have a strong nonlinear modeling ability, such as implementing basic fuzzy logic operations. In this paper, we develop a general backpropagation algorithm to train the network consisting of second-order neurons. The numerical studies are performed to verify the generalized backpropagation algorithm. Copyright © 2017 John Wiley & Sons, Ltd.
Original languageEnglish
Article numbere2956
JournalInternational Journal for Numerical Methods in Biomedical Engineering
Volume34
Issue number5
DOIs
Publication statusPublished - 1 May 2018
Externally publishedYes

Bibliographical note

Publication details (e.g. title, author(s), publication statuses and dates) are captured on an “AS IS” and “AS AVAILABLE” basis at the time of record harvesting from the data source. Suggestions for further amendments or supplementary information can be sent to <a href="mailto:[email protected]">[email protected]</a>.

Funding

This work is partially supported by the Clark & Crossan Endowment Fund at Rensselaer Polytechnic Institute, Troy, NY, USA.

Research Keywords

  • artificial neural network
  • backpropagation (BP)
  • second-order neurons

Fingerprint

Dive into the research topics of 'Generalized backpropagation algorithm for training second-order neural networks'. Together they form a unique fingerprint.

Cite this