Abstract
The artificial neural network is a popular framework in machine learning. To empower individual neurons, we recently suggested that the current type of neurons could be upgraded to second-order counterparts, in which the linear operation between inputs to a neuron and the associated weights is replaced with a nonlinear quadratic operation. A single second-order neurons already have a strong nonlinear modeling ability, such as implementing basic fuzzy logic operations. In this paper, we develop a general backpropagation algorithm to train the network consisting of second-order neurons. The numerical studies are performed to verify the generalized backpropagation algorithm. Copyright © 2017 John Wiley & Sons, Ltd.
| Original language | English |
|---|---|
| Article number | e2956 |
| Journal | International Journal for Numerical Methods in Biomedical Engineering |
| Volume | 34 |
| Issue number | 5 |
| DOIs | |
| Publication status | Published - 1 May 2018 |
| Externally published | Yes |
Bibliographical note
Publication details (e.g. title, author(s), publication statuses and dates) are captured on an “AS IS” and “AS AVAILABLE” basis at the time of record harvesting from the data source. Suggestions for further amendments or supplementary information can be sent to <a href="mailto:[email protected]">[email protected]</a>.Funding
This work is partially supported by the Clark & Crossan Endowment Fund at Rensselaer Polytechnic Institute, Troy, NY, USA.
Research Keywords
- artificial neural network
- backpropagation (BP)
- second-order neurons
Fingerprint
Dive into the research topics of 'Generalized backpropagation algorithm for training second-order neural networks'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver