Successive-least-squares error algorithm on minimum description length neural networks for time series prediction

Yu Ning Lai, Shiu Yin Yuen

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

1 Citation (Scopus)

Abstract

A successive least-squares approach is proposed to find an optimal model of a flat neural network in a short period of time. It is based on a minimum description length (MDL) neural network that uses the MDL principle as the stopping criterion. Different from conventional algorithms on flat neural networks that apply least-squares technique on weights between hidden layer and output layer only, it extends the least-squares technique to weights between the input layer and the hidden layer. We apply this algorithm to the chaotic Mackey-Glass time series and chaotic laser time series. The results show that it provides satisfactory prediction within a small amount of time.
Original languageEnglish
Title of host publicationProceedings - International Conference on Pattern Recognition
Pages609-612
Volume4
DOIs
Publication statusPublished - 2004
EventProceedings of the 17th International Conference on Pattern Recognition, ICPR 2004 - Cambridge, United Kingdom
Duration: 23 Aug 200426 Aug 2004

Publication series

Name
Volume4
ISSN (Print)1051-4651

Conference

ConferenceProceedings of the 17th International Conference on Pattern Recognition, ICPR 2004
PlaceUnited Kingdom
CityCambridge
Period23/08/0426/08/04

Fingerprint

Dive into the research topics of 'Successive-least-squares error algorithm on minimum description length neural networks for time series prediction'. Together they form a unique fingerprint.

Cite this