Partial least-squares algorithm for weights initialization of backpropagation network

Tzu-Chien Hsiao*, Chii Wann Lin, Huihua Kenny Chiang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

26 Scopus citations

Abstract

This paper proposes a hybrid scheme to set the weights initialization and the optimal number of hidden nodes of the backpropagation network (BPN) by applying the loading weights and factor numbers of the partial least-squares (PLS) algorithm. The joint PLS and BPN method (PLSBPN) starts with a small residual error, modifies the latent weight matrices, and obtains a near-global minimum in the calibration phase. Performances of the BPN, PLS, and PLSBPN were compared for the near infrared spectroscopic analysis of glucose concentrations in aqueous matrices. The results showed that the PLSBPN had the smallest root mean square error. The PLSBPN approach significantly solves some conventional problems of the BPN method by providing the good initial weights, reducing the calibration time, obtaining an optimal solution, and easily determining the number of hidden nodes.

Original languageEnglish
Pages (from-to)237-247
Number of pages11
JournalNeurocomputing
Volume50
DOIs
StatePublished - 1 Jan 2003

Keywords

  • Backpropagation network
  • Freedforward neural networks
  • Partial least-squares
  • Weights intialization

Fingerprint Dive into the research topics of 'Partial least-squares algorithm for weights initialization of backpropagation network'. Together they form a unique fingerprint.

Cite this