A new dynamic optimal learning rate for a two-layer neural network

Tong Zhang*, C. L.Philip Chen, Chi-Hsu Wang, Sik Chung Tam

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

3 Scopus citations

Abstract

The learning rate is crucial for the training process of a two-layer neural network (NN). Therefore, many researches have been done to find the optimal learning rate so that maximum error reduction can be achieved in all iterations. However, in this paper, we found that the best learning rate can be further improved. In saying so, we have revised the direction to search for a new dynamic optimal learning rate, which can have a better convergence in less iteration count than previous approach. There exists a ratio k between out new optimal learning rate and the previous one after the first iteration. In contrast to earlier approaches, the new optimal learning rate of the two-layer NN has a better performance in the same experiment. So we can conclude that our new dynamic optimal learning rate can be a very useful one for the applications of neural networks.

Original languageEnglish
Title of host publicationProceedings 2012 International Conference on System Science and Engineering, ICSSE 2012
Pages55-59
Number of pages5
DOIs
StatePublished - 1 Oct 2012
Event2012 International Conference on System Science and Engineering, ICSSE 2012 - Dalian, Liaoning, China
Duration: 30 Jun 20122 Jul 2012

Publication series

NameProceedings 2012 International Conference on System Science and Engineering, ICSSE 2012

Conference

Conference2012 International Conference on System Science and Engineering, ICSSE 2012
CountryChina
CityDalian, Liaoning
Period30/06/122/07/12

Keywords

  • learning rate
  • neural network
  • new optimal learning rate
  • ratio k
  • two-layer NN

Fingerprint Dive into the research topics of 'A new dynamic optimal learning rate for a two-layer neural network'. Together they form a unique fingerprint.

Cite this