Finding the near optimal learning rates of Fuzzy Neural Networks (FNNs) via its equivalent fully connected neural networks (FFNNs)

Jing Wang*, C. L.Philip Chen, Chi-Hsu Wang

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

In this paper, Fuzzy Neural Network (FNN) is transformed into an equivalent fully connected three layer neural network, or FFNN. Based on the FFNN, BP training algorithm is derived. To improve convergent rate, a new method to find near optimal learning rates for FFNN is proposed. Illustrative examples are presented to check the validity of the proposed theory and algorithms. Simulation results show satisfactory results. Finding near optimal learning rates for FNN via its equivalent FFNN has its emerging values in all engineering applications using FNN, such as intelligent adaptive control, pattern recognition, and signal processing,..., etc.

Original languageEnglish
Title of host publicationProceedings 2012 International Conference on System Science and Engineering, ICSSE 2012
Pages137-142
Number of pages6
DOIs
StatePublished - 1 Oct 2012
Event2012 International Conference on System Science and Engineering, ICSSE 2012 - Dalian, Liaoning, China
Duration: 30 Jun 20122 Jul 2012

Publication series

NameProceedings 2012 International Conference on System Science and Engineering, ICSSE 2012

Conference

Conference2012 International Conference on System Science and Engineering, ICSSE 2012
CountryChina
CityDalian, Liaoning
Period30/06/122/07/12

Keywords

  • Back Propagations
  • Fuzzy Logic
  • Fuzzy Neural Networks
  • Gradient Descent
  • Neural Networks
  • Optimal training

Fingerprint Dive into the research topics of 'Finding the near optimal learning rates of Fuzzy Neural Networks (FNNs) via its equivalent fully connected neural networks (FFNNs)'. Together they form a unique fingerprint.

Cite this