Self-adjusting feature maps network and its applications

Dong Lin Li, Mukesh Prasad, Chin-Teng Lin, Jyh-Yeong Chang

Research output: Contribution to journalArticlepeer-review

3 Scopus citations


This paper, proposes a novel artificial neural network, called self-adjusting feature map (SAM), and develop its unsupervised learning ability with self-adjusting mechanism. The trained network structure of representative connected neurons not only displays the spatial relation of the input data distribution but also quantizes the data well. The SAM can automatically isolate a set of connected neurons, in which, the used number of the sets may indicate the number of clusters. The idea of self-adjusting mechanism is based on combining of mathematical statistics and neurological advantages and retreat of waste. In the training process,, for each representative neuron has are three phases, growth, adaptation and decline. The network of representative neurons, first create the necessary neurons according to the local density of the input data in the growth phase. In the adaption phase, it adjusts neighborhood neuron pair's connected/disconnected topology constantly according to the statistics of input feature data. Finally, the unnecessary neurons of the network are merged or remove in the decline phase. In this paper, we exploit the SAM to handle some peculiar cases that cannot be handled easily by classical unsupervised learning networks such as self-organizing map (SOM) network. The remarkable characteristics of the SAM can be seen on various real world cases in the experimental results. (C) 2016 Elsevier B.V. All rights reserved.
Original languageEnglish
Pages (from-to)78-94
Number of pages7
StatePublished - 26 Sep 2016


  • Unsupervised learning; Self-adjusting; Statistics; Quantization; Self-organizing map; Artificial neural networks

Fingerprint Dive into the research topics of 'Self-adjusting feature maps network and its applications'. Together they form a unique fingerprint.

Cite this