+ Site Statistics
References:
54,258,434
Abstracts:
29,560,870
PMIDs:
28,072,757
+ Search Articles
+ Subscribe to Site Feeds
Most Shared
PDF Full Text
+ PDF Full Text
Request PDF Full Text
+ Follow Us
Follow on Facebook
Follow on Twitter
Follow on LinkedIn
+ Translate
+ Recently Requested

Methods of training and constructing multilayer perceptrons with arbitrary pattern sets



Methods of training and constructing multilayer perceptrons with arbitrary pattern sets



International Journal of Neural Systems 6(3): 233-247



This paper presents two compensation methods for multilayer perceptrons (MLPs) which are very difficult to train by traditional Back Propagation (BP) methods. For MLPs trapped in local minima, compensating methods can correct the wrong outputs one by one using constructing techniques until all outputs are right, so that the MLPs can skip from the local minima to the global minima. A hidden neuron is added as compensation for a binary input three-layer perceptron trapped in a local minimum; and one or two hidden neurons are added as compensation for a real input three-layer perception. For a perceptron of more than three layers, the second hidden layer from behind will be temporarily treated as the input layer during compensation, hence the above methods can also be used. Examples are given.

(PDF emailed within 0-6 h: $19.90)

Accession: 046676435

Download citation: RISBibTeXText

PMID: 8589861

DOI: 10.1142/S0129065795000172


Related references

Specification of training sets and the number of hidden neurons for multilayer perceptrons. Neural Computation 13(12): 2673-2680, 2001

Fast training of multilayer perceptrons. IEEE Transactions on Neural Networks 8(6): 1314-1320, 1997

Efficient block training of multilayer perceptrons. Neural Computation 12(6): 1429-1447, 2000

Fast parallel off-line training of multilayer perceptrons. IEEE Transactions on Neural Networks 8(3): 646-653, 1997

Deterministic nonmonotone strategies for effective training of multilayer perceptrons. IEEE Transactions on Neural Networks 13(6): 1268-1284, 2008

Dynamic tunneling technique for efficient training of multilayer perceptrons. IEEE Transactions on Neural Networks 10(1): 48-55, 2008

Efficient training of multilayer perceptrons using principal component analysis. Physical Review. E, Statistical, Nonlinear, and Soft Matter Physics 72(2 Pt 2): 026117-026117, 2005

A new error function at hidden layers for past training of multilayer perceptrons. IEEE Transactions on Neural Networks 10(4): 960-964, 2008

An equalized error backpropagation algorithm for the on-line training of multilayer perceptrons. IEEE Transactions on Neural Networks 13(3): 532-541, 2008

Determining and improving the fault tolerance of multilayer perceptrons in a pattern-recognition application. IEEE Transactions on Neural Networks 4(5): 788-793, 1993

Complexity issues in natural gradient descent method for training multilayer perceptrons. Neural Computation 10(8): 2137-2157, 1998

Three Methods to Speed up the Training of Feedforward and Feedback Perceptrons. Neural Networks 10(8): 1435-1443, 2003

Representation and extrapolation in multilayer perceptrons. Neural Computation 14(7): 1739-1754, 2002

Infinite-dimensional multilayer perceptrons. IEEE Transactions on Neural Networks 7(4): 889-896, 1996

Dynamic sizing of multilayer perceptrons. Biological Cybernetics. 71(1): 49-63, 1994