Loading...
Please wait, while we are loading the content...
Similar Documents
Chaos in a back propagation neural network while learning the xor-function (1996).
| Content Provider | CiteSeerX |
|---|---|
| Author | Stathis, P. T. Vassiliadis, S. Bertels, Ir. Koen |
| Abstract | The back propagation training algorithm for feed-forward multilayered neural networks is known for its difficult parameterization. Furthermore, when the learning rate is increased, a chaotic behaviour at the output is observed while learning. This is related to the chaos that appears in the verhulst equation, a well studied chaotic equation which appears in a part of the back propagation algorithm. We further examine the influence of chaos on the training performance of back propagation on a feedforward neural network. As shown in our investigation, chaos appears not to inhibit learning but it may slow it down. It also appears that the verhulst-like chaos only occurred at the output neuron and it may relate primarily to its threshold. To understand such behaviour we consider a one-neuron model and investigate the behaviour of the output neuron in relation to the parameters that determine its chaotic behaviour. Also, during experiments with large learning rates in non-epoch... |
| File Format | |
| Publisher Date | 1996-01-01 |
| Access Restriction | Open |
| Subject Keyword | Back Propagation Neural Network Chaotic Behaviour Feed-forward Multilayered Neural Network Verhulst Equation One-neuron Model Output Neuron Learning Rate Back Propagation Training Algorithm Feedforward Neural Network Verhulst-like Chaos Chaotic Equation Training Performance Back Propagation Algorithm Difficult Parameterization Back Propagation |
| Content Type | Text |