Loading...
Please wait, while we are loading the content...
Similar Documents
Abstract fast training algorithms for multi-layer neural nets.
| Content Provider | CiteSeerX |
|---|---|
| Author | Brent, Richard P. |
| Abstract | Training a multilayer neural net by back-propagation is slow and requires arbitrary choices regarding the number of hidden units and layers. This paper describes an algorithm which is much faster than back-propagation and for which it is not necessary to specify the number of hidden units in advance. The relationship with other fast pattern recognition algorithms, such as algorithms based on k-d trees, is mentioned. The algorithm has been implemented and tested on artificial problems such as the parity problem and on real problems arising in speech recognition. Experimental results, including training times and recognition accuracy, are given. Generally, the algorithm achieves accuracy as good as or better than nets trained using back-propagation, and the training process is much faster than back-propagation. Accuracy is comparable to that for the “nearest neighbour ” algorithm, which is slower and requires more storage space. 1. |
| File Format | |
| Access Restriction | Open |
| Subject Keyword | Speech Recognition Hidden Unit Real Problem Parity Problem Neighbour Algorithm Training Time Arbitrary Choice Abstract Fast Training Algorithm Storage Space Multi-layer Neural Net Training Process Artificial Problem Experimental Result Multilayer Neural Net K-d Tree Fast Pattern Recognition Algorithm Recognition Accuracy |
| Content Type | Text |
| Resource Type | Article |