Loading...
Please wait, while we are loading the content...
Robust Neural Network Training Using Partial Gradient Probing
| Content Provider | Semantic Scholar |
|---|---|
| Author | Miloš Wilamowski |
| Copyright Year | 2004 |
| Abstract | Proposed algorithm features fast and robust convergence for one bidden layer neural networks. Search for weights is done only io the input layer i.e. on compressed network. Only forward propagation is performed with second layer trained automatically with Pseudo-Inversion training, for all patterns at once. Last layer training is also modified to handle non-linear problems, not presented here. Through iterations gradient is randomly probed towards each weight set dimension. The algorithm further features serious of modifcations, such as adaptive network parameters that resolve problems like total error fluctuations, slow convergence, etc. For testing of this algorithm one of most popular benchmark tests parity problems were chosen. Final version of the proposed algorithm typically provides a solution for various tested parity problems in less than ten iterations, regardless of initial weight set. Performance of the algorithm on parity problems is tested and illustrated by figures. |
| File Format | PDF HTM / HTML |
| Alternate Webpage(s) | http://www.eng.auburn.edu/~wilambm/pap/2003/Robust%20NN%20traninf%20using%20partial.pdf |
| Alternate Webpage(s) | http://www.eng.auburn.edu/users/wilambm/pap/2003/Robust%20NN%20traninf%20using%20partial.pdf |
| Language | English |
| Access Restriction | Open |
| Subject Keyword | Algorithm Artificial neural network Benchmark (computing) Convergence (action) Gradient Iteration Neural Network Simulation Nonlinear system Pseudo brand of pseudoephedrine Randomness Software propagation Weight |
| Content Type | Text |
| Resource Type | Article |