Loading...
Please wait, while we are loading the content...
Similar Documents
The Minimum Number of Hidden Neurons Does Not Necessarily Provide the Best Generalization
| Content Provider | Semantic Scholar |
|---|---|
| Author | Kinser, Jason M. |
| Copyright Year | 2001 |
| Abstract | The quality of a feedforward neural network that allows it to associate data not used in training is called generalization. A common method of creating the desired network is for the user to select the network architecture (largely based on the selecting the number of hidden neurons) and allowing a training algorithm to evolve the synaptic weights between the neurons. A popular belief is that the network with the fewest number of hidden neurons that correctly learns a sufficient training set is a network with better generalization. This paper will contradict this belief. The optimization of generalization requires that the network not assume information that does not exist in the training data. Unfortunately, a network with the minimum number of hidden neurons may require assumptions of information that does not exist. The network then skews the surface that maps the input space to the output space in order to accommodate the minimum architecture which then sacrifices generalization. |
| File Format | PDF HTM / HTML |
| Alternate Webpage(s) | http://binf.gmu.edu/kinser/pubs/MinNum.pdf |
| Language | English |
| Access Restriction | Open |
| Subject Keyword | Algorithm Artificial neural network Biological Neural Networks Feedforward neural network Generalization (Psychology) Map Mathematical optimization Network architecture Neuron Synaptic Package Manager Synaptic weight Test set |
| Content Type | Text |
| Resource Type | Article |