Loading...
Please wait, while we are loading the content...
Similar Documents
Neural network regression with input uncertainty.
| Content Provider | CiteSeerX |
|---|---|
| Author | Wright, W. A. |
| Abstract | It is generally assumed when using Bayesian inference methods for neural networks that the input data contains no noise or corruption. For real-world (errors in variable) problems this is clearly an unsafe assumption. This paper presents a Bayesian neural network framework which allows for input noise given that some model of the noise process exists. In the limit where this noise process is small and symmetric it is shown, using the Laplace approximation, that there is an additional term to the usual Bayesian error bar which depends on the variance of the input noise process. Further by treating the true (noiseless) input as a hidden variable and sampling this jointly with the network’s weights, using Markov Chain Monte-Carlo methods, it is demonstrated that it is possible to infer the unbiased regression over the noiseless input. 1 |
| File Format | |
| Access Restriction | Open |
| Subject Keyword | Input Uncertainty Neural Network Regression Additional Term Hidden Variable Noise Process Markov Chain Monte-carlo Method Input Data Noise Process Exists Usual Bayesian Error Bar Unbiased Regression Network Weight Noiseless Input Unsafe Assumption Bayesian Inference Method Laplace Approximation Neural Network Bayesian Neural Network Framework |
| Content Type | Text |
| Resource Type | Article |