Loading...
Please wait, while we are loading the content...
Similar Documents
Lecture 19: Leave-one-out approximations
| Content Provider | Semantic Scholar |
|---|---|
| Author | Mukherjee, Sayan |
| Copyright Year | 2002 |
| Abstract | We introduce the idea of cross-validation, leave-one-out in its extreme form. We show that the leave-one-out estimate is almost unbiased. We then show a series of approximations and bounds on the leave-one-out error that are used for computational efficiency. First this is shown for least-squares loss then for the SVM loss function. We close by reporting in a worst case analysis the leave-one-out error is not a significantly better estimate of expected error than is the training error. |
| File Format | PDF HTM / HTML |
| Alternate Webpage(s) | https://ocw.mit.edu/courses/brain-and-cognitive-sciences/9-520-statistical-learning-theory-and-applications-spring-2003/lecture-notes/lecture19.pdf |
| Language | English |
| Access Restriction | Open |
| Content Type | Text |
| Resource Type | Article |