Loading...
Please wait, while we are loading the content...
On The Value of Leave-One-Out Cross-Validation Bounds
| Content Provider | Semantic Scholar |
|---|---|
| Author | Rennie, Jason D. M. Jrennie |
| Copyright Year | 2003 |
| Abstract | A long-standing problem in classification is the determination of the regularization parameter. Nearly every classification algorithm uses a parameter (or set of parameters) to control classifier complexity. Crossvalidation on the training set is usually done to determine the regularization parameter(s). [1] proved a leave-one-out cross-validation (LOOCV) bound for a class of kernel classifiers. [2] extended the bound to Regularized Least Squares Classification (RLSC). We provide the (trivial) extension to multiclass. Our contribution is empirical work. We evaluate the bound’s usefulness as a selector for the regularization parameter for RLSC. We find that it works extremely poorly on the data set we experimented with (20 Newsgroups); the LOOCV bound consistently selects a regularization parameter that is too large. |
| File Format | PDF HTM / HTML |
| Alternate Webpage(s) | http://qwone.com/~jason/writing/loocv.pdf |
| Alternate Webpage(s) | http://people.csail.mit.edu/jrennie/writing/loocv.pdf |
| Alternate Webpage(s) | http://www.ai.mit.edu/~jrennie/writing/loocv.pdf |
| Alternate Webpage(s) | http://www.csail.mit.edu/~jrennie/writing/loocv.pdf |
| Alternate Webpage(s) | http://people.csail.mit.edu/~jrennie/writing/loocv.pdf |
| Language | English |
| Access Restriction | Open |
| Content Type | Text |
| Resource Type | Article |