Loading...
Please wait, while we are loading the content...
Similar Documents
Consistency of support vector machines and other regularized kernel classifiers
| Content Provider | CiteSeerX |
|---|---|
| Author | Steinwart, Ingo |
| Abstract | Abstract—It is shown that various classifiers that are based on minimization of a regularized risk are universally consistent, i.e., they can asymptotically learn in every classification task. The role of the loss functions used in these algorithms is considered in detail. As an application of our general framework, several types of support vector machines (SVMs) as well as regularization networks are treated. Our methods combine techniques from stochastics, approximation theory, and functional analysis. Index Terms—Computational learning theory, kernel methods, pattern recognition, regularization, support vector machines (SVMs), universal consistency. I. |
| File Format | |
| Journal | IEEE Transactions on Information Theory |
| Language | English |
| Access Restriction | Open |
| Subject Keyword | Support Vector Machine Regularized Kernel Classifier Regularization Network General Framework Pattern Recognition Loss Function Various Classifier Approximation Theory Index Term Computational Learning Theory Several Type Universal Consistency Functional Analysis Classification Task |
| Content Type | Text |
| Resource Type | Article |