Loading...
Please wait, while we are loading the content...
Similar Documents
Online gradient descent learning algorithms †
| Content Provider | CiteSeerX |
|---|---|
| Author | Ying, Yiming Pontil, Massimiliano |
| Abstract | This paper considers the least-square online gradient descent algorithm in a reproducing kernel Hilbert space (RKHS) without explicit regularization. We present a novel capacity independent approach to derive error bounds and convergence results for this algorithm. We show that, although the algorithm does not involve an explicit RKHS regularization term, choosing the step sizes appropriately can yield competitive error rates with those for both offline and online regularization algorithms in the literature. |
| File Format | |
| Access Restriction | Open |
| Subject Keyword | Online Gradient Descent Novel Capacity Independent Approach Convergence Result Explicit Regularization Online Regularization Algorithm Competitive Error Rate Least-square Online Gradient Descent Algorithm Reproducing Kernel Hilbert Space Explicit Rkhs Regularization Term Error Bound |
| Content Type | Text |
| Resource Type | Article |