Loading...
Please wait, while we are loading the content...
Similar Documents
More efficiency in multiple kernel learning (2007)
| Content Provider | CiteSeerX |
|---|---|
| Author | Canu, Stéphane Rakotomamonjy, Alain |
| Description | IN ICML |
| Abstract | An efficient and general multiple kernel learning (MKL) algorithm has been recently proposed by Sonnenburg et al. (2006). This approach has opened new perspectives since it makes the MKL approach tractable for largescale problems, by iteratively using existing support vector machine code. However, it turns out that this iterative algorithm needs several iterations before converging towards a reasonable solution. In this paper, we address the MKL problem through an adaptive 2-norm regularization formulation. Weights on each kernel matrix are included in the standard SVM empirical risk minimization problem with a ℓ1 constraint to encourage sparsity. We propose an algorithm for solving this problem and provide an new insight on MKL algorithms based on block 1-norm regularization by showing that the two approaches are equivalent. Experimental results show that the resulting algorithm converges rapidly and its efficiency compares favorably to other MKL algorithms. 1. |
| File Format | |
| Journal | ICML |
| Publisher Date | 2007-01-01 |
| Access Restriction | Open |
| Subject Keyword | Multiple Kernel Adaptive 2-norm Regularization Formulation Support Vector Machine Code Mkl Approach Tractable New Insight Algorithm Converges Mkl Algorithm Largescale Problem Mkl Problem Iterative Algorithm Several Iteration Experimental Result Block 1-norm Regularization New Perspective General Multiple Kernel Learning Kernel Matrix Reasonable Solution |
| Content Type | Text |