Loading...
Please wait, while we are loading the content...
Similar Documents
Hybrid Conditional Gradient-Smoothing Algorithms with Applications to Sparse and Low Rank Regularization
| Content Provider | Scilit |
|---|---|
| Copyright Year | 2014 |
| Description | Inspired by such algorithms, in this chapter we study a first-order method for solving certain convex optimization problems. We focus on problems of the form min {f(x) + g(Ax) + ω(x) : x ∈ H} (3.1) over a real Hilbert spaceH. We assume that f is a convex function with Hölder continuous gradient, g a Lipschitz continuous convex function, A a bounded linear operator, and ω a convex function defined over a bounded domain. We also assume that the computational operations available are the gradient of f , the proximity operator of g, and a subgradient of the convex conjugate ω∗.1 A particularly common type of problems covered by (3.1) is min {f(x) + g(Ax) : x ∈ C} , (3.2) where C is a bounded, closed, convex subset of H. Common among such examples are regularization problems with one or more penalties in the objective (as the term g ◦A) and one penalty as a constraint described by C. Book Name: Regularization, Optimization, Kernels, and Support Vector Machines |
| Related Links | https://content.taylorfrancis.com/books/download?dac=C2013-0-27442-8&isbn=9780429076121&doi=10.1201/b17558-6&format=pdf |
| DOI | 10.1201/b17558-6 |
| Language | English |
| Publisher | Informa UK Limited |
| Publisher Date | 2014-10-23 |
| Access Restriction | Open |
| Subject Keyword | Book Name: Regularization, Optimization, Kernels, and Support Vector Machines Automotive Engineering Optimization Regularization Convex Function |
| Content Type | Text |
| Resource Type | Chapter |