scispace - formally typeset
Open AccessBook

Learning with Submodular Functions: A Convex Optimization Perspective

TLDR
In Learning with Submodular Functions: A Convex Optimization Perspective, the theory of submodular functions is presented in a self-contained way from a convex analysis perspective, presenting tight links between certain polyhedra, combinatorial optimization and convex optimization problems.
Abstract
Submodular functions are relevant to machine learning for at least two reasons: (1) some problems may be expressed directly as the optimization of submodular functions and (2) the Lovsz extension of submodular functions provides a useful set of regularization functions for supervised and unsupervised learning. In Learning with Submodular Functions: A Convex Optimization Perspective, the theory of submodular functions is presented in a self-contained way from a convex analysis perspective, presenting tight links between certain polyhedra, combinatorial optimization and convex optimization problems. In particular, it describes how submodular function minimization is equivalent to solving a wide variety of convex optimization problems. This allows the derivation of new efficient algorithms for approximate and exact submodular function minimization with theoretical guarantees and good practical performance. By listing many examples of submodular functions, it reviews various applications to machine learning, such as clustering, experimental design, sensor placement, graphical model structure learning or subset selection, as well as a family of structured sparsity-inducing norms that can be derived and used from submodular functions. Learning with Submodular Functions: A Convex Optimization Perspective is an ideal reference for researchers, scientists, or engineers with an interest in applying submodular functions to machine learning problems.

read more

Content maybe subject to copyright    Report

Citations
More filters
Proceedings Article

Revisiting Frank-Wolfe: Projection-Free Sparse Convex Optimization

TL;DR: A new general framework for convex optimization over matrix factorizations, where every Frank-Wolfe iteration will consist of a low-rank update, is presented, and the broad application areas of this approach are discussed.
Book

Convex Optimization: Algorithms and Complexity

TL;DR: This monograph presents the main complexity theorems in convex optimization and their corresponding algorithms and provides a gentle introduction to structural optimization with FISTA, saddle-point mirror prox, Nemirovski's alternative to Nesterov's smoothing, and a concise description of interior point methods.
Proceedings ArticleDOI

The Lovasz-Softmax Loss: A Tractable Surrogate for the Optimization of the Intersection-Over-Union Measure in Neural Networks

TL;DR: In this article, a method for direct optimization of the mean intersection-over-union loss in neural networks, based on the convex LovAisz extension of submodular losses, is presented.
Posted Content

Taking Human out of Learning Applications: A Survey on Automated Machine Learning

TL;DR: An up to date survey on AutoML and proposes a general AutoML framework that not only covers most existing approaches to date but also can guide the design for new methods.
Proceedings Article

On the global linear convergence of Frank-Wolfe optimization variants

TL;DR: In this paper, the authors highlight and clarify several variants of the Frank-Wolfe optimization algorithm that have been successfully applied in practice: away-steps, pairwise, fully-corrective and minimum norm point algorithms.
References
More filters
Book

Elements of information theory

TL;DR: The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Journal ArticleDOI

Regression Shrinkage and Selection via the Lasso

TL;DR: A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.
Book

Matrix computations

Gene H. Golub
Book

Convex Optimization

TL;DR: In this article, the focus is on recognizing convex optimization problems and then finding the most appropriate technique for solving them, and a comprehensive introduction to the subject is given. But the focus of this book is not on the optimization problem itself, but on the problem of finding the appropriate technique to solve it.
Book

Matrix Analysis

TL;DR: In this article, the authors present results of both classic and recent matrix analyses using canonical forms as a unifying theme, and demonstrate their importance in a variety of applications, such as linear algebra and matrix theory.