Open AccessPosted Content
A Unified Convergence Analysis of Block Successive Minimization Methods for Nonsmooth Optimization
Reads0
Chats0
TLDR
In this article, an alternative inexact block coordinate descent (BCD) approach is proposed, which updates the variable blocks by successively minimizing a sequence of approximations of f which are either locally tight upper bounds of f or strictly convex local approximates of f. The convergence properties for a fairly wide class of such methods, especially for the cases where the objective functions are either non-differentiable or nonconvex.Abstract:
The block coordinate descent (BCD) method is widely used for minimizing a continuous function f of several block variables. At each iteration of this method, a single block of variables is optimized, while the remaining variables are held fixed. To ensure the convergence of the BCD method, the subproblem to be optimized in each iteration needs to be solved exactly to its unique optimal solution. Unfortunately, these requirements are often too restrictive for many practical scenarios. In this paper, we study an alternative inexact BCD approach which updates the variable blocks by successively minimizing a sequence of approximations of f which are either locally tight upper bounds of f or strictly convex local approximations of f. We focus on characterizing the convergence properties for a fairly wide class of such methods, especially for the cases where the objective functions are either non-differentiable or nonconvex. Our results unify and extend the existing convergence results for many classical algorithms such as the BCD method, the difference of convex functions (DC) method, the expectation maximization (EM) algorithm, as well as the alternating proximal minimization algorithm.read more
Citations
More filters
Journal ArticleDOI
Tensor Decomposition for Signal Processing and Machine Learning
Nicholas D. Sidiropoulos,Lieven De Lathauwer,Xiao Fu,Kejun Huang,Evangelos E. Papalexakis,Christos Faloutsos +5 more
TL;DR: The material covered includes tensor rank and rank decomposition; basic tensor factorization models and their relationships and properties; broad coverage of algorithms ranging from alternating optimization to stochastic gradient; statistical performance analysis; and applications ranging from source separation to collaborative filtering, mixture and topic modeling, classification, and multilinear subspace learning.
Journal ArticleDOI
Tensor Decompositions for Signal Processing Applications: From two-way to multiway component analysis
Andrzej Cichocki,Danilo P. Mandic,Lieven De Lathauwer,Guoxu Zhou,Qibin Zhao,Cesar F. Caiafa,Huy Anh Phan +6 more
TL;DR: Benefiting from the power of multilinear algebra as their mathematical backbone, data analysis techniques using tensor decompositions are shown to have great flexibility in the choice of constraints which match data properties and extract more general latent components in the data than matrix-based methods.
Journal ArticleDOI
A Block Coordinate Descent Method for Regularized Multiconvex Optimization with Applications to Nonnegative Tensor Factorization and Completion
Yangyang Xu,Wotao Yin +1 more
TL;DR: This paper considers regularized block multiconvex optimization, where the feasible set and objective function are generally nonconvex but convex in each block of variables and proposes a generalized block coordinate descent method.
Journal ArticleDOI
Majorization-Minimization Algorithms in Signal Processing, Communications, and Machine Learning
TL;DR: An overview of the majorization-minimization (MM) algorithmic framework, which can provide guidance in deriving problem-driven algorithms with low computational cost and is elaborated by a wide range of applications in signal processing, communications, and machine learning.
Journal ArticleDOI
Convergence Analysis of Alternating Direction Method of Multipliers for a Family of Nonconvex Problems
TL;DR: It is shown that in the presence of nonconvex objective function, classical ADMM is able to reach the set of stationary solutions for these problems, if the stepsize is chosen large enough.
References
More filters
Journal ArticleDOI
Maximum likelihood from incomplete data via the EM algorithm
Journal ArticleDOI
Tensor Decompositions and Applications
Tamara G. Kolda,Brett W. Bader +1 more
TL;DR: This survey provides an overview of higher-order tensor decompositions, their applications, and available software.
Book
Parallel and Distributed Computation: Numerical Methods
TL;DR: This work discusses parallel and distributed architectures, complexity measures, and communication and synchronization issues, and it presents both Jacobi and Gauss-Seidel iterations, which serve as algorithms of reference for many of the computational approaches addressed later.
Journal ArticleDOI
Analysis of individual differences in multidimensional scaling via an n-way generalization of 'eckart-young' decomposition
J. Douglas Carroll,Jih-Jie Chang +1 more
TL;DR: In this paper, an individual differences model for multidimensional scaling is outlined in which individuals are assumed differentially to weight the several dimensions of a common "psychological space" and a corresponding method of analyzing similarities data is proposed, involving a generalization of Eckart-Young analysis to decomposition of three-way (or higher-way) tables.
Related Papers (5)
A unified convergence analysis of block successive minimization methods for nonsmooth optimization
Block stochastic gradient iteration for convex and nonconvex optimization
Yangyang Xu,Wotao Yin +1 more