scispace - formally typeset
Open AccessJournal ArticleDOI

A majorization-minimization approach to the sparse generalized eigenvalue problem

Reads0
Chats0
TLDR
The proposed sparse GEV algorithm, which offers a general framework to solve any sparse G EV problem, will give rise to competitive algorithms for a variety of applications where specific instances of GEV problems arise.
Abstract
Generalized eigenvalue (GEV) problems have applications in many areas of science and engineering. For example, principal component analysis (PCA), canonical correlation analysis (CCA) and Fisher discriminant analysis (FDA) are specific instances of GEV problems, that are widely used in statistical data analysis. The main contribution of this work is to formulate a general, efficient algorithm to obtain sparse solutions to a GEV problem. Specific instances of sparse GEV problems can then be solved by specific instances of this algorithm. We achieve this by solving the GEV problem while constraining the cardinality of the solution. Instead of relaxing the cardinality constraint using a l1-norm approximation, we consider a tighter approximation that is related to the negative log-likelihood of a Student's t-distribution. The problem is then framed as a d.c. (difference of convex functions) program and is solved as a sequence of convex programs by invoking the majorization-minimization method. The resulting algorithm is proved to exhibit global convergence behavior, i.e., for any random initialization, the sequence (subsequence) of iterates generated by the algorithm converges to a stationary point of the d.c. program. Finally, we illustrate the merits of this general sparse GEV algorithm with three specific examples of sparse GEV problems: sparse PCA, sparse CCA and sparse FDA. Empirical evidence for these examples suggests that the proposed sparse GEV algorithm, which offers a general framework to solve any sparse GEV problem, will give rise to competitive algorithms for a variety of applications where specific instances of GEV problems arise.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Principal component analysis

TL;DR: The paper focuses on the use of principal component analysis in typical chemometric areas but the results are generally applicable.
Journal ArticleDOI

Nonlinear Programming: A Unified Approach

Journal ArticleDOI

Majorization-Minimization Algorithms in Signal Processing, Communications, and Machine Learning

TL;DR: An overview of the majorization-minimization (MM) algorithmic framework, which can provide guidance in deriving problem-driven algorithms with low computational cost and is elaborated by a wide range of applications in signal processing, communications, and machine learning.
Journal ArticleDOI

Group Sparse Beamforming for Green Cloud-RAN

TL;DR: This paper proposes a new framework to design a green Cloud-RAN, which is formulated as a joint RRH selection and power minimization beamforming problem, and proposes a greedy selection algorithm, shown to provide near-optimal performance.
Posted Content

Group Sparse Beamforming for Green Cloud-RAN

TL;DR: In this paper, the authors proposed a new framework to design a green cloud radio access network, which is formulated as a joint RRH selection and power minimization beamforming problem, and the proposed algorithms significantly reduce the network power consumption and demonstrate the importance of considering the transport link power consumption.
References
More filters
Journal ArticleDOI

Regression Shrinkage and Selection via the Lasso

TL;DR: A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.
Book

Convex Optimization

TL;DR: In this article, the focus is on recognizing convex optimization problems and then finding the most appropriate technique for solving them, and a comprehensive introduction to the subject is given. But the focus of this book is not on the optimization problem itself, but on the problem of finding the appropriate technique to solve it.
Book

Principal Component Analysis

TL;DR: In this article, the authors present a graphical representation of data using Principal Component Analysis (PCA) for time series and other non-independent data, as well as a generalization and adaptation of principal component analysis.
Journal ArticleDOI

Regularization and variable selection via the elastic net

TL;DR: It is shown that the elastic net often outperforms the lasso, while enjoying a similar sparsity of representation, and an algorithm called LARS‐EN is proposed for computing elastic net regularization paths efficiently, much like algorithm LARS does for the lamba.
Related Papers (5)