scispace - formally typeset
Open AccessJournal ArticleDOI

Robust Face Recognition via Sparse Representation

TLDR
This work considers the problem of automatically recognizing human faces from frontal views with varying expression and illumination, as well as occlusion and disguise, and proposes a general classification algorithm for (image-based) object recognition based on a sparse representation computed by C1-minimization.
Abstract
We consider the problem of automatically recognizing human faces from frontal views with varying expression and illumination, as well as occlusion and disguise. We cast the recognition problem as one of classifying among multiple linear regression models and argue that new theory from sparse signal representation offers the key to addressing this problem. Based on a sparse representation computed by C1-minimization, we propose a general classification algorithm for (image-based) object recognition. This new framework provides new insights into two crucial issues in face recognition: feature extraction and robustness to occlusion. For feature extraction, we show that if sparsity in the recognition problem is properly harnessed, the choice of features is no longer critical. What is critical, however, is whether the number of features is sufficiently large and whether the sparse representation is correctly computed. Unconventional features such as downsampled images and random projections perform just as well as conventional features such as eigenfaces and Laplacianfaces, as long as the dimension of the feature space surpasses certain threshold, predicted by the theory of sparse representation. This framework can handle errors due to occlusion and corruption uniformly by exploiting the fact that these errors are often sparse with respect to the standard (pixel) basis. The theory of sparse representation helps predict how much occlusion the recognition algorithm can handle and how to choose the training images to maximize robustness to occlusion. We conduct extensive experiments on publicly available databases to verify the efficacy of the proposed algorithm and corroborate the above claims.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Robust classification using l 2,1 -norm based regression model

TL;DR: A novel classification method using @?"2","1-norm based regression is proposed in this paper, and the results show competitive performance particularly better than those using dummy matrix as the response variables.
Journal ArticleDOI

Sparse Sensor Placement Optimization for Classification

TL;DR: A novel algorithm to solve sparse sensor placement optimization for classification (SSPOC) that exploits low-dimensional structure exhibited by many high-dimensional systems and performs computationally efficient classification with accuracy approaching that of classification using full-state data.
Proceedings Article

Adaptive ADMM with Spectral Penalty Parameter Selection

TL;DR: In this paper, an adaptive alternating direction method of multipliers (AADMM) algorithm is proposed to adaptively tune the penalty parameters to achieve fast convergence and relative insensitivity to the initial stepsize and problem scaling.
Journal ArticleDOI

Research on sparsity indexes for fault diagnosis of rotating machinery

TL;DR: Analysis of the performance of the representative sparsity indexes, containing Gini index, l2/l1 norm, Hoyer measure and kurtosis, confirms that an optimal scheme can be designed for the sparsity-based improvement under the proposed guideline.
Posted Content

Face Recognition: From Traditional to Deep Learning Methods

TL;DR: This paper provides a comprehensive and up-to-date literature review of popular face recognition methods including both traditional (geometry-based, holistic, feature-based and hybrid methods) and deep learning methods.
References
More filters
Journal ArticleDOI

Regression Shrinkage and Selection via the Lasso

TL;DR: A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.
Book

The Nature of Statistical Learning Theory

TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Book

Convex Optimization

TL;DR: In this article, the focus is on recognizing convex optimization problems and then finding the most appropriate technique for solving them, and a comprehensive introduction to the subject is given. But the focus of this book is not on the optimization problem itself, but on the problem of finding the appropriate technique to solve it.
Journal ArticleDOI

Eigenfaces for recognition

TL;DR: A near-real-time computer system that can locate and track a subject's head, and then recognize the person by comparing characteristics of the face to those of known individuals, and that is easy to implement using a neural network architecture.
Journal ArticleDOI

Eigenfaces vs. Fisherfaces: recognition using class specific linear projection

TL;DR: A face recognition algorithm which is insensitive to large variation in lighting direction and facial expression is developed, based on Fisher's linear discriminant and produces well separated classes in a low-dimensional subspace, even under severe variations in lighting and facial expressions.
Related Papers (5)
Trending Questions (1)
What is the minimum number of images required for a facial recognition model to sufficiently learn features?

The paper does not provide a specific minimum number of images required for a facial recognition model to sufficiently learn features.