scispace - formally typeset
Journal ArticleDOI

A Nonlocal Weighted Joint Sparse Representation Classification Method for Hyperspectral Imagery

Reads0
Chats0
TLDR
The simultaneous orthogonal matching pursuit technique is used to solve the nonlocal weighted joint sparsity model (NLW-JSM) and the proposed classification algorithm performs better than the other sparsity-based algorithms and the classical support vector machine hyperspectral classifier.
Abstract
As a powerful and promising statistical signal modeling technique, sparse representation has been widely used in various image processing and analysis fields. For hyperspectral image classification, previous studies have shown the effectiveness of the sparsity-based classification methods. In this paper, we propose a nonlocal weighted joint sparse representation classification (NLW-JSRC) method to improve the hyperspectral image classification result. In the joint sparsity model (JSM), different weights are utilized for different neighboring pixels around the central test pixel. The weight of one specific neighboring pixel is determined by the structural similarity between the neighboring pixel and the central test pixel, which is referred to as a nonlocal weighting scheme. In this paper, the simultaneous orthogonal matching pursuit technique is used to solve the nonlocal weighted joint sparsity model (NLW-JSM). The proposed classification algorithm was tested on three hyperspectral images. The experimental results suggest that the proposed algorithm performs better than the other sparsity-based algorithms and the classical support vector machine hyperspectral classifier.

read more

Citations
More filters
Journal ArticleDOI

Spectral–Spatial Classification of Hyperspectral Data Based on Deep Belief Network

TL;DR: A new feature extraction (FE) and image classification framework are proposed for hyperspectral data analysis based on deep belief network (DBN) and a novel deep architecture is proposed, which combines the spectral-spatial FE and classification together to get high classification accuracy.
Journal ArticleDOI

Total-Variation-Regularized Low-Rank Matrix Factorization for Hyperspectral Image Restoration

TL;DR: A spatial spectral hyperspectral image (HSI) mixed-noise removal method named total variation (TV)-regularized low-rank matrix factorization (LRTV) that integrates the nuclear norm, TV regularization, and L1-norm together in a unified framework for HSI restoration.
Journal ArticleDOI

Spectral–Spatial Hyperspectral Image Classification via Multiscale Adaptive Sparse Representation

TL;DR: Considering that regions of different scales incorporate the complementary yet correlated information for classification, a multiscale adaptive sparse representation (MASR) model is proposed and demonstrates the qualitative and quantitative superiority of the proposed MASR algorithm when compared to several well-known classifiers.
Journal ArticleDOI

Spectral–Spatial Classification of Hyperspectral Images With a Superpixel-Based Discriminative Sparse Model

TL;DR: Experimental results on four real HSI datasets demonstrate the superiority of the proposed SBDSM algorithm over several well-known classification approaches in terms of both classification accuracies and computational speed.
Journal ArticleDOI

Hyperspectral Image Denoising via Noise-Adjusted Iterative Low-Rank Matrix Approximation

TL;DR: A noise-adjusted iterative low-rank matrix approximation (NAILRMA) method is proposed for HSI denoising that can effectively preserve the high- SNR bands and denoise the low-SNR bands.
References
More filters
Book

The Nature of Statistical Learning Theory

TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Proceedings ArticleDOI

A training algorithm for optimal margin classifiers

TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
Journal ArticleDOI

Robust Face Recognition via Sparse Representation

TL;DR: This work considers the problem of automatically recognizing human faces from frontal views with varying expression and illumination, as well as occlusion and disguise, and proposes a general classification algorithm for (image-based) object recognition based on a sparse representation computed by C1-minimization.
Journal ArticleDOI

Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit

TL;DR: It is demonstrated theoretically and empirically that a greedy algorithm called orthogonal matching pursuit (OMP) can reliably recover a signal with m nonzero entries in dimension d given O(m ln d) random linear measurements of that signal.

Signal Recovery from Random Measurements Via Orthogonal Matching Pursuit: The Gaussian Case

TL;DR: In this paper, a greedy algorithm called Orthogonal Matching Pursuit (OMP) was proposed to recover a signal with m nonzero entries in dimension 1 given O(m n d) random linear measurements of that signal.
Related Papers (5)