scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Statistical Signal Processing

01 Jan 1985-Vol. 148, Iss: 1, pp 63-63
About: The article was published on 1985-01-01. It has received 992 citations till now. The article focuses on the topics: Statistical signal processing & Digital signal processing.
Citations
More filters
Posted Content
TL;DR: An overview of unmixing methods from the time of Keshava and Mustard's tutorial as mentioned in this paper to the present can be found in Section 2.2.1].
Abstract: Imaging spectrometers measure electromagnetic energy scattered in their instantaneous field view in hundreds or thousands of spectral channels with higher spectral resolution than multispectral cameras. Imaging spectrometers are therefore often referred to as hyperspectral cameras (HSCs). Higher spectral resolution enables material identification via spectroscopic analysis, which facilitates countless applications that require identifying materials in scenarios unsuitable for classical spectroscopic analysis. Due to low spatial resolution of HSCs, microscopic material mixing, and multiple scattering, spectra measured by HSCs are mixtures of spectra of materials in a scene. Thus, accurate estimation requires unmixing. Pixels are assumed to be mixtures of a few materials, called endmembers. Unmixing involves estimating all or some of: the number of endmembers, their spectral signatures, and their abundances at each pixel. Unmixing is a challenging, ill-posed inverse problem because of model inaccuracies, observation noise, environmental conditions, endmember variability, and data set size. Researchers have devised and investigated many models searching for robust, stable, tractable, and accurate unmixing algorithms. This paper presents an overview of unmixing methods from the time of Keshava and Mustard's unmixing tutorial [1] to the present. Mixing models are first discussed. Signal-subspace, geometrical, statistical, sparsity-based, and spatial-contextual unmixing algorithms are described. Mathematical problems and potential solutions are described. Algorithm characteristics are illustrated experimentally.

1,808 citations

Journal ArticleDOI
TL;DR: A new framework for statistical signal processing based on wavelet-domain hidden Markov models (HMMs) that concisely models the statistical dependencies and non-Gaussian statistics encountered in real-world signals is developed.
Abstract: Wavelet-based statistical signal processing techniques such as denoising and detection typically model the wavelet coefficients as independent or jointly Gaussian. These models are unrealistic for many real-world signals. We develop a new framework for statistical signal processing based on wavelet-domain hidden Markov models (HMMs) that concisely models the statistical dependencies and non-Gaussian statistics encountered in real-world signals. Wavelet-domain HMMs are designed with the intrinsic properties of the wavelet transform in mind and provide powerful, yet tractable, probabilistic signal models. Efficient expectation maximization algorithms are developed for fitting the HMMs to observational signal data. The new framework is suitable for a wide range of applications, including signal estimation, detection, classification, prediction, and even synthesis. To demonstrate the utility of wavelet-domain HMMs, we develop novel algorithms for signal denoising, classification, and detection.

1,783 citations

Journal ArticleDOI
TL;DR: The authors present a fully constrained least squares (FCLS) linear spectral mixture analysis method for material quantification, where no closed form can be derived for this method and an efficient algorithm is developed to yield optimal solutions.
Abstract: Linear spectral mixture analysis (LSMA) is a widely used technique in remote sensing to estimate abundance fractions of materials present in an image pixel. In order for an LSMA-based estimator to produce accurate amounts of material abundance, it generally requires two constraints imposed on the linear mixture model used in LSMA, which are the abundance sum-to-one constraint and the abundance nonnegativity constraint. The first constraint requires the sum of the abundance fractions of materials present in an image pixel to be one and the second imposes a constraint that these abundance fractions be nonnegative. While the first constraint is easy to deal with, the second constraint is difficult to implement since it results in a set of inequalities and can only be solved by numerical methods. Consequently, most LSMA-based methods are unconstrained and produce solutions that do not necessarily reflect the true abundance fractions of materials. In this case, they can only be used for the purposes of material detection, discrimination, and classification, but not for material quantification. The authors present a fully constrained least squares (FCLS) linear spectral mixture analysis method for material quantification. Since no closed form can be derived for this method, an efficient algorithm is developed to yield optimal solutions. In order to further apply the designed algorithm to unknown image scenes, an unsupervised least squares error (LSE)-based method is also proposed to extend the FCLS method in an unsupervised manner.

1,676 citations

Proceedings ArticleDOI
25 Jul 1995
TL;DR: The authors present the MMSE and LS estimators and a method for modifications compromising between complexity and performance and the symbol error rate for a 18-QAM system is presented by means of simulation results.
Abstract: The use of multi-amplitude signaling schemes in wireless OFDM systems requires the tracking of the fading radio channel. The paper addresses channel estimation based on time-domain channel statistics. Using a general model for a slowly fading channel, the authors present the MMSE and LS estimators and a method for modifications compromising between complexity and performance. The symbol error rate for a 18-QAM system is presented by means of simulation results. Depending upon estimator complexity, up to 4 dB in SNR can be gained over the LS estimator.

1,647 citations

Journal ArticleDOI
TL;DR: A tutorial/overview cross section of some relevant hyperspectral data analysis methods and algorithms, organized in six main topics: data fusion, unmixing, classification, target detection, physical parameter retrieval, and fast computing.
Abstract: Hyperspectral remote sensing technology has advanced significantly in the past two decades. Current sensors onboard airborne and spaceborne platforms cover large areas of the Earth surface with unprecedented spectral, spatial, and temporal resolutions. These characteristics enable a myriad of applications requiring fine identification of materials or estimation of physical parameters. Very often, these applications rely on sophisticated and complex data analysis methods. The sources of difficulties are, namely, the high dimensionality and size of the hyperspectral data, the spectral mixing (linear and nonlinear), and the degradation mechanisms associated to the measurement process such as noise and atmospheric effects. This paper presents a tutorial/overview cross section of some relevant hyperspectral data analysis methods and algorithms, organized in six main topics: data fusion, unmixing, classification, target detection, physical parameter retrieval, and fast computing. In all topics, we describe the state-of-the-art, provide illustrative examples, and point to future challenges and research directions.

1,604 citations