scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Automatic analysis of the difference image for unsupervised change detection

TL;DR: The authors propose two automatic techniques (based on the Bayes theory) for the analysis of the difference image that allow an automatic selection of the decision threshold that minimizes the overall change detection error probability under the assumption that pixels in the difference picture are independent of one another.
Abstract: One of the main problems related to unsupervised change detection methods based on the "difference image" lies in the lack of efficient automatic techniques for discriminating between changed and unchanged pixels in the difference image. Such discrimination is usually performed by using empirical strategies or manual trial-and-error procedures, which affect both the accuracy and the reliability of the change-detection process. To overcome such drawbacks, in this paper, the authors propose two automatic techniques (based on the Bayes theory) for the analysis of the difference image. One allows an automatic selection of the decision threshold that minimizes the overall change detection error probability under the assumption that pixels in the difference image are independent of one another. The other analyzes the difference image by considering the spatial-contextual information included in the neighborhood of each pixel. In particular, an approach based on Markov Random Fields (MRFs) that exploits interpixel class dependency contexts is presented. Both proposed techniques require the knowledge of the statistical distributions of the changed and unchanged pixels in the difference image. To perform an unsupervised estimation of the statistical terms that characterize these distributions, they propose an iterative method based on the Expectation-Maximization (EM) algorithm. Experimental results confirm the effectiveness of both proposed techniques.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: This paper is a comprehensive exploration of all the major change detection approaches implemented as found in the literature and summarizes and reviews these techniques.
Abstract: Timely and accurate change detection of Earth's surface features is extremely important for understanding relationships and interactions between human and natural phenomena in order to promote better decision making. Remote sensing data are primary sources extensively used for change detection in recent decades. Many change detection techniques have been developed. This paper summarizes and reviews these techniques. Previous literature has shown that image differencing, principal component analysis and post-classification comparison are the most common methods used for change detection. In recent years, spectral mixture analysis, artificial neural networks and integration of geographical information system and remote sensing data have become important techniques for change detection applications. Different change detection algorithms have their own merits and no single approach is optimal and applicable to all cases. In practice, different algorithms are often compared to find the best change detection results for a specific application. Research of change detection techniques is still an active topic and new techniques are needed to effectively use the increasingly diverse and complex remotely sensed data available or projected to be soon available from satellite and airborne sensors. This paper is a comprehensive exploration of all the major change detection approaches implemented as found in the literature.

2,785 citations

01 Jan 2005
TL;DR: A systematic survey of the common processing steps and core decision rules in modern change detection algorithms, including significance and hypothesis testing, predictive models, the shading model, and background modeling is presented.

1,750 citations

Journal ArticleDOI
TL;DR: In this paper, the authors present a systematic survey of the common processing steps and core decision rules in modern change detection algorithms, including significance and hypothesis testing, predictive models, the shading model, and background modeling.
Abstract: Detecting regions of change in multiple images of the same scene taken at different times is of widespread interest due to a large number of applications in diverse disciplines, including remote sensing, surveillance, medical diagnosis and treatment, civil infrastructure, and underwater sensing. This paper presents a systematic survey of the common processing steps and core decision rules in modern change detection algorithms, including significance and hypothesis testing, predictive models, the shading model, and background modeling. We also discuss important preprocessing methods, approaches to enforcing the consistency of the change mask, and principles for evaluating and comparing the performance of change detection algorithms. It is hoped that our classification of algorithms into a relatively small number of categories will provide useful guidance to the algorithm designer.

1,693 citations

Journal ArticleDOI
Masroor Hussain1, Dongmei Chen1, Angela Cheng1, Hui Wei, David Stanley 
TL;DR: This paper begins with a discussion of the traditionally pixel-based and (mostly) statistics-oriented change detection techniques which focus mainly on the spectral values and mostly ignore the spatial context, followed by a review of object-basedchange detection techniques.
Abstract: The appetite for up-to-date information about earth’s surface is ever increasing, as such information provides a base for a large number of applications, including local, regional and global resources monitoring, land-cover and land-use change monitoring, and environmental studies. The data from remote sensing satellites provide opportunities to acquire information about land at varying resolutions and has been widely used for change detection studies. A large number of change detection methodologies and techniques, utilizing remotely sensed data, have been developed, and newer techniques are still emerging. This paper begins with a discussion of the traditionally pixel-based and (mostly) statistics-oriented change detection techniques which focus mainly on the spectral values and mostly ignore the spatial context. This is succeeded by a review of object-based change detection techniques. Finally there is a brief discussion of spatial data mining techniques in image processing and change detection from remote sensing data. The merits and issues of different techniques are compared. The importance of the exponential increase in the image data volume and multiple sensors and associated challenges on the development of change detection techniques are highlighted. With the wide use of very-high-resolution (VHR) remotely sensed images, object-based methods and data mining techniques may have more potential in change detection.

1,159 citations


Cites background or methods from "Automatic analysis of the differenc..."

  • ...Other approaches to improve the threshold selection include: fuzzy set and fuzzy membership functions (Metternicht, 1999), Bayes rule analysis (Bruzzone and Prieto, 2000), and the use of existing objects for automated threshold calculation (Bouziani et al., 2010)....

    [...]

  • ...…overlapping classes may be difficult or non-informative (Kennedy et al., 2009) Topographic map revision (Metternicht, 1999) Unsupervised change detection (Bruzzone and Prieto, 2000) Landscape change analysis (Fisher et al., 2006) Multi-sensor data fusion for change detection Allow to take…...

    [...]

  • ...Other approaches to improve the threshold selection include: fuzzy set and fuzzy membership functions (Metternicht, 1999), Bayes rule analysis (Bruzzone and Prieto, 2000), and the use of existing objects for automated threshold calculation (Bouziani et al....

    [...]

Journal ArticleDOI
TL;DR: A completely automatic algorithm (ADJUST) that identifies artifacted independent components by combining stereotyped artifact-specific spatial and temporal features is proposed that provides a fast, efficient, and automatic way to use ICA for artifact removal.
Abstract: A successful method for removing artifacts from electroencephalogram (EEG) recordings is Independent Component Analysis (ICA), but its implementation remains largely user-dependent. Here, we propose a completely automatic algorithm (ADJUST) that identifies artifacted independent components by combining stereotyped artifact-specific spatial and temporal features. Features were optimized to capture blinks, eye movements, and generic discontinuities on a feature selection dataset. Validation on a totally different EEG dataset shows that (1) ADJUST’s classification of independent components largely matches a manual one by experts (agreement on 95.2% of the data variance), and (2) Removal of the artifacted components detected by ADJUST leads to neat reconstruction of visual and auditory eventrelated potentials from heavily artifacted data. These results demonstrate that ADJUST provides a fast, efficient, and automatic way to use ICA for artifact removal. Descriptors: Electroencephalography, Independent component analysis, EEG artifacts, EEG artefacts, Event-related potentials, Ongoing brain activity, Automatic classification, Thresholding

1,060 citations


Cites background or methods from "Automatic analysis of the differenc..."

  • ...For each feature, a threshold dividing artifacts from non-artifacts is estimated on the whole set of ICs in a completely automatic way by the Expectation-Maximization automatic thresholding method (Bruzzone & Prieto, 2000)....

    [...]

  • ...Automatic Classification For each feature included in the detectors, the threshold value was computed by means of a completely automatic image processing thresholding algorithm based on the Expectation-Maximization (EM) technique (Bruzzone & Prieto, 2000)....

    [...]

  • ...The key applicative aspect of ADJUST is its completely automatic nature: no trial-and-errors procedures are necessary for tuning parameters, as features are defined a priori and the algorithm that computes feature thresholds is completely unsupervised (Bruzzone & Prieto, 2000)....

    [...]

  • ...Remarkably, this result is obtained with a simple thresholding algorithm (Bruzzone & Prieto, 2000) in a completely unsupervised way....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: The analogy between images and statistical mechanics systems is made and the analogous operation under the posterior distribution yields the maximum a posteriori (MAP) estimate of the image given the degraded observations, creating a highly parallel ``relaxation'' algorithm for MAP estimation.
Abstract: We make an analogy between images and statistical mechanics systems. Pixel gray levels and the presence and orientation of edges are viewed as states of atoms or molecules in a lattice-like physical system. The assignment of an energy function in the physical system determines its Gibbs distribution. Because of the Gibbs distribution, Markov random field (MRF) equivalence, this assignment also determines an MRF image model. The energy function is a more convenient and natural mechanism for embodying picture attributes than are the local characteristics of the MRF. For a range of degradation mechanisms, including blurring, nonlinear deformations, and multiplicative or additive noise, the posterior distribution is an MRF with a structure akin to the image model. By the analogy, the posterior distribution defines another (imaginary) physical system. Gradual temperature reduction in the physical system isolates low energy states (``annealing''), or what is the same thing, the most probable states under the Gibbs distribution. The analogous operation under the posterior distribution yields the maximum a posteriori (MAP) estimate of the image given the degraded observations. The result is a highly parallel ``relaxation'' algorithm for MAP estimation. We establish convergence properties of the algorithm and we experiment with some simple pictures, for which good restorations are obtained at low signal-to-noise ratios.

18,761 citations


Additional excerpts

  • ..., the simulated annealing algorithm [27])....

    [...]

Journal ArticleDOI
Julian Besag1
TL;DR: In this paper, the authors proposed an iterative method for scene reconstruction based on a non-degenerate Markov Random Field (MRF) model, where the local characteristics of the original scene can be represented by a nondegenerate MRF and the reconstruction can be estimated according to standard criteria.
Abstract: may 7th, 1986, Professor A. F. M. Smith in the Chair] SUMMARY A continuous two-dimensional region is partitioned into a fine rectangular array of sites or "pixels", each pixel having a particular "colour" belonging to a prescribed finite set. The true colouring of the region is unknown but, associated with each pixel, there is a possibly multivariate record which conveys imperfect information about its colour according to a known statistical model. The aim is to reconstruct the true scene, with the additional knowledge that pixels close together tend to have the same or similar colours. In this paper, it is assumed that the local characteristics of the true scene can be represented by a nondegenerate Markov random field. Such information can be combined with the records by Bayes' theorem and the true scene can be estimated according to standard criteria. However, the computational burden is enormous and the reconstruction may reflect undesirable largescale properties of the random field. Thus, a simple, iterative method of reconstruction is proposed, which does not depend on these large-scale characteristics. The method is illustrated by computer simulations in which the original scene is not directly related to the assumed random field. Some complications, including parameter estimation, are discussed. Potential applications are mentioned briefly.

4,490 citations


"Automatic analysis of the differenc..." refers methods in this paper

  • ...In this paper, we suggest using a simple and fast approach based on Besag’s iterated conditional modes (ICM) algorithm, which has been proved to converge to a local minimum of the energy function [30]....

    [...]

Book
01 Sep 1990

4,384 citations


"Automatic analysis of the differenc..." refers methods in this paper

  • ...In order to analyze the difference image on the basis of the Bayes theory, the main problems to be solved are the estimations of both the probability density functions and and the a priori probabilities and of the classes and , respectively [21]....

    [...]