scispace - formally typeset
Search or ask a question

Showing papers in "IEEE Transactions on Geoscience and Remote Sensing in 2006"


Journal ArticleDOI
TL;DR: The Ozone Monitoring Instrument is a ultraviolet/visible nadir solar backscatter spectrometer, which provides nearly global coverage in one day with a spatial resolution of 13 km/spl times/24 km and will enable detection of air pollution on urban scale resolution.
Abstract: The Ozone Monitoring Instrument (OMI) flies on the National Aeronautics and Space Administration's Earth Observing System Aura satellite launched in July 2004. OMI is a ultraviolet/visible (UV/VIS) nadir solar backscatter spectrometer, which provides nearly global coverage in one day with a spatial resolution of 13 km/spl times/24 km. Trace gases measured include O/sub 3/, NO/sub 2/, SO/sub 2/, HCHO, BrO, and OClO. In addition, OMI will measure aerosol characteristics, cloud top heights, and UV irradiance at the surface. OMI's unique capabilities for measuring important trace gases with a small footprint and daily global coverage will be a major contribution to our understanding of stratospheric and tropospheric chemistry and climate change. OMI's high spatial resolution is unprecedented and will enable detection of air pollution on urban scale resolution. In this paper, the instrument and its performance will be discussed.

1,644 citations


Journal ArticleDOI
TL;DR: A new spatial and temporal adaptive reflectance fusion model (STARFM) algorithm is presented to blend Landsat and MODIS surface reflectance so that high-frequency temporal information from MODIS and high-resolution spatial information from Landsat can be blended for applications that require high resolution in both time and space.
Abstract: The 16-day revisit cycle of Landsat has long limited its use for studying global biophysical processes, which evolve rapidly during the growing season. In cloudy areas of the Earth, the problem is compounded, and researchers are fortunate to get two to three clear images per year. At the same time, the coarse resolution of sensors such as the Advanced Very High Resolution Radiometer and Moderate Resolution Imaging Spectroradiometer (MODIS) limits the sensors' ability to quantify biophysical processes in heterogeneous landscapes. In this paper, the authors present a new spatial and temporal adaptive reflectance fusion model (STARFM) algorithm to blend Landsat and MODIS surface reflectance. Using this approach, high-frequency temporal information from MODIS and high-resolution spatial information from Landsat can be blended for applications that require high resolution in both time and space. The MODIS daily 500-m surface reflectance and the 16-day repeat cycle Landsat Enhanced Thematic Mapper Plus (ETM+) 30-m surface reflectance are used to produce a synthetic "daily" surface reflectance product at ETM+ spatial resolution. The authors present results both with simulated (model) data and actual Landsat/MODIS acquisitions. In general, the STARFM accurately predicts surface reflectance at an effective resolution close to that of the ETM+. However, the performance depends on the characteristic patch size of the landscape and degrades somewhat when used on extremely heterogeneous fine-grained landscapes

1,389 citations


Journal ArticleDOI
TL;DR: The Earth Observing System Microwave Limb Sounder measures several atmospheric chemical species to improve the authors' understanding of stratospheric ozone chemistry, the interaction of composition and climate, and pollution in the upper troposphere.
Abstract: The Earth Observing System Microwave Limb Sounder measures several atmospheric chemical species (OH, HO/sub 2/, H/sub 2/O, O/sub 3/, HCl, ClO, HOCl, BrO, HNO/sub 3/, N/sub 2/O, CO, HCN, CH/sub 3/CN, volcanic SO/sub 2/), cloud ice, temperature, and geopotential height to improve our understanding of stratospheric ozone chemistry, the interaction of composition and climate, and pollution in the upper troposphere. All measurements are made simultaneously and continuously, during both day and night. The instrument uses heterodyne radiometers that observe thermal emission from the atmospheric limb in broad spectral regions centered near 118, 190, 240, and 640 GHz, and 2.5 THz. It was launched July 15, 2004 on the National Aeronautics and Space Administration's Aura satellite and started full-up science operations on August 13, 2004. An atmospheric limb scan and radiometric calibration for all bands are performed routinely every 25 s. Vertical profiles are retrieved every 165 km along the suborbital track, covering 82/spl deg/S to 82/spl deg/N latitudes on each orbit. Instrument performance to date has been excellent; data have been made publicly available; and initial science results have been obtained.

1,191 citations


Journal ArticleDOI
TL;DR: A novel (according to the authors' knowledge) type of scanning synthetic aperture radar (ScanSAR) that solves the problems of scalloping and azimuth-varying ambiguities is introduced, with the name terrain observation with progressive scan (TOPS).
Abstract: In this paper, a novel (according to the authors' knowledge) type of scanning synthetic aperture radar (ScanSAR) that solves the problems of scalloping and azimuth-varying ambiguities is introduced. The technique employs a very simple counterrotation of the radar beam in the opposite direction to a SPOT: hence, the name terrain observation with progressive scan (TOPS). After a short summary of the characteristics of the ScanSAR technique and its problems, TOPSAR, which is the technique of design, the limits, and a focusing technique are introduced. A synthetic example based on a possible future system follows

668 citations


Journal ArticleDOI
TL;DR: It was demonstrated that this new algorithm is able to distinguish dust plumes from fine-mode pollution particles even in complex aerosol environments such as the one over Beijing, and the values of satellite-retrieved aerosol optical thickness from Deep Blue are generally within 20%-30% of those measured by sunphotometers.
Abstract: During the ACE-Asia field campaign, unprecedented amounts of aerosol property data in East Asia during springtime were collected from an array of aircraft, shipboard, and surface instruments. However, most of the observations were obtained in areas downwind of the source regions. In this paper, the newly developed satellite aerosol algorithm called "Deep Blue" was employed to characterize the properties of aerosols over source regions using radiance measurements from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) and Moderate Resolution Imaging Spectroradiometer (MODIS). Based upon the Aringngstroumlm exponent derived from the Deep Blue algorithm, it was demonstrated that this new algorithm is able to distinguish dust plumes from fine-mode pollution particles even in complex aerosol environments such as the one over Beijing. Furthermore, these results were validated by comparing them with observations from AERONET sites in China and Mongolia during spring 2001. These comparisons show that the values of satellite-retrieved aerosol optical thickness from Deep Blue are generally within 20%-30% of those measured by sunphotometers. The analyses also indicate that the roles of mineral dust and anthropogenic particles are comparable in contributing to the overall aerosol distributions during spring in northern China, while fine-mode particles are dominant over southern China. The spring season in East Asia consists of one of the most complex environments in terms of frequent cloudiness and wide ranges of aerosol loadings and types. This paper will discuss how the factors contributing to this complexity influence the resulting aerosol monthly averages from various satellite sensors and, thus, the synergy among satellite aerosol products

642 citations


Journal ArticleDOI
TL;DR: The results of this study indicate that the current MODIS GPP algorithm shows reasonable spatial patterns and temporal variability across a diverse range of biomes and climate regimes.
Abstract: The Moderate Resolution Spectroradiometer (MODIS) sensor has provided near real-time estimates of gross primary production (GPP) since March 2000. We compare four years (2000 to 2003) of satellite-based calculations of GPP with tower eddy CO2 flux-based estimates across diverse land cover types and climate regimes. We examine the potential error contributions from meteorology, leaf area index (LAI)/fPAR, and land cover. The error between annual GPP computed from NASA's Data Assimilation Office's (DAO) and tower-based meteorology is 28%, indicating that NASA's DAO global meteorology plays an important role in the accuracy of the GPP algorithm. Approximately 62% of MOD15-based estimates of LAI were within the estimates based on field optical measurements, although remaining values overestimated site values. Land cover presented the fewest errors, with most errors within the forest classes, reducing potential error. Tower-based and MODIS estimates of annual GPP compare favorably for most biomes, although MODIS GPP overestimates tower-based calculations by 20%-30%. Seasonally, summer estimates of MODIS GPP are closest to tower data, and spring estimates are the worst, most likely the result of the relatively rapid onset of leaf-out. The results of this study indicate, however, that the current MODIS GPP algorithm shows reasonable spatial patterns and temporal variability across a diverse range of biomes and climate regimes. So, while continued efforts are needed to isolate particular problems in specific biomes, we are optimistic about the general quality of these data, and continuation of the MOD17 GPP product will likely provide a key component of global terrestrial ecosystem analysis, providing continuous weekly measurements of global vegetation production

628 citations


Journal ArticleDOI
TL;DR: This paper presents an independent component analysis (ICA) approach to DR, to be called ICA-DR which uses mutual information as a criterion to measure data statistical independency that exceeds second-order statistics.
Abstract: In hyperspectral image analysis, the principal components analysis (PCA) and the maximum noise fraction (MNF) are most commonly used techniques for dimensionality reduction (DR), referred to as PCA-DR and MNF-DR, respectively. The criteria used by the PCA-DR and the MNF-DR are data variance and signal-to-noise ratio (SNR) which are designed to measure data second-order statistics. This paper presents an independent component analysis (ICA) approach to DR, to be called ICA-DR which uses mutual information as a criterion to measure data statistical independency that exceeds second-order statistics. As a result, the ICA-DR can capture information that cannot be retained or preserved by second-order statistics-based DR techniques. In order for the ICA-DR to perform effectively, the virtual dimensionality (VD) is introduced to estimate number of dimensions needed to be retained as opposed to the energy percentage that has been used by the PCA-DR and MNF-DR to determine energies contributed by signal sources and noise. Since there is no prioritization among components generated by the ICA-DR due to the use of random initial projection vectors, we further develop criteria and algorithms to measure the significance of information contained in each of ICA-generated components for component prioritization. Finally, a comparative study and analysis is conducted among the three DR techniques, PCA-DR, MNF-DR, and ICA-DR in two applications, endmember extraction and data compression where the proposed ICA-DR has been shown to provide advantages over the PCA-DR and MNF-DR.

594 citations


Journal ArticleDOI
TL;DR: A novel modified TSVM classifier designed for addressing ill-posed remote-sensing problems is proposed that is able to mitigate the effects of suboptimal model selection and can address multiclass cases.
Abstract: This paper introduces a semisupervised classification method that exploits both labeled and unlabeled samples for addressing ill-posed problems with support vector machines (SVMs). The method is based on recent developments in statistical learning theory concerning transductive inference and in particular transductive SVMs (TSVMs). TSVMs exploit specific iterative algorithms which gradually search a reliable separating hyperplane (in the kernel space) with a transductive process that incorporates both labeled and unlabeled samples in the training phase. Based on an analysis of the properties of the TSVMs presented in the literature, a novel modified TSVM classifier designed for addressing ill-posed remote-sensing problems is proposed. In particular, the proposed technique: 1) is based on a novel transductive procedure that exploits a weighting strategy for unlabeled patterns, based on a time-dependent criterion; 2) is able to mitigate the effects of suboptimal model selection (which is unavoidable in the presence of small-size training sets); and 3) can address multiclass cases. Experimental results confirm the effectiveness of the proposed method on a set of ill-posed remote-sensing classification problems representing different operative conditions

560 citations


Journal ArticleDOI
TL;DR: The proposed simplex growing algorithm (SGA) improves one commonly used EEA, the N-finder algorithm (N-FINDR) developed by Winter, by including a process of growing simplexes one vertex at a time until it reaches a desired number of vertices estimated by the VD, which results in a tremendous reduction of computational complexity.
Abstract: A new growing method for simplex-based endmember extraction algorithms (EEAs), called simplex growing algorithm (SGA), is presented in this paper. It is a sequential algorithm to find a simplex with the maximum volume every time a new vertex is added. In order to terminate this algorithm a recently developed concept, virtual dimensionality (VD), is implemented as a stopping rule to determine the number of vertices required for the algorithm to generate. The SGA improves one commonly used EEA, the N-finder algorithm (N-FINDR) developed by Winter, by including a process of growing simplexes one vertex at a time until it reaches a desired number of vertices estimated by the VD, which results in a tremendous reduction of computational complexity. Additionally, it also judiciously selects an appropriate initial vector to avoid a dilemma caused by the use of random vectors as its initial condition in the N-FINDR where the N-FINDR generally produces different sets of final endmembers if different sets of randomly generated initial endmembers are used. In order to demonstrate the performance of the proposed SGA, the N-FINDR and two other EEAs, pixel purity index, and vertex component analysis are used for comparison

487 citations


Journal ArticleDOI
TL;DR: The Ozone Monitoring Instrument (OMI) flies on NASA's Earth Observing System AURA satellite, launched in July 2004, and will provide near-real-time data for operational agencies in Europe and the U.S.
Abstract: The Ozone Monitoring Instrument (OMI) flies on NASA's Earth Observing System AURA satellite, launched in July 2004. OMI is an ultraviolet/visible (UV/VIS) nadir solar backscatter spectrometer, which provides nearly global coverage in one day, with a spatial resolution of 13 km/spl times/24 km. Trace gases measured include O/sub 3/, NO/sub 2/, SO/sub 2/, HCHO, BrO, and OClO. In addition OMI measures aerosol characteristics, cloud top heights and cloud coverage, and UV irradiance at the surface. OMI's unique capabilities for measuring important trace gases with daily global coverage and a small footprint will make a major contribution to our understanding of stratospheric and tropospheric chemistry and climate change along with Aura's other three instruments. OMI's high spatial resolution enables detection of air pollution at urban scales. Total Ozone Mapping Spectrometer and differential optical absorption spectroscopy heritage algorithms, as well as new ones developed by the international (Dutch, Finnish, and U.S.) OMI science team, are used to derive OMI's advanced backscatter data products. In addition to providing data for Aura's prime objectives, OMI will provide near-real-time data for operational agencies in Europe and the U.S. Examples of OMI's unique capabilities are presented in this paper.

477 citations


Journal ArticleDOI
TL;DR: A new approach to band selection is developed, referred to as constrained band selection (CBS) for hyperspectral imagery, which interprets a band image as a desired target signature vector while considering other band images as unknown signature vectors.
Abstract: Constrained energy minimization (CEM) has shown effective in hyperspectral target detection. It linearly constrains a desired target signature while minimizing interfering effects caused by other unknown signatures. This paper explores this idea for band selection and develops a new approach to band selection, referred to as constrained band selection (CBS) for hyperspectral imagery. It interprets a band image as a desired target signature vector while considering other band images as unknown signature vectors. As a result, the proposed CBS using the concept of the CEM to linearly constrain a band image, while also minimizing band correlation or dependence provided by other band images, is referred to as CEM-CBS. Four different criteria referred to as Band Correlation Minimization (BCM), Band Correlation Constraint (BCC), Band Dependence Constraint (BDC), and Band Dependence Minimization (BDM) are derived for CEM-CBS.. Since dimensionality resulting from conversion of a band image to a vector may be huge, the CEM-CBS is further reinterpreted as linearly constrained minimum variance (LCMV)-based CBS by constraining a band image as a matrix where the same four criteria, BCM, BCC, BDC, and BDM, can be also used for LCMV-CBS. In order to determine the number of bands required to select p, a recently developed concept, called virtual dimensionality, is used to estimate the p. Once the p is determined, a set of p desired bands can be selected by the CEM/LCMV-CBS. Finally, experiments are conducted to substantiate the proposed CEM/LCMV-CBS four criteria, BCM, BCC, BDC, and BDM, in comparison with variance-based band selection, information divergence-based band selection, and uniform band selection.

Journal ArticleDOI
TL;DR: This paper proposes a classification system based on a genetic optimization framework formulated in such a way as to detect the best discriminative features without requiring the a priori setting of their number by the user and to estimate the best SVM parameters in a completely automatic way.
Abstract: Recent remote sensing literature has shown that support vector machine (SVM) methods generally outperform traditional statistical and neural methods in classification problems involving hyperspectral images. However, there are still open issues that, if suitably addressed, could allow further improvement of their performances in terms of classification accuracy. Two especially critical issues are: 1) the determination of the most appropriate feature subspace where to carry out the classification task and 2) model selection. In this paper, these two issues are addressed through a classification system that optimizes the SVM classifier accuracy for this kind of imagery. This system is based on a genetic optimization framework formulated in such a way as to detect the best discriminative features without requiring the a priori setting of their number by the user and to estimate the best SVM parameters (i.e., regularization and kernel parameters) in a completely automatic way. For these purposes, it exploits fitness criteria intrinsically related to the generalization capabilities of SVM classifiers. In particular, two criteria are explored, namely: 1) the simple support vector count and 2) the radius margin bound. The effectiveness of the proposed classification system in general and of these two criteria in particular is assessed both by simulated and real experiments. In addition, a comparison with classification approaches based on three different feature selection methods is reported, i.e., the steepest ascent (SA) algorithm and two other methods explicitly developed for SVM classifiers, namely: 1) the recursive feature elimination technique and 2) the radius margin bound minimization method

Journal ArticleDOI
Myungjin Choi1
TL;DR: The author uses a tradeoff parameter in a new approach to image fusion based on fast IHS fusion that enables fast, easy implementation and provides a satisfactory result, both visually and quantitatively.
Abstract: A useful technique in various applications of remote sensing involves the fusion of panchromatic and multispectral satellite images. Recently, Tu et al. introduced a fast intensity-hue-saturation (IHS) fusion method. Aside from its fast computing capability for fusing images, this method can extend traditional three-order transformations to an arbitrary order. It can also quickly merge massive volumes of data by requiring only resampled multispectral data. However, fast IHS fusion also distorts color in the same way as fusion processes such as the IHS fusion technique. To overcome this problem, the minimization problem for a fast IHS method was considered and the method proposed by Gonza/spl acute/lez-Aud/spl inodot//spl acute/cana et al. is presented as a solution. However, the method is not efficient enough to quickly merge massive volumes of data from satellite images. The author therefore uses a tradeoff parameter in a new approach to image fusion based on fast IHS fusion. This approach enables fast, easy implementation. Furthermore, the tradeoff between the spatial and spectral resolution of the image to be fused can be easily controlled with the aid of the tradeoff parameter. Therefore, with an appropriate tradeoff parameter, the new approach provides a satisfactory result, both visually and quantitatively.

Journal ArticleDOI
TL;DR: In this paper, the authors describe the main components of this international validation effort, including the current participants, their ground LAI measurements and scaling techniques, and the metadata and infrastructure established to share data.
Abstract: Initiated in 1984, the Committee Earth Observing Satellites' Working Group on Calibration and Validation (CEOS WGCV) pursues activities to coordinate, standardize and advance calibration and validation of civilian satellites and their data. One subgroup of CEOS WGCV, Land Product Validation (LPV), was established in 2000 to define standard validation guidelines and protocols and to foster data and information exchange relevant to the validation of land products. Since then, a number of leaf area index (LAI) products have become available to the science community at both global and regional extents. Having multiple global LAI products and multiple, disparate validation activities related to these products presents the opportunity to realize efficiency through international collaboration. So the LPV subgroup established an international LAI intercomparison validation activity. This paper describes the main components of this international validation effort. The paper documents the current participants, their ground LAI measurements and scaling techniques, and the metadata and infrastructure established to share data. The paper concludes by describing plans for sharing both field data and high-resolution LAI products from each site. Many considerations of this global LAI intercomparison can apply to other products, and this paper presents a framework for such collaboration

Journal ArticleDOI
TL;DR: A simple method for estimating water depths from multispectral imagery is described and is applied to several IKONOS data sets for which independent measurements of the water depth are available.
Abstract: A simple method for estimating water depths from multispectral imagery is described and is applied to several IKONOS data sets for which independent measurements of the water depth are available. The methodology is based on a physical model for the shallow-water reflectance, and is capable of correcting for at least some range of water-quality and bottom-reflectance variations. Corrections for sun-glint effects are applied prior to the application of the bathymetry algorithm. The accuracy of the depth algorithm is determined by comparison with ground-truth measurements, and comparisons between the observed and calculated radiances are presented for one case to illustrate how the algorithm corrects for water-attenuation and/or bottom-reflectance variations

Journal ArticleDOI
TL;DR: Results are reported that demonstrate the improved performance and reduction in the false-alarm rate when using the SVDD-based detector on wide-area airborne mine detection and hyperspectral digital imagery collection experiment (HYDICE) imagery.
Abstract: This paper presents a method for anomaly detection in hyperspectral images based on the support vector data description (SVDD), a kernel method for modeling the support of a distribution. Conventional anomaly-detection algorithms are based upon the popular Reed-Xiaoli detector. However, these algorithms typically suffer from large numbers of false alarms due to the assumptions that the local background is Gaussian and homogeneous. In practice, these assumptions are often violated, especially when the neighborhood of a pixel contains multiple types of terrain. To remove these assumptions, a novel anomaly detector that incorporates a nonparametric background model based on the SVDD is derived. Expanding on prior SVDD work, a geometric interpretation of the SVDD is used to propose a decision rule that utilizes a new test statistic and shares some of the properties of constant false-alarm rate detectors. Using receiver operating characteristic curves, the authors report results that demonstrate the improved performance and reduction in the false-alarm rate when using the SVDD-based detector on wide-area airborne mine detection (WAAMD) and hyperspectral digital imagery collection experiment (HYDICE) imagery

Journal ArticleDOI
TL;DR: Experimental results and comparisons with a standard technique developed for the analysis of very high spatial resolution images confirm the effectiveness of the proposed pixel-based classification system.
Abstract: This paper proposes a novel pixel-based system for the supervised classification of very high geometrical (spatial) resolution images. This system is aimed at obtaining accurate and reliable maps both by preserving the geometrical details in the images and by properly considering the spatial-context information. It is made up of two main blocks: 1) a novel feature-extraction block that, extending and developing some concepts previously presented in the literature, adaptively models the spatial context of each pixel according to a complete hierarchical multilevel representation of the scene and 2) a classifier, based on support vector machines (SVMs), capable of analyzing hyperdimensional feature spaces. The choice of adopting an SVM-based classification architecture is motivated by the potentially large number of parameters derived from the contextual feature-extraction stage. Experimental results and comparisons with a standard technique developed for the analysis of very high spatial resolution images confirm the effectiveness of the proposed system

Journal ArticleDOI
TL;DR: The statistical and correlation analyses demonstrate that due to the similarity in their overall variance, it is not necessary to choose between the longer time series of AVHRR and the higher quality of the more modern sensors.
Abstract: This paper evaluates the consistency of the Normalized Difference Vegetation Index (NDVI) records derived from Advanced Very High Resolution Radiometer (AVHRR), SPOT-Vegetation, SeaWiFS, Moderate Resolution Imaging Spectroradiometer, and Landsat ETM+. We used independently derived NDVI from atmospherically corrected ETM+ data at 13 Earth Observation System Land Validation core sites, eight locations of drought, and globally aggregated one-degree data from the four coarse resolution sensors to assess the NDVI records agreement. The objectives of this paper are to: 1) compare the absolute and relative differences of the vegetation signal across these sensors from a user perspective, and, to a lesser degree, 2) evaluate the possibility of merging the AVHRR historical data record with that of the more modern sensors in order to provide historical perspective on current vegetation activities. The statistical and correlation analyses demonstrate that due to the similarity in their overall variance, it is not necessary to choose between the longer time series of AVHRR and the higher quality of the more modern sensors. The long-term AVHRR-NDVI record provides a critical historical perspective on vegetation activities necessary for global change research and, thus, should be the basis of an intercalibrated, sensor-independent NDVI data record. This paper suggests that continuity is achievable given the similarity between these datasets

Journal ArticleDOI
TL;DR: Aura, the last of the large Earth Observing System observatories, was launched on July 15, 2004 and all of the instruments are performing as expected, and HIRDLS will likely be able to deliver most of their planned data products.
Abstract: Aura, the last of the large Earth Observing System observatories, was launched on July 15, 2004. Aura is designed to make comprehensive stratospheric and tropospheric composition measurements from its four instruments, the High Resolution Dynamics Limb Sounder (HIRDLS), the Microwave Limb Sounder (MLS), the Ozone Monitoring Instrument (OMI), and the Tropospheric Emission Spectrometer (TES). With the exception of HIRDLS, all of the instruments are performing as expected, and HIRDLS will likely be able to deliver most of their planned data products. We summarize the mission, instruments, and synergies in this paper.

Journal ArticleDOI
TL;DR: The presented results confirm the effectiveness of the extended MCF unwrapping algorithm and exploits both the spatial characteristics and the temporal relationships among multiple interferograms relevant to a properly chosen sequence.
Abstract: In this paper, an extension of the minimum cost flow (MCF) algorithm dealing with a sparse data grid, which allows the unwrapping of multitemporal differential synthetic aperture radar (SAR) interferograms for the generation of deformation time series, is presented. The proposed approach exploits both the spatial characteristics and the temporal relationships among multiple interferograms relevant to a properly chosen sequence. In particular, the presented solution involves two main steps: first of all, for each arc connecting neighboring pixels on the interferometric azimuth/range grid, the unwrapped phase gradients are estimated via the MCF technique applied in the temporal/perpendicular baseline plane. Following this step, these estimates are used as a starting point for the spatial-unwrapping operation implemented again via the MCF approach but carried out in the azimuth/range plane. The presented results, achieved on simulated and real European Remote Sensing satellite SAR data, confirm the effectiveness of the extended MCF unwrapping algorithm

Journal ArticleDOI
TL;DR: An automatic property-based approach for the detection and compensation of shadow regions with shape information preserved in complex urban color aerial images for solving problems caused by cast shadows in digital image mapping.
Abstract: In urban color aerial images, shadows cast by cultural features may cause false color tone, loss of feature information, shape distortion of objects, and failure of conjugate image matching within the shadow area. This paper presents an automatic property-based approach for the detection and compensation of shadow regions with shape information preserved in complex urban color aerial images for solving problems caused by cast shadows in digital image mapping. The technique is applied in several invariant color spaces that decouple luminance and chromaticity, including HSI, HSV, HCV, YIQ, and YC/sub b/C/sub r/ models. Experimental results from de-shadowing color aerial images of a complex building and a highway segment in these color models are evaluated in terms of visual comparisons and shadow detection accuracy assessments. The results show the effectiveness of the proposed approach in revealing details under shadows and the suitability of these color models in de-shadowing urban color aerial images.

Journal ArticleDOI
TL;DR: This paper summarizes the experience of several collaborating investigators on validation of MODIS LAI products and suggests three key factors that influence the accuracy of LAI retrievals that are suggested from the model used to build the look-up tables accompanying the algorithm.
Abstract: Global products of vegetation green Leaf Area Index (LAI) and Fraction of Photosynthetically Active Radiation absorbed by vegetation (FPAR) are being operationally produced from Terra and Aqua Moderate Resolution Imaging Spectroradiometers (MODIS) at 1-km resolution and eight-day frequency. This paper summarizes the experience of several collaborating investigators on validation of MODIS LAI products and demonstrates the close connection between product validation and algorithm refinement activities. The validation of moderate resolution LAI products includes three steps: 1) field sampling representative of LAI spatial distribution and dynamic range within each major land cover type at the validation site; 2) development of a transfer function between field LAI measurements and high resolution satellite data to generate a reference LAI map over an extended area; and 3) comparison of MODIS LAI with aggregated reference LAI map at patch (multipixel) scale in view of geo-location and pixel shift uncertainties. The MODIS LAI validation experiences, summarized here, suggest three key factors that influence the accuracy of LAI retrievals: 1) uncertainties in input land cover data, 2) uncertainties in input surface reflectances, and 3) uncertainties from the model used to build the look-up tables accompanying the algorithm. This strategy of validation efforts guiding algorithm refinements has led to progressively more accurate LAI products from the MODIS sensors aboard NASA's Terra and Aqua platforms

Journal ArticleDOI
TL;DR: A method for accurate estimation of leaf area density and the cumulative leaf area index (LAI) profiles of small trees (Camellia sasanqua and Deutzia crenata) under different conditions was demonstrated, which used precise voxel-based tree models produced by high-resolution portable scanning lidar.
Abstract: A method for accurate estimation of leaf area density (LAD) and the cumulative leaf area index (LAI) profiles of small trees (Camellia sasanqua and Deutzia crenata) under different conditions was demonstrated, which used precise voxel-based tree models produced by high-resolution portable scanning lidar. In this voxel-based canopy profiling (VCP) method, data for each horizontal layer of the canopy of each tree were collected from symmetrical azimuthal measurement points around the tree using optimally inclined laser beams. The data were then converted into a voxel-based three-dimensional model that reproduced the tree precisely, including within the canopy. This precise voxel model allowed the LAD and LAI of these trees, which have extremely dense and nonrandomly distributed foliage, to be computed by direct counting of the beam-contact frequency in each layer using a point-quadrat method. Corrections for leaf inclination and nonphotosynthetic tissues reduced the estimation error. A beam incident zenith angle near 57.5deg offered a good correction for leaf inclination without knowledge of the actual leaf inclination. Non-photosynthetic tissues were removed by image-processing techniques. The best LAD estimations showed errors of 17% at the minimum horizontal layer thickness and of 0.7% at the maximum thickness. The error of the best LAI estimations was also 0.7%

Journal ArticleDOI
TL;DR: Subjective and objective performance analysis, including coherence edge detection, receiver operating characteristics plots, and bias reduction tables, recommends the proposed algorithm as an effective POL-InSAR postprocessing technique.
Abstract: In this paper, a new method to filter coherency matrices of polarimetric or interferometric data is presented. For each pixel, an adaptive neighborhood (AN) is determined by a region growing technique driven exclusively by the intensity image information. All the available intensity images of the polarimetric and interferometric terms are fused in the region growing process to ensure the validity of the stationarity assumption. Afterward, all the pixels within the obtained AN are used to yield the filtered values of the polarimetric and interferometric coherency matrices, which can be derived either by direct complex multilooking or from the locally linear minimum mean-squared error (LLMMSE) estimator. The entropy/alpha/anisotropy decomposition is then applied to the estimated polarimetric coherency matrices, and coherence optimization is performed on the estimated polarimetric and interferometric coherency matrices. Using this decomposition, unsupervised classification for land applications by an iterative algorithm based on a complex Wishart density function is also applied. The method has been tested on airborne high-resolution polarimetric interferometric synthetic aperture radar (POL-InSAR) images (Oberpfaffenhofen area-German Space Agency). For comparison purposes, the two estimation techniques (complex multilooking and LLMMSE) were tested using three different spatial supports: a fix-sized symmetric neighborhood (boxcar filter), directional nonsymmetric windows, and the proposed AN. Subjective and objective performance analysis, including coherence edge detection, receiver operating characteristics plots, and bias reduction tables, recommends the proposed algorithm as an effective POL-InSAR postprocessing technique.

Journal ArticleDOI
TL;DR: A new noise reduction algorithm is introduced and applied to the problem of denoising hyperspectral imagery, and provides signal-to-noise-ratio improvement up to 84.44% and 98.35% in the first and the second datacubes, respectively.
Abstract: In this paper, a new noise reduction algorithm is introduced and applied to the problem of denoising hyperspectral imagery. This algorithm resorts to the spectral derivative domain, where the noise level is elevated, and benefits from the dissimilarity of the signal regularity in the spatial and the spectral dimensions of hyperspectral images. The performance of the new algorithm is tested on two different hyperspectral datacubes: an Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) datacube that is acquired in a vegetation-dominated site and a simulated AVIRIS datacube that simulates a geological site. The new algorithm provides signal-to-noise-ratio improvement up to 84.44% and 98.35% in the first and the second datacubes, respectively.

Journal ArticleDOI
TL;DR: The image-ratioing approach to SAR change detection is adopted, and the Kittler and Illingworth minimum-error thresholding algorithm is generalized to take into account the non-Gaussian distribution of the amplitude values of SAR images.
Abstract: The availability of synthetic aperture radar (SAR) data offers great potential for environmental monitoring due to the insensitiveness of SAR imagery to atmospheric and sunlight-illumination conditions. In addition, the short revisit time provided by future SAR-based missions will allow a huge amount of multitemporal SAR data to become systematically available for monitoring applications. In this paper, the problem of detecting the changes that occurred on the ground by analyzing SAR imagery is addressed by a completely unsupervised approach, i.e., by developing an automatic thresholding technique. The image-ratioing approach to SAR change detection is adopted, and the Kittler and Illingworth minimum-error thresholding algorithm is generalized to take into account the non-Gaussian distribution of the amplitude values of SAR images. In particular, a SAR-specific parametric modeling approach for the ratio image is proposed and integrated into the thresholding process. Experimental results, which confirm the accuracy of the method for real X-band SAR and spaceborne imaging radar C-band images, are presented

Journal ArticleDOI
TL;DR: The Joint Research Centre of the European Commission (JRC), in partnership with 30 institutions, has produced a global land cover map for the year 2000, the GLC 2000 map, and the validation of the product has now been completed.
Abstract: The Joint Research Centre of the European Commission (JRC), in partnership with 30 institutions, has produced a global land cover map for the year 2000, the GLC 2000 map. The validation of the GLC2000 product has now been completed. The accuracy assessment relied on two methods: a confidence-building method (quality control based on a comparison with ancillary data) and a quantitative accuracy assessment based on a stratified random sampling of reference data. The sample site stratification used an underlying grid of Landsat data and was based on the proportion of priority land cover classes and on the landscape complexity. A total of 1265 sample sites have been interpreted. The first results indicate an overall accuracy of 68.6%. The GLC2000 validation exercise has provided important experiences. The design-based inference conforms to the CEOS Cal-Val recommendations and has proven to be successful. Both the GLC2000 legend development and reference data interpretations used the FAO Land Cover Classification System (LCCS). Problems in the validation process were identified for areas with heterogeneous land cover. This issue appears in both in the GLC2000 (neighborhood pixel variations) and in the reference data (cartographic and thematic mixed units). Another interesting outcome of the GLC2000 validation is the accuracy reporting. Error statistics are provided from both the producer and user perspective and incorporates measures of thematic similarity between land cover classes derived from LCCS

Journal ArticleDOI
TL;DR: An OMI SO/sub 2/ algorithm (the band residual difference) that uses calibrated residuals at SO/ sub 2/ absorption band centers produced by the NASA operational ozone algorithm (OMTO3) is described, which permits daily global measurement of passive volcanic degassing of SO/ Sub 2/ and of heavy anthropogenic SO/Sub 2/ pollution to provide new information on the relative importance of these sources for climate studies.
Abstract: The Ozone Monitoring Instrument (OMI) on EOS/Aura offers unprecedented spatial and spectral resolution, coupled with global coverage, for space-based UV measurements of sulfur dioxide (SO/sub 2/). This paper describes an OMI SO/sub 2/ algorithm (the band residual difference) that uses calibrated residuals at SO/sub 2/ absorption band centers produced by the NASA operational ozone algorithm (OMTO3). By using optimum wavelengths for retrieval of SO/sub 2/, the retrieval sensitivity is improved over NASA predecessor Total Ozone Mapping Spectrometer (TOMS) by factors of 10 to 20, depending on location. The ground footprint of OMI is eight times smaller than TOMS. These factors produce two orders of magnitude improvement in the minimum detectable mass of SO/sub 2/. Thus, the diffuse boundaries of volcanic clouds can be imaged better and the clouds can be tracked longer. More significantly, the improved sensitivity now permits daily global measurement of passive volcanic degassing of SO/sub 2/ and of heavy anthropogenic SO/sub 2/ pollution to provide new information on the relative importance of these sources for climate studies.

Journal ArticleDOI
TL;DR: This paper presents a new algorithm for the global retrieval of LAI where the bidirectional reflectance distribution function (BRDF) is considered explicitly in the algorithm and hence removing the need of doing BRDF corrections and normalizations to the input images.
Abstract: Leaf area index (LAI) is one of the most important Earth surface parameters in modeling ecosystems and their interaction with climate. Based on a geometrical optical model (Four-Scale) and LAI algorithms previously derived for Canada-wide applications, this paper presents a new algorithm for the global retrieval of LAI where the bidirectional reflectance distribution function (BRDF) is considered explicitly in the algorithm and hence removing the need of doing BRDF corrections and normalizations to the input images. The core problem of integrating BRDF into the LAI algorithm is that nonlinear BRDF kernels that are used to relate spectral reflectances to LAI are also LAI dependent, and no analytical solution is found to derive directly LAI from reflectance data. This problem is solved through developing a simple iteration procedure. The relationships between LAI and reflectances of various spectral bands (red, near infrared, and shortwave infrared) are simulated with Four-Scale with a multiple scattering scheme. Based on the model simulations, the key coefficients in the BRDF kernels are fitted with Chebyshev polynomials of the second kind. Spectral indices - the simple ratio and the reduced simple ratio - are used to effectively combine the spectral bands for LAI retrieval. Example regional and global LAI maps are produced. Accuracy assessment on a Canada-wide LAI map is made in comparison with a previously validated 1998 LAI map and ground measurements made in seven Landsat scenes

Journal ArticleDOI
TL;DR: The optimized algorithm for the retrieval of stratospheric, tropospheric and total column densities of nitrogen dioxide from earthshine radiances measured by the Ozone Monitoring Instrument is described and global maps of NO/sub 2/ VCDs are presented for the first time.
Abstract: We describe the operational algorithm for the retrieval of stratospheric, tropospheric, and total column densities of nitrogen dioxide (NO/sub 2/) from earthshine radiances measured by the Ozone Monitoring Instrument (OMI), aboard the EOS-Aura satellite. The algorithm uses the DOAS method for the retrieval of slant column NO/sub 2/ densities. Air mass factors (AMFs) calculated from a stratospheric NO/sub 2/ profile are used to make initial estimates of the vertical column density. Using data collected over a 24-h period, a smooth estimate of the global stratospheric field is constructed. Where the initial vertical column densities exceed the estimated stratospheric field, we infer the presence of tropospheric NO/sub 2/, and recalculate the vertical column density (VCD) using an AMF calculated from an assumed tropospheric NO/sub 2/ profile. The parameters that control the operational algorithm were selected with the aid of a set of data assembled from stratospheric and tropospheric chemical transport models. We apply the optimized algorithm to OMI data and present global maps of NO/sub 2/ VCDs for the first time.