scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Development of multi-sensor global cloud and radiance composites for earth radiation budget monitoring from DSCOVR

02 Oct 2017-Vol. 10424, pp 21-33
TL;DR: In this article, a new algorithm is proposed for optimal merging of selected radiances and cloud properties derived from multiple spacecraft imagers to obtain seamless global hourly composites at 5-km resolution.
Abstract: The Deep Space Climate Observatory (DSCOVR) enables analysis of the daytime Earth radiation budget via the onboard Earth Polychromatic Imaging Camera (EPIC) and National Institute of Standards and Technology Advanced Radiometer (NISTAR). Radiance observations and cloud property retrievals from low earth orbit and geostationary satellite imagers have to be co-located with EPIC pixels to provide scene identification in order to select anisotropic directional models needed to calculate shortwave and longwave fluxes. A new algorithm is proposed for optimal merging of selected radiances and cloud properties derived from multiple satellite imagers to obtain seamless global hourly composites at 5-km resolution. An aggregated rating is employed to incorporate several factors and to select the best observation at the time nearest to the EPIC measurement. Spatial accuracy is improved using inverse mapping with gradient search during reprojection and bicubic interpolation for pixel resampling. The composite data are subsequently remapped into EPIC-view domain by convolving composite pixels with the EPIC point spread function defined with a half-pixel accuracy. PSF-weighted average radiances and cloud properties are computed separately for each cloud phase. The algorithm has demonstrated contiguous global coverage for any requested time of day with a temporal lag of under 2 hours in over 95% of the globe.

Summary (2 min read)

1. INTRODUCTION

  • The Deep Space Climate Observatory was launched in February 2015 to reach a looping halo orbit around Lagrangian point 1 (L1) with a spacecraft-Earth-Sun angle varying from 4 to 15 degrees [1, 2].
  • This goal implies calculation of the albedo and the outgoing longwave radiation using a combination of NISTAR, EPIC, and other imager-based products.
  • The process involves, first, deriving and merging cloud properties and radiation estimates from low earth orbit (LEO) and geosynchronous (GEO) satellite imagers.
  • These properties are then spatially averaged and collocated to match the EPIC pixels to provide the scene identification needed to select anisotropic directional models (ADMs).
  • These datasets have 409 pixels by about 12800 lines matching the 4 km/pixel resolution and temporal coverage of the original AVHRR Level 1B GAC data.

2. GENERATION OF GLOBAL GEO/LEO COMPOSITES

  • The input GEO, AVHRR, and MODIS datasets are carefully pre-processed using a variety of data quality control algorithms including both automated and human-analyzed techniques.
  • If a satellite observation occurred long before or long after the nominal time, then that data sample is given a lower rating and thus will likely be replaced with data from another source that was observed closer in time.
  • As an inherent result of the merging process, the obtained composite may reveal uneven boundaries between data that originate from two different satellites .
  • Most programming interfaces provide a uniformly distributed random variable, but converting it to a normally distributed number involves two logarithms, two square roots, and two trigonometric functions, severely impairing the computational performance.

3. EPIC-VIEW COMPOSITES

  • EPIC's total field of view (FOV) is 0.61 full angle and the recorded image dimensions are 20482048 pixels for high resolution channels, which translates to about nominal 7.8 km pixel spacing at nadir.
  • To minimize the under-sampling of the global composite data and to improve the accuracy of the PSF sampling, the decision was made to double the dimensions of the EPIC domain by creating a virtual grid of 40964096 pixels at 3.9 km/pix resolution.
  • On the new grid, the PSF can be sampled with half-pixel accuracy and so the matrix of PSF weights becomes 1212 size with the largest four weights at the center (corresponding to the former central pixel) of 0.07598.
  • The process of remapping and convolution can be summarized in 5 steps: 1) Convert the global composite data (where applicable); 2) Remap to the virtual grid with bilinear interpolation;.
  • The actual FOV fractions (sums of the PSF weights screened out by the same masks), which describe the percent ratio of a particular cloud phase in a given FOV, are also calculated and stored accordingly.

4. DISCUSSION AND CONCLUSION

  • A very good spatial collocation can be seen when comparing the two images.
  • Overall, the global composite data files provide well-characterized and consistent regional and global cloud and surface property datasets covering all time and space scales to match with EPIC.
  • The EPIC-view composites are useful for many applications including: Inter-calibration of non-UV EPIC channels; Provide high-resolution independent scene identification for each EPIC pixel; Convolve with EPIC radiances and CERES ADMs to compute daytime fluxes from NISTAR; Serve as a comparison source for EPIC cloud retrievals; Provide cloud mask for other retrievals based on EPIC radiances.
  • Spatial variability and continuity of the global composite data have been analyzed to assess the performance of the merging criteria.
  • The described algorithm has demonstrated seamless global coverage for any requested time of day with a temporal lag of under 2 hours in over 95% of the globe.

Did you find this useful? Give us your feedback

Figures (12)

Content maybe subject to copyright    Report

Development of Multi-Sensor Global Cloud and Radiance Composites for Earth
Radiation Budget Monitoring from DSCOVR
Konstantin Khlopenkov
*a
, David Duda
a
, Mandana Thieman
a
,
Patrick Minnis
a
, Wenying Su
b
, and Kristopher Bedka
b
a
Science Systems and Applications Inc., Hampton, VA 23666;
b
NASA Langley Research Center, Hampton, VA 23681.
ABSTRACT
The Deep Space Climate Observatory (DSCOVR) enables analysis of the daytime Earth radiation budget via the
onboard Earth Polychromatic Imaging Camera (EPIC) and National Institute of Standards and Technology Advanced
Radiometer (NISTAR). Radiance observations and cloud property retrievals from low earth orbit and geostationary
satellite imagers have to be co-located with EPIC pixels to provide scene identification in order to select anisotropic
directional models needed to calculate shortwave and longwave fluxes.
A new algorithm is proposed for optimal merging of selected radiances and cloud properties derived from multiple
satellite imagers to obtain seamless global hourly composites at 5-km resolution. An aggregated rating is employed to
incorporate several factors and to select the best observation at the time nearest to the EPIC measurement. Spatial
accuracy is improved using inverse mapping with gradient search during reprojection and bicubic interpolation for pixel
resampling.
The composite data are subsequently remapped into EPIC-view domain by convolving composite pixels with the EPIC
point spread function defined with a half-pixel accuracy. PSF-weighted average radiances and cloud properties are
computed separately for each cloud phase. The algorithm has demonstrated contiguous global coverage for any
requested time of day with a temporal lag of under 2 hours in over 95% of the globe.
Keywords: DSCOVR, radiation budget, cloud properties, image processing, composite, subpixel, point spread function.
1. INTRODUCTION
The Deep Space Climate Observatory (DSCOVR) was launched in February 2015 to reach a looping halo orbit around
Lagrangian point 1 (L1) with a spacecraft-Earth-Sun angle varying from 4 to 15 degrees [1, 2]. The Earth science
instruments consist of the Earth Polychromatic Imaging Camera (EPIC) and the National Institute of Standards and
Technology Advanced Radiometer (NISTAR) [3]. The vantage point from L1, unique to Earth science, provides for
continuous monitoring of the Earth’s reflected and emitted radiation and enables analysis of the daytime Earth radiation
budget. This goal implies calculation of the albedo and the outgoing longwave radiation using a combination of
NISTAR, EPIC, and other imager-based products. The process involves, first, deriving and merging cloud properties and
radiation estimates from low earth orbit (LEO) and geosynchronous (GEO) satellite imagers. These properties are then
spatially averaged and collocated to match the EPIC pixels to provide the scene identification needed to select
anisotropic directional models (ADMs). Shortwave (SW) and longwave (LW) anisotropic factors are finally computed
for each EPIC pixel and convolved with EPIC reflectances to determine single average SW and LW factors used to
convert the NISTAR-measured radiances to global daytime SW and LW fluxes [4, 5].
EPIC imager delivers 20482048 pixel imagery in 10 spectral bands, all shortwave, from 317 to 780 nm, while NISTAR
measures the full Earth disk radiance at the top-of-atmosphere (TOA) in three broadband spectral windows: 0.2100,
0.24, and 0.74 m. Although NISTAR can provide accurate top-of-atmosphere (TOA) radiance measurements, the
low resolution of EPIC imagery (discussed futher below) and its lack of infrared channels diminish its usefulness in
obtaining details on small-scale surface and cloud properties [6]. Previous studies [7] have shown that these properties
*
konstantin.khlopenkov@nasa.gov; phone 1 757 951-1914; fax 1 757 951-1900; www.ssaihq.com

have a strong influence on the anisotropy of the radiation at the TOA, and ignoring such effects can result in large TOA-
flux errors. To overcome this problem, high-resolution scene identification is derived from the radiance observations and
cloud properties retrievals from LEO (including NASA Terra and Aqua MODIS, and NOAA AVHRR) and from a
global constellation of GEO satellite imagers, which include Geostationary Operational Environmental Satellites
(GOES) operated by the NOAA, Meteosat satellites (by EUMETSAT), and the Multifunctional Transport Satellites
(MTSAT) and Himawari-8 satellites operated by Japan Meteorological Agency (JMA).
The NASA Clouds and the Earth's Radiant Energy System (CERES) [8] project was designed to monitor the Earth’s
energy balance in the shortwave and longwave broadband wavelengths. For the SYN1deg Edition 4 product [9],
1-hourly imager radiances are obtained from 5 contiguous GEO satellite positions (including GOES-13 and -15,
METEOSAT-7 and -10, MTSAT-2, and Himawari-8). The GEO data utilized for CERES was obtained from the Man
computer Interactive Data Access System (McIDAS) [10] archive, which collects data from its antenna systems in near
real time. The imager radiances are then used to retrieve cloud and radiative properties using the CERES Cloud
Subsystem group algorithms [11, 12]. Radiative properties are derived from GEO and MODIS following the method
described in [13] to calculate broadband shortwave albedo, and following a modified version of the radiance-based
approach of [9] to calculate broadband longwave flux. For the 1st generation GEO imagers, the visible (0.65 µm), water
vapor (WV) (6.7 µm) and IR window (11 µm) channels are utilized in the cloud algorithm. For the 2nd generation GEO
imagers, the solar IR (SIR) (3.9µm) and split-window (12 µm) channels are added. In GOES-12 through 15, the split
window is replaced with a CO
2
slicing channel (13.2 µm). The native nominal pixel resolution varies as a function of
wavelength and by satellite. The channel data is sub-sampled to a ~810 km/pixel resolution. For MODIS, the algorithm
uses every fourth 1-km pixel and every other scanline and yields hourly datasets of 339 pixels by about 12000 lines. A
long-term cloud and radiation property data product has also been developed using AVHRR Global Area Coverage
(GAC) imagery [13], which has been georeferenced to sub-pixel accuracy [14]. These datasets have 409 pixels by about
12800 lines matching the 4 km/pixel resolution and temporal coverage of the original AVHRR Level 1B GAC data.
This work presents an algorithm for optimal merging of selected radiances and cloud properties derived from multiple
satellite imagers to obtain a seamless global composite product at 5-km resolution. These composite images are
produced for each observation time of the EPIC instrument (typically 300500 composites per month). In the next step,
for accurate collocation with EPIC pixels, the global composites are remapped into the EPIC-view domain using
geolocation information supplied in EPIC Level 1B data.
2. GENERATION OF GLOBAL GEO/LEO COMPOSITES
The input GEO, AVHRR, and MODIS datasets are carefully pre-processed using a variety of data quality control
algorithms including both automated and human-analyzed techniques. This is especially true of the GEO satellites that
frequently contain bad scan lines and/or other artifacts that result in data quality issues despite the radiance values being
within a physical range. Some of the improvements to the quality of the imager data include detector de-striping
algorithm [15] and an automated system for detection and filtering of transmission noise and corrupt data [16]. The
CERES Clouds Subsystem has pioneered these and other data quality tests to ensure the removal of as many satellite
artifacts as possible.
The global composites are produced on a rectangular latitude/longitude grid with a constant angle spacing of exactly
1/22 degree, which is about 5 km per pixel near the Equator, resulting in 79203960 grid dimensions. Each source data
image is remapped onto this grid by means of inverse mapping, which uses the latitude and longitude coordinates of a
pixel in the output grid for searching for the corresponding sample location in the input data domain (provided that the
latitude and longitude are known for every pixel of the input data). This process employs the concurrent gradient search
described in [17], which uses local gradients of latitude and longitude fields of the input data to locate the sought sample.
This search is very computationally efficient and yields a fractional row/column position, which is then used to
interpolate the adjacent data samples (such as reflectance or brightness temperature) by means of a 66 point resampling
function. At the image boundaries, the missing contents of the 66 window are padded by replicating the edge pixel
values. The image resampling operations are implemented as Lanczos filtering [18] extended to the 2D case with the
parameter a = 3. This interpolation method is based on the sinc filter, which is known to be an optimal reconstruction
filter for band-limited signals, e.g. digital imagery. For discrete input data, such as cloud phase or surface type, the
nearest-neighbor sampling is used instead of interpolation. The described remapping process allows us to preserve as

much as possible the spatial accuracy of the input imagery, which has a nominal resolution of 48 km/pix at the satellite
nadir.
After the remapping, the new data are merged with data in the global composite. Composite pixels are replaced with the
new samples only if their quality is lower compared to that of the new data. The quality is measured by a specially
designed rating R, which incorporates five parameters: nominal satellite resolution F
resolution
, pixel time relative to the
EPIC time, viewing zenith angle , distance from day/night terminator F
terminator
, and sun glint factor F
glint
:
2
5.1
glintterminatorresolution
)τ/(1
θcos8.02.0
t
FFFR
(1)
where t is the absolute difference, in hours, between the original observation time of a given sample and the EPIC
observation time. The latter defines the nominal time of the whole composite. If a satellite observation occurred long
before or long after the nominal time, then that data sample is given a lower rating and thus will likely be replaced with
data from another source that was observed closer in time. The characteristic time controls the attenuation rate of the
rating R with increasing time difference. Here = 5 hours is used, which will be justified further below. One can see that
a time difference of 2.8 hours decreases the rating by a factor of about 2. Observations with a time difference of more
than 4 hours have not been processed at all.
Observations with a larger viewing zenith angle (VZA) have a lower spatial resolution, and so the numerator in Equation
(1) reduces the rating accordingly. At a very high VZA, observations may still be usable when there is no alternative
data source to fill in the gap, and therefore the numerator does not decrease to zero. The F
resolution
factor describes a
subjective preference in choosing a particular satellite due to its nominal resolutions or other factors. It is set to 100 for
METEOSAT-7, 220 for MTSAT-1R, -2, and Himawari-8, 210 for all other GEO satellites, 185 for MODIS Terra and
Aqua, and 140 for all NOAA satellites. This is designed to prefer GEO satellites in equatorial and mid-latitude regions
and to help the overall continuity of the composite. Also, AVHRR lacks the water vapor channel (6.7 m), which is
critical for correct retrieval of cloud properties, and therefore AVHRR data are assigned a lower initial rating. The F
glint
factor is designed to reduce the pixel's rating in the vicinity of sun glint and is calculated as follows:
2
2
glint
92.01
1
15.01
b
F
(2)
otherwise92.0
92.0cosifcos
b
and
cosθsinθsinθcosθcosβcos
00
(3)
where
0
is the solar zenith angle and
is the relative azimuth angle.
Finally, the F
terminator
factor is designed to give lower priority to pixels around the day/time terminator, which may have
lower quality of the cloud property retrievals obtained by the daytime algorithm. It is calculated as:
)5.88(cos375.0625.0
0terminator
F
(4)
where
0
is taken in degrees but the argument of cosine is treated as radians and is clipped to the range of [].
Overall, the use of such an aggregated rating allows merging of multiple input factors into a single number that can be
compared and enables higher flexibility in choosing between two candidate pixels. A fixed-threshold approach would be
more difficult to implement for multiple factors, harder to fine-tune and achieve reliable and consistent results, and it
would still cause discontinuities in the final composite. For example, a moderate resolution off-nadir observation may
still be usable when all other candidates occurred too far in time. An opposite situation is also possible. The way all the
factors are accounted for in Equation (1) allows for an optimal compromise solution to this problem.
Once the rating comparison indicates that a pixel in the composite is to be replaced with the input data, all parameters
(already remapped) associated with that particular satellite observation are copied to the composite pixel. A list of those
parameters is shown in Table 1. There are a few limitations here that are worth mentioning. First, the Near-Infrared

(NIR) channel is absent on GEO imagers and so reflectance in the 0.86 m band is flagged as a missing value for
composite pixels that originate from GEO satellites. Similarly, the 6.7 m water vapor band is absent for AVHRR pixels.
On GOES-12, -13, -14, and -15, the split-window brightness temperature (BT) is measured in the 13.5 m band instead
of 12 m. For this reason, BT in 12.0 m is also flagged as a missing value for composite pixels originating from those
four satellites. The surface type information is retrieved from the International Geosphere-Biosphere Programme (IGBP)
map [19], with the addition of snow and ice flags taken from the NOAA daily snow and ice cover maps at 1/6 deg spatial
Table 1. List of parameters included in global composite.
Parameter AVHRR
MODIS
GEOs
25 Satellite ID
Global
Composite
± 3.5 hours maximum
from IGBP + snow/ice flags
Figure 1. Global composite map of the brightness temperature in channel 10.8 m generated for Sep-15-2015
13:23UTC. The continuous coverage leaves no gaps and apparent disruptions in the temperature field.

resolution. The time relative to EPIC observation is the t from Equation (1) but converted to seconds and stored as a
signed integer number. Satellite ID is an integer number unique for each satellite which is designed to indicate the
origination of a given composite pixel.
A typical result of the compositing algorithm is presented in Figure 1, which shows an example of the map of brightness
temperature in channel 10.8 m composited for Sep-15-2015 13:23UTC. The composite image presents a continuous
coverage with no gaps and no artificial breaks or disruptions in the temperature data. A corresponding map of satellite
ID is shown in Figure 2, which represents typical spatial coverage from different satellites. Most of the equatorial
regions are covered by GEO satellites, while the polar regions are represented by Terra and Aqua, because MODIS was
initially given a higher F
resolution
factor than AVHRR. Similarly, the MET-7 data were assigned a lower F
resolution
factor
in order to correct for their lower quality, and therefore the MET-7 coverage automatically shrinks and is more likely to
Figure 2. A map of satellite coverage in a global composite generated for Sep-15-2015 13:23UTC.
Figure 3. A map of the time relative to nominal for the case of Sep-15-2015 13:23UTC. Pale colors correspond to a
lower difference in time and bright colors indicate a larger difference.

Citations
More filters
Journal ArticleDOI
TL;DR: In this article, the authors used a random forest model to estimate global hourly downward shortwave radiation (SW) and photosynthetically active radiation (PAR) at 0.1°×0.0° (about 10 km at equator) spatial resolution based on EPIC measurements.

39 citations

Journal ArticleDOI
TL;DR: Comparison with co-located cloud retrievals from geosynchronous earth orbit (GEO) and lowearth orbit (LEO) satellites shows that the EPIC cloud product algorithms are performing well and are consistent with theoretical expectations.
Abstract: . This paper presents the physical basis of the Earth Polychromatic Imaging Camera (EPIC) cloud product algorithms and an initial evaluation of their performance. Since June 2015, EPIC has been providing observations of the sunlit side of the Earth with its 10 spectral channels ranging from the UV to the near-infrared. A suite of algorithms has been developed to generate the standard EPIC Level 2 cloud products that include cloud mask, cloud effective pressure/height, and cloud optical thickness. The EPIC cloud mask adopts the threshold method and utilizes multichannel observations and ratios as tests. Cloud effective pressure/height is derived with observations from the O2 A-band (780 and 764 nm) and B-band (680 and 688 nm) pairs. The EPIC cloud optical thickness retrieval adopts a single-channel approach in which the 780 and 680 nm channels are used for retrievals over ocean and over land, respectively. Comparison with co-located cloud retrievals from geosynchronous earth orbit (GEO) and low earth orbit (LEO) satellites shows that the EPIC cloud product algorithms are performing well and are consistent with theoretical expectations. These products are publicly available at the Atmospheric Science Data Center at the NASA Langley Research Center for climate studies and for generating other geophysical products that require cloud properties as input.

31 citations

Journal ArticleDOI
TL;DR: Hao et al. as mentioned in this paper adopted a suite of data-driven machine-learning models to generate the first globally land products of downward shortwave radiation (SW) from June 2015 to June 2019, based on the Earth Polychromatic Imaging Camera (EPIC) data.
Abstract: . Downward shortwave radiation (SW) and photosynthetically active radiation (PAR) play crucial roles in Earth system dynamics. Spaceborne remote sensing techniques provide a unique means for mapping accurate spatiotemporally continuous SW–PAR, globally. However, any individual polar-orbiting or geostationary satellite cannot satisfy the desired high temporal resolution (sub-daily) and global coverage simultaneously, while integrating and fusing multisource data from complementary satellites/sensors is challenging because of co-registration, intercalibration, near real-time data delivery and the effects of discrepancies in orbital geometry. The Earth Polychromatic Imaging Camera (EPIC) on board the Deep Space Climate Observatory (DSCOVR), launched in February 2015, offers an unprecedented possibility to bridge the gap between high temporal resolution and global coverage and characterize the diurnal cycles of SW–PAR globally. In this study, we adopted a suite of well-validated data-driven machine-learning models to generate the first global land products of SW–PAR, from June 2015 to June 2019, based on DSCOVR/EPIC data. The derived products have high temporal resolution (hourly) and medium spatial resolution ( 0.1 ∘ × 0.1 ∘ ), and they include estimates of the direct and diffuse components of SW–PAR. We used independently widely distributed ground station data from the Baseline Surface Radiation Network (BSRN), the Surface Radiation Budget Network (SURFRAD), NOAA's Global Monitoring Division and the U.S. Department of Energy's Atmospheric System Research (ASR) program to evaluate the performance of our products, and we further analyzed and compared the spatiotemporal characteristics of the derived products with the benchmarking Clouds and the Earth's Radiant Energy System Synoptic (CERES) data. We found both the hourly and daily products to be consistent with ground-based observations (e.g., hourly and daily total SWs have low biases of −3.96 and −0.71 W m −2 and root-mean-square errors (RMSEs) of 103.50 and 35.40 W m −2 , respectively). The developed products capture the complex spatiotemporal patterns well and accurately track substantial diurnal, monthly, and seasonal variations in SW–PAR when compared to CERES data. They provide a reliable and valuable alternative for solar photovoltaic applications worldwide and can be used to improve our understanding of the diurnal and seasonal variabilities of the terrestrial water, carbon and energy fluxes at various spatial scales. The products are freely available at https://doi.org/10.25584/1595069 (Hao et al., 2020).

18 citations

Journal ArticleDOI
TL;DR: In this paper, an unfiltering algorithm was developed for the NISTAR SW and NIR channels using a spectral radiance database calculated for typical Earth scenes, which was then converted to full-disk daytime SW and LW flux by accounting for the anisotropic characteristics of the Earth-reflected and emitted radiances.
Abstract: . The National Institute of Standards and Technology Advanced Radiometer (NISTAR) onboard the Deep Space Climate Observatory (DSCOVR) provides continuous full-disk global broadband irradiance measurements over most of the sunlit side of the Earth. The three active cavity radiometers measure the total radiant energy from the sunlit side of the Earth in shortwave (SW; 0.2–4 µ m), total (0.4–100 µ m), and near-infrared (NIR; 0.7–4 µ m) channels. The Level 1 NISTAR dataset provides the filtered radiances (the ratio between irradiance and solid angle). To determine the daytime top-of-atmosphere (TOA) shortwave and longwave radiative fluxes, the NISTAR-measured shortwave radiances must be unfiltered first. An unfiltering algorithm was developed for the NISTAR SW and NIR channels using a spectral radiance database calculated for typical Earth scenes. The resulting unfiltered NISTAR radiances are then converted to full-disk daytime SW and LW flux by accounting for the anisotropic characteristics of the Earth-reflected and emitted radiances. The anisotropy factors are determined using scene identifications determined from multiple low-Earth orbit and geostationary satellites as well as the angular distribution models (ADMs) developed using data collected by the Clouds and the Earth's Radiant Energy System (CERES). Global annual daytime mean SW fluxes from NISTAR are about 6 % greater than those from CERES, and both show strong diurnal variations with daily maximum–minimum differences as great as 20 Wm −2 depending on the conditions of the sunlit portion of the Earth. They are also highly correlated, having correlation coefficients of 0.89, indicating that they both capture the diurnal variation. Global annual daytime mean LW fluxes from NISTAR are 3 % greater than those from CERES, but the correlation between them is only about 0.38.

18 citations

References
More filters
Journal ArticleDOI
TL;DR: The method was implemented to reproject MODIS Level 1B imagery over Canada, North America, and Arctic circumpolar zone in the following four popular geographic projections: Plate Care, Lambert Conic Conformal, Universal Transverse Mercator, and Lambert Azimuthal Equal-Area.
Abstract: This paper presents details regarding implementation of a novel algorithm for reprojection of Moderate Resolution Imaging Spectroradiometer (MODIS) Level 1B imagery. The method is based on a simultaneous 2-D search in latitude and longitude geolocation fields by using their local gradients. Due to the segmented structure of MODIS imagery caused by the instrument whiskbroom electrooptical design, the gradient search is realized in the following two steps: intersegment and intrasegment search. This approach resolves the discontinuity of the latitude/longitude geolocation fields caused by overlap between consecutively scanned MODIS multidetector image segments. The structure of the algorithm allows equal efficiency with nearest neighbor and bilinear interpolation. A special procedure that combines analytical and numerical schemes is designed for reprojecting imagery near the polar region, where the standard gradient search may become unstable. The performance of the method was validated by comparison of reprojected MODIS/Terra and MODIS/Aqua images with georectified Landsat-7 Enhanced Thematic Mapper Plus imagery over Canada. It was found that the proposed method preserves the absolute geolocation accuracy of MODIS pixels determined by the MODIS geolocation team. The method was implemented to reproject MODIS Level 1B imagery over Canada, North America, and Arctic circumpolar zone in the following four popular geographic projections: Plate Care (cylindrical equidistant), Lambert Conic Conformal, Universal Transverse Mercator, and Lambert Azimuthal Equal-Area. It was also found to be efficient for reprojection of Advanced Very High Resolution Radiometer and Medium Resolution Imaging Spectrometer satellite images and general-type meteorological fields, such as the North American Regional Reanalysis data sets.

58 citations


"Development of multi-sensor global ..." refers methods in this paper

  • ...This process employs the concurrent gradient search described in [17], which uses local gradients of latitude and longitude fields of the input data to locate the sought sample....

    [...]

Journal ArticleDOI
TL;DR: The application of the developed processing system showed that the algorithm achieved better than 1/3 FOV geolocation accuracy for AVHRR 1-km scenes, and was designed for processing daytime data as it intensively employs observations from optical solar bands, the near-infrared channel in particular.
Abstract: Precise geolocation is one of the fundamental requirements for satellite imagery to be suitable for climate applications. The Global Climate Observing System and the Committee on Earth Observing Satellites identified the requirement for the accuracy of geolocation of satellite data for climate applications as 1/3 field of view (FOV). This requirement for the series of the Advanced Very High Resolution Radiometer (AVHRR) on the National Oceanic and Atmospheric Administration platforms cannot be met without implementing the ground control point (GCP) correction, particularly for historical data, because of limited accuracy of orbit modeling and knowledge of satellite attitude angles. This paper presents a new method for precise georeferencing of the AVHRR imagery developed as part of the new Canadian AVHRR processing system (CAPS) designed for generating high-quality AVHRR satellite climate data record at 1-km spatial resolution. The method works in swath projection and uses the following: 1) the reference monthly images from Moderate Resolution Imaging Spectroradiometer at 250-m resolution; 2) orthorectification to correct for surface elevation; and 3) a novel image matching technique in swath projection to achieve the subpixel resolution. The method is designed for processing daytime data as it intensively employs observations from optical solar bands, the near-infrared channel in particular. The application of the developed processing system showed that the algorithm achieved better than 1/3 FOV geolocation accuracy for AVHRR 1-km scenes. It has very high efficiency rate (> 97%) due to the dense and uniform GCP coverage of the study area (5700 × 4800 km2 ), covering the entire Canada, the Northern U.S., Alaska, Greenland, and surrounding oceans.

46 citations


"Development of multi-sensor global ..." refers methods in this paper

  • ...A long-term cloud and radiation property data product has also been developed using AVHRR Global Area Coverage (GAC) imagery [13], which has been georeferenced to sub-pixel accuracy [14]....

    [...]

Journal ArticleDOI
TL;DR: In this paper, the authors used satellite cloud property retrievals to define empirical angular distribution models (ADMs) for estimating top-of-atmosphere (TOA) albedo.
Abstract: The next generation of Earth radiation budget satellite instruments will routinely merge estimates of global top-of-atmosphere radiative fluxes with cloud properties. This information will offer many new opportunities for validating radiative transfer models and cloud parameterizations in climate models. In this study, five months of POLarization and Directionality of the Earth's Reflectances (POLDER) 670 nm radiance measurements are considered in order to examine how satellite cloud property retrievals can be used to define empirical Angular Distribution Models (ADMs) for estimating top-of-atmosphere (TOA) albedo. ADMs are defined for 19 scene types defined by satellite retrievals of cloud fraction and cloud optical depth. Two approaches are used to define the ADM scene types: The first assumes there are no biases in the retrieved cloud properties and defines ADMs for fixed discrete intervals of cloud fraction and cloud optical depth (fixed-tau approach). The second approach involves the same cloud fraction intervals, but uses percentile intervals of cloud optical depth instead (percentile-tau approach). Albedos generated using these methods are compared with albedos inferred directly from the mean observed reflectance field. Albedos based on ADMs that assume cloud properties are unbiased (fixed-tau approach) show a strong systematic dependence on viewing geometry. This dependence becomes more pronounced with increasing solar zenith angle, reaching approximately equals 12% (relative) between near-nadir and oblique viewing zenith angles for solar zenith angles between 60 deg and 70 deg. The cause for this bias is shown to be due to biases in the cloud optical depth retrievals. In contrast, albedos based on ADMs built using percentile intervals of cloud optical depth (percentile-tau approach) show very little viewing zenith angle dependence and are in good agreement with albedos obtained by direct integration of the mean observed reflectance field (less than 1% relative error). When the ADMs are applied separately to populations consisting of only liquid water and ice clouds, significant biases in albedo with viewing geometry are observed (particularly at low sun elevations), highlighting the need to account for cloud phase both in cloud optical depth retrievals and in defining ADM scene types. ADM-derived monthly mean albedos determined for all 5 deg x 5 deg latitude/longitude regions over ocean are in good agreement (regional RMS relative errors less than 2%) with those obtained by direct integration when ADM albedos inferred from specific angular bins are averaged together. Albedos inferred from near-nadir and oblique viewing zenith angles are the least accurate, with regional RMS errors reaching approximately 5-10% (relative). Compared to an earlier study involving ERBE ADMs, regional mean albedos based on the 19 scene types considered here show a factor of 4 reduction in bias error and a factor of 3 reduction in RMS error.

38 citations


Additional excerpts

  • ...Previous studies [7] have shown that these properties...

    [...]

Journal ArticleDOI
TL;DR: Investigation of the expected uncertainties of a single channel cloud optical thickness (COT) retrieval technique, as well as a simple cloud temperature threshold based thermodynamic phase approach, in support of the Deep Space Climate Observatory (DSCOVR) mission show that a singleChannel COT retrieval is feasible for EPIC.
Abstract: . This paper presents an investigation of the expected uncertainties of a single-channel cloud optical thickness (COT) retrieval technique, as well as a simple cloud-temperature-threshold-based thermodynamic phase approach, in support of the Deep Space Climate Observatory (DSCOVR) mission. DSCOVR cloud products will be derived from Earth Polychromatic Imaging Camera (EPIC) observations in the ultraviolet and visible spectra. Since EPIC is not equipped with a spectral channel in the shortwave or mid-wave infrared that is sensitive to cloud effective radius (CER), COT will be inferred from a single visible channel with the assumption of appropriate CER values for liquid and ice phase clouds. One month of Aqua MODerate-resolution Imaging Spectroradiometer (MODIS) daytime granules from April 2005 is selected for investigating cloud phase sensitivity, and a subset of these granules that has similar EPIC Sun-view geometry is selected for investigating COT uncertainties. EPIC COT retrievals are simulated with the same algorithm as the operational MODIS cloud products (MOD06), except using fixed phase-dependent CER values. Uncertainty estimates are derived by comparing the single-channel COT retrievals with the baseline bi-spectral MODIS retrievals. Results show that a single-channel COT retrieval is feasible for EPIC. For ice clouds, single-channel retrieval errors are minimal (

20 citations


"Development of multi-sensor global ..." refers background in this paper

  • ...Although NISTAR can provide accurate top-of-atmosphere (TOA) radiance measurements, the low resolution of EPIC imagery (discussed futher below) and its lack of infrared channels diminish its usefulness in obtaining details on small-scale surface and cloud properties [6]....

    [...]

Proceedings ArticleDOI
18 Oct 2016
TL;DR: In this paper, an image processing methodology designed to detect most kinds of noise and corrupt data in all bands of raw imagery from modern and historic geostationary (GEO) satellites is presented.
Abstract: The Clouds and the Earth’s Radiant Energy System (CERES) has incorporated imagery from 16 individual geostationary (GEO) satellites across five contiguous domains since March 2000. In order to derive broadband fluxes uniform across satellite platforms it is important to ensure a good quality of the input raw count data. GEO data obtained by older GOES imagers (such as MTSAT-1, Meteosat-5, Meteosat-7, GMS-5, and GOES-9) are known to frequently contain various types of noise caused by transmission errors, sync errors, stray light contamination, and others. This work presents an image processing methodology designed to detect most kinds of noise and corrupt data in all bands of raw imagery from modern and historic GEO satellites. The algorithm is based on a set of different approaches to detect abnormal image patterns, including inter-line and inter-pixel differences within a scanline, correlation between scanlines, analysis of spatial variance, and also a 2D Fourier analysis of the image spatial frequencies. In spite of computational complexity, the described method is highly optimized for performance to facilitate volume processing of multi-year data and runs in fully automated mode. Reliability of this noise detection technique has been assessed by human supervision for each GEO dataset obtained during selected time periods in 2005 and 2006. This assessment has demonstrated the overall detection accuracy of over 99.5% and the false alarm rate of under 0.3%. The described noise detection routine is currently used in volume processing of historical GEO imagery for subsequent production of global gridded data products and for cross-platform calibration.

2 citations


"Development of multi-sensor global ..." refers background in this paper

  • ...Some of the improvements to the quality of the imager data include detector de-striping algorithm [15] and an automated system for detection and filtering of transmission noise and corrupt data [16]....

    [...]

Frequently Asked Questions (1)
Q1. What contributions have the authors mentioned in the paper "Development of multi-sensor global cloud and radiance composites for earth radiation budget monitoring from dscovr" ?

In this paper, a new algorithm is proposed for optimal merging of selected radiances and cloud properties derived from multiple satellite imagers to obtain seamless global hourly composites at 5-km resolution.