Showing papers by "Goddard Space Flight Center published in 2019"
••
TL;DR: The Simons Observatory (SO) is a new cosmic microwave background experiment being built on Cerro Toco in Chile, due to begin observations in the early 2020s as mentioned in this paper.
Abstract: The Simons Observatory (SO) is a new cosmic microwave background experiment being built on Cerro Toco in Chile, due to begin observations in the early 2020s. We describe the scientific goals of the experiment, motivate the design, and forecast its performance. SO will measure the temperature and polarization anisotropy of the cosmic microwave background in six frequency bands centered at: 27, 39, 93, 145, 225 and 280 GHz. The initial configuration of SO will have three small-aperture 0.5-m telescopes and one large-aperture 6-m telescope, with a total of 60,000 cryogenic bolometers. Our key science goals are to characterize the primordial perturbations, measure the number of relativistic species and the mass of neutrinos, test for deviations from a cosmological constant, improve our understanding of galaxy evolution, and constrain the duration of reionization. The small aperture telescopes will target the largest angular scales observable from Chile, mapping ≈ 10% of the sky to a white noise level of 2 μK-arcmin in combined 93 and 145 GHz bands, to measure the primordial tensor-to-scalar ratio, r, at a target level of σ(r)=0.003. The large aperture telescope will map ≈ 40% of the sky at arcminute angular resolution to an expected white noise level of 6 μK-arcmin in combined 93 and 145 GHz bands, overlapping with the majority of the Large Synoptic Survey Telescope sky region and partially with the Dark Energy Spectroscopic Instrument. With up to an order of magnitude lower polarization noise than maps from the Planck satellite, the high-resolution sky maps will constrain cosmological parameters derived from the damping tail, gravitational lensing of the microwave background, the primordial bispectrum, and the thermal and kinematic Sunyaev-Zel'dovich effects, and will aid in delensing the large-angle polarization signal to measure the tensor-to-scalar ratio. The survey will also provide a legacy catalog of 16,000 galaxy clusters and more than 20,000 extragalactic sources.
1,027 citations
••
University of Washington1, California Institute of Technology2, Stockholm University3, University of Maryland, College Park4, Humboldt University of Berlin5, Goddard Space Flight Center6, National Central University7, Weizmann Institute of Science8, Macau University of Science and Technology9, Tel Aviv University10, University of California, Santa Barbara11, University of Michigan12, Northwestern University13, Adler Planetarium14, Lawrence Berkeley National Laboratory15, University of California, Berkeley16, Soka University of America17, Centre national de la recherche scientifique18, Radboud University Nijmegen19, University of Wisconsin–Milwaukee20, Los Alamos National Laboratory21
TL;DR: The Zwicky Transient Facility (ZTF) as mentioned in this paper is a new optical time-domain survey that uses the Palomar 48 inch Schmidt telescope, which provides a 47 deg^2 field of view and 8 s readout time, yielding more than an order of magnitude improvement in survey speed relative to its predecessor survey.
Abstract: The Zwicky Transient Facility (ZTF) is a new optical time-domain survey that uses the Palomar 48 inch Schmidt telescope. A custom-built wide-field camera provides a 47 deg^2 field of view and 8 s readout time, yielding more than an order of magnitude improvement in survey speed relative to its predecessor survey, the Palomar Transient Factory. We describe the design and implementation of the camera and observing system. The ZTF data system at the Infrared Processing and Analysis Center provides near-real-time reduction to identify moving and varying objects. We outline the analysis pipelines, data products, and associated archive. Finally, we present on-sky performance analysis and first scientific results from commissioning and the early survey. ZTF's public alert stream will serve as a useful precursor for that of the Large Synoptic Survey Telescope.
1,009 citations
••
University of Exeter1, École Normale Supérieure2, Norwich Research Park3, Alfred Wegener Institute for Polar and Marine Research4, University of Groningen5, Wageningen University and Research Centre6, Max Planck Society7, Ludwig Maximilian University of Munich8, Commonwealth Scientific and Industrial Research Organisation9, Centre national de la recherche scientifique10, Stanford University11, Karlsruhe Institute of Technology12, Atlantic Oceanographic and Meteorological Laboratory13, Cooperative Institute for Marine and Atmospheric Studies14, Geophysical Institute, University of Bergen15, Bjerknes Centre for Climate Research16, Japan Agency for Marine-Earth Science and Technology17, University of Maryland, College Park18, National Institute of Water and Atmospheric Research19, National Oceanic and Atmospheric Administration20, Appalachian State University21, Flanders Marine Institute22, Augsburg College23, ETH Zurich24, Leibniz Institute of Marine Sciences25, University of East Anglia26, Woods Hole Research Center27, University of Illinois at Urbana–Champaign28, University of Hong Kong29, Utrecht University30, Netherlands Environmental Assessment Agency31, University of Paris32, Hobart Corporation33, University of Tasmania34, University of Bern35, National Center for Atmospheric Research36, University of Reading37, Cooperative Institute for Research in Environmental Sciences38, National Institute for Environmental Studies39, Russian Academy of Sciences40, Goddard Space Flight Center41, Leibniz Institute for Baltic Sea Research42, Princeton University43, Met Office44, Lund University45, Auburn University46, Food and Agriculture Organization47, VU University Amsterdam48
TL;DR: In this article, the authors describe data sets and methodology to quantify the five major components of the global carbon budget and their uncertainties, including emissions from land use and land use change, and show that the difference between the estimated total emissions and the estimated changes in the atmosphere, ocean, and terrestrial biosphere is a measure of imperfect data and understanding of the contemporary carbon cycle.
Abstract: . Accurate assessment of anthropogenic carbon dioxide ( CO2 ) emissions and
their redistribution among the atmosphere, ocean, and terrestrial biosphere
– the “global carbon budget” – is important to better understand the
global carbon cycle, support the development of climate policies, and
project future climate change. Here we describe data sets and methodology to
quantify the five major components of the global carbon budget and their
uncertainties. Fossil CO2 emissions ( EFF ) are based on energy
statistics and cement production data, while emissions from land use change
( ELUC ), mainly deforestation, are based on land use and land use change
data and bookkeeping models. Atmospheric CO2 concentration is measured
directly and its growth rate ( GATM ) is computed from the annual changes
in concentration. The ocean CO2 sink ( SOCEAN ) and terrestrial
CO2 sink ( SLAND ) are estimated with global process models
constrained by observations. The resulting carbon budget imbalance
( BIM ), the difference between the estimated total emissions and the
estimated changes in the atmosphere, ocean, and terrestrial biosphere, is a
measure of imperfect data and understanding of the contemporary carbon
cycle. All uncertainties are reported as ±1σ . For the last
decade available (2009–2018), EFF was 9.5±0.5 GtC yr −1 ,
ELUC 1.5±0.7 GtC yr −1 , GATM 4.9±0.02 GtC yr −1 ( 2.3±0.01 ppm yr −1 ), SOCEAN 2.5±0.6 GtC yr −1 , and SLAND 3.2±0.6 GtC yr −1 , with a budget
imbalance BIM of 0.4 GtC yr −1 indicating overestimated emissions
and/or underestimated sinks. For the year 2018 alone, the growth in EFF was
about 2.1 % and fossil emissions increased to 10.0±0.5 GtC yr −1 , reaching 10 GtC yr −1 for the first time in history,
ELUC was 1.5±0.7 GtC yr −1 , for total anthropogenic
CO2 emissions of 11.5±0.9 GtC yr −1 ( 42.5±3.3 GtCO2 ). Also for 2018, GATM was 5.1±0.2 GtC yr −1 ( 2.4±0.1 ppm yr −1 ), SOCEAN was 2.6±0.6 GtC yr −1 , and SLAND was 3.5±0.7 GtC yr −1 , with a BIM of 0.3 GtC. The global atmospheric CO2 concentration reached 407.38±0.1 ppm averaged over 2018. For 2019, preliminary data for the first 6–10 months indicate a reduced growth in EFF of +0.6 % (range of
−0.2 % to 1.5 %) based on national emissions projections for China, the
USA, the EU, and India and projections of gross domestic product corrected
for recent changes in the carbon intensity of the economy for the rest of
the world. Overall, the mean and trend in the five components of the global
carbon budget are consistently estimated over the period 1959–2018, but
discrepancies of up to 1 GtC yr −1 persist for the representation of
semi-decadal variability in CO2 fluxes. A detailed comparison among
individual estimates and the introduction of a broad range of observations
shows (1) no consensus in the mean and trend in land use change emissions
over the last decade, (2) a persistent low agreement between the different
methods on the magnitude of the land CO2 flux in the northern
extra-tropics, and (3) an apparent underestimation of the CO2
variability by ocean models outside the tropics. This living data update
documents changes in the methods and data sets used in this new global
carbon budget and the progress in understanding of the global carbon cycle
compared with previous publications of this data set (Le Quere et
al., 2018a, b, 2016, 2015a, b, 2014, 2013). The data generated by
this work are available at https://doi.org/10.18160/gcp-2019 (Friedlingstein
et al., 2019).
981 citations
••
University of Maryland, College Park1, University of Illinois at Urbana–Champaign2, Columbia University3, Goddard Space Flight Center4, Centre National D'Etudes Spatiales5, Haverford College6, University of Southampton7, Stony Brook University8, California Institute of Technology9, University of Alberta10, United States Naval Research Laboratory11, Kyoto University12, Massachusetts Institute of Technology13
TL;DR: In this paper, the mass and radius of the isolated 205.53 Hz millisecond pulsar PSR J0030+0451 were estimated using a Bayesian inference approach to analyze its energy-dependent thermal X-ray waveform, which was observed using the Neutron Star Interior Composition Explorer (NICER).
Abstract: Neutron stars are not only of astrophysical interest, but are also of great interest to nuclear physicists because their attributes can be used to determine the properties of the dense matter in their cores. One of the most informative approaches for determining the equation of state (EoS) of this dense matter is to measure both a star’s equatorial circumferential radius R e and its gravitational mass M. Here we report estimates of the mass and radius of the isolated 205.53 Hz millisecond pulsar PSR J0030+0451 obtained using a Bayesian inference approach to analyze its energy-dependent thermal X-ray waveform, which was observed using the Neutron Star Interior Composition Explorer (NICER). This approach is thought to be less subject to systematic errors than other approaches for estimating neutron star radii. We explored a variety of emission patterns on the stellar surface. Our best-fit model has three oval, uniform-temperature emitting spots and provides an excellent description of the pulse waveform observed using NICER. The radius and mass estimates given by this model are km and (68%). The independent analysis reported in the companion paper by Riley et al. explores different emitting spot models, but finds spot shapes and locations and estimates of R e and M that are consistent with those found in this work. We show that our measurements of R e and M for PSR J0030+0451 improve the astrophysical constraints on the EoS of cold, catalyzed matter above nuclear saturation density.
758 citations
••
University of Amsterdam1, Columbia University2, United States Naval Research Laboratory3, California Institute of Technology4, Hoffmann-La Roche5, University of Toulouse6, Goddard Space Flight Center7, Massachusetts Institute of Technology8, University of Southampton9, Haverford College10, Stony Brook University11, University of Alberta12
TL;DR: In this paper, the mass and equatorial radius of the millisecond pulsar PSR J0030+0451 were estimated based on a relativistic ray-tracing of thermal emission from hot regions of the pulsar surface.
Abstract: We report on Bayesian parameter estimation of the mass and equatorial radius of the millisecond pulsar PSR J0030+0451, conditional on pulse-profile modeling of Neutron Star Interior Composition Explorer X-ray spectral-timing event data. We perform relativistic ray-tracing of thermal emission from hot regions of the pulsar’s surface. We assume two distinct hot regions based on two clear pulsed components in the phase-folded pulse-profile data; we explore a number of forms (morphologies and topologies) for each hot region, inferring their parameters in addition to the stellar mass and radius. For the family of models considered, the evidence (prior predictive probability of the data) strongly favors a model that permits both hot regions to be located in the same rotational hemisphere. Models wherein both hot regions are assumed to be simply connected circular single-temperature spots, in particular those where the spots are assumed to be reflection-symmetric with respect to the stellar origin, are strongly disfavored. For the inferred configuration, one hot region subtends an angular extent of only a few degrees (in spherical coordinates with origin at the stellar center) and we are insensitive to other structural details; the second hot region is far more azimuthally extended in the form of a narrow arc, thus requiring a larger number of parameters to describe. The inferred mass M and equatorial radius R eq are, respectively, and , while the compactness is more tightly constrained; the credible interval bounds reported here are approximately the 16% and 84% quantiles in marginal posterior mass.
737 citations
••
TL;DR: In this paper, the authors improved initial estimates of the binary's properties, including component masses, spins, and tidal parameters, using the known source location, improved modeling, and recalibrated Virgo data.
Abstract: On August 17, 2017, the Advanced LIGO and Advanced Virgo gravitational-wave detectors observed a low-mass compact binary inspiral. The initial sky localization of the source of the gravitational-wave signal, GW170817, allowed electromagnetic observatories to identify NGC 4993 as the host galaxy. In this work, we improve initial estimates of the binary's properties, including component masses, spins, and tidal parameters, using the known source location, improved modeling, and recalibrated Virgo data. We extend the range of gravitational-wave frequencies considered down to 23 Hz, compared to 30 Hz in the initial analysis. We also compare results inferred using several signal models, which are more accurate and incorporate additional physical effects as compared to the initial analysis. We improve the localization of the gravitational-wave source to a 90% credible region of 16 deg2. We find tighter constraints on the masses, spins, and tidal parameters, and continue to find no evidence for nonzero component spins. The component masses are inferred to lie between 1.00 and 1.89 M when allowing for large component spins, and to lie between 1.16 and 1.60 M (with a total mass 2.73-0.01+0.04 M) when the spins are restricted to be within the range observed in Galactic binary neutron stars. Using a precessing model and allowing for large component spins, we constrain the dimensionless spins of the components to be less than 0.50 for the primary and 0.61 for the secondary. Under minimal assumptions about the nature of the compact objects, our constraints for the tidal deformability parameter Λ are (0,630) when we allow for large component spins, and 300-230+420 (using a 90% highest posterior density interval) when restricting the magnitude of the component spins, ruling out several equation-of-state models at the 90% credible level. Finally, with LIGO and GEO600 data, we use a Bayesian analysis to place upper limits on the amplitude and spectral energy density of a possible postmerger signal.
715 citations
••
National Center for Atmospheric Research1, Lawrence Berkeley National Laboratory2, Oak Ridge National Laboratory3, Utrecht University4, Columbia University5, Chinese Academy of Sciences6, University of Houston7, University of California, Los Angeles8, California Institute of Technology9, Institute of Arctic and Alpine Research10, Los Alamos National Laboratory11, Harvard University12, Cooperative Institute for Research in Environmental Sciences13, University of Arizona14, University of Colorado Boulder15, Purdue University16, Michigan State University17, Argonne National Laboratory18, University of Michigan19, Auburn University20, Pacific Northwest National Laboratory21, Goddard Space Flight Center22, University of California, Irvine23, Virginia Tech24, University of Sheffield25
TL;DR: The Community Land Model (CLM) is the land component of the Community Earth System Model (CESM) and is used in several global and regional modeling systems.
Abstract: The Community Land Model (CLM) is the land component of the Community Earth System Model (CESM) and is used in several global and regional modeling systems. In this paper, we introduce model developments included in CLM version 5 (CLM5), which is the default land component for CESM2. We assess an ensemble of simulations, including prescribed and prognostic vegetation state, multiple forcing data sets, and CLM4, CLM4.5, and CLM5, against a range of metrics including from the International Land Model Benchmarking (ILAMBv2) package. CLM5 includes new and updated processes and parameterizations: (1) dynamic land units, (2) updated parameterizations and structure for hydrology and snow (spatially explicit soil depth, dry surface layer, revised groundwater scheme, revised canopy interception and canopy snow processes, updated fresh snow density, simple firn model, and Model for Scale Adaptive River Transport), (3) plant hydraulics and hydraulic redistribution, (4) revised nitrogen cycling (flexible leaf stoichiometry, leaf N optimization for photosynthesis, and carbon costs for plant nitrogen uptake), (5) global crop model with six crop types and time‐evolving irrigated areas and fertilization rates, (6) updated urban building energy, (7) carbon isotopes, and (8) updated stomatal physiology. New optional features include demographically structured dynamic vegetation model (Functionally Assembled Terrestrial Ecosystem Simulator), ozone damage to plants, and fire trace gas emissions coupling to the atmosphere. Conclusive establishment of improvement or degradation of individual variables or metrics is challenged by forcing uncertainty, parametric uncertainty, and model structural complexity, but the multivariate metrics presented here suggest a general broad improvement from CLM4 to CLM5.
661 citations
••
TL;DR: The AERONET Version 3.3.V3 (V3) algorithm as mentioned in this paper provides fully automatic cloud screening and instrument anomaly quality control for near real-time AOD data.
Abstract: The Aerosol Robotic Network (AERONET) has provided highly
accurate, ground-truth measurements of the aerosol optical depth (AOD) using
Cimel Electronique Sun–sky radiometers for more than 25 years In Version 2 (V2)
of the AERONET database, the near-real-time AOD was semiautomatically
quality controlled utilizing mainly cloud-screening methodology, while
additional AOD data contaminated by clouds or affected by instrument
anomalies were removed manually before attaining quality-assured status
(Level 20) The large growth in the number of AERONET sites over the past
25 years resulted in significant burden to the manual quality control of millions
of measurements in a consistent manner The AERONET Version 3 (V3) algorithm
provides fully automatic cloud screening and instrument anomaly quality
controls All of these new algorithm updates apply to near-real-time data as
well as post-field-deployment processed data, and AERONET reprocessed the
database in 2018 A full algorithm redevelopment provided the opportunity to
improve data inputs and corrections such as unique filter-specific
temperature characterizations for all visible and near-infrared wavelengths,
updated gaseous and water vapor absorption coefficients, and ancillary data
sets The Level 20 AOD quality-assured data set is now available within a
month after post-field calibration, reducing the lag time from up to several
months Near-real-time estimated uncertainty is determined using data
qualified as V3 Level 20 AOD and considering the difference between the AOD
computed with the pre-field calibration and AOD computed with pre-field and
post-field calibration This assessment provides a near-real-time
uncertainty estimate for which average differences of AOD suggest a +002 bias
and one sigma uncertainty of 002, spectrally, but the bias and uncertainty
can be significantly larger for specific instrument deployments Long-term
monthly averages analyzed for the entire V3 and V2 databases produced
average differences (V3–V2) of + 0002 with a ± 002 SD (standard
deviation), yet monthly averages calculated using time-matched observations
in both databases were analyzed to compute an average difference of −0002
with a ±0004 SD The high statistical agreement in
multiyear monthly averaged AOD validates the advanced automatic data
quality control algorithms and suggests that migrating research to the
V3 database will corroborate most V2 research conclusions and likely lead to
more accurate results in some cases
629 citations
••
TL;DR: Three decades of high-resolution Landsat 5 satellite imagery are used to investigate long-term trends in intense summertime near-surface phytoplankton blooms for 71 large lakes globally, revealing a worldwide exacerbation of bloom conditions.
Abstract: Freshwater blooms of phytoplankton affect public health and ecosystem services globally1,2. Harmful effects of such blooms occur when the intensity of a bloom is too high, or when toxin-producing phytoplankton species are present. Freshwater blooms result in economic losses of more than US$4 billion annually in the United States alone, primarily from harm to aquatic food production, recreation and tourism, and drinking-water supplies3. Studies that document bloom conditions in lakes have either focused only on individual or regional subsets of lakes4–6, or have been limited by a lack of long-term observations7–9. Here we use three decades of high-resolution Landsat 5 satellite imagery to investigate long-term trends in intense summertime near-surface phytoplankton blooms for 71 large lakes globally. We find that peak summertime bloom intensity has increased in most (68 per cent) of the lakes studied, revealing a global exacerbation of bloom conditions. Lakes that have experienced a significant (P < 0.1) decrease in bloom intensity are rare (8 per cent). The reason behind the increase in phytoplankton bloom intensity remains unclear, however, as temporal trends do not track consistently with temperature, precipitation, fertilizer-use trends or other previously hypothesized drivers. We do find, however, that lakes with a decrease in bloom intensity warmed less compared to other lakes, suggesting that lake warming may already be counteracting management efforts to ameliorate eutrophication10,11. Our findings support calls for water quality management efforts to better account for the interactions between climate change and local hydrological conditions12,13. Analyses show that the peak intensity of summertime phytoplankton blooms has increased in 71 large lakes globally over the past three decades, revealing a worldwide exacerbation of bloom conditions.
595 citations
••
University of Maryland, College Park1, University of Illinois at Urbana–Champaign2, Columbia University3, Goddard Space Flight Center4, Centre National D'Etudes Spatiales5, Haverford College6, University of Southampton7, Stony Brook University8, California Institute of Technology9, University of Alberta10, United States Naval Research Laboratory11, Kyoto University12, Massachusetts Institute of Technology13
TL;DR: In this article, the mass and radius of the isolated 205.53 Hz millisecond pulsar PSR J0030+0451 were estimated using a Bayesian inference approach to analyze its energy-dependent thermal X-ray waveform.
Abstract: Neutron stars are not only of astrophysical interest, but are also of great interest to nuclear physicists, because their attributes can be used to determine the properties of the dense matter in their cores. One of the most informative approaches for determining the equation of state of this dense matter is to measure both a star's equatorial circumferential radius $R_e$ and its gravitational mass $M$. Here we report estimates of the mass and radius of the isolated 205.53 Hz millisecond pulsar PSR J0030+0451 obtained using a Bayesian inference approach to analyze its energy-dependent thermal X-ray waveform, which was observed using the Neutron Star Interior Composition Explorer (NICER). This approach is thought to be less subject to systematic errors than other approaches for estimating neutron star radii. We explored a variety of emission patterns on the stellar surface. Our best-fit model has three oval, uniform-temperature emitting spots and provides an excellent description of the pulse waveform observed using NICER. The radius and mass estimates given by this model are $R_e = 13.02^{+1.24}_{-1.06}$ km and $M = 1.44^{+0.15}_{-0.14}\ M_\odot$ (68%). The independent analysis reported in the companion paper by Riley et al. (2019) explores different emitting spot models, but finds spot shapes and locations and estimates of $R_e$ and $M$ that are consistent with those found in this work. We show that our measurements of $R_e$ and $M$ for PSR J0030$+$0451 improve the astrophysical constraints on the equation of state of cold, catalyzed matter above nuclear saturation density.
586 citations
••
University of Amsterdam1, Columbia University2, United States Naval Research Laboratory3, California Institute of Technology4, Hoffmann-La Roche5, University of Toulouse6, Goddard Space Flight Center7, Massachusetts Institute of Technology8, Haverford College9, University of Southampton10, Stony Brook University11, University of Alberta12
TL;DR: In this article, the mass and equatorial radius of the millisecond pulsar PSR J0030$+$0451 were estimated from the ICER X-ray spectral-timing event data.
Abstract: We report on Bayesian parameter estimation of the mass and equatorial radius of the millisecond pulsar PSR J0030$+$0451, conditional on pulse-profile modeling of Neutron Star Interior Composition Explorer (NICER) X-ray spectral-timing event data. We perform relativistic ray-tracing of thermal emission from hot regions of the pulsar's surface. We assume two distinct hot regions based on two clear pulsed components in the phase-folded pulse-profile data; we explore a number of forms (morphologies and topologies) for each hot region, inferring their parameters in addition to the stellar mass and radius. For the family of models considered, the evidence (prior predictive probability of the data) strongly favors a model that permits both hot regions to be located in the same rotational hemisphere. Models wherein both hot regions are assumed to be simply-connected circular single-temperature spots, in particular those where the spots are assumed to be reflection-symmetric with respect to the stellar origin, are strongly disfavored. For the inferred configuration, one hot region subtends an angular extent of only a few degrees (in spherical coordinates with origin at the stellar center) and we are insensitive to other structural details; the second hot region is far more azimuthally extended in the form of a narrow arc, thus requiring a larger number of parameters to describe. The inferred mass $M$ and equatorial radius $R_\mathrm{eq}$ are, respectively, $1.34_{-0.16}^{+0.15}$ M$_{\odot}$ and $12.71_{-1.19}^{+1.14}$ km, whilst the compactness $GM/R_\mathrm{eq}c^2 = 0.156_{-0.010}^{+0.008}$ is more tightly constrained; the credible interval bounds reported here are approximately the $16\%$ and $84\%$ quantiles in marginal posterior mass.
••
Natural Resources Canada1, United States Geological Survey2, Michigan State University3, Goddard Space Flight Center4, Boston University5, University of Idaho6, Agricultural Research Service7, United States Forest Service8, University of Massachusetts Boston9, European Space Agency10, South Dakota State University11, University of British Columbia12, United States Department of Agriculture13, Humboldt University of Berlin14, Oregon State University15, Desert Research Institute16, University of Nebraska–Lincoln17, Geoscience Australia18, University of Colorado Boulder19, Rochester Institute of Technology20, University of California, Los Angeles21, Virginia Tech22, Texas Tech University23, University of Connecticut24
TL;DR: The programmatic developments and institutional context for the Landsat program and the unique ability of Landsat to meet the needs of national and international programs are described and the key trends in Landsat science are presented.
••
TL;DR: The Gravity Recovery and Climate Experiment mission allows monitoring of changes in hydrology and the cryosphere with terrestrial and ocean applications and its contribution to the detection and quantification of climate change signals is focused on.
Abstract: Time-resolved satellite gravimetry has revolutionized understanding of mass transport in the Earth system. Since 2002, the Gravity Recovery and Climate Experiment (GRACE) has enabled monitoring of the terrestrial water cycle, ice sheet and glacier mass balance, sea level change and ocean bottom pressure variations and understanding responses to changes in the global climate system. Initially a pioneering experiment of geodesy, the time-variable observations have matured into reliable mass transport products, allowing assessment and forecast of a number of important climate trends and improve service applications such as the U.S. Drought Monitor. With the successful launch of the GRACE Follow-On mission, a multi decadal record of mass variability in the Earth system is within reach.
••
TL;DR: In this paper, the mass, spin, and redshift distributions of binary black hole (BBH) mergers with LIGO and Advanced Virgo observations were analyzed using phenomenological population models.
Abstract: We present results on the mass, spin, and redshift distributions with phenomenological population models using the 10 binary black hole (BBH) mergers detected in the first and second observing runs completed by Advanced LIGO and Advanced Virgo. We constrain properties of the BBH mass spectrum using models with a range of parameterizations of the BBH mass and spin distributions. We find that the mass distribution of the more massive BH in such binaries is well approximated by models with no more than 1% of BHs more massive than 45 M and a power-law index of (90% credibility). We also show that BBHs are unlikely to be composed of BHs with large spins aligned to the orbital angular momentum. Modeling the evolution of the BBH merger rate with redshift, we show that it is flat or increasing with redshift with 93% probability. Marginalizing over uncertainties in the BBH population, we find robust estimates of the BBH merger rate density of R= (90% credibility). As the BBH catalog grows in future observing runs, we expect that uncertainties in the population model parameters will shrink, potentially providing insights into the formation of BHs via supernovae, binary interactions of massive stars, stellar cluster dynamics, and the formation history of BHs across cosmic time.
••
California Institute of Technology1, Carnegie Learning2, University of Michigan3, National Central University4, Goddard Space Flight Center5, University of Maryland, College Park6, Northwestern University7, Adler Planetarium8, University of Washington9, Weizmann Institute of Science10, University of California, Santa Barbara11, University of Wisconsin–Milwaukee12
TL;DR: The Zwicky Transient Facility (ZTF) as mentioned in this paper is a robotic time-domain survey currently in progress using the Palomar 48-inch Schmidt Telescope, which uses a 600 megapixel camera to scan the entire northern visible sky at rates of ~3760 square degrees/hour.
Abstract: The Zwicky Transient Facility (ZTF) is a new robotic time-domain survey currently in progress using the Palomar 48-inch Schmidt Telescope. ZTF uses a 47 square degree field with a 600 megapixel camera to scan the entire northern visible sky at rates of ~3760 square degrees/hour to median depths of g ~ 20.8 and r ~ 20.6 mag (AB, 5σ in 30 sec). We describe the Science Data System that is housed at IPAC, Caltech. This comprises the data-processing pipelines, alert production system, data archive, and user interfaces for accessing and analyzing the products. The real-time pipeline employs a novel image-differencing algorithm, optimized for the detection of point-source transient events. These events are vetted for reliability using a machine-learned classifier and combined with contextual information to generate data-rich alert packets. The packets become available for distribution typically within 13 minutes (95th percentile) of observation. Detected events are also linked to generate candidate moving-object tracks using a novel algorithm. Objects that move fast enough to streak in the individual exposures are also extracted and vetted. We present some preliminary results of the calibration performance delivered by the real-time pipeline. The reconstructed astrometric accuracy per science image with respect to Gaia DR1 is typically 45 to 85 milliarcsec. This is the RMS per-axis on the sky for sources extracted with photometric S/N ≥ 10 and hence corresponds to the typical astrometric uncertainty down to this limit. The derived photometric precision (repeatability) at bright unsaturated fluxes varies between 8 and 25 millimag. The high end of these ranges corresponds to an airmass approaching ~2—the limit of the public survey. Photometric calibration accuracy with respect to Pan-STARRS1 is generally better than 2%. The products support a broad range of scientific applications: fast and young supernovae; rare flux transients; variable stars; eclipsing binaries; variability from active galactic nuclei; counterparts to gravitational wave sources; a more complete census of Type Ia supernovae; and solar-system objects.
••
TL;DR: In this paper, the authors place constraints on the dipole radiation and possible deviations from GR in the post-Newtonian coefficients that govern the inspiral regime of a binary neutron star inspiral.
Abstract: The recent discovery by Advanced LIGO and Advanced Virgo of a gravitational wave signal from a binary neutron star inspiral has enabled tests of general relativity (GR) with this new type of source. This source, for the first time, permits tests of strong-field dynamics of compact binaries in the presence of matter. In this Letter, we place constraints on the dipole radiation and possible deviations from GR in the post-Newtonian coefficients that govern the inspiral regime. Bounds on modified dispersion of gravitational waves are obtained; in combination with information from the observed electromagnetic counterpart we can also constrain effects due to large extra dimensions. Finally, the polarization content of the gravitational wave signal is studied. The results of all tests performed here show good agreement with GR.
••
Vanderbilt University1, Fisk University2, Harvard University3, Lehigh University4, Northern Kentucky University5, Boston University6, Vassar College7, Andrés Bello National University8, Space Telescope Science Institute9, Ames Research Center10, University of California, Riverside11, University of Maryland, Baltimore County12, Goddard Space Flight Center13, University of Chicago14, University of Florida15, University of North Carolina at Chapel Hill16, George Mason University17, Massachusetts Institute of Technology18, University of Sydney19, INAF20, Aarhus University21, University of New South Wales22
••
California Institute of Technology1, Carnegie Learning2, National Central University3, University of Michigan4, Goddard Space Flight Center5, University of Maryland, College Park6, Northwestern University7, Adler Planetarium8, University of Washington9, Weizmann Institute of Science10, University of California, Santa Barbara11, University of Wisconsin–Milwaukee12
TL;DR: The Zwicky Transient Facility (ZTF) is a new robotic time-domain survey currently in progress using the Palomar 48-inch Schmidt Telescope, and the Science Data System that is housed at IPAC, Caltech is described.
Abstract: The Zwicky Transient Facility (ZTF) is a new robotic time-domain survey currently in progress using the Palomar 48-inch Schmidt Telescope. ZTF uses a 47 square degree field with a 600 megapixel camera to scan the entire northern visible sky at rates of ~3760 square degrees/hour to median depths of g ~ 20.8 and r ~ 20.6 mag (AB, 5sigma in 30 sec). We describe the Science Data System that is housed at IPAC, Caltech. This comprises the data-processing pipelines, alert production system, data archive, and user interfaces for accessing and analyzing the products. The realtime pipeline employs a novel image-differencing algorithm, optimized for the detection of point source transient events. These events are vetted for reliability using a machine-learned classifier and combined with contextual information to generate data-rich alert packets. The packets become available for distribution typically within 13 minutes (95th percentile) of observation. Detected events are also linked to generate candidate moving-object tracks using a novel algorithm. Objects that move fast enough to streak in the individual exposures are also extracted and vetted. The reconstructed astrometric accuracy per science image with respect to Gaia is typically 45 to 85 milliarcsec. This is the RMS per axis on the sky for sources extracted with photometric S/N >= 10. The derived photometric precision (repeatability) at bright unsaturated fluxes varies between 8 and 25 millimag. Photometric calibration accuracy with respect to Pan-STARRS1 is generally better than 2%. The products support a broad range of scientific applications: fast and young supernovae, rare flux transients, variable stars, eclipsing binaries, variability from active galactic nuclei, counterparts to gravitational wave sources, a more complete census of Type Ia supernovae, and Solar System objects.
••
University of California, Berkeley1, Queen Mary University of London2, Smithsonian Astrophysical Observatory3, University of Minnesota4, University of New Hampshire5, University of Maryland, College Park6, University of Orléans7, Imperial College London8, University of Colorado Boulder9, Goddard Space Flight Center10, University of Maryland, Baltimore County11, University of Iowa12, University of Michigan13, University of Applied Sciences and Arts Northwestern Switzerland FHNW14, University of Paris15, Princeton University16, Johns Hopkins University Applied Physics Laboratory17, University of California, Los Angeles18
TL;DR: Measurements from the Parker Solar Probe show that slow solar wind near the Sun’s equator originates in coronal holes, and plasma-wave measurements suggest the existence of electron and ion velocity-space micro-instabilities that are associated with plasma heating and thermalization processes.
Abstract: During the solar minimum, when the Sun is at its least active, the solar wind1,2 is observed at high latitudes as a predominantly fast (more than 500 kilometres per second), highly Alfvenic rarefied stream of plasma originating from deep within coronal holes. Closer to the ecliptic plane, the solar wind is interspersed with a more variable slow wind3 of less than 500 kilometres per second. The precise origins of the slow wind streams are less certain4; theories and observations suggest that they may originate at the tips of helmet streamers5,6, from interchange reconnection near coronal hole boundaries7,8, or within coronal holes with highly diverging magnetic fields9,10. The heating mechanism required to drive the solar wind is also unresolved, although candidate mechanisms include Alfven-wave turbulence11,12, heating by reconnection in nanoflares13, ion cyclotron wave heating14 and acceleration by thermal gradients1. At a distance of one astronomical unit, the wind is mixed and evolved, and therefore much of the diagnostic structure of these sources and processes has been lost. Here we present observations from the Parker Solar Probe15 at 36 to 54 solar radii that show evidence of slow Alfvenic solar wind emerging from a small equatorial coronal hole. The measured magnetic field exhibits patches of large, intermittent reversals that are associated with jets of plasma and enhanced Poynting flux and that are interspersed in a smoother and less turbulent flow with a near-radial magnetic field. Furthermore, plasma-wave measurements suggest the existence of electron and ion velocity-space micro-instabilities10,16 that are associated with plasma heating and thermalization processes. Our measurements suggest that there is an impulsive mechanism associated with solar-wind energization and that micro-instabilities play a part in heating, and we provide evidence that low-latitude coronal holes are a key source of the slow solar wind. Measurements from the Parker Solar Probe show that slow solar wind near the Sun’s equator originates in coronal holes.
••
University of California, Los Angeles1, University of California, Berkeley2, Goddard Space Flight Center3, Nagoya University4, Kanazawa University5, Tohoku University6, Korea Astronomy and Space Science Institute7, The Aerospace Corporation8, University of Washington9, Dartmouth College10, Montana State University11, University of California, Santa Cruz12, National Cheng Kung University13, Academia Sinica Institute of Astronomy and Astrophysics14, University of Tokyo15, National Central University16, National Oceanic and Atmospheric Administration17, Cooperative Institute for Research in Environmental Sciences18, Johns Hopkins University Applied Physics Laboratory19, Kyushu University20, Kyoto University21, National Institute of Polar Research22, University of Colorado Boulder23, University of Iowa24, University of New Hampshire25, Southwest Research Institute26, National Center for Atmospheric Research27, Université Paris-Saclay28, Boston University29, Braunschweig University of Technology30, University of Calgary31, University of Graz32, University of Minnesota33
TL;DR: The SPEDAS development history, goals, and current implementation are reviewed, and its “modes of use” are explained with examples geared for users and its technical implementation and requirements with software developers in mind are outlined.
Abstract: With the advent of the Heliophysics/Geospace System Observatory (H/GSO), a complement of multi-spacecraft missions and ground-based observatories to study the space environment, data retrieval, analysis, and visualization of space physics data can be daunting. The Space Physics Environment Data Analysis System (SPEDAS), a grass-roots software development platform (
www.spedas.org
), is now officially supported by NASA Heliophysics as part of its data environment infrastructure. It serves more than a dozen space missions and ground observatories and can integrate the full complement of past and upcoming space physics missions with minimal resources, following clear, simple, and well-proven guidelines. Free, modular and configurable to the needs of individual missions, it works in both command-line (ideal for experienced users) and Graphical User Interface (GUI) mode (reducing the learning curve for first-time users). Both options have “crib-sheets,” user-command sequences in ASCII format that can facilitate record-and-repeat actions, especially for complex operations and plotting. Crib-sheets enhance scientific interactions, as users can move rapidly and accurately from exchanges of technical information on data processing to efficient discussions regarding data interpretation and science. SPEDAS can readily query and ingest all International Solar Terrestrial Physics (ISTP)-compatible products from the Space Physics Data Facility (SPDF), enabling access to a vast collection of historic and current mission data. The planned incorporation of Heliophysics Application Programmer’s Interface (HAPI) standards will facilitate data ingestion from distributed datasets that adhere to these standards. Although SPEDAS is currently Interactive Data Language (IDL)-based (and interfaces to Java-based tools such as Autoplot), efforts are under-way to expand it further to work with python (first as an interface tool and potentially even receiving an under-the-hood replacement). We review the SPEDAS development history, goals, and current implementation. We explain its “modes of use” with examples geared for users and outline its technical implementation and requirements with software developers in mind. We also describe SPEDAS personnel and software management, interfaces with other organizations, resources and support structure available to the community, and future development plans.
••
National Radio Astronomy Observatory1, University of Cambridge2, University of Washington3, Space Telescope Science Institute4, Harvard University5, Minnesota State University Moorhead6, Max Planck Society7, Goddard Space Flight Center8, Villanova University9, Lowell Observatory10, European Space Agency11, Chalmers University of Technology12, University of Maryland, College Park13, The Catholic University of America14, European Southern Observatory15
TL;DR: Astroquery as discussed by the authors is a collection of tools for requesting data from databases hosted on remote servers with interfaces exposed on the Internet, including those with web pages but without formal application program interfaces.
Abstract: Astroquery is a collection of tools for requesting data from databases hosted on remote servers with interfaces exposed on the Internet, including those with web pages but without formal application program interfaces. These tools are built on the Python requests package, which is used to make HTTP requests, and astropy, which provides most of the data parsing functionality. astroquery modules generally attempt to replicate the web page interface provided by a given service as closely as possible, making the transition from browser-based to command-line interaction easy. astroquery has received significant contributions from throughout the astronomical community, including several from telescope archives. astroquery enables the creation of fully reproducible workflows from data acquisition through publication. This paper describes the philosophy, basic structure, and development model of the astroquery package. The complete documentation for astroquery can be found at http://astroquery.readthedocs.io/.
••
TL;DR: The satellite record reveals that a gradual, decades-long overall increase in Antarctic sea ice extents reversed in 2014, with subsequent rates of decrease in 2014–2017 far exceeding the more widely publicized decay rates experienced in the Arctic.
Abstract: Following over 3 decades of gradual but uneven increases in sea ice coverage, the yearly average Antarctic sea ice extents reached a record high of 12.8 × 106 km2 in 2014, followed by a decline so precipitous that they reached their lowest value in the 40-y 1979–2018 satellite multichannel passive-microwave record, 10.7 × 106 km2, in 2017. In contrast, it took the Arctic sea ice cover a full 3 decades to register a loss that great in yearly average ice extents. Still, when considering the 40-y record as a whole, the Antarctic sea ice continues to have a positive overall trend in yearly average ice extents, although at 11,300 ± 5,300 km2⋅y−1, this trend is only 50% of the trend for 1979–2014, before the precipitous decline. Four of the 5 sectors into which the Antarctic sea ice cover is divided all also have 40-y positive trends that are well reduced from their 2014–2017 values. The one anomalous sector in this regard, the Bellingshausen/Amundsen Seas, has a 40-y negative trend, with the yearly average ice extents decreasing overall in the first 3 decades, reaching a minimum in 2007, and exhibiting an overall upward trend since 2007 (i.e., reflecting a reversal in the opposite direction from the other 4 sectors and the Antarctic sea ice cover as a whole).
••
TL;DR: An ensemble model that integrated multiple machine learning algorithms and predictor variables to estimate daily PM2.5 at a resolution of 1’km × 1 km across the contiguous United States allows epidemiologists to accurately estimate the adverse health effect of PM 2.5.
••
Smithsonian Astrophysical Observatory1, University of Michigan2, University of California, Berkeley3, Imperial College London4, Massachusetts Institute of Technology5, Université Paris-Saclay6, University of New Hampshire7, Marshall Space Flight Center8, Los Alamos National Laboratory9, University of Iowa10, Johns Hopkins University Applied Physics Laboratory11, University of Alabama in Huntsville12, University of Arizona13, University of Delaware14, University of Toulouse15, University of Paris16, Goddard Space Flight Center17, University of California, Los Angeles18, Universities Space Research Association19, Princeton University20
TL;DR: Observations of solar-wind plasma at heliocentric distances of about 35 solar radii reveal an increasing rotational component to the flow velocity of the solar wind around the Sun, peaking at 35 to 50 kilometres per second—considerably above the amplitude of the waves.
Abstract: The prediction of a supersonic solar wind1 was first confirmed by spacecraft near Earth2,3 and later by spacecraft at heliocentric distances as small as 62 solar radii4. These missions showed that plasma accelerates as it emerges from the corona, aided by unidentified processes that transport energy outwards from the Sun before depositing it in the wind. Alfvenic fluctuations are a promising candidate for such a process because they are seen in the corona and solar wind and contain considerable energy5–7. Magnetic tension forces the corona to co-rotate with the Sun, but any residual rotation far from the Sun reported until now has been much smaller than the amplitude of waves and deflections from interacting wind streams8. Here we report observations of solar-wind plasma at heliocentric distances of about 35 solar radii9–11, well within the distance at which stream interactions become important. We find that Alfven waves organize into structured velocity spikes with duration of up to minutes, which are associated with propagating S-like bends in the magnetic-field lines. We detect an increasing rotational component to the flow velocity of the solar wind around the Sun, peaking at 35 to 50 kilometres per second—considerably above the amplitude of the waves. These flows exceed classical velocity predictions of a few kilometres per second, challenging models of circulation in the corona and calling into question our understanding of how stars lose angular momentum and spin down as they age12–14. Data collected by the Parker Solar Probe in the solar corona are used to determine the organization of Alfven waves, revealing an increasing flow velocity peaking at 35–50 km s−1.
••
TL;DR: Early OSIRIS-REx observations of Bennu are considered to understand how the asteroid’s properties compare to pre-encounter expectations and to assess the prospects for sample return.
Abstract: NASA'S Origins, Spectral Interpretation, Resource Identification and Security-Regolith Explorer (OSIRIS-REx) spacecraft recently arrived at the near-Earth asteroid (101955) Bennu, a primitive body that represents the objects that may have brought prebiotic molecules and volatiles such as water to Earth1. Bennu is a low-albedo B-type asteroid2 that has been linked to organic-rich hydrated carbonaceous chondrites3. Such meteorites are altered by ejection from their parent body and contaminated by atmospheric entry and terrestrial microbes. Therefore, the primary mission objective is to return a sample of Bennu to Earth that is pristine-that is, not affected by these processes4. The OSIRIS-REx spacecraft carries a sophisticated suite of instruments to characterize Bennu's global properties, support the selection of a sampling site and document that site at a sub-centimetre scale5-11. Here we consider early OSIRIS-REx observations of Bennu to understand how the asteroid's properties compare to pre-encounter expectations and to assess the prospects for sample return. The bulk composition of Bennu appears to be hydrated and volatile-rich, as expected. However, in contrast to pre-encounter modelling of Bennu's thermal inertia12 and radar polarization ratios13-which indicated a generally smooth surface covered by centimetre-scale particles-resolved imaging reveals an unexpected surficial diversity. The albedo, texture, particle size and roughness are beyond the spacecraft design specifications. On the basis of our pre-encounter knowledge, we developed a sampling strategy to target 50-metre-diameter patches of loose regolith with grain sizes smaller than two centimetres4. We observe only a small number of apparently hazard-free regions, of the order of 5 to 20 metres in extent, the sampling of which poses a substantial challenge to mission success.
••
University of Milano-Bicocca1, Goddard Space Flight Center2, Forschungszentrum Jülich3, University of Twente4, École Polytechnique5, Max Planck Society6, University of Zurich7, Swiss Federal Institute of Aquatic Science and Technology8, University of Tasmania9, University of Toulouse10, York University11, University of Valencia12, Carnegie Institution for Science13, California Institute of Technology14
TL;DR: This review distills the historical and current developments spanning the last several decades of SIF heritage and complementarity within the broader field of fluorescence science, the maturation of physiological and radiative transfer modelling, SIF signal retrieval strategies, techniques for field and airborne sensing, advances in satellite-based systems, and applications of these capabilities in evaluation of photosynthesis and stress effects.
••
TL;DR: The results confirm that it is not possible to understand the current global terrestrial carbon sink without accounting for the sizeable sink due to forest demography, and imply that a large portion of the current terrestrialcarbon sink is strictly transient in nature.
Abstract: Although the existence of a large carbon sink in terrestrial ecosystems is well-established, the drivers of this sink remain uncertain. It has been suggested that perturbations to forest demography caused by past land-use change, management, and natural disturbances may be causing a large component of current carbon uptake. Here we use a global compilation of forest age observations, combined with a terrestrial biosphere model with explicit modeling of forest regrowth, to partition the global forest carbon sink between old-growth and regrowth stands over the period 1981-2010. For 2001-2010 we find a carbon sink of 0.85 (0.66-0.96) Pg year-1 located in intact old-growth forest, primarily in the moist tropics and boreal Siberia, and 1.30 (1.03-1.96) Pg year-1 located in stands regrowing after past disturbance. Approaching half of the sink in regrowth stands would have occurred from demographic changes alone, in the absence of other environmental changes. These age-constrained results show consistency with those simulated using an ensemble of demographically-enabled terrestrial biosphere models following an independent reconstruction of historical land use and management. We estimate that forests will accumulate an additional 69 (44-131) Pg C in live biomass from changes in demography alone if natural disturbances, wood harvest, and reforestation continue at rates comparable to those during 1981-2010. Our results confirm that it is not possible to understand the current global terrestrial carbon sink without accounting for the sizeable sink due to forest demography. They also imply that a large portion of the current terrestrial carbon sink is strictly transient in nature.
••
TL;DR: In this article, the authors evaluated the performance of 26 gridded (sub-) daily P datasets to obtain a view into the merit of these innovations using the Kling-Gupta efficiency (KGE).
Abstract: . New precipitation ( P ) datasets are released regularly, following
innovations in weather forecasting models, satellite retrieval methods, and
multi-source merging techniques. Using the conterminous US as a case study,
we evaluated the performance of 26 gridded (sub-)daily P datasets to obtain
insight into the merit of these innovations. The evaluation was performed at
a daily timescale for the period 2008–2017 using the Kling–Gupta efficiency
(KGE), a performance metric combining correlation, bias, and variability. As
a reference, we used the high-resolution (4 km) Stage-IV gauge-radar P
dataset. Among the three KGE components, the P datasets performed worst
overall in terms of correlation (related to event identification). In terms
of improving KGE scores for these datasets, improved P totals (affecting
the bias score) and improved distribution of P intensity (affecting the
variability score) are of secondary importance. Among the 11 gauge-corrected
P datasets, the best overall performance was obtained by MSWEP V2.2,
underscoring the importance of applying daily gauge corrections and
accounting for gauge reporting times. Several uncorrected P datasets
outperformed gauge-corrected ones. Among the 15 uncorrected P datasets, the
best performance was obtained by the ERA5-HRES fourth-generation reanalysis,
reflecting the significant advances in earth system modeling during the last
decade. The (re)analyses generally performed better in winter than in summer,
while the opposite was the case for the satellite-based datasets. IMERGHH V05
performed substantially better than TMPA-3B42RT V7, attributable to the many
improvements implemented in the IMERG satellite P retrieval algorithm.
IMERGHH V05 outperformed ERA5-HRES in regions dominated by convective storms,
while the opposite was observed in regions of complex terrain. The ERA5-EDA
ensemble average exhibited higher correlations than the ERA5-HRES
deterministic run, highlighting the value of ensemble modeling. The WRF
regional convection-permitting climate model showed considerably more
accurate P totals over the mountainous west and performed best among the
uncorrected datasets in terms of variability, suggesting there is merit in
using high-resolution models to obtain climatological P statistics. Our
findings provide some guidance to choose the most suitable P dataset for a
particular application.
••
University of Manchester1, Arecibo Observatory2, Lafayette College3, National Radio Astronomy Observatory4, United States Naval Research Laboratory5, Swinburne University of Technology6, Goddard Space Flight Center7, West Virginia University8, ASTRON9, Curtin University10, Cornell University11, INAF12, Canadian Institute for Advanced Research13, Peking University14, Max Planck Society15, PSL Research University16, University of Orléans17, University of British Columbia18, Australia Telescope National Facility19, Paris Diderot University20, Hillsdale College21, University of East Anglia22, University of Maryland, College Park23, McGill University24, University of Birmingham25, University of Washington26, University of Wisconsin–Milwaukee27, Radboud University Nijmegen28, University of Toronto29, York University30, Hungarian Academy of Sciences31, University of Cagliari32, Commonwealth Scientific and Industrial Research Organisation33, University of Milan34, Oregon State University35, California Institute of Technology36, Vanderbilt University37, Chinese Academy of Sciences38, Monash University, Clayton campus39
TL;DR: In this article, the authors describe the International Pulsar Timing Array second data release, which includes recent pulsar timing data obtained by three regional consortia: the European Pulsars Timing array, the North American Nanohertz Observatory for Gravitational Waves, and the Parkes pulsar timing array, and find that the timing precisions of pulsars are generally improved compared to the previous data release.
Abstract: In this paper, we describe the International Pulsar Timing Array second data release, which includes recent pulsar timing data obtained by three regional consortia: the European Pulsar Timing Array, the North American Nanohertz Observatory for Gravitational Waves, and the Parkes Pulsar Timing Array. We analyse and where possible combine high-precision timing data for 65 millisecond pulsars which are regularly observed by these groups. A basic noise analysis, including the processes which are both correlated and uncorrelated in time, provides noise models and timing ephemerides for the pulsars. We find that the timing precisions of pulsars are generally improved compared to the previous data release, mainly due to the addition of new data in the combination. The main purpose of this work is to create the most up-to-date IPTA data release. These data are publicly available for searches for low-frequency gravitational waves and other pulsar science.
••
California Institute of Technology1, University of Washington2, Stockholm University3, University of Maryland, College Park4, Auburn University5, University of Wisconsin–Milwaukee6, Goddard Space Flight Center7, National Central University8, University of California, Santa Barbara9, University of Michigan10, Adler Planetarium11, Northwestern University12, Lawrence Berkeley National Laboratory13, University of California, Berkeley14, Weizmann Institute of Science15, Radboud University Nijmegen16, Humboldt University of Berlin17, Macau University of Science and Technology18, Tel Aviv University19, Soka University of America20, Centre national de la recherche scientifique21, Los Alamos National Laboratory22
TL;DR: The Zwicky Transient Facility (ZTF) as mentioned in this paper is a new time-domain survey employing a dedicated camera on the Palomar 48-inch Schmidt telescope with a 47 deg^2 field of view and an 8 second readout time.
Abstract: The Zwicky Transient Facility (ZTF), a public–private enterprise, is a new time-domain survey employing a dedicated camera on the Palomar 48-inch Schmidt telescope with a 47 deg^2 field of view and an 8 second readout time. It is well positioned in the development of time-domain astronomy, offering operations at 10% of the scale and style of the Large Synoptic Survey Telescope (LSST) with a single 1-m class survey telescope. The public surveys will cover the observable northern sky every three nights in g and r filters and the visible Galactic plane every night in g and r. Alerts generated by these surveys are sent in real time to brokers. A consortium of universities that provided funding ("partnership") are undertaking several boutique surveys. The combination of these surveys producing one million alerts per night allows for exploration of transient and variable astrophysical phenomena brighter than r ~ 20.5 on timescales of minutes to years. We describe the primary science objectives driving ZTF, including the physics of supernovae and relativistic explosions, multi-messenger astrophysics, supernova cosmology, active galactic nuclei, and tidal disruption events, stellar variability, and solar system objects.