Showing papers by "Jet Propulsion Laboratory published in 2019"
••
TL;DR: The nuclei of most normal galaxies contain supermassive black holes, which can accrete gas through a disk and become active as mentioned in this paper, and these active galactic nuclei can form jets that are observed on s...
Abstract: The nuclei of most normal galaxies contain supermassive black holes, which can accrete gas through a disk and become active. These active galactic nuclei (AGNs) can form jets that are observed on s...
401 citations
••
TL;DR: The MODIS Collection 6 Global Land Cover Type product (CLP 6) as discussed by the authors uses a hierarchical classification model where the classes included in each level of the hierarchy reflect structured distinctions between land cover properties.
367 citations
••
TL;DR: In this paper, the authors identify and quantify the key material properties that make Bi2Te3 such a good thermoelectric material, which can be used for benchmarking future improvements in Bi2TE3 or new replacement materials.
Abstract: DOI: 10.1002/aelm.201800904 made for efficient thermoelectric cooling or temperature management uses Bi2Te3 alloys. Such solid-state devices dominate the market for temperature control in optoelectronics. As the need to eliminate greenhouse-gas refrigerants increases, Peltier cooling is becoming more attractive particularly in small systems where efficiencies are comparable to traditional refrigerant based cooling. Such small devices may enable distributive heating/ cooling (only where and when it is needed) with higher system level energy efficiency, for example in electric vehicles where energy for heating/cooling competes with vehicle range. Even for thermoelectric power generation, e.g., recovery of waste heat, Bi2Te3 alloys are most used because of superior efficiency up to 200 °C and the technology to make devices with Bi2Te3 is most advanced.[1–3] While the material and production technology for making Bi2Te3-based devices has remained essentially unchanged since the 1960s, our understanding of these materials has advanced considerably. Most recently, the interest in topological insulators (TI) has led to new insights into the complex electronic structure[4,5] revealing that with the accuracy in assessing the band structures available today, improvements in the electronic structure by band engineering should not only be possible but predictable.[6–9] Indeed, the p-type alloys chosen for use in commercial Peltier coolers appear to have unintentionally arrived at a composition close to a band convergence. The understanding of defects and doping is also advancing rapidly that will lead to new strategies for additional improvements in the electronic properties. The thermal conductivity of Bi2Te3-based alloys can also be engineered, where in particular there is much recent interest in microstructure engineering or nanostructuring.[10–22] Reduced thermal conductivity has led to numerous reports of exceptionally high efficiency (zT) that would be sufficient to revolutionize the industry. However, between measurement and material uncertainties, a revolutionary new Bi2Te3-based material has not made it to the market. Because even small but reliable improvements could make significant impact, it is worthwhile to better understand all the complex, interdependent effects of band engineering and microstructure engineering. To demonstrate and quantify improvements in thermoelectric properties, it is necessary to have well characterized properties or reliable benchmarks for comparison. Bismuth telluride is the working material for most Peltier cooling devices and thermoelectric generators. This is because Bi2Te3 (or more precisely its alloys with Sb2Te3 for p-type and Bi2Se3 for n-type material) has the highest thermoelectric figure of merit, zT, of any material around room temperature. Since thermoelectric technology will be greatly enhanced by improving Bi2Te3 or finding a superior material, this review aims to identify and quantify the key material properties that make Bi2Te3 such a good thermoelectric. The large zT can be traced to the high band degeneracy, low effective mass, high carrier mobility, and relatively low lattice thermal conductivity, which all contribute to its remarkably high thermoelectric quality factor. Using literature data augmented with newer results, these material parameters are quantified, giving clear insight into the tailoring of the electronic band structure of Bi2Te3 by alloying, or reducing thermal conductivity by nanostructuring. For example, this analysis clearly shows that the minority carrier excitation across the small bandgap significantly limits the thermoelectric performance of Bi2Te3, even at room temperature, showing that larger bandgap alloys are needed for higher temperature operation. Such effective material parameters can also be used for benchmarking future improvements in Bi2Te3 or new replacement materials.
350 citations
••
TL;DR: The Advanced Spaceborne Thermal Emission Reflection Radiometer (ASTER) spectral library version 2.0 has been expanded to support ECOSTRESS studies by including major additions of laboratory measured vegetation and non-photosynthetic vegetation (NPV) spectra as discussed by the authors.
205 citations
••
TL;DR: The science case and observational strategy for the Very Large Array Sky Survey is presented, and also results from early survey observations are presented.
Abstract: The Very Large Array Sky Survey (VLASS) is a synoptic, all-sky radio sky survey with a unique combination of high angular resolution ($\approx$2.5"), sensitivity (a 1$\sigma$ goal of 70 $\mu$Jy/beam in the coadded data), full linear Stokes polarimetry, time domain coverage, and wide bandwidth (2-4 GHz). The first observations began in September 2017, and observing for the survey will finish in 2024. VLASS will use approximately 5500 hours of time on the Karl G. Jansky Very Large Array (VLA) to cover the whole sky visible to the VLA (Declination $>-40^{\circ}$), a total of 33,885 deg$^2$. The data will be taken in three epochs to allow the discovery of variable and transient radio sources. The survey is designed to engage radio astronomy experts, multi-wavelength astronomers, and citizen scientists alike. By utilizing an "on the fly" interferometry mode, the observing overheads are much reduced compared to a conventional pointed survey. In this paper, we present the science case and observational strategy for the survey, and also results from early survey observations.
202 citations
••
West Virginia University1, Canadian Institute for Advanced Research2, California Institute of Technology3, Jet Propulsion Laboratory4, Hillsdale College5, University of Washington6, University of Illinois at Urbana–Champaign7, Harvard University8, Northwestern University9, National Radio Astronomy Observatory10, Vanderbilt University11, Fisk University12, York University13, Eötvös Loránd University14, University of Wisconsin–Milwaukee15, Swarthmore College16
TL;DR: Pulsar timing array (PTA) collaborations in North America, Australia, and Europe have been exploiting the exquisite timing precision of millisecond pulsars over decades of observations to search for correlated timing deviations induced by gravitational waves (GWs).
Abstract: Pulsar timing array (PTA) collaborations in North America, Australia, and Europe, have been exploiting the exquisite timing precision of millisecond pulsars over decades of observations to search for correlated timing deviations induced by gravitational waves (GWs). PTAs are sensitive to the frequency band ranging just below 1 nanohertz to a few tens of microhertz. The discovery space of this band is potentially rich with populations of inspiraling supermassive black hole binaries, decaying cosmic string networks, relic post-inflation GWs, and even non-GW imprints of axionic dark matter. This article aims to provide an understanding of the exciting open science questions in cosmology, galaxy evolution, and fundamental physics that will be addressed by the detection and study of GWs through PTAs. The focus of the article is on providing an understanding of the mechanisms by which PTAs can address specific questions in these fields, and to outline some of the subtleties and difficulties in each case. The material included is weighted most heavily toward the questions which we expect will be answered in the near-term with PTAs; however, we have made efforts to include most currently anticipated applications of nanohertz GWs.
196 citations
••
TL;DR: The results show that Saturn's rings are substantially younger than the planet itself and constrain models of their origin, and five small moons located in and around the rings are presented, confirming that the flows are very deep and likely extend down to the levels where magneticipation occurs.
Abstract: The interior structure of Saturn, the depth of its winds, and the mass and age of its rings constrain its formation and evolution. In the final phase of the Cassini mission, the spacecraft dived between the planet and its innermost ring, at altitudes of 2600 to 3900 kilometers above the cloud tops. During six of these crossings, a radio link with Earth was monitored to determine the gravitational field of the planet and the mass of its rings. We find that Saturn's gravity deviates from theoretical expectations and requires differential rotation of the atmosphere extending to a depth of at least 9000 kilometers. The total mass of the rings is (1.54 ± 0.49) × 1019 kilograms (0.41 ± 0.13 times that of the moon Mimas), indicating that the rings may have formed 107 to 108 years ago.
177 citations
••
TL;DR: In this paper, the authors highlight the issues being addressed by the SWOT science community to understand SWOT's very precise SSH / surface pressure observations, and explore how SWOT data will be combined with other satellite and in-situ data and models to better understand the upper ocean 4D circulation.
Abstract: The future international Surface Water and Ocean Topography (SWOT) Mission, planned for launch in 2021, will make high-resolution 2D observations of sea-surface height using SAR radar interferometric techniques. SWOT will map the global and coastal oceans up to 77.6° latitude every 21 days over a swath of 120 km (20 km nadir gap). Today’s 2D mapped altimeter data can resolve ocean scales of 150 km wavelength whereas the SWOT measurement will extend our 2D observations down to 15-30 km, depending on sea state. SWOT will offer new opportunities to observe the oceanic dynamic processes at these scales, that are important in the generation and dissipation of kinetic energy in the ocean, and act as one of the main gateways connecting the interior of the ocean to the upper layer. The active vertical exchanges linked to these scales have impacts on the local and global budgets of heat and carbon, and on nutrients for biogeochemical cycles. This review paper highlights the issues being addressed by the SWOT science community to understand SWOT’s very precise SSH / surface pressure observations, and it explores how SWOT data will be combined with other satellite and in-situ data and models to better understand the upper ocean 4D circulation (x,y,z,t) over the next decade. SWOT’s new SAR-interferometry technology aims to observe ocean SSH scales down to 15-30 km in wavelength. At these scales, SSH includes “balanced” geostrophic eddy motions and high-frequency internal tides and internal waves. This presents both a challenge in reconstructing the 4D upper ocean circulation, or in the assimilation of SSH in models, but also an opportunity to have global observations of the 2D structure of these phenomena, and to learn more about their interactions. At these small scales, the ocean dynamics evolve rapidly, and combining SWOT 2D SSH data with other satellite or in-situ data with different space-time coverage is also a challenge. SWOT’s new technology will be a forerunner for the future altimetric observing system, and so advancing on these issues today will pave the way for our future.
163 citations
••
University of Sheffield1, Paul Sabatier University2, University of Toulouse3, Technical University of Denmark4, University of Edinburgh5, University of Montpellier6, Polytechnic University of Milan7, University of Bordeaux8, German Aerospace Center9, Jet Propulsion Laboratory10, European Space Research and Technology Centre11, University of Virginia12, University of Tasmania13, Hobart Corporation14, Chalmers University of Technology15
TL;DR: The European Space Agency's 7th Earth Explorer mission, BIOMASS, is to determine the worldwide distribution of forest above-ground biomass (AGB) in order to reduce the major uncertainties in calculations of carbon stocks and fluxes associated with the terrestrial biosphere, including carbon fluxe associated with Land Use Change, forest degradation and forest regrowth as mentioned in this paper.
160 citations
••
University of Michigan1, University of California, Los Angeles2, Texas A&M University3, University of Washington4, Imperial College London5, ETH Zurich6, University of Chicago7, Heidelberg University8, Goethe University Frankfurt9, Institut de Physique du Globe de Paris10, Ruhr University Bochum11, University of Cambridge12, Stony Brook University13, Hungarian Academy of Sciences14, Jet Propulsion Laboratory15, University of South Florida St. Petersburg16, Grand Valley State University17
TL;DR: In this article, the authors investigate whether use of updated and more accurate values for these parameters can remove observed interlaboratory differences in the measured T-Δ relationship, using the updated parameters, they reprocess 14 published calibration data sets measured in 11 different laboratories, representing many mineralogies, bulk compositions, sample types, reaction temperatures and sample preparation and analysis methods.
Abstract: The clumped isotopic composition of carbonate-derived CO (denoted Δ) is a function of carbonate formation temperature and in natural samples can act as a recorder of paleoclimate, burial, or diagenetic conditions. The absolute abundance of heavy isotopes in the universal standards VPDB and VSMOW (defined by four parameters: R , R , R , and λ) impact calculated Δ values. Here, we investigate whether use of updated and more accurate values for these parameters can remove observed interlaboratory differences in the measured T-Δ relationship. Using the updated parameters, we reprocess 14 published calibration data sets measured in 11 different laboratories, representing many mineralogies, bulk compositions, sample types, reaction temperatures, and sample preparation and analysis methods. Exploiting this large composite data set (n = 1,253 sample replicates), we investigate the possibility for a “universal” clumped isotope calibration. We find that applying updated parameters improves the T-Δ relationship (reduces residuals) within most labs and improves overall agreement but does not eliminate all interlaboratory differences. We reaffirm earlier findings that different mineralogies do not require different calibration equations and that cleaning procedures, method of pressure baseline correction, and mass spectrometer type do not affect interlaboratory agreement. We also present new estimates of the temperature dependence of the acid digestion fractionation for Δ (Δ*), based on combining reprocessed data from four studies, and new theoretical equilibrium values to be used in calculation of the empirical transfer function. Overall, we have ruled out a number of possible causes of interlaboratory disagreement in the T-Δ relationship, but many more remain to be investigated.
126 citations
••
TL;DR: In this article, satellite measurements from Nimbus-7 and Earth Probe Total Ozone Mapping Spectrometer (TOMS) are combined with ozone measurements from the Aura Ozone Monitoring Instrument/Microwave LimbSounder (OMI/MLS) to determine trends in tropospheric ozone for 1979-2016.
Abstract: . Past studies have suggested that ozone in the troposphere has
increased globally throughout much of the 20th century due to increases in
anthropogenic emissions and transport. We show, by combining satellite
measurements with a chemical transport model, that during the last four
decades tropospheric ozone does indeed indicate increases that are global in
nature, yet still highly regional. Satellite ozone measurements from
Nimbus-7 and Earth Probe Total Ozone Mapping Spectrometer (TOMS) are merged
with ozone measurements from the Aura Ozone Monitoring Instrument/Microwave Limb
Sounder (OMI/MLS) to determine trends in tropospheric ozone for 1979–2016.
Both TOMS (1979–2005) and OMI/MLS (2005–2016) depict large increases in
tropospheric ozone from the Near East to India and East Asia and further
eastward over the Pacific Ocean. The 38-year merged satellite record shows
total net change over this region of about +6 to +7 Dobson units (DU)
(i.e., ∼15 %–20 % of average background ozone), with the
largest increase ( ∼4 DU) occurring during the 2005–2016 Aura
period. The Global Modeling Initiative (GMI) chemical transport model with
time-varying emissions is used to aid in the interpretation of tropospheric
ozone trends for 1980–2016. The GMI simulation for the combined record also
depicts the greatest increases of +6 to +7 DU over India and East Asia, very
similar to the satellite measurements. In regions of significant increases
in tropospheric column ozone (TCO) the trends are a factor of 2–2.5 larger
for the Aura record when compared to the earlier TOMS record; for India and East Asia the trends in TCO for both GMI and satellite measurements are
∼ + 3 DU decade −1 or greater during 2005–2016 compared
to about +1.2 to +1.4 DU decade −1 for 1979–2005. The GMI simulation
and satellite data also reveal a tropospheric ozone increases in
∼ + 4 to +5 DU for the 38-year record over central Africa
and the tropical Atlantic Ocean. Both the GMI simulation and
satellite-measured tropospheric ozone during the latter Aura time period
show increases of ∼ + 3 DU decade −1 over the N Atlantic
and NE Pacific.
••
Massachusetts Institute of Technology1, University of Hawaii2, Mount Holyoke College3, Université Paris-Saclay4, Weizmann Institute of Science5, Centre national de la recherche scientifique6, Paris Observatory7, Lowell Observatory8, Wellesley College9, Northern Arizona University10, Johns Hopkins University Applied Physics Laboratory11, Jet Propulsion Laboratory12, Colby College13, University of Arizona14, Planetary Science Institute15, Luleå University of Technology16, University of Helsinki17
TL;DR: Lantz et al. as discussed by the authors reported measured spectral properties for more than 1000 near-Earth objects (NEOs), representing approximately 5% of the currently discovered population of NEOs.
••
Environment Canada1, Commonwealth Scientific and Industrial Research Organisation2, University of Wollongong3, Cooperative Institute for Research in Environmental Sciences4, Earth System Research Laboratory5, Jet Propulsion Laboratory6, Ford Motor Company7, Goddard Space Flight Center8, Harvard University9, Swiss Federal Laboratories for Materials Science and Technology10, ETH Zurich11, Belgian Institute for Space Aeronomy12, National Center for Atmospheric Research13, Cooperative Institute for Mesoscale Meteorological Studies14, University of Toronto15, Geophysical Fluid Dynamics Laboratory16, Princeton University17, National Oceanic and Atmospheric Administration18
TL;DR: In this paper, various ozone measurement methods and ozone datasets are reviewed and selected for inclusion in the historical record of background ozone levels, based on relationship of the measurement technique to the modern UV absorption standard, absence of interfering pollutants, representativeness of the well-mixed boundary layer and expert judgement of their credibility.
Abstract: From the earliest observations of ozone in the lower atmosphere in the 19th century, both measurement methods and the portion of the globe observed have evolved and changed. These methods have di erent uncertainties and biases, and the data records di er with respect to coverage (space and time), information content, and representativeness. In this study, various ozone measurement methods and ozone datasets are reviewed and selected for inclusion in the historical record of background ozone levels, based on relationship of the measurement technique to the modern UV absorption standard, absence of interfering pollutants, representativeness of the well-mixed boundary layer and expert judgement of their credibility. There are signi cant uncertainties with the 19th and early 20th-century measurements related to interference of other gases. Spectroscopic methods applied before 1960 have likely underestimated ozone by as much as 11% at the surface and by about 24% in the free troposphere, due to the use of di ering ozone absorption coe cients.
There is no unambiguous evidence in the measurement record back to 1896 that typical mid-latitude background surface ozone values were below about 20 nmol mol–1, but there is robust evidence for increases in the temperate and polar regions of the northern hemisphere of 30–70%, with large uncertainty, between the period of historic observations, 1896–1975, and the modern period (1990–2014). Independent historical observations from balloons and aircraft indicate similar changes in the free troposphere. Changes in the southern hemisphere are much less. Regional representativeness of the available observations remains a potential source of large errors, which are di cult to quantify.
The great majority of validation and intercomparison studies of free tropospheric ozone measurement methods use ECC ozonesondes as reference. Compared to UV-absorption measurements they show a modest (~1–5% ±5%) high bias in the troposphere, but no evidence of a change with time. Umkehr, lidar, and FTIR methods all show modest low biases relative to ECCs, and so, using ECC sondes as a transfer standard, all appear to agree to within one standard deviation with the modern UV-absorption standard. Other sonde types show an increase of 5–20% in sensitivity to tropospheric ozone from 1970–1995.
Biases and standard deviations of satellite retrieval comparisons are often 2–3 times larger than those of other free tropospheric measurements. The lack of information on temporal changes of bias for satellite measurements of tropospheric ozone is an area of concern for long-term trend studies.
••
National Oceanic and Atmospheric Administration1, University of Washington2, Jet Propulsion Laboratory3, Colorado State University4, California Institute of Technology5, University of Hamburg6, Langley Research Center7, University of Wisconsin-Madison8, University of St. Thomas (Minnesota)9, University of New South Wales10, Hobart Corporation11, Australian Research Council12, Cooperative Research Centre13, University of Tasmania14, University of California, San Diego15, Japan Meteorological Agency16, Pacific Marine Environmental Laboratory17, Met Office18, Silver Spring Networks19, Joint Institute for Marine and Atmospheric Research20, École Normale Supérieure21, National University of Defense Technology22, Woods Hole Oceanographic Institution23
TL;DR: In this article, the authors review the current state-of-the-art methods to estimate global OHC changes and evaluate their relevance to derive Earth's Energy Imbalance (EEI) estimate on different time scales.
Abstract: The energy radiated by the Earth towards space does not compensate the incoming radiation from the Sun leading to a small positive energy imbalance at the top of the atmosphere (0.4-1.Wm-2). This imbalance is coined Earth’s Energy Imbalance (EEI). It is mostly caused by anthropogenic greenhouse gases emissions and is driving the current warming of the planet. Precise monitoring of EEI is critical to assess the current status of climate change and the future evolution of climate. But the monitoring of EEI is challenging as EEI is two order of magnitude smaller than the radiation fluxes in and out of the Earth. Over 93% of the excess energy that is gained by the Earth in response to the positive EEI accumulates into the ocean in the form of heat. This accumulation of heat can be tracked with the ocean observing system such that today, the monitoring of Ocean Heat Content (OHC) and its long-term change provide the most efficient approach to estimate EEI. In this community paper we review the current four state-of-the-art methods to estimate global OHC changes and evaluate their relevance to derive EEI estimate on different time scales. These four methods make use of : 1) direct observations of in situ temperature; 2) satellite-based measurements of the ocean surface net heat fluxes; 3) satellite-based estimates of the thermal expansion of the ocean and 4) ocean reanalyses that assimilate observations from both satellite and in situ instruments. For each method we review the potential and the uncertainty of the method to estimate global OHC changes. We also analyze gaps in the current capability of each method and identify ways of progress for the future to fulfill the requirements of EEI monitoring. Achieving the observation of EEI with sufficient accuracy will depend on merging the remote sensing techniques with in situ measurements of key variables as an integral part of the Ocean Observing System.
••
TL;DR: The auxiliary payload sensor suite (APSS) as mentioned in this paper includes a magnetometer, an atmospheric pressure sensor, and a pair of wind and air temperature sensors for the InSight mission to Mars.
Abstract: NASA’s InSight mission to Mars will measure seismic signals to determine the planet’s interior structure. These highly sensitive seismometers are susceptible to corruption of their measurements by environmental changes. Magnetic fields, atmosphere pressure changes, and local winds can all induce apparent changes in the seismic records that are not due to propagating ground motions. Thus, InSight carries a set of sensors called the Auxiliary Payload Sensor Suite (APSS) which includes a magnetometer, an atmospheric pressure sensor, and a pair of wind and air temperature sensors. In the case of the magnetometer, knowledge of the amplitude of the fluctuating magnetic field at the InSight lander will allow the separation of seismic signals from potentially interfering magnetic signals of either natural or spacecraft origin. To acquire such data, a triaxial fluxgate magnetometer was installed on the deck of the lander to obtain magnetic records at the same cadence as the seismometer. Similarly, a highly sensitive pressure sensor is carried by InSight to enable the removal of local ground-surface tilts due to advecting pressure perturbations. Finally, the local winds (speed and direction) and air temperature are estimated using a hot-film wind sensor with heritage from REMS on the Curiosity rover. When winds are too high, seismic signals can be ignored or discounted. Herein we describe the APSS sensor suite, the test programs for its components, and the possible additional science investigations it enables.
••
Durham University1, Radboud University Nijmegen2, Harvard University3, Université Paris-Saclay4, Jet Propulsion Laboratory5, Luleå University of Technology6, École Normale Supérieure7, University at Buffalo8, Snow College9, University of Colorado Boulder10, University of California, Berkeley11, Russian Academy of Sciences12
TL;DR: In this paper, the authors significantly revise and extend the HITRAN data with respect to the original effort described in Richard et al. [JQSRT 113, 1276 (2012)].
••
TL;DR: In this paper, the authors review the recent technical advances in processing and the new technological capabilities of satellite radar altimetry in the coastal zone and illustrate the fast-growing use of coastal data sets in coastal sea level research and applications, as highfrequency (tides and storm surge) and long-term sea level change studies.
Abstract: Satellite radar altimetry provides a unique sea level data set that extends over more than 25 years back in time and that has an almost global coverage. However, when approaching the coasts, the extraction of correct sea level estimates is challenging due to corrupted waveforms and to errors in most of the corrections and in some auxiliary information used in the data processing. The development of methods dedicated to the improvement of altimeter data in the coastal zone dates back to the 1990s, but the major progress happened during the last decade thanks to progress in radar technology [e.g., synthetic aperture radar (SAR) mode and Ka-band frequency], improved waveform retracking algorithms, the availability of new/improved corrections (e.g., wet troposphere and tidal models) and processing workflows oriented to the coastal zone. Today, a set of techniques exists for the processing of coastal altimetry data, generally called “coastal altimetry.” They have been used to generate coastal altimetry products. Altimetry is now recognized as part of the integrated observing system devoted to coastal sea level monitoring. In this article, we review the recent technical advances in processing and the new technological capabilities of satellite radar altimetry in the coastal zone. We also illustrate the fast-growing use of coastal altimetry data sets in coastal sea level research and applications, as high-frequency (tides and storm surge) and long-term sea level change studies.
••
TL;DR: In fact, a minimal deviation from Gaussianity is perhaps the most robust theoretical prediction of models that explain the observed universe; itis necessarily present even in the simplest scenarios as discussed by the authors.
Abstract: Our current understanding of the Universe is established through the pristine measurements of structure in the cosmic microwave background (CMB) and the distribution and shapes of galaxies tracing the large scale structure (LSS) of the Universe. One key ingredient that underlies cosmological observables is that the field that sources the observed structure is assumed to be initially Gaussian with high precision. Nevertheless, a minimal deviation from Gaussianityis perhaps the most robust theoretical prediction of models that explain the observed Universe; itis necessarily present even in the simplest scenarios. In addition, most inflationary models produce far higher levels of non-Gaussianity. Since non-Gaussianity directly probes the dynamics in the early Universe, a detection would present a monumental discovery in cosmology, providing clues about physics at energy scales as high as the GUT scale.
••
Johns Hopkins University1, University of Reims Champagne-Ardenne2, University of Arizona3, Institut d'Astrophysique de Paris4, Ames Research Center5, University of Geneva6, Technical University of Denmark7, University of Maryland, College Park8, Massachusetts Institute of Technology9, Jet Propulsion Laboratory10, Cornell University11, Smithsonian Institution12, Technical University of Berlin13, Tennessee State University14, Spanish National Research Council15, University of Exeter16, Space Telescope Science Institute17
TL;DR: In this paper, the authors presented the results of a mission to the International Space Station (ISS) under the European Union's Horizon 2020 research and innovation programme (project Four Aces) under project PACES.
Abstract: NASA through STScI [HST-GO-14767]; DFG [SPP 1992, GA 2557/1-1]; Programme National de Planetologie (PNP) of CNRS/INSU; CNES; CNES (France) under project PACES; Spanish MINECO [AYA2016-79425-C3-2-P]; Swiss National Science Foundation (SNSF); European Research Council (ERC) under the European Union's Horizon 2020 research and innovation programme (project Four Aces) [724427]
••
California Institute of Technology1, University of Toulouse2, Technical University of Denmark3, National Oceanography Centre4, Spanish National Research Council5, European Space Agency6, University of Porto7, Fisheries and Oceans Canada8, Danish Meteorological Institute9, University of Miami10, National Oceanic and Atmospheric Administration11, Jet Propulsion Laboratory12, Deutsches Geodätisches Forschungsinstitut13, EUMETSAT14, Rutgers University15, Natural Environment Research Council16
TL;DR: In this paper, the main forcing agents acting on coastal regions (e.g., sea level, winds, waves and currents, river runoff, sediment supply and transport, vertical land motions, land use) and the induced coastal response are discussed.
Abstract: Coastal zones are highly dynamical systems affected by a variety of natural and anthropogenic forcing factors that include sea level rise, extreme events, local oceanic and atmospheric processes, ground subsidence, etc. However, so far, they remain poorly monitored on a global scale. To better understand changes affecting world coastal zones and to provide crucial information to decision-makers involved in adaptation to and mitigation of environmental risks, coastal observations of various types need to be collected and analyzed. In this white paper, we first discuss the main forcing agents acting on coastal regions (e.g., sea level, winds, waves and currents, river runoff, sediment supply and transport, vertical land motions, land use) and the induced coastal response (e.g., shoreline position, estuaries morphology, land topography at the land-sea interface and coastal bathymetry). We identify a number of space-based observational needs that have to be addressed in the near future to understand coastal zone evolution. Among these, improved monitoring of coastal sea level by satellite altimetry techniques is recognized as high priority. Classical altimeter data in the coastal zone are adversely affected by land contamination with degraded range and geophysical corrections. However, recent progress in coastal altimetry data processing and multisensor data synergy, offers new perspective to measure sea level change very close to the coast. This issue is discussed in much detail in this paper, including the development of a global coastal sea-level and sea state climate record with mission consistent coastal processing and products dedicated to coastal regimes. Finally, we present a new promising technology based on the use of Signals of Opportunity (SoOp), i.e., communication satellite transmissions that are reutilized as illumination sources in a bistatic radar configuration, for measuring coastal sea level. Since SoOp technology requires only receiver technology to be placed in orbit, small satellite platforms could be used, enabling a constellation to achieve high spatio-temporal resolutions of sea level in coastal zones.
••
18 Dec 2019TL;DR: A psychological experiment to examine what forms of explanations best foster human trust in the robot found that comprehensive and real-time visualizations of the robot’s internal decisions were more effective in promoting human trust than explanations based on summary text descriptions.
Abstract: The ability to provide comprehensive explanations of chosen actions is a hallmark of intelligence. Lack of this ability impedes the general acceptance of AI and robot systems in critical tasks. This paper examines what forms of explanations best foster human trust in machines and proposes a framework in which explanations are generated from both functional and mechanistic perspectives. The robot system learns from human demonstrations to open medicine bottles using (i) an embodied haptic prediction model to extract knowledge from sensory feedback, (ii) a stochastic grammar model induced to capture the compositional structure of a multistep task, and (iii) an improved Earley parsing algorithm to jointly leverage both the haptic and grammar models. The robot system not only shows the ability to learn from human demonstrators but also succeeds in opening new, unseen bottles. Using different forms of explanations generated by the robot system, we conducted a psychological experiment to examine what forms of explanations best foster human trust in the robot. We found that comprehensive and real-time visualizations of the robot’s internal decisions were more effective in promoting human trust than explanations based on summary text descriptions. In addition, forms of explanation that are best suited to foster trust do not necessarily correspond to the model components contributing to the best task performance. This divergence shows a need for the robotics community to integrate model components to enhance both task execution and human trust in machines.
••
Centre national de la recherche scientifique1, Memorial University of Newfoundland2, University of California, San Diego3, Rutgers University4, University of Cyprus5, University of Washington6, University of Western Australia7, World Meteorological Organization8, Finnish Meteorological Institute9, Oceanic Platform of the Canary Islands10, Oregon State University11, Woods Hole Oceanographic Institution12, University of Bergen13, University of Perpignan14, British Antarctic Survey15, Jet Propulsion Laboratory16, United States Naval Research Laboratory17, Pierre-and-Marie-Curie University18, Bermuda Institute of Ocean Sciences19, Dalhousie University20, Texas A&M University21, University of Georgia22, National Oceanography Centre23, Hebrew University of Jerusalem24, National Oceanic and Atmospheric Administration25, Fisheries and Oceans Canada26, Massachusetts Institute of Technology27, Scottish Association for Marine Science28, Korean Ocean Research and Development Institute29, University of Tokyo30, National Taiwan University31, University of Victoria32, Council for Scientific and Industrial Research33, Institut de recherche pour le développement34, University of Puerto Rico at Mayagüez35, Superior National School of Advanced Techniques36, National Institute of Water and Atmospheric Research37, Marine Institute of Memorial University of Newfoundland38, Hobart Corporation39, Ensenada Center for Scientific Research and Higher Education40, Korea University41, NATO42, Royal Dutch Shell43, University of New South Wales44, Spanish National Research Council45, Commonwealth Scientific and Industrial Research Organisation46, University of Gothenburg47, California Institute of Technology48, University of British Columbia49, University of the Virgin Islands50
TL;DR: OceanGliders as mentioned in this paper is an active coordination and enhancement of global glider activity, which brings together marine scientists and engineers operating gliders around the world to observe the long-term physical, biogeochemical and biological ocean processes and phenomena that are relevant for societal applications.
Abstract: The OceanGliders program started in 2016 to support active coordination and enhancement of global glider activity. OceanGliders contributes to the international efforts of the Global Ocean Observation System (GOOS) for Climate, Ocean Health and Operational Services. It brings together marine scientists and engineers operating gliders around the world: (1) to observe the long-term physical, biogeochemical, and biological ocean processes and phenomena that are relevant for societal applications; and, (2) to contribute to the GOOS through real-time and delayed mode data dissemination. The OceanGliders program is distributed across national and regional observing systems and significantly contributes to integrated, multi-scale and multi-platform sampling strategies. OceanGliders shares best practices, requirements, and scientific knowledge needed for glider operations, data collection and analysis. It also monitors global glider activity and supports the dissemination of glider data through regional and global databases, in real-time and delayed modes, facilitating data access to the wider community. OceanGliders currently supports national, regional and global initiatives to maintian and expand the capabilities and application of gliders to meet key global challenges such as improved measurement of ocean boundary currents, water transformation and storm forecast.
••
Jet Propulsion Laboratory1, Bureau of Meteorology2, University of São Paulo3, National Oceanic and Atmospheric Administration4, European Space Research and Technology Centre5, Danish Meteorological Institute6, University Of Energy And Natural Resources7, Japan Aerospace Exploration Agency8, Technical University of Denmark9, University of Reading10, ENEA11, University of Miami12, Indian Institute of Technology Bombay13, University of Southampton14
TL;DR: The work in this article reviewed progress versus the challenges set out 10 years ago in a previous paper, highlight remaining and new research and development challenges for the next 10 years (such as the need for sustained continuity of passive microwave SST using a 6.9 GHz channel), and conclude with needs to achieve an integrated global high-resolution SST observing system, with focus on satellite observations exploited in conjunction with in situ SSTs.
Abstract: Sea surface temperature (SST) is a fundamental physical variable for understanding, quantifying and predicting complex interactions between the ocean and the atmosphere. Such processes determine how heat from the sun is redistributed across the global oceans, directly impacting large- and small-scale weather and climate patterns. The provision of daily maps of global SST for operational systems, climate modelling and the broader scientific community is now a mature and sustained service coordinated by the Group for High Resolution Sea Surface Temperature (GHRSST) and the CEOS SST Virtual Constellation (CEOS SST-VC). Data streams are shared, indexed, processed, quality controlled, analyzed, and documented within a Regional/Global Task Sharing (R/GTS) framework, which is implemented internationally in a distributed manner. Products rely on a combination of low-Earth orbit infrared and microwave satellite imagery, geostationary orbit infrared satellite imagery, and in situ data from moored and drifting buoys, Argo floats, and a suite of independent, fully characterized and traceable in situ measurements for product validation (Fiducial Reference Measurements, FRM). Research and development continues to tackle problems such as instrument calibration, algorithm development, diurnal variability, derivation of high-quality skin and depth temperatures, and areas of specific interest such as the high latitudes and coastal areas. In this white paper, we review progress versus the challenges we set out 10 years ago in a previous paper, highlight remaining and new research and development challenges for the next 10 years (such as the need for sustained continuity of passive microwave SST using a 6.9 GHz channel), and conclude with needs to achieve an integrated global high-resolution SST observing system, with focus on satellite observations exploited in conjunction with in situ SSTs. The paper directly relates to the theme of Data Information Systems and also contributes to Ocean Observing Governance and Ocean Technology and Networks within the OceanObs2019 objectives. Applications of SST contribute to all the seven societal benefits, covering Discovery; Ecosystem Health & Biodiversity; Climate Variability & Change; Water, Food, & Energy Security; Pollution & Human Health; Hazards and Maritime Safety; and the Blue Economy.
••
Goddard Space Flight Center1, University of Maryland, College Park2, Carnegie Institution for Science3, University of Paris4, Misericordia University5, Spanish National Research Council6, Lunar and Planetary Institute7, Ames Research Center8, National Autonomous University of Mexico9, Jet Propulsion Laboratory10
TL;DR: Navarro-Gonzalez et al. as discussed by the authors acknowledge that the successful operation of the Curiosity rover and the SAM instrument on Mars is due to the hard work and dedication of hundreds of scientists, engineers, and managers over more than a decade.
Abstract: All MSL data used in this manuscript (REMS and SAM) are freely available on NASA's Planetary Data System (PDS) Geosciences Node, from within 6 months after receipt on Earth (http://pds‐geosciences.wustl.edu/missions/msl/). The mixing ratios developed and presented in this paper are available at a publicly available archive (dataverse.org: doi.org/10.7910/DVN/CVUOWW) as cited within the manuscript. The successful operation of the Curiosity rover and the SAM instrument on Mars is due to the hard work and dedication of hundreds of scientists, engineers, and managers over more than a decade. Essential contributions to the successful operation of SAM on Mars and the acquisition of SAM data were provided by the SAM development, operations, and test bed teams. The authors gratefully thank the SAM and MSL teams that have contributed in numerous ways to obtain the data that enabled this scientific work. We also thank NASA for the support of the development of SAM, SAM data analysis, and the continued support of the Mars Science Laboratory mission. The contribution of F. Lefevre was supported by the Programme National de Planetologie (PNP). R. Navarro‐Gonzalez acknowledges support from the Universidad Nacional Autonoma de Mexico (PAPIIT IN111619). LPI is operated by USRA under a cooperative agreement with the Science Mission Directorate of the National Aeronautics and Space Administration. We thank members of the SAM and larger MSL team for insightful discussions and support. In particular, we thank R. Becker and R. O. Pepin for careful review of data analysis and interpretation. We thank M. D. Smith for discussion of CRISM CO measurements. We thank A. Brunner, M. Johnson, and M. Lefavor for their development of customized data analysis tools used here and in other SAM publications.
••
TL;DR: In this article, a detailed review of the physics basis for the DTE2 operational scenarios, including the fusion power predictions through first principle and integrated modelling, and the impact of isotopes in the operation and physics of DTE plasmas (thermal and particle transport, high confinement mode, Be and W erosion, fuel recovery, etc).
Abstract: For the past several years, the JET scientific programme (Pamela et al 2007 Fusion Eng. Des.
82 590) has been engaged in a multi-campaign effort, including experiments in D, H and T,
leading up to 2020 and the first experiments with 50%/50% D–T mixtures since 1997 and the
first ever D–T plasmas with the ITER mix of plasma-facing component materials. For this
purpose, a concerted physics and technology programme was launched with a view to prepare
the D–T campaign (DTE2). This paper addresses the key elements developed by the JET
programme directly contributing to the D–T preparation. This intense preparation includes
the review of the physics basis for the D–T operational scenarios, including the fusion power
predictions through first principle and integrated modelling, and the impact of isotopes in the
operation and physics of D–T plasmas (thermal and particle transport, high confinement mode
(H-mode) access, Be and W erosion, fuel recovery, etc). This effort also requires improving
several aspects of plasma operation for DTE2, such as real time control schemes, heat load
control, disruption avoidance and a mitigation system (including the installation of a new
shattered pellet injector), novel ion cyclotron resonance heating schemes (such as the threeions
scheme), new diagnostics (neutron camera and spectrometer, active Alfven eigenmode
antennas, neutral gauges, radiation hard imaging systems…) and the calibration of the JET
neutron diagnostics at 14 MeV for accurate fusion power measurement. The active preparation
of JET for the 2020 D–T campaign provides an incomparable source of information and a
basis for the future D–T operation of ITER, and it is also foreseen that a large number of key
physics issues will be addressed in support of burning plasmas.
••
Massachusetts Institute of Technology1, Johns Hopkins University2, University of Exeter3, Tennessee State University4, Space Telescope Science Institute5, Cornell University6, Ames Research Center7, Université Paris-Saclay8, Jet Propulsion Laboratory9, University of Maryland, College Park10, University of Arizona11
TL;DR: In this paper, the authors used the Leverhulme Trust and University of Exeter PhD studentship for the development of the Space Telescope Science Institute (SSTI) project.
Abstract: NASA [NAS 5-26555]; Space Telescope Science Institute; Leverhulme Trust; University of Exeter PhD Studentship - UK Science and Technology Facilities Council (STFC) studentship; STFC Consolidated Grant [ST/R000395/1]; European Research Council;[ATMO 757858]
••
••
Ferdowsi University of Mashhad1, University of California, Irvine2, Concordia University3, Oklahoma State University–Stillwater4, Boise State University5, University of Alabama6, École Polytechnique de Montréal7, Beijing Normal University8, Stockholm University9, Imperial College London10, City University of New York11, Fairleigh Dickinson University12, University of Stuttgart13, University of California, Los Angeles14, Jet Propulsion Laboratory15
TL;DR: In this article, the authors quantified the compounding effects of human activities and climate change on surface water availability in Iran over the twenty-first century by combining long-term ground-based data on water withdrawal with climate model projections.
Abstract: By combining long-term ground-based data on water withdrawal with climate model projections, this study quantifies the compounding effects of human activities and climate change on surface water availability in Iran over the twenty-first century. Our findings show that increasing water withdrawal in Iran, due to population growth and increased agricultural activities, has been the main source of historical water stress. Increased levels of water stress across Iran are expected to continue or even worsen over the next decades due to projected variability and change in precipitation combined with heightened water withdrawals due to increasing population and socio-economic activities. The greatest rate of decreased water storage is expected in the Urmia Basin, northwest of Iran, (varying from ~ − 8.3 mm/year in 2010–2039 to ~ − 61.6 mm/year in 2070–2099 compared with an observed rate of 4 mm/year in 1976–2005). Human activities, however, strongly dominate the effects of precipitation variability and change. Major shifts toward sustainable land and water management are needed to reduce the impacts of water scarcity in the future, particularly in Iran’s heavily stressed basins like Urmia Basin, which feeds the shrinking Lake Urmia.
••
Dartmouth College1, University of Bern2, International Space Science Institute3, Tel Aviv University4, Russian Academy of Sciences5, European Space Agency6, Jet Propulsion Laboratory7, Technical University of Denmark8, École Polytechnique Fédérale de Lausanne9, Istituto Nazionale di Fisica Nucleare10, University of Trieste11, Montana State University12, Science and Technology Policy Institute13, University of Strathclyde14, Korea Astronomy and Space Science Institute15, PES University16, Advanced Technology Center17, Chinese Academy of Sciences18
TL;DR: The goal of this roadmap is to encourage the space science community to leverage developments in the small satellite industry in order to increase flight rates, and change the way small science satellites are built and managed.
[...]
12 Dec 2019
Abstract: The InSight (Interior Exploration using Seismic Investigations, Geodesy and Heat Transport) mission landed in Elysium Planitia on Mars on 26 November 2018 and fully deployed its seismometer by the end of February 2019. The mission aims to detect, characterize and locate seismic activity on Mars, and to further constrain the internal structure, composition and dynamics of the planet. Here, we present seismometer data recorded until 30 September 2019, which reveal that Mars is seismically active. We identify 174 marsquakes, comprising two distinct populations: 150 small-magnitude, high-frequency events with waves propagating at crustal depths and 24 low-frequency, subcrustal events of magnitude M w 3–4 with waves propagating at various depths in the mantle. These marsquakes have spectral characteristics similar to the seismicity observed on the Earth and Moon. We determine that two of the largest detected marsquakes were located near the Cerberus Fossae fracture system. From the recorded seismicity, we constrain attenuation in the crust and mantle, and find indications of a potential low-S-wave-velocity layer in the upper mantle. Mars is seismically active: 24 subcrustal magnitude 3–4 marsquakes and 150 smaller events have been identified up to 30 September 2019, by an analysis of seismometer data from the InSight lander.