scispace - formally typeset
Search or ask a question

Showing papers by "University of Grenoble published in 2019"


Journal ArticleDOI
TL;DR: Together, rational deployment of prevention, attainment of global goals for viral hepatitis eradication, and improvements in HCC surveillance and therapy hold promise for achieving a substantial reduction in the worldwide HCC burden within the next few decades.
Abstract: Hepatocellular carcinoma (HCC) is the fourth most common cause of cancer-related death worldwide. Risk factors for HCC include chronic hepatitis B and hepatitis C, alcohol addiction, metabolic liver disease (particularly nonalcoholic fatty liver disease) and exposure to dietary toxins such as aflatoxins and aristolochic acid. All these risk factors are potentially preventable, highlighting the considerable potential of risk prevention for decreasing the global burden of HCC. HCC surveillance and early detection increase the chance of potentially curative treatment; however, HCC surveillance is substantially underutilized, even in countries with sufficient medical resources. Early-stage HCC can be treated curatively by local ablation, surgical resection or liver transplantation. Treatment selection depends on tumour characteristics, the severity of underlying liver dysfunction, age, other medical comorbidities, and available medical resources and local expertise. Catheter-based locoregional treatment is used in patients with intermediate-stage cancer. Kinase and immune checkpoint inhibitors have been shown to be effective treatment options in patients with advanced-stage HCC. Together, rational deployment of prevention, attainment of global goals for viral hepatitis eradication, and improvements in HCC surveillance and therapy hold promise for achieving a substantial reduction in the worldwide HCC burden within the next few decades.

2,122 citations


Journal ArticleDOI
TL;DR: This is the first study to report global prevalence of obstructive sleep apnoea; with almost 1 billion people affected, and with prevalence exceeding 50% in some countries, effective diagnostic and treatment strategies are needed to minimise the negative health impacts and to maximise cost-effectiveness.

1,487 citations


Journal ArticleDOI
TL;DR: In this 8th release of JASPAR, the CORE collection has been expanded with 245 new PFMs, and 156 PFMs were updated, and the genomic tracks, inference tool, and TF-binding profile similarity clusters were updated.
Abstract: JASPAR (http://jaspar.genereg.net) is an open-access database of curated, non-redundant transcription factor (TF)-binding profiles stored as position frequency matrices (PFMs) for TFs across multiple species in six taxonomic groups. In this 8th release of JASPAR, the CORE collection has been expanded with 245 new PFMs (169 for vertebrates, 42 for plants, 17 for nematodes, 10 for insects, and 7 for fungi), and 156 PFMs were updated (125 for vertebrates, 28 for plants and 3 for insects). These new profiles represent an 18% expansion compared to the previous release. JASPAR 2020 comes with a novel collection of unvalidated TF-binding profiles for which our curators did not find orthogonal supporting evidence in the literature. This collection has a dedicated web form to engage the community in the curation of unvalidated TF-binding profiles. Moreover, we created a Q&A forum to ease the communication between the user community and JASPAR curators. Finally, we updated the genomic tracks, inference tool, and TF-binding profile similarity clusters. All the data is available through the JASPAR website, its associated RESTful API, and through the JASPAR2020 R/Bioconductor package.

1,219 citations


Journal ArticleDOI
Željko Ivezić1, Steven M. Kahn2, J. Anthony Tyson3, Bob Abel4  +332 moreInstitutions (55)
TL;DR: The Large Synoptic Survey Telescope (LSST) as discussed by the authors is a large, wide-field ground-based system designed to obtain repeated images covering the sky visible from Cerro Pachon in northern Chile.
Abstract: We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the solar system, exploring the transient optical sky, and mapping the Milky Way. LSST will be a large, wide-field ground-based system designed to obtain repeated images covering the sky visible from Cerro Pachon in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2 field of view, a 3.2-gigapixel camera, and six filters (ugrizy) covering the wavelength range 320–1050 nm. The project is in the construction phase and will begin regular survey operations by 2022. About 90% of the observing time will be devoted to a deep-wide-fast survey mode that will uniformly observe a 18,000 deg2 region about 800 times (summed over all six bands) during the anticipated 10 yr of operations and will yield a co-added map to r ~ 27.5. These data will result in databases including about 32 trillion observations of 20 billion galaxies and a similar number of stars, and they will serve the majority of the primary science programs. The remaining 10% of the observing time will be allocated to special projects such as Very Deep and Very Fast time domain surveys, whose details are currently under discussion. We illustrate how the LSST science drivers led to these choices of system parameters, and we describe the expected data products and their characteristics.

921 citations


Journal ArticleDOI
TL;DR: In this paper, the authors acknowledge support from the EU FET Open RIA Grant No 766566, the Ministry of Education of the Czech Republic Grant No LM2015087 and LNSM-LNSpin.
Abstract: A M was supported by the King Abdullah University of Science and Technology (KAUST) T J acknowledges support from the EU FET Open RIA Grant No 766566, the Ministry of Education of the Czech Republic Grant No LM2015087 and LNSM-LNSpin, and the Grant Agency of the Czech Republic Grant No 19-28375X J S acknowledges the Alexander von Humboldt Foundation, EU FET Open Grant No 766566, EU ERC Synergy Grant No 610115, and the Transregional Collaborative Research Center (SFB/TRR) 173 SPIN+X K G and P G acknowledge stimulating discussions with C O Avci and financial support by the Swiss National Science Foundation (Grants No 200021-153404 and No 200020-172775) and the European Commission under the Seventh Framework Program (spOt project, Grant No 318144) A T acknowledges support by the Agence Nationale de la Recherche, Project No ANR-17-CE24-0025 (TopSky) J Ž acknowledges the Grant Agency of the Czech Republic Grant No 19-18623Y and support from the Institute of Physics of the Czech Academy of Sciences and the Max Planck Society through the Max Planck Partner Group programme

863 citations


Journal ArticleDOI
TL;DR: In certain subgroups, PFS was positively associated with PD-L1 expression (KRAS, EGFR) and with smoking status (BRAF, HER2) and the lack of response in the ALK group was notable.

719 citations


Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott2, T. D. Abbott, Fausto Acernese3  +1157 moreInstitutions (70)
TL;DR: In this paper, the authors improved initial estimates of the binary's properties, including component masses, spins, and tidal parameters, using the known source location, improved modeling, and recalibrated Virgo data.
Abstract: On August 17, 2017, the Advanced LIGO and Advanced Virgo gravitational-wave detectors observed a low-mass compact binary inspiral. The initial sky localization of the source of the gravitational-wave signal, GW170817, allowed electromagnetic observatories to identify NGC 4993 as the host galaxy. In this work, we improve initial estimates of the binary's properties, including component masses, spins, and tidal parameters, using the known source location, improved modeling, and recalibrated Virgo data. We extend the range of gravitational-wave frequencies considered down to 23 Hz, compared to 30 Hz in the initial analysis. We also compare results inferred using several signal models, which are more accurate and incorporate additional physical effects as compared to the initial analysis. We improve the localization of the gravitational-wave source to a 90% credible region of 16 deg2. We find tighter constraints on the masses, spins, and tidal parameters, and continue to find no evidence for nonzero component spins. The component masses are inferred to lie between 1.00 and 1.89 M when allowing for large component spins, and to lie between 1.16 and 1.60 M (with a total mass 2.73-0.01+0.04 M) when the spins are restricted to be within the range observed in Galactic binary neutron stars. Using a precessing model and allowing for large component spins, we constrain the dimensionless spins of the components to be less than 0.50 for the primary and 0.61 for the secondary. Under minimal assumptions about the nature of the compact objects, our constraints for the tidal deformability parameter Λ are (0,630) when we allow for large component spins, and 300-230+420 (using a 90% highest posterior density interval) when restricting the magnitude of the component spins, ruling out several equation-of-state models at the 90% credible level. Finally, with LIGO and GEO600 data, we use a Bayesian analysis to place upper limits on the amplitude and spectral energy density of a possible postmerger signal.

715 citations


Journal ArticleDOI
TL;DR: During the entire period, the mass loss concentrated in areas closest to warm, salty, subsurface, circumpolar deep water (CDW), consistent with enhanced polar westerlies pushing CDW toward Antarctica to melt its floating ice shelves, destabilize the glaciers, and raise sea level.
Abstract: We use updated drainage inventory, ice thickness, and ice velocity data to calculate the grounding line ice discharge of 176 basins draining the Antarctic Ice Sheet from 1979 to 2017. We compare the results with a surface mass balance model to deduce the ice sheet mass balance. The total mass loss increased from 40 ± 9 Gt/y in 1979–1990 to 50 ± 14 Gt/y in 1989–2000, 166 ± 18 Gt/y in 1999–2009, and 252 ± 26 Gt/y in 2009–2017. In 2009–2017, the mass loss was dominated by the Amundsen/Bellingshausen Sea sectors, in West Antarctica (159 ± 8 Gt/y), Wilkes Land, in East Antarctica (51 ± 13 Gt/y), and West and Northeast Peninsula (42 ± 5 Gt/y). The contribution to sea-level rise from Antarctica averaged 3.6 ± 0.5 mm per decade with a cumulative 14.0 ± 2.0 mm since 1979, including 6.9 ± 0.6 mm from West Antarctica, 4.4 ± 0.9 mm from East Antarctica, and 2.5 ± 0.4 mm from the Peninsula (i.e., East Antarctica is a major participant in the mass loss). During the entire period, the mass loss concentrated in areas closest to warm, salty, subsurface, circumpolar deep water (CDW), that is, consistent with enhanced polar westerlies pushing CDW toward Antarctica to melt its floating ice shelves, destabilize the glaciers, and raise sea level.

654 citations


Journal ArticleDOI
TL;DR: In patients with acute major bleeding associated with the use of a factor Xa inhibitor, treatment with andexanet markedly reduced anti–factor Xa activity, and 82% of patients had excellent or good hemostatic efficacy at 12 hours, as adjudicated according to prespecified criteria.
Abstract: Background Andexanet alfa is a modified recombinant inactive form of human factor Xa developed for reversal of factor Xa inhibitors. Methods We evaluated 352 patients who had acute major b...

612 citations


Journal ArticleDOI
TL;DR: In this paper, the authors presented a paper on the African Climate and Development Initiative (ACDI) in South Africa, focusing on the effects of climate change on the local environment.
Abstract: 1 Department of Forest Ecosystems and Society, Oregon State University, Corvallis, OR 97331, USA 2 School of Life and Environmental Sciences, The University of Sydney, Sydney, NSW 2006, Australia 3 Conservation Biology Institute, 136 SW Washington Avenue, Suite 202, Corvallis, OR 97333, USA 4 African Climate and Development Initiative, University of Cape Town, Cape Town, 7700, South Africa. 5 The Fletcher School and Global Development and Environment Institute, Tufts University, Medford, MA, USA

609 citations


Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1491 moreInstitutions (239)
TL;DR: In this article, the authors present the second volume of the Future Circular Collider Conceptual Design Report, devoted to the electron-positron collider FCC-ee, and present the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan.
Abstract: In response to the 2013 Update of the European Strategy for Particle Physics, the Future Circular Collider (FCC) study was launched, as an international collaboration hosted by CERN. This study covers a highest-luminosity high-energy lepton collider (FCC-ee) and an energy-frontier hadron collider (FCC-hh), which could, successively, be installed in the same 100 km tunnel. The scientific capabilities of the integrated FCC programme would serve the worldwide community throughout the 21st century. The FCC study also investigates an LHC energy upgrade, using FCC-hh technology. This document constitutes the second volume of the FCC Conceptual Design Report, devoted to the electron-positron collider FCC-ee. After summarizing the physics discovery opportunities, it presents the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan. FCC-ee can be built with today’s technology. Most of the FCC-ee infrastructure could be reused for FCC-hh. Combining concepts from past and present lepton colliders and adding a few novel elements, the FCC-ee design promises outstandingly high luminosity. This will make the FCC-ee a unique precision instrument to study the heaviest known particles (Z, W and H bosons and the top quark), offering great direct and indirect sensitivity to new physics.

Journal ArticleDOI
01 Apr 2019-Nature
TL;DR: This article used an extrapolation of glaciological and geodetic observations to show that glaciers contributed 27 ± 22 millimetres to global mean sea-level rise from 1961 to 2016.
Abstract: Glaciers distinct from the Greenland and Antarctic ice sheets cover an area of approximately 706,000 square kilometres globally1, with an estimated total volume of 170,000 cubic kilometres, or 0.4 metres of potential sea-level-rise equivalent2. Retreating and thinning glaciers are icons of climate change3 and affect regional runoff4 as well as global sea level5,6. In past reports from the Intergovernmental Panel on Climate Change, estimates of changes in glacier mass were based on the multiplication of averaged or interpolated results from available observations of a few hundred glaciers by defined regional glacier areas7–10. For data-scarce regions, these results had to be complemented with estimates based on satellite altimetry and gravimetry11. These past approaches were challenged by the small number and heterogeneous spatiotemporal distribution of in situ measurement series and their often unknown ability to represent their respective mountain ranges, as well as by the spatial limitations of satellite altimetry (for which only point data are available) and gravimetry (with its coarse resolution). Here we use an extrapolation of glaciological and geodetic observations to show that glaciers contributed 27 ± 22 millimetres to global mean sea-level rise from 1961 to 2016. Regional specific-mass-change rates for 2006–2016 range from −0.1 metres to −1.2 metres of water equivalent per year, resulting in a global sea-level contribution of 335 ± 144 gigatonnes, or 0.92 ± 0.39 millimetres, per year. Although statistical uncertainty ranges overlap, our conclusions suggest that glacier mass loss may be larger than previously reported11. The present glacier mass loss is equivalent to the sea-level contribution of the Greenland Ice Sheet12, clearly exceeds the loss from the Antarctic Ice Sheet13, and accounts for 25 to 30 per cent of the total observed sea-level rise14. Present mass-loss rates indicate that glaciers could almost disappear in some mountain ranges in this century, while heavily glacierized regions will continue to contribute to sea-level rise beyond 2100. The largest collection so far of glaciological and geodetic observations suggests that glaciers contributed about 27 millimetres to sea-level rise from 1961 to 2016, at rates of ice loss that could see the disappearance of many glaciers this century.

Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, T. D. Abbott2, Sheelu Abraham3  +1215 moreInstitutions (134)
TL;DR: In this paper, the mass, spin, and redshift distributions of binary black hole (BBH) mergers with LIGO and Advanced Virgo observations were analyzed using phenomenological population models.
Abstract: We present results on the mass, spin, and redshift distributions with phenomenological population models using the 10 binary black hole (BBH) mergers detected in the first and second observing runs completed by Advanced LIGO and Advanced Virgo. We constrain properties of the BBH mass spectrum using models with a range of parameterizations of the BBH mass and spin distributions. We find that the mass distribution of the more massive BH in such binaries is well approximated by models with no more than 1% of BHs more massive than 45 M and a power-law index of (90% credibility). We also show that BBHs are unlikely to be composed of BHs with large spins aligned to the orbital angular momentum. Modeling the evolution of the BBH merger rate with redshift, we show that it is flat or increasing with redshift with 93% probability. Marginalizing over uncertainties in the BBH population, we find robust estimates of the BBH merger rate density of R= (90% credibility). As the BBH catalog grows in future observing runs, we expect that uncertainties in the population model parameters will shrink, potentially providing insights into the formation of BHs via supernovae, binary interactions of massive stars, stellar cluster dynamics, and the formation history of BHs across cosmic time.

Journal ArticleDOI
TL;DR: In this paper, the authors describe the main characteristics of CNRM-CM6-1, the fully coupled atmosphere-ocean general circulation model of sixth generation jointly developed by Centre National de Recherches Meteorologiques (CNRM) and Cerfacs for the sixth phase of the Coupled Model Intercomparison Project 6 (CMIP6).
Abstract: This paper describes the main characteristics of CNRM-CM6-1, the fully coupled atmosphere-ocean general circulation model of sixth generation jointly developed by Centre National de Recherches Meteorologiques (CNRM) and Cerfacs for the sixth phase of the Coupled Model Intercomparison Project 6 (CMIP6). The paper provides a description of each component of CNRM-CM6-1, including the coupling method and the new online output software. We emphasize where model's components have been updated with respect to the former model version, CNRM-CM5.1. In particular, we highlight major improvements in the representation of atmospheric and land processes. A particular attention has also been devoted to mass and energy conservation in the simulated climate system to limit long-term drifts. The climate simulated by CNRM-CM6-1 is then evaluated using CMIP6 historical and Diagnostic, Evaluation and Characterization of Klima (DECK) experiments in comparison with CMIP5 CNRM-CM5.1 equivalent experiments. Overall, the mean surface biases are of similar magnitude but with different spatial patterns. Deep ocean biases are generally reduced, whereas sea ice is too thin in the Arctic. Although the simulated climate variability remains roughly consistent with CNRM-CM5.1, its sensitivity to rising CO 2 has increased: the equilibrium climate sensitivity is 4.9 K, which is now close to the upper bound of the range estimated from CMIP5 models.

Journal Article
TL;DR: The largest collection so far of glaciological and geodetic observations suggests that glaciers contributed about 27 millimetres to sea-level rise from 1961 to 2016, at rates of ice loss that could see the disappearance of many glaciers this century.
Abstract: Glaciers distinct from the Greenland and Antarctic ice sheets cover an area of approximately 706,000 square kilometres globally1, with an estimated total volume of 170,000 cubic kilometres, or 0.4 metres of potential sea-level-rise equivalent2. Retreating and thinning glaciers are icons of climate change3 and affect regional runoff4 as well as global sea level5,6. In past reports from the Intergovernmental Panel on Climate Change, estimates of changes in glacier mass were based on the multiplication of averaged or interpolated results from available observations of a few hundred glaciers by defined regional glacier areas7–10. For data-scarce regions, these results had to be complemented with estimates based on satellite altimetry and gravimetry11. These past approaches were challenged by the small number and heterogeneous spatiotemporal distribution of in situ measurement series and their often unknown ability to represent their respective mountain ranges, as well as by the spatial limitations of satellite altimetry (for which only point data are available) and gravimetry (with its coarse resolution). Here we use an extrapolation of glaciological and geodetic observations to show that glaciers contributed 27 ± 22 millimetres to global mean sea-level rise from 1961 to 2016. Regional specific-mass-change rates for 2006–2016 range from −0.1 metres to −1.2 metres of water equivalent per year, resulting in a global sea-level contribution of 335 ± 144 gigatonnes, or 0.92 ± 0.39 millimetres, per year. Although statistical uncertainty ranges overlap, our conclusions suggest that glacier mass loss may be larger than previously reported11. The present glacier mass loss is equivalent to the sea-level contribution of the Greenland Ice Sheet12, clearly exceeds the loss from the Antarctic Ice Sheet13, and accounts for 25 to 30 per cent of the total observed sea-level rise14. Present mass-loss rates indicate that glaciers could almost disappear in some mountain ranges in this century, while heavily glacierized regions will continue to contribute to sea-level rise beyond 2100.The largest collection so far of glaciological and geodetic observations suggests that glaciers contributed about 27 millimetres to sea-level rise from 1961 to 2016, at rates of ice loss that could see the disappearance of many glaciers this century.

Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1496 moreInstitutions (238)
TL;DR: In this paper, the authors describe the detailed design and preparation of a construction project for a post-LHC circular energy frontier collider in collaboration with national institutes, laboratories and universities worldwide, and enhanced by a strong participation of industrial partners.
Abstract: Particle physics has arrived at an important moment of its history. The discovery of the Higgs boson, with a mass of 125 GeV, completes the matrix of particles and interactions that has constituted the “Standard Model” for several decades. This model is a consistent and predictive theory, which has so far proven successful at describing all phenomena accessible to collider experiments. However, several experimental facts do require the extension of the Standard Model and explanations are needed for observations such as the abundance of matter over antimatter, the striking evidence for dark matter and the non-zero neutrino masses. Theoretical issues such as the hierarchy problem, and, more in general, the dynamical origin of the Higgs mechanism, do likewise point to the existence of physics beyond the Standard Model. This report contains the description of a novel research infrastructure based on a highest-energy hadron collider with a centre-of-mass collision energy of 100 TeV and an integrated luminosity of at least a factor of 5 larger than the HL-LHC. It will extend the current energy frontier by almost an order of magnitude. The mass reach for direct discovery will reach several tens of TeV, and allow, for example, to produce new particles whose existence could be indirectly exposed by precision measurements during the earlier preceding e+e– collider phase. This collider will also precisely measure the Higgs self-coupling and thoroughly explore the dynamics of electroweak symmetry breaking at the TeV scale, to elucidate the nature of the electroweak phase transition. WIMPs as thermal dark matter candidates will be discovered, or ruled out. As a single project, this particle collider infrastructure will serve the world-wide physics community for about 25 years and, in combination with a lepton collider (see FCC conceptual design report volume 2), will provide a research tool until the end of the 21st century. Collision energies beyond 100 TeV can be considered when using high-temperature superconductors. The European Strategy for Particle Physics (ESPP) update 2013 stated “To stay at the forefront of particle physics, Europe needs to be in a position to propose an ambitious post-LHC accelerator project at CERN by the time of the next Strategy update”. The FCC study has implemented the ESPP recommendation by developing a long-term vision for an “accelerator project in a global context”. This document describes the detailed design and preparation of a construction project for a post-LHC circular energy frontier collider “in collaboration with national institutes, laboratories and universities worldwide”, and enhanced by a strong participation of industrial partners. Now, a coordinated preparation effort can be based on a core of an ever-growing consortium of already more than 135 institutes worldwide. The technology for constructing a high-energy circular hadron collider can be brought to the technology readiness level required for constructing within the coming ten years through a focused R&D programme. The FCC-hh concept comprises in the baseline scenario a power-saving, low-temperature superconducting magnet system based on an evolution of the Nb3Sn technology pioneered at the HL-LHC, an energy-efficient cryogenic refrigeration infrastructure based on a neon-helium (Nelium) light gas mixture, a high-reliability and low loss cryogen distribution infrastructure based on Invar, high-power distributed beam transfer using superconducting elements and local magnet energy recovery and re-use technologies that are already gradually introduced at other CERN accelerators. On a longer timescale, high-temperature superconductors can be developed together with industrial partners to achieve an even more energy efficient particle collider or to reach even higher collision energies.The re-use of the LHC and its injector chain, which also serve for a concurrently running physics programme, is an essential lever to come to an overall sustainable research infrastructure at the energy frontier. Strategic R&D for FCC-hh aims at minimising construction cost and energy consumption, while maximising the socio-economic impact. It will mitigate technology-related risks and ensure that industry can benefit from an acceptable utility. Concerning the implementation, a preparatory phase of about eight years is both necessary and adequate to establish the project governance and organisation structures, to build the international machine and experiment consortia, to develop a territorial implantation plan in agreement with the host-states’ requirements, to optimise the disposal of land and underground volumes, and to prepare the civil engineering project. Such a large-scale, international fundamental research infrastructure, tightly involving industrial partners and providing training at all education levels, will be a strong motor of economic and societal development in all participating nations. The FCC study has implemented a set of actions towards a coherent vision for the world-wide high-energy and particle physics community, providing a collaborative framework for topically complementary and geographically well-balanced contributions. This conceptual design report lays the foundation for a subsequent infrastructure preparatory and technical design phase.

Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1501 moreInstitutions (239)
TL;DR: In this article, the physics opportunities of the Future Circular Collider (FC) were reviewed, covering its e+e-, pp, ep and heavy ion programs, and the measurement capabilities of each FCC component, addressing the study of electroweak, Higgs and strong interactions.
Abstract: We review the physics opportunities of the Future Circular Collider, covering its e+e-, pp, ep and heavy ion programmes. We describe the measurement capabilities of each FCC component, addressing the study of electroweak, Higgs and strong interactions, the top quark and flavour, as well as phenomena beyond the Standard Model. We highlight the synergy and complementarity of the different colliders, which will contribute to a uniquely coherent and ambitious research programme, providing an unmatchable combination of precision and sensitivity to new physics.

Journal ArticleDOI
20 Sep 2019-Science
TL;DR: The climate change–impact literature is reviewed, expanding on the recent report of the Intergovernmental Panel on Climate Change, and it is argued that impacts accelerating as a function of distance from the optimal temperature for an organism or an ecosystem process is a consequence of impacts accelerating.
Abstract: Increased concentrations of atmospheric greenhouse gases have led to a global mean surface temperature 1.0°C higher than during the pre-industrial period. We expand on the recent IPCC Special Report on global warming of 1.5°C and review the additional risks associated with higher levels of warming, each having major implications for multiple geographies, climates, and ecosystems. Limiting warming to 1.5°C rather than 2.0°C would be required to maintain substantial proportions of ecosystems and would have clear benefits for human health and economies. These conclusions are relevant for people everywhere, particularly in low- and middle-income countries, where the escalation of climate-related risks may prevent the achievement of the United Nations Sustainable Development Goals.

Journal ArticleDOI
TL;DR: In this article, the authors provide a guide to CNF chemical pretreatment possibilities, optimize its production, and exhaustively report the available CNF and bacterial cellulose (BC) chemical modification techniques capable of producing high value-added materials.

Journal ArticleDOI
Jean-Luc Beuzit1, Jean-Luc Beuzit2, Arthur Vigan1, David Mouillet2, Kjetil Dohlen1, Raffaele Gratton3, Anthony Boccaletti4, Jean-François Sauvage1, Jean-François Sauvage5, H. M. Schmid6, Maud Langlois1, Maud Langlois7, Cyril Petit5, Andrea Baruffolo3, M. Feldt8, Julien Milli9, Zahed Wahhaj9, L. Abe10, U. Anselmi3, Jacopo Antichi3, Rudy Barette1, J. Baudrand4, Pierre Baudoz4, Andreas Bazzon6, P. Bernardi4, P. Blanchard1, R. Brast9, Pietro Bruno3, Tristan Buey4, Marcel Carbillet10, M. Carle1, Enrico Cascone11, F. Chapron4, Julien Charton2, Gael Chauvin12, Gael Chauvin2, Riccardo Claudi3, Anne Costille1, V. De Caprio11, J. de Boer13, A. Delboulbe2, Silvano Desidera3, Carsten Dominik14, Mark Downing9, O. Dupuis4, Christophe Fabron1, Daniela Fantinel3, G. Farisato3, Philippe Feautrier2, Enrico Fedrigo9, Thierry Fusco5, Thierry Fusco1, P. Gigan4, Christian Ginski14, Christian Ginski13, Julien Girard15, Julien Girard2, Enrico Giro3, D. Gisler6, L. Gluck2, Cecile Gry1, Th. Henning8, Norbert Hubin9, Emmanuel Hugot1, S. Incorvaia3, M. Jaquet1, M. Kasper9, Eric Lagadec10, Anne-Marie Lagrange2, H. Le Coroller1, D. Le Mignant1, B. Le Ruyet4, G. Lessio3, J. L. Lizon9, M. Llored1, Lars Lundin9, F. Madec1, Yves Magnard2, M. Marteaud4, Patrice Martinez10, D. Maurel2, Francois Menard2, Dino Mesa3, O. Möller-Nilsson8, Thibaut Moulin2, C. Moutou1, Alain Origne1, J. Parisot4, A. Pavlov8, D. Perret4, J. Pragt, Pascal Puget2, P. Rabou2, Joany Andreina Manjarres Ramos8, J.-M. Reess4, F. Rigal, S. Rochat2, Ronald Roelfsema, Gérard Rousset4, A. Roux2, Michel Saisse1, Bernardo Salasnich3, E. Sant'Ambrogio3, Salvo Scuderi3, Damien Ségransan16, Arnaud Sevin4, Ralf Siebenmorgen9, Christian Soenke9, Eric Stadler2, Marcos Suarez9, D. Tiphène4, Massimo Turatto3, Stéphane Udry16, Farrokh Vakili10, L. B. F. M. Waters14, L. B. F. M. Waters17, L. Weber16, Francois Wildi16, Gérard Zins9, Alice Zurlo1, Alice Zurlo18 
TL;DR: The Spectro-Polarimetic High contrast imager for Exoplanets REsearch (SPHERE) was designed and built for the ESO Very Large Telescope (VLT) in Chile as discussed by the authors.
Abstract: Observations of circumstellar environments that look for the direct signal of exoplanets and the scattered light from disks have significant instrumental implications. In the past 15 years, major developments in adaptive optics, coronagraphy, optical manufacturing, wavefront sensing, and data processing, together with a consistent global system analysis have brought about a new generation of high-contrast imagers and spectrographs on large ground-based telescopes with much better performance. One of the most productive imagers is the Spectro-Polarimetic High contrast imager for Exoplanets REsearch (SPHERE), which was designed and built for the ESO Very Large Telescope (VLT) in Chile. SPHERE includes an extreme adaptive optics system, a highly stable common path interface, several types of coronagraphs, and three science instruments. Two of them, the Integral Field Spectrograph (IFS) and the Infra-Red Dual-band Imager and Spectrograph (IRDIS), were designed to efficiently cover the near-infrared range in a single observation for an efficient search of young planets. The third instrument, ZIMPOL, was designed for visible polarimetric observation to look for the reflected light of exoplanets and the light scattered by debris disks. These three scientific instruments enable the study of circumstellar environments at unprecedented angular resolution, both in the visible and the near-infrared. In this work, we thoroughly present SPHERE and its on-sky performance after four years of operations at the VLT.

Journal ArticleDOI
Kazuhiko Nakagawa1, Edward B Garon2, Takashi Seto, Makoto Nishio3, Santiago Ponce Aix4, Luis Paz-Ares4, Chao-Hua Chiu5, Keunchil Park6, Silvia Novello7, Ernest Nadal, Fumio Imamura, Kiyotaka Yoh, Jin-Yuan Shih8, Kwok Hung Au, Denis Moro-Sibilot9, Sotaro Enatsu10, Annamaria Zimmermann10, Bente Frimodt-Moller10, Carla Visseren-Grul, Martin Reck, Quincy Chu, Alexis B. Cortot, Jean-Louis Pujol, Elizabeth Fabre, Corinne Lamour, Helge Bischoff, Jens Kollmeier, M Kimmich, Walburga Engel-Riedel, Stefan Hammerschmidt, Wolfgang Schütte, Konstantinos N. Syrigos, James Chung-Man Ho, Kwok-Hung Au, Andrea Ardizzoni, Giulia Pasello, Vanessa Gregorc, Alessandro Del Conte, Domenico Galetta, Toshiaki Takahashi, Toru Kumagai, Katsuyuki Hotta, Yasushi Goto, Yukio Hosomi, Hiroshi Sakai, Yuichi Takiguchi, Young Hak Kim, Takayasu Kurata, Hiroyuki Yamaguchi, Haruko Daga, Isamu Okamoto, Miyako Satouchi, Satoshi Ikeda, Kazuo Kasahara, Shinji Atagi, Koichi Azuma, Keisuke Aoe, Yoshitsugu Horio, Nobuyuki Yamamoto, Hiroshi Tanaka, Satoshi Watanabe, Naoyuki Nogami, Tomohiro Ozaki, Ryo Koyama, Tomonori Hirashima, Hiroyasu Kaneda, Keisuke Tomii, Yuka Fujita, Masahiro Seike, Naoki Nishimura, Terufumi Kato, Masao Ichiki, Hideo Saka, Katsuya Hirano, Yasuharu Nakahara, Shunichi Sugawara, Sang-We Kim, Young Joo Min, Hyun Woo Lee, Jin-Hyoung Kang, Ho Jung An, Ki Hyeong Lee, Jin Soo Kim, Gyeong-Won Lee, Sung Yong Lee, A. Alexandru, Anghel Adrian Udrea, Óscar Juan-Vidal, Ernest Nadal-Alforja, Ignacio Gil-Bazo, Santiago Ponce-Aix, Belén Rubio-Viqueira, Miriam Alonso Garcia, Enriqueta Felip Font, Jose Fuentes Pradera, Juan Coves Sarto, Meng-Chih Lin, Wu Chou Su, Te Chun Hsia, Gee-Chen Chang, Yu-Feng Wei, Jian Su, Irfan Cicin, Tuncay Göksel, Hakan Harputluoglu, Ozgur Ozyilkan, Ivo Henning, Sanjay Popat, Olivia Hatcher, Kathryn Mileham, Jared Acoba, Edward B. Garon2, Gabriel Jung, Moses Sundar Raj, William J. Martin, Shaker R. Dakhil 
TL;DR: The RELAY trial as mentioned in this paper evaluated erlotinib, an EGFR tyrosine kinase inhibitor (TKI) standard of care, plus ramucirumab, a human IgG1 VEGFR2 antagonist, or placebo in patients with untreated EGFR-mutated metastatic NSCLC.
Abstract: Summary Background Dual blockade of the EGFR and VEGF pathways in EGFR-mutated metastatic non-small-cell lung cancer (NSCLC) is supported by preclinical and clinical data, yet the approach is not widely implemented. RELAY assessed erlotinib, an EGFR tyrosine kinase inhibitor (TKI) standard of care, plus ramucirumab, a human IgG1 VEGFR2 antagonist, or placebo in patients with untreated EGFR-mutated metastatic NSCLC. Methods This is a worldwide, double-blind, phase 3 trial done in 100 hospitals, clinics, and medical centres in 13 countries. Eligible patients were aged 18 years or older (20 years or older in Japan and Taiwan) at the time of study entry, had stage IV NSCLC, with an EGFR exon 19 deletion (ex19del) or exon 21 substitution (Leu858Arg) mutation, an Eastern Cooperative Oncology Group performance status of 0 or 1, and no CNS metastases. We randomly assigned eligible patients in a 1:1 ratio to receive oral erlotinib (150 mg/day) plus either intravenous ramucirumab (10 mg/kg) or matching placebo once every 2 weeks. Randomisation was done by an interactive web response system with a computer-generated sequence and stratified by sex, geographical region, EGFR mutation type, and EGFR testing method. The primary endpoint was investigator-assessed progression-free survival in the intention-to-treat population. Safety was assessed in all patients who received at least one dose of study treatment. This trial is registered at ClinicalTrials.gov, NCT02411448, and is ongoing for long-term survival follow-up. Findings Between Jan 28, 2016, and Feb 1, 2018, 449 eligible patients were enrolled and randomly assigned to treatment with ramucirumab plus erlotinib (n=224) or placebo plus erlotinib (n=225). Median duration of follow-up was 20·7 months (IQR 15·8–27·2). At the time of primary analysis, progression-free survival was significantly longer in the ramucirumab plus erlotinib group (19·4 months [95% CI 15·4–21·6]) than in the placebo plus erlotinib group (12·4 months [11·0–13·5]), with a stratified hazard ratio of 0·59 (95% CI 0·46–0·76; p Interpretation Ramucirumab plus erlotinib demonstrated superior progression-free survival compared with placebo plus erlotinib in patients with untreated EGFR-mutated metastatic NSCLC. Safety was consistent with the safety profiles of the individual compounds in advanced lung cancer. The RELAY regimen is a viable new treatment option for the initial treatment of EGFR-mutated metastatic NSCLC. Funding Eli Lilly.

Journal ArticleDOI
TL;DR: This ERS task force summarises the most recent scientific and methodological developments regarding respiratory mechanics and respiratory muscle assessment by addressing the validity, precision, reproducibility, prognostic value and responsiveness to interventions of various methods.
Abstract: Assessing respiratory mechanics and muscle function is critical for both clinical practice and research purposes. Several methodological developments over the past two decades have enhanced our understanding of respiratory muscle function and responses to interventions across the spectrum of health and disease. They are especially useful in diagnosing, phenotyping and assessing treatment efficacy in patients with respiratory symptoms and neuromuscular diseases. Considerable research has been undertaken over the past 17 years, since the publication of the previous American Thoracic Society (ATS)/European Respiratory Society (ERS) statement on respiratory muscle testing in 2002. Key advances have been made in the field of mechanics of breathing, respiratory muscle neurophysiology (electromyography, electroencephalography and transcranial magnetic stimulation) and on respiratory muscle imaging (ultrasound, optoelectronic plethysmography and structured light plethysmography). Accordingly, this ERS task force reviewed the field of respiratory muscle testing in health and disease, with particular reference to data obtained since the previous ATS/ERS statement. It summarises the most recent scientific and methodological developments regarding respiratory mechanics and respiratory muscle assessment by addressing the validity, precision, reproducibility, prognostic value and responsiveness to interventions of various methods. A particular emphasis is placed on assessment during exercise, which is a useful condition to stress the respiratory system.

Journal ArticleDOI
Nabila Aghanim1, Yashar Akrami2, Yashar Akrami3, Yashar Akrami4  +213 moreInstitutions (66)
TL;DR: The 2018 Planck CMB likelihoods were presented in this paper, following a hybrid approach similar to the 2015 one, with different approximations at low and high multipoles, and implementing several methodological and analysis refinements.
Abstract: This paper describes the 2018 Planck CMB likelihoods, following a hybrid approach similar to the 2015 one, with different approximations at low and high multipoles, and implementing several methodological and analysis refinements. With more realistic simulations, and better correction and modelling of systematics, we can now make full use of the High Frequency Instrument polarization data. The low-multipole 100x143 GHz EE cross-spectrum constrains the reionization optical-depth parameter $\tau$ to better than 15% (in combination with with the other low- and high-$\ell$ likelihoods). We also update the 2015 baseline low-$\ell$ joint TEB likelihood based on the Low Frequency Instrument data, which provides a weaker $\tau$ constraint. At high multipoles, a better model of the temperature-to-polarization leakage and corrections for the effective calibrations of the polarization channels (polarization efficiency or PE) allow us to fully use the polarization spectra, improving the constraints on the $\Lambda$CDM parameters by 20 to 30% compared to TT-only constraints. Tests on the modelling of the polarization demonstrate good consistency, with some residual modelling uncertainties, the accuracy of the PE modelling being the main limitation. Using our various tests, simulations, and comparison between different high-$\ell$ implementations, we estimate the consistency of the results to be better than the 0.5$\sigma$ level. Minor curiosities already present before (differences between $\ell$ 800 parameters or the preference for more smoothing of the $C_\ell$ peaks) are shown to be driven by the TT power spectrum and are not significantly modified by the inclusion of polarization. Overall, the legacy Planck CMB likelihoods provide a robust tool for constraining the cosmological model and represent a reference for future CMB observations. (Abridged)

Journal ArticleDOI
TL;DR: Nielsen et al. as discussed by the authors presented a statistical analysis of the first 300 stars observed by the Gemini Planet Imager Exoplanet Survey (GPEES) to infer the underlying distributions of substellar companions with respect to their mass, semimajor axis, and host stellar mass.
Abstract: Author(s): Nielsen, EL; De Rosa, RJ; Macintosh, B; Wang, JJ; Ruffio, JB; Chiang, E; Marley, MS; Saumon, D; Savransky, D; Mark Ammons, S; Bailey, VP; Barman, T; Blain, C; Bulger, J; Burrows, A; Chilcote, J; Cotten, T; Czekala, I; Doyon, R; Duchene, G; Esposito, TM; Fabrycky, D; Fitzgerald, MP; Follette, KB; Fortney, JJ; Gerard, BL; Goodsell, SJ; Graham, JR; Greenbaum, AZ; Hibon, P; Hinkley, S; Hirsch, LA; Hom, J; Hung, LW; Ilene Dawson, R; Ingraham, P; Kalas, P; Konopacky, Q; Larkin, JE; Lee, EJ; Lin, JW; Maire, J; Marchis, F; Marois, C; Metchev, S; Millar-Blanchaer, MA; Morzinski, KM; Oppenheimer, R; Palmer, D; Patience, J; Perrin, M; Poyneer, L; Pueyo, L; Rafikov, RR; Rajan, A; Rameau, J; Rantakyro, FT; Ren, B; Schneider, AC; Sivaramakrishnan, A; Song, I; Soummer, R; Tallis, M; Thomas, S; Ward-Duong, K; Wolff, S | Abstract: We present a statistical analysis of the first 300 stars observed by the Gemini Planet Imager Exoplanet Survey. This subsample includes six detected planets and three brown dwarfs; from these detections and our contrast curves we infer the underlying distributions of substellar companions with respect to their mass, semimajor axis, and host stellar mass. We uncover a strong correlation between planet occurrence rate and host star mass, with stars M ∗ g1.5 M o more likely to host planets with masses between 2 and 13M Jup and semimajor axes of 3-100 au at 99.92% confidence. We fit a double power-law model in planet mass (m) and semimajor axis (a) for planet populations around high-mass stars (M ∗ g1.5 M o) of the form , finding α = -2.4 +0.8 and β = -2.0 +0.5, and an integrated occurrence rate of % between 5-13M Jup and 10-100 au. A significantly lower occurrence rate is obtained for brown dwarfs around all stars, with % of stars hosting a brown dwarf companion between 13-80M Jup and 10-100 au. Brown dwarfs also appear to be distributed differently in mass and semimajor axis compared to giant planets; whereas giant planets follow a bottom-heavy mass distribution and favor smaller semimajor axes, brown dwarfs exhibit just the opposite behaviors. Comparing to studies of short-period giant planets from the radial velocity method, our results are consistent with a peak in occurrence of giant planets between ∼1 and 10 au. We discuss how these trends, including the preference of giant planets for high-mass host stars, point to formation of giant planets by core/pebble accretion, and formation of brown dwarfs by gravitational instability.

Journal ArticleDOI
TL;DR: This multicentre randomised, non-comparative, open-label, phase 2 trial aimed to prospectively assess the anti-PD-1 monoclonal antibody alone or in combination with anti-cytotoxic T-lymphocyte protein 4 (CTLA-4) antibody in patients with malignant pleural mesothelioma.
Abstract: Summary Background There is no recommended therapy for malignant pleural mesothelioma that has progressed after first-line pemetrexed and platinum-based chemotherapy. Disease control has been less than 30% in all previous studies of second-line drugs. Preliminary results have suggested that anti-programmed cell death 1 (PD-1) monoclonal antibody could be efficacious in these patients. We thus aimed to prospectively assess the anti-PD-1 monoclonal antibody alone or in combination with anti-cytotoxic T-lymphocyte protein 4 (CTLA-4) antibody in patients with malignant pleural mesothelioma. Methods This multicentre randomised, non-comparative, open-label, phase 2 trial was done at 21 hospitals in France. Eligible patients were aged 18 years or older with an Eastern Cooperative Oncology Group performance status of 0–1, histologically proven malignant pleural mesothelioma progressing after first-line or second-line pemetrexed and platinum-based treatments, measurable disease by CT, and life expectancy greater than 12 weeks. Patients were randomly allocated (1:1) to receive intravenous nivolumab (3 mg/kg bodyweight) every 2 weeks, or intravenous nivolumab (3 mg/kg every 2 weeks) plus intravenous ipilimumab (1 mg/kg every 6 weeks), given until progression or unacceptable toxicity. Central randomisation was stratified by histology (epithelioid vs non-epithelioid), treatment line (second line vs third line), and chemosensitivity to previous treatment (progression ≥3 months vs Findings Between March 24 and August 25, 2016, 125 eligible patients were recruited and assigned to either nivolumab (n=63) or nivolumab plus ipilimumab (n=62). In the first 108 eligible patients, 12-week disease control was achieved by 24 (44%; 95% CI 31–58) of 54 patients in the nivolumab group and 27 (50%; 37–63) of 54 patients in the nivolumab plus ipilimumab group. In the intention-to-treat population, 12-week disease control was achieved by 25 (40%; 28–52) of 63 patients in the nivolumab group and 32 (52%; 39–64) of 62 patients in the combination group. Nine (14%) of 63 patients in the nivolumab group and 16 (26%) of 61 patients in the combination group had grade 3–4 toxicities. The most frequent grade 3 adverse events were asthenia (one [2%] in the nivolumab group vs three [5%] in the combination group), asymptomatic increase in aspartate aminotransferase or alanine aminotransferase (none vs four [7%] of each), and asymptomatic lipase increase (two [3%] vs one [2%]). No patients had toxicities leading to death in the nivolumab group, whereas three (5%) of 62 in the combination group did (one fulminant hepatitis, one encephalitis, and one acute kidney failure). Interpretation Anti-PD-1 nivolumab monotherapy or nivolumab plus anti-CTLA-4 ipilimumab combination therapy both showed promising activity in relapsed patients with malignant pleural mesothelioma, without unexpected toxicity. These regimens require confirmation in larger clinical trials. Funding French Cooperative Thoracic Intergroup.

Journal ArticleDOI
TL;DR: In this article, the authors reported on the detection of strong Hα emission from two distinct locations in the PDS 70 system, one corresponding to the previously discovered planet PDS70 b, which confirms the earlier Hα detection, and another located close to the outer edge of the gap.
Abstract: Newly forming protoplanets are expected to create cavities and substructures in young, gas-rich protoplanetary disks1–3, but they are difficult to detect as they could be confused with disk features affected by advanced image analysis techniques4,5. Recently, a planet was discovered inside the gap of the transitional disk of the T Tauri star PDS 706,7. Here, we report on the detection of strong Hα emission from two distinct locations in the PDS 70 system, one corresponding to the previously discovered planet PDS 70 b, which confirms the earlier Hα detection8, and another located close to the outer edge of the gap, coinciding with a previously identified bright dust spot in the disk and with a small opening in a ring of molecular emission6,7,9. We identify this second Hα peak as a second protoplanet in the PDS 70 system. The Hα emission spectra of both protoplanets indicate ongoing accretion onto the protoplanets10,11, which appear to be near a 2:1 mean motion resonance. Our observations show that adaptive-optics-assisted, medium-resolution integral field spectroscopy with MUSE12 targeting accretion signatures will be a powerful way to trace ongoing planet formation in transitional disks at different stages of their evolution. Finding more young planetary systems in mean motion resonance would give credibility to the Grand Tack hypothesis in which Jupiter and Saturn migrated in a resonance orbit during the early formation period of our Solar System13. Two Hα emission peaks are detected within the disk of the T Tauri star PDS 70: one corresponds to protoplanet PDS 70 b, and the other is associated with a second accreting planet of few Jupiter masses at ~35 au. The two protoplanets are near 2:1 mean motion resonance, supporting migration scenarios of giant planets during planetary formation.

Journal ArticleDOI
Tomotada Akutsu1, Masaki Ando1, Masaki Ando2, Koya Arai2  +199 moreInstitutions (48)
TL;DR: KAGRA as discussed by the authors is a 2.5-generation GW detector with two 3'km baseline arms arranged in an 'L' shape, similar to the second generations of Advanced LIGO and Advanced Virgo, but it will be operating at cryogenic temperatures with sapphire mirrors.
Abstract: The recent detections of gravitational waves (GWs) reported by the LIGO and Virgo collaborations have made a significant impact on physics and astronomy. A global network of GW detectors will play a key role in uncovering the unknown nature of the sources in coordinated observations with astronomical telescopes and detectors. Here we introduce KAGRA, a new GW detector with two 3 km baseline arms arranged in an ‘L’ shape. KAGRA’s design is similar to the second generations of Advanced LIGO and Advanced Virgo, but it will be operating at cryogenic temperatures with sapphire mirrors. This low-temperature feature is advantageous for improving the sensitivity around 100 Hz and is considered to be an important feature for the third-generation GW detector concept (for example, the Einstein Telescope of Europe or the Cosmic Explorer of the United States). Hence, KAGRA is often called a 2.5-generation GW detector based on laser interferometry. KAGRA’s first observation run is scheduled in late 2019, aiming to join the third observation run of the advanced LIGO–Virgo network. When operating along with the existing GW detectors, KAGRA will be helpful in locating GW sources more accurately and determining the source parameters with higher precision, providing information for follow-up observations of GW trigger candidates.

Journal ArticleDOI
TL;DR: Retrospective analyses suggest that the echinoderm microtubule-associated protein-like 4 gene-ALK variant (EML4-ALK) may influence ALK-inhibitor treatment benefit, and alectinib continues to demonstrate superior investigator-assessed PFS versus crizotinib in untreated ALk-positive NSCLC, irrespective of EML4-alk variant.

Journal ArticleDOI
TL;DR: In this article, the Earth system (ES) model of second generation developed by CNRM-CERFACS for the sixth phase of the Coupled Model Intercomparison Project (CMIP6) was compared to the Atmosphere Ocean General Circulation Model (AOCM) by adding interactive ES components such as carbon cycle, aerosols, and atmospheric chemistry.
Abstract: This study introduces CNRM‐ESM2‐1, the Earth system (ES) model of second generation developed by CNRM‐CERFACS for the sixth phase of the Coupled Model Intercomparison Project (CMIP6). CNRM‐ESM2‐1 offers a higher model complexity than the Atmosphere‐Ocean General Circulation Model CNRM‐CM6‐1 by adding interactive ES components such as carbon cycle, aerosols, and atmospheric chemistry. As both models share the same code, physical parameterizations, and grid resolution, they offer a fully traceable framework to investigate how far the represented ES processes impact the model performance over present‐day, response to external forcing and future climate projections. Using a large variety of CMIP6 experiments, we show that represented ES processes impact more prominently the model response to external forcing than the model performance over present‐day. Both models display comparable performance at replicating modern observations although the mean climate of CNRM‐ESM2‐1 is slightly warmer than that of CNRM‐CM6‐1. This difference arises from land cover‐aerosol interactions where the use of different soil vegetation distributions between both models impacts the rate of dust emissions. This interaction results in a smaller aerosol burden in CNRM‐ESM2‐1 than in CNRM‐CM6‐1, leading to a different surface radiative budget and climate. Greater differences are found when comparing the model response to external forcing and future climate projections. Represented ES processes damp future warming by up to 10% in CNRM‐ESM2‐1 with respect to CNRM‐CM6‐1. The representation of land vegetation and the CO2‐water‐stomatal feedback between both models explain about 60% of this difference. The remainder is driven by other ES feedbacks such as the natural aerosol feedback.

Journal ArticleDOI
TL;DR: In this paper, a novel spectral mixture model, called the augmented linear mixing model (ARMLM), is proposed to address spectral variability by applying a data-driven learning strategy in inverse problems of hyperspectral unmixing.
Abstract: Hyperspectral imagery collected from airborne or satellite sources inevitably suffers from spectral variability, making it difficult for spectral unmixing to accurately estimate abundance maps. The classical unmixing model, the linear mixing model (LMM), generally fails to handle this sticky issue effectively. To this end, we propose a novel spectral mixture model, called the augmented LMM, to address spectral variability by applying a data-driven learning strategy in inverse problems of hyperspectral unmixing. The proposed approach models the main spectral variability (i.e., scaling factors) generated by variations in illumination or typography separately by means of the endmember dictionary. It then models other spectral variabilities caused by environmental conditions (e.g., local temperature and humidity and atmospheric effects) and instrumental configurations (e.g., sensor noise), and material nonlinear mixing effects, by introducing a spectral variability dictionary. To effectively run the data-driven learning strategy, we also propose a reasonable prior knowledge for the spectral variability dictionary, whose atoms are assumed to be low-coherent with spectral signatures of endmembers, which leads to a well-known low-coherence dictionary learning problem. Thus, a dictionary learning technique is embedded in the framework of spectral unmixing so that the algorithm can learn the spectral variability dictionary and estimate the abundance maps simultaneously. Extensive experiments on synthetic and real datasets are performed to demonstrate the superiority and effectiveness of the proposed method in comparison with the previous state-of-the-art methods.