scispace - formally typeset
Search or ask a question

Showing papers by "University of Colorado Boulder published in 2019"


Journal ArticleDOI
Kazunori Akiyama, Antxon Alberdi1, Walter Alef2, Keiichi Asada3  +403 moreInstitutions (82)
TL;DR: In this article, the Event Horizon Telescope was used to reconstruct event-horizon-scale images of the supermassive black hole candidate in the center of the giant elliptical galaxy M87.
Abstract: When surrounded by a transparent emission region, black holes are expected to reveal a dark shadow caused by gravitational light bending and photon capture at the event horizon. To image and study this phenomenon, we have assembled the Event Horizon Telescope, a global very long baseline interferometry array observing at a wavelength of 1.3 mm. This allows us to reconstruct event-horizon-scale images of the supermassive black hole candidate in the center of the giant elliptical galaxy M87. We have resolved the central compact radio source as an asymmetric bright emission ring with a diameter of 42 +/- 3 mu as, which is circular and encompasses a central depression in brightness with a flux ratio greater than or similar to 10: 1. The emission ring is recovered using different calibration and imaging schemes, with its diameter and width remaining stable over four different observations carried out in different days. Overall, the observed image is consistent with expectations for the shadow of a Kerr black hole as predicted by general relativity. The asymmetry in brightness in the ring can be explained in terms of relativistic beaming of the emission from a plasma rotating close to the speed of light around a black hole. We compare our images to an extensive library of ray-traced general-relativistic magnetohydrodynamic simulations of black holes and derive a central mass of M = (6.5 +/- 0.7) x 10(9) M-circle dot. Our radio-wave observations thus provide powerful evidence for the presence of supermassive black holes in centers of galaxies and as the central engines of active galactic nuclei. They also present a new tool to explore gravity in its most extreme limit and on a mass scale that was so far not accessible.

2,589 citations


Journal ArticleDOI
TL;DR: Efforts to reverse global trends in freshwater degradation now depend on bridging an immense gap between the aspirations of conservation biologists and the accelerating rate of species endangerment.
Abstract: In the 12 years since Dudgeon et al. (2006) reviewed major pressures on freshwater ecosystems, the biodiversity crisis in the world’s lakes, reservoirs, rivers, streams and wetlands has deepened. While lakes, reservoirs and rivers cover only 2.3% of the Earth’s surface, these ecosystems host at least 9.5% of the Earth’s described animal species. Furthermore, using the World Wide Fund for Nature’s Living Planet Index, freshwater population declines (83% between 1970 and 2014) continue to outpace contemporaneous declines in marine or terrestrial systems. The Anthropocene has brought multiple new and varied threats that disproportionately impact freshwater systems. We document 12 emerging threats to freshwater biodiversity that are either entirely new since 2006 or have since intensified: (i) changing climates; (ii) e-commerce and invasions; (iii) infectious diseases; (iv) harmful algal blooms; (v) expanding hydropower; (vi) emerging contaminants; (vii) engineered nanomaterials; (viii) microplastic pollution; (ix) light and noise; (x) freshwater salinisation; (xi) declining calcium; and (xii) cumulative stressors. Effects are evidenced for amphibians, fishes, invertebrates, microbes, plants, turtles and waterbirds, with potential for ecosystem-level changes through bottom-up and top-down processes. In our highly uncertain future, the net effects of these threats raise serious concerns for freshwater ecosystems. However, we also highlight opportunities for conservation gains as a result of novel management tools (e.g. environmental flows, environmental DNA) and specific conservation-oriented actions (e.g. dam removal, habitat protection policies,managed relocation of species) that have been met with varying levels of success.Moving forward, we advocate hybrid approaches that manage fresh waters as crucial ecosystems for human life support as well as essential hotspots of biodiversity and ecological function. Efforts to reverse global trends in freshwater degradation now depend on bridging an immense gap between the aspirations of conservation biologists and the accelerating rate of species endangerment.

1,230 citations


Journal ArticleDOI
TL;DR: The NCCN Guidelines for Prostate Cancer include recommendations regarding diagnosis, risk stratification and workup, treatment options for localized disease, and management of recurrent and advanced disease for clinicians who treat patients with prostate cancer.
Abstract: The NCCN Guidelines for Prostate Cancer include recommendations regarding diagnosis, risk stratification and workup, treatment options for localized disease, and management of recurrent and advanced disease for clinicians who treat patients with prostate cancer. The portions of the guidelines included herein focus on the roles of germline and somatic genetic testing, risk stratification with nomograms and tumor multigene molecular testing, androgen deprivation therapy, secondary hormonal therapy, chemotherapy, and immunotherapy in patients with prostate cancer.

1,218 citations


Journal ArticleDOI
21 Jan 2019
TL;DR: CheXpert as discussed by the authors is a large dataset of chest radiographs of 65,240 patients annotated by 3 board-certified radiologists with 14 observations in radiology reports, capturing uncertainties inherent in radiograph interpretation.
Abstract: Large, labeled datasets have driven deep learning methods to achieve expert-level performance on a variety of medical imaging tasks. We present CheXpert, a large dataset that contains 224,316 chest radiographs of 65,240 patients. We design a labeler to automatically detect the presence of 14 observations in radiology reports, capturing uncertainties inherent in radiograph interpretation. We investigate different approaches to using the uncertainty labels for training convolutional neural networks that output the probability of these observations given the available frontal and lateral radiographs. On a validation set of 200 chest radiographic studies which were manually annotated by 3 board-certified radiologists, we find that different uncertainty approaches are useful for different pathologies. We then evaluate our best model on a test set composed of 500 chest radiographic studies annotated by a consensus of 5 board-certified radiologists, and compare the performance of our model to that of 3 additional radiologists in the detection of 5 selected pathologies. On Cardiomegaly, Edema, and Pleural Effusion, the model ROC and PR curves lie above all 3 radiologist operating points. We release the dataset to the public as a standard benchmark to evaluate performance of chest radiograph interpretation models.

1,070 citations


Journal ArticleDOI
TL;DR: Recommendations are made on how accelerated testing should be performed to rapidly develop solar cells that are both extraordinarily efficient and stable.
Abstract: This review article examines the current state of understanding in how metal halide perovskite solar cells can degrade when exposed to moisture, oxygen, heat, light, mechanical stress, and reverse bias. It also highlights strategies for improving stability, such as tuning the composition of the perovskite, introducing hydrophobic coatings, replacing metal electrodes with carbon or transparent conducting oxides, and packaging. The article concludes with recommendations on how accelerated testing should be performed to rapidly develop solar cells that are both extraordinarily efficient and stable.

962 citations


Journal ArticleDOI
TL;DR: The authors show the operational environment of asteroid Bennu, validate its photometric phase function and demonstrate the accelerating rotational rate due to YORP effect using the data acquired during the approach phase of OSIRIS-REx mission.
Abstract: During its approach to asteroid (101955) Bennu, NASA’s Origins, Spectral Interpretation, Resource Identification, and Security-Regolith Explorer (OSIRIS-REx) spacecraft surveyed Bennu’s immediate environment, photometric properties, and rotation state. Discovery of a dusty environment, a natural satellite, or unexpected asteroid characteristics would have had consequences for the mission’s safety and observation strategy. Here we show that spacecraft observations during this period were highly sensitive to satellites (sub-meter scale) but reveal none, although later navigational images indicate that further investigation is needed. We constrain average dust production in September 2018 from Bennu’s surface to an upper limit of 150 g s–1 averaged over 34 min. Bennu’s disk-integrated photometric phase function validates measurements from the pre-encounter astronomical campaign. We demonstrate that Bennu’s rotation rate is accelerating continuously at 3.63 ± 0.52 × 10–6 degrees day–2, likely due to the Yarkovsky–O’Keefe–Radzievskii–Paddack (YORP) effect, with evolutionary implications.

905 citations


Journal ArticleDOI
TL;DR: The application and evolution of RE-AIM is described as well as lessons learned from its use, with increasing the emphasis on cost and adaptations to programs and expanding the use of qualitative methods to understand “how” and “why” results came about.
Abstract: The RE-AIM planning and evaluation framework was conceptualized two decades ago. As one of the most frequently applied implementation frameworks, RE-AIM has now been cited in over 2,800 publications. This paper describes the application and evolution of RE-AIM as well as lessons learned from its use. RE-AIM has been applied most often in public health and health behavior change research, but increasingly in more diverse content areas and within clinical, community, and corporate settings. We discuss challenges of using RE-AIM while encouraging a more pragmatic use of key dimensions rather than comprehensive applications of all elements. Current foci of RE-AIM include increasing the emphasis on cost and adaptations to programs and expanding the use of qualitative methods to understand "how" and "why" results came about. The framework will continue to evolve to focus on contextual and explanatory factors related to RE-AIM outcomes, package RE-AIM for use by non-researchers, and integrate RE-AIM with other pragmatic and reporting frameworks.

819 citations


Journal ArticleDOI
TL;DR: The 2019 report of The Lancet Countdown on health and climate change : ensuring that the health of a child born today is not defined by a changing climate is ensured.

794 citations


Posted Content
TL;DR: CheXpert as discussed by the authors is a large dataset of chest radiographs of 65,240 patients annotated by 3 board-certified radiologists with 14 observations in radiology reports, capturing uncertainties inherent in radiograph interpretation and different approaches to using the uncertainty labels for training convolutional neural networks that output the probability of these observations given the available frontal and lateral radiographs.
Abstract: Large, labeled datasets have driven deep learning methods to achieve expert-level performance on a variety of medical imaging tasks. We present CheXpert, a large dataset that contains 224,316 chest radiographs of 65,240 patients. We design a labeler to automatically detect the presence of 14 observations in radiology reports, capturing uncertainties inherent in radiograph interpretation. We investigate different approaches to using the uncertainty labels for training convolutional neural networks that output the probability of these observations given the available frontal and lateral radiographs. On a validation set of 200 chest radiographic studies which were manually annotated by 3 board-certified radiologists, we find that different uncertainty approaches are useful for different pathologies. We then evaluate our best model on a test set composed of 500 chest radiographic studies annotated by a consensus of 5 board-certified radiologists, and compare the performance of our model to that of 3 additional radiologists in the detection of 5 selected pathologies. On Cardiomegaly, Edema, and Pleural Effusion, the model ROC and PR curves lie above all 3 radiologist operating points. We release the dataset to the public as a standard benchmark to evaluate performance of chest radiograph interpretation models. The dataset is freely available at this https URL .

783 citations


Journal ArticleDOI
Kazunori Akiyama, Antxon Alberdi1, Walter Alef2, Keiichi Asada3  +394 moreInstitutions (78)
TL;DR: The Event Horizon Telescope (EHT) as mentioned in this paper is a very long baseline interferometry (VLBI) array that comprises millimeter and submillimeter-wavelength telescopes separated by distances comparable to the diameter of the Earth.
Abstract: The Event Horizon Telescope (EHT) is a very long baseline interferometry (VLBI) array that comprises millimeter- and submillimeter-wavelength telescopes separated by distances comparable to the diameter of the Earth. At a nominal operating wavelength of ~1.3 mm, EHT angular resolution (λ/D) is ~25 μas, which is sufficient to resolve nearby supermassive black hole candidates on spatial and temporal scales that correspond to their event horizons. With this capability, the EHT scientific goals are to probe general relativistic effects in the strong-field regime and to study accretion and relativistic jet formation near the black hole boundary. In this Letter we describe the system design of the EHT, detail the technology and instrumentation that enable observations, and provide measures of its performance. Meeting the EHT science objectives has required several key developments that have facilitated the robust extension of the VLBI technique to EHT observing wavelengths and the production of instrumentation that can be deployed on a heterogeneous array of existing telescopes and facilities. To meet sensitivity requirements, high-bandwidth digital systems were developed that process data at rates of 64 gigabit s^(−1), exceeding those of currently operating cm-wavelength VLBI arrays by more than an order of magnitude. Associated improvements include the development of phasing systems at array facilities, new receiver installation at several sites, and the deployment of hydrogen maser frequency standards to ensure coherent data capture across the array. These efforts led to the coordination and execution of the first Global EHT observations in 2017 April, and to event-horizon-scale imaging of the supermassive black hole candidate in M87.

756 citations


Journal ArticleDOI
TL;DR: Among patients with advanced heart failure, a fully magnetically levitated centrifugal‐flow left ventricular assist device was associated with less frequent need for pump replacement than an axial‐flow device and was superior with respect to survival free of disabling stroke or reoperation to replace or remove a malfunctioning device.
Abstract: Background In two interim analyses of this trial, patients with advanced heart failure who were treated with a fully magnetically levitated centrifugal-flow left ventricular assist device ...

Journal ArticleDOI
TL;DR: In certain subgroups, PFS was positively associated with PD-L1 expression (KRAS, EGFR) and with smoking status (BRAF, HER2) and the lack of response in the ALK group was notable.

Journal ArticleDOI
TL;DR: This selection from the NCCN Guidelines for Esophageal and Esophagogastric Junction Cancers focuses on recommendations for the management of locally advanced and metastatic adenocarcinoma of the esophagus and EGJ.
Abstract: Esophageal cancer is the sixth leading cause of cancer-related deaths worldwide. Squamous cell carcinoma is the most common histology in Eastern Europe and Asia, and adenocarcinoma is most common in North America and Western Europe. Surgery is a major component of treatment of locally advanced resectable esophageal and esophagogastric junction (EGJ) cancer, and randomized trials have shown that the addition of preoperative chemoradiation or perioperative chemotherapy to surgery significantly improves survival. Targeted therapies including trastuzumab, ramucirumab, and pembrolizumab have produced encouraging results in the treatment of patients with advanced or metastatic disease. Multidisciplinary team management is essential for all patients with esophageal and EGJ cancers. This selection from the NCCN Guidelines for Esophageal and Esophagogastric Junction Cancers focuses on recommendations for the management of locally advanced and metastatic adenocarcinoma of the esophagus and EGJ.

Journal ArticleDOI
24 May 2019-Science
TL;DR: By a process of complete delignification and densification of wood, a structural material with a mechanical strength of 404.3 megapascals is developed, more than eight times that of natural wood, resulting in continuous subambient cooling during both day and night.
Abstract: Reducing human reliance on energy-inefficient cooling methods such as air conditioning would have a large impact on the global energy landscape. By a process of complete delignification and densification of wood, we developed a structural material with a mechanical strength of 404.3 megapascals, more than eight times that of natural wood. The cellulose nanofibers in our engineered material backscatter solar radiation and emit strongly in mid-infrared wavelengths, resulting in continuous subambient cooling during both day and night. We model the potential impact of our cooling wood and find energy savings between 20 and 60%, which is most pronounced in hot and dry climates.

Journal ArticleDOI
03 May 2019-Science
TL;DR: The addition of guanidinium thiocyanate (GuaSCN) resulted in marked improvements in the structural and optoelectronic properties of Sn-Pb mixed, low-band gap perovskite films, enabling the demonstration of >20% efficient low–band gap PSCs.
Abstract: All-perovskite-based polycrystalline thin-film tandem solar cells have the potential to deliver efficiencies of >30%. However, the performance of all-perovskite-based tandem devices has been limited by the lack of high-efficiency, low-band gap tin-lead (Sn-Pb) mixed-perovskite solar cells (PSCs). We found that the addition of guanidinium thiocyanate (GuaSCN) resulted in marked improvements in the structural and optoelectronic properties of Sn-Pb mixed, low-band gap (~1.25 electron volt) perovskite films. The films have defect densities that are lower by a factor of 10, leading to carrier lifetimes of greater than 1 microsecond and diffusion lengths of 2.5 micrometers. These improved properties enable our demonstration of >20% efficient low-band gap PSCs. When combined with wider-band gap PSCs, we achieve 25% efficient four-terminal and 23.1% efficient two-terminal all-perovskite-based polycrystalline thin-film tandem solar cells.

Posted ContentDOI
Daniel Taliun1, Daniel N. Harris2, Michael D. Kessler2, Jedidiah Carlson1  +191 moreInstitutions (61)
06 Mar 2019-bioRxiv
TL;DR: The nearly complete catalog of genetic variation in TOPMed studies provides unique opportunities for exploring the contributions of rare and non-coding sequence variants to phenotypic variation as well as resources and early insights from the sequence data.
Abstract: Summary paragraph The Trans-Omics for Precision Medicine (TOPMed) program seeks to elucidate the genetic architecture and disease biology of heart, lung, blood, and sleep disorders, with the ultimate goal of improving diagnosis, treatment, and prevention. The initial phases of the program focus on whole genome sequencing of individuals with rich phenotypic data and diverse backgrounds. Here, we describe TOPMed goals and design as well as resources and early insights from the sequence data. The resources include a variant browser, a genotype imputation panel, and sharing of genomic and phenotypic data via dbGaP. In 53,581 TOPMed samples, >400 million single-nucleotide and insertion/deletion variants were detected by alignment with the reference genome. Additional novel variants are detectable through assembly of unmapped reads and customized analysis in highly variable loci. Among the >400 million variants detected, 97% have frequency

Journal ArticleDOI
TL;DR: The Community Land Model (CLM) is the land component of the Community Earth System Model (CESM) and is used in several global and regional modeling systems.
Abstract: The Community Land Model (CLM) is the land component of the Community Earth System Model (CESM) and is used in several global and regional modeling systems. In this paper, we introduce model developments included in CLM version 5 (CLM5), which is the default land component for CESM2. We assess an ensemble of simulations, including prescribed and prognostic vegetation state, multiple forcing data sets, and CLM4, CLM4.5, and CLM5, against a range of metrics including from the International Land Model Benchmarking (ILAMBv2) package. CLM5 includes new and updated processes and parameterizations: (1) dynamic land units, (2) updated parameterizations and structure for hydrology and snow (spatially explicit soil depth, dry surface layer, revised groundwater scheme, revised canopy interception and canopy snow processes, updated fresh snow density, simple firn model, and Model for Scale Adaptive River Transport), (3) plant hydraulics and hydraulic redistribution, (4) revised nitrogen cycling (flexible leaf stoichiometry, leaf N optimization for photosynthesis, and carbon costs for plant nitrogen uptake), (5) global crop model with six crop types and time‐evolving irrigated areas and fertilization rates, (6) updated urban building energy, (7) carbon isotopes, and (8) updated stomatal physiology. New optional features include demographically structured dynamic vegetation model (Functionally Assembled Terrestrial Ecosystem Simulator), ozone damage to plants, and fire trace gas emissions coupling to the atmosphere. Conclusive establishment of improvement or degradation of individual variables or metrics is challenged by forcing uncertainty, parametric uncertainty, and model structural complexity, but the multivariate metrics presented here suggest a general broad improvement from CLM4 to CLM5.

Journal ArticleDOI
TL;DR: This manuscript discusses guiding principles for the workup, staging, and treatment of early stage and locally advanced cervical cancer, as well as evidence for these recommendations.
Abstract: Cervical cancer is a malignant epithelial tumor that forms in the uterine cervix. Most cases of cervical cancer are preventable through human papilloma virus (HPV) vaccination, routine screening, and treatment of precancerous lesions. However, due to inadequate screening protocols in many regions of the world, cervical cancer remains the fourth-most common cancer in women globally. The complete NCCN Guidelines for Cervical Cancer provide recommendations for the diagnosis, evaluation, and treatment of cervical cancer. This manuscript discusses guiding principles for the workup, staging, and treatment of early stage and locally advanced cervical cancer, as well as evidence for these recommendations. For recommendations regarding treatment of recurrent or metastatic disease, please see the full guidelines on NCCN.org.

Journal ArticleDOI
TL;DR: The estimated US national MS prevalence for 2010 is the highest reported to date and provides evidence that the north-south gradient persists and has the potential to be used for other chronic neurologic conditions.
Abstract: Objective To generate a national multiple sclerosis (MS) prevalence estimate for the United States by applying a validated algorithm to multiple administrative health claims (AHC) datasets. Methods A validated algorithm was applied to private, military, and public AHC datasets to identify adult cases of MS between 2008 and 2010. In each dataset, we determined the 3-year cumulative prevalence overall and stratified by age, sex, and census region. We applied insurance-specific and stratum-specific estimates to the 2010 US Census data and pooled the findings to calculate the 2010 prevalence of MS in the United States cumulated over 3 years. We also estimated the 2010 prevalence cumulated over 10 years using 2 models and extrapolated our estimate to 2017. Results The estimated 2010 prevalence of MS in the US adult population cumulated over 10 years was 309.2 per 100,000 (95% confidence interval [CI] 308.1–310.1), representing 727,344 cases. During the same time period, the MS prevalence was 450.1 per 100,000 (95% CI 448.1–451.6) for women and 159.7 (95% CI 158.7–160.6) for men (female:male ratio 2.8). The estimated 2010 prevalence of MS was highest in the 55- to 64-year age group. A US north-south decreasing prevalence gradient was identified. The estimated MS prevalence is also presented for 2017. Conclusion The estimated US national MS prevalence for 2010 is the highest reported to date and provides evidence that the north-south gradient persists. Our rigorous algorithm-based approach to estimating prevalence is efficient and has the potential to be used for other chronic neurologic conditions.

Journal ArticleDOI
TL;DR: A severe test of their empirical prevalence using state-of-the-art statistical tools applied to nearly 1000 social, biological, technological, transportation, and information networks finds robust evidence that strongly scale-free structure is empirically rare, while for most networks, log-normal distributions fit the data as well or better than power laws.
Abstract: Real-world networks are often claimed to be scale free, meaning that the fraction of nodes with degree k follows a power law k−α, a pattern with broad implications for the structure and dynamics of complex systems. However, the universality of scale-free networks remains controversial. Here, we organize different definitions of scale-free networks and construct a severe test of their empirical prevalence using state-of-the-art statistical tools applied to nearly 1000 social, biological, technological, transportation, and information networks. Across these networks, we find robust evidence that strongly scale-free structure is empirically rare, while for most networks, log-normal distributions fit the data as well or better than power laws. Furthermore, social networks are at best weakly scale free, while a handful of technological and biological networks appear strongly scale free. These findings highlight the structural diversity of real-world networks and the need for new theoretical explanations of these non-scale-free patterns. Real-world networks are often said to be ”scale free”, meaning their degree distribution follows a power law. Broido and Clauset perform statistical tests of this claim using a large and diverse corpus of real-world networks, showing that scale-free structure is far from universal.

Journal ArticleDOI
TL;DR: In this article, the authors developed an accurate, physically interpretable, and one-dimensional tolerance factor, τ, that correctly predicts 92% of compounds as perovskite or nonperovskiy for an experimental dataset of 576 ABX 3 materials.
Abstract: Predicting the stability of the perovskite structure remains a long-standing challenge for the discovery of new functional materials for many applications including photovoltaics and electrocatalysts. We developed an accurate, physically interpretable, and one-dimensional tolerance factor, τ, that correctly predicts 92% of compounds as perovskite or nonperovskite for an experimental dataset of 576 ABX 3 materials ( X = O 2− , F − , Cl − , Br − , I − ) using a novel data analytics approach based on SISSO (sure independence screening and sparsifying operator). τ is shown to generalize outside the training set for 1034 experimentally realized single and double perovskites (91% accuracy) and is applied to identify 23,314 new double perovskites ( A 2 BB′X 6 ) ranked by their probability of being stable as perovskite. This work guides experimentalists and theorists toward which perovskites are most likely to be successfully synthesized and demonstrates an approach to descriptor identification that can be extended to arbitrary applications beyond perovskite stability predictions.

Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1491 moreInstitutions (239)
TL;DR: In this article, the authors present the second volume of the Future Circular Collider Conceptual Design Report, devoted to the electron-positron collider FCC-ee, and present the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan.
Abstract: In response to the 2013 Update of the European Strategy for Particle Physics, the Future Circular Collider (FCC) study was launched, as an international collaboration hosted by CERN. This study covers a highest-luminosity high-energy lepton collider (FCC-ee) and an energy-frontier hadron collider (FCC-hh), which could, successively, be installed in the same 100 km tunnel. The scientific capabilities of the integrated FCC programme would serve the worldwide community throughout the 21st century. The FCC study also investigates an LHC energy upgrade, using FCC-hh technology. This document constitutes the second volume of the FCC Conceptual Design Report, devoted to the electron-positron collider FCC-ee. After summarizing the physics discovery opportunities, it presents the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan. FCC-ee can be built with today’s technology. Most of the FCC-ee infrastructure could be reused for FCC-hh. Combining concepts from past and present lepton colliders and adding a few novel elements, the FCC-ee design promises outstandingly high luminosity. This will make the FCC-ee a unique precision instrument to study the heaviest known particles (Z, W and H bosons and the top quark), offering great direct and indirect sensitivity to new physics.

Journal ArticleDOI
TL;DR: The programmatic developments and institutional context for the Landsat program and the unique ability of Landsat to meet the needs of national and international programs are described and the key trends in Landsat science are presented.

Journal ArticleDOI
TL;DR: The NCCN Guidelines for Non-Small Cell Lung Cancer (NSCLC) as discussed by the authors address all aspects of management for NSCLC, focusing on recent updates in immunotherapy.
Abstract: The NCCN Guidelines for Non-Small Cell Lung Cancer (NSCLC) address all aspects of management for NSCLC. These NCCN Guidelines Insights focus on recent updates in immunotherapy. For the 2020 update, all of the systemic therapy regimens have been categorized using a new preference stratification system; certain regimens are now recommended as "preferred interventions," whereas others are categorized as either "other recommended interventions" or "useful under certain circumstances."

Journal ArticleDOI
TL;DR: In this article, an optical atomic clock based on quantum-logic spectroscopy of the S 0↔ −3 P 0 transition in Al −+ was proposed, with a systematic uncertainty of 9.4×10 −19 and a frequency stability of 1.2×10−15 −15/sqrt[τ].
Abstract: We describe an optical atomic clock based on quantum-logic spectroscopy of the ^{1}S_{0}↔^{3}P_{0} transition in ^{27}Al^{+} with a systematic uncertainty of 9.4×10^{-19} and a frequency stability of 1.2×10^{-15}/sqrt[τ]. A ^{25}Mg^{+} ion is simultaneously trapped with the ^{27}Al^{+} ion and used for sympathetic cooling and state readout. Improvements in a new trap have led to reduced secular motion heating, compared to previous ^{27}Al^{+} clocks, enabling clock operation with ion secular motion near the three-dimensional ground state. Operating the clock with a lower trap drive frequency has reduced excess micromotion compared to previous ^{27}Al^{+} clocks. Both of these improvements have led to a reduced time-dilation shift uncertainty. Other systematic uncertainties including those due to blackbody radiation and the second-order Zeeman effect have also been reduced.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated the effect of anthropogenic climate change on wildfire in western North America and especially in California and found that the response of summer forest fire area to atmospheric vapor pressure deficit (VPD) is exponential, meaning that warming has grown increasingly impactful.
Abstract: Recent fire seasons have fueled intense speculation regarding the effect of anthropogenic climate change on wildfire in western North America and especially in California. During 1972–2018, California experienced a fivefold increase in annual burned area, mainly due to more than an eightfold increase in summer forest‐fire extent. Increased summer forest‐fire area very likely occurred due to increased atmospheric aridity caused by warming. Since the early 1970s, warm‐season days warmed by approximately 1.4 °C as part of a centennial warming trend, significantly increasing the atmospheric vapor pressure deficit (VPD). These trends are consistent with anthropogenic trends simulated by climate models. The response of summer forest‐fire area to VPD is exponential, meaning that warming has grown increasingly impactful. Robust interannual relationships between VPD and summer forest‐fire area strongly suggest that nearly all of the increase in summer forest‐fire area during 1972–2018 was driven by increased VPD. Climate change effects on summer wildfire were less evident in nonforested lands. In fall, wind events and delayed onset of winter precipitation are the dominant promoters of wildfire. While these variables did not changemuch over the past century, backgroundwarming and consequent fuel drying is increasingly enhancing the potential for large fall wildfires. Among the many processes important to California's diverse fire regimes, warming‐driven fuel drying is the clearest link between anthropogenic climate change and increased California wildfire activity to date. Plain Language Summary Since the early 1970s, California's annual wildfire extent increased fivefold, punctuated by extremely large and destructive wildfires in 2017 and 2018. This trend was mainly due to an eightfold increase in summertime forest‐fire area and was very likely driven by drying of fuels promoted by human‐induced warming. Warming effects were also apparent in the fall by enhancing the odds that fuels are dry when strong fall wind events occur. The ability of dry fuels to promote large fires is nonlinear, which has allowed warming to become increasingly impactful. Human‐caused warming has already significantly enhanced wildfire activity in California, particularly in the forests of the Sierra Nevada and North Coast, and will likely continue to do so in the coming decades.

Journal ArticleDOI
TL;DR: The major changes to the 2012 and 2011 NCCN Guidelines for Soft Tissue Sarcoma pertain to the management of patients with gastrointestinal stromal tumors and desmoid tumors.
Abstract: The NCCN Guidelines for Kidney Cancer provide multidisciplinary recommendations for the clinical management of patients with clear cell and non-clear cell renal cell carcinoma, and are intended to assist with clinical decision-making. These NCCN Guidelines Insights summarize the NCCN Kidney Cancer Panel discussions for the 2020 update to the guidelines regarding initial management and first-line systemic therapy options for patients with advanced clear cell renal cell carcinoma.

Journal ArticleDOI
Albert M. Sirunyan, Armen Tumasyan, Wolfgang Adam1, Federico Ambrogi1  +2265 moreInstitutions (153)
TL;DR: Combined measurements of the production and decay rates of the Higgs boson, as well as its couplings to vector bosons and fermions, are presented and constraints are placed on various two Higgs doublet models.
Abstract: Combined measurements of the production and decay rates of the Higgs boson, as well as its couplings to vector bosons and fermions, are presented. The analysis uses the LHC proton–proton collision data set recorded with the CMS detector in 2016 at $\sqrt{s}=13\,\text {Te}\text {V} $ , corresponding to an integrated luminosity of 35.9 ${\,\text {fb}^{-1}} $ . The combination is based on analyses targeting the five main Higgs boson production mechanisms (gluon fusion, vector boson fusion, and associated production with a $\mathrm {W}$ or $\mathrm {Z}$ boson, or a top quark-antiquark pair) and the following decay modes: $\mathrm {H} \rightarrow \gamma \gamma $ , $\mathrm {Z}\mathrm {Z}$ , $\mathrm {W}\mathrm {W}$ , $\mathrm {\tau }\mathrm {\tau }$ , $\mathrm {b} \mathrm {b} $ , and $\mathrm {\mu }\mathrm {\mu }$ . Searches for invisible Higgs boson decays are also considered. The best-fit ratio of the signal yield to the standard model expectation is measured to be $\mu =1.17\pm 0.10$ , assuming a Higgs boson mass of $125.09\,\text {Ge}\text {V} $ . Additional results are given for various assumptions on the scaling behavior of the production and decay modes, including generic parametrizations based on ratios of cross sections and branching fractions or couplings. The results are compatible with the standard model predictions in all parametrizations considered. In addition, constraints are placed on various two Higgs doublet models.

Journal ArticleDOI
TL;DR: In this article, key observational indicators of climate change in the Arctic, most spanning a 47-year period (1971-2017) demonstrate fundamental changes among nine key elements of the Arctic system.
Abstract: Key observational indicators of climate change in the Arctic, most spanning a 47 year period (1971–2017) demonstrate fundamental changes among nine key elements of the Arctic system. We find that, coherent with increasing air temperature, there is an intensification of the hydrological cycle, evident from increases in humidity, precipitation, river discharge, glacier equilibrium line altitude and land ice wastage. Downward trends continue in sea ice thickness (and extent) and spring snow cover extent and duration, while near-surface permafrost continues to warm. Several of the climate indicators exhibit a significant statistical correlation with air temperature or precipitation, reinforcing the notion that increasing air temperatures and precipitation are drivers of major changes in various components of the Arctic system. To progress beyond a presentation of the Arctic physical climate changes, we find a correspondence between air temperature and biophysical indicators such as tundra biomass and identify numerous biophysical disruptions with cascading effects throughout the trophic levels. These include: increased delivery of organic matter and nutrients to Arctic near‐coastal zones; condensed flowering and pollination plant species periods; timing mismatch between plant flowering and pollinators; increased plant vulnerability to insect disturbance; increased shrub biomass; increased ignition of wildfires; increased growing season CO2 uptake, with counterbalancing increases in shoulder season and winter CO2 emissions; increased carbon cycling, regulated by local hydrology and permafrost thaw; conversion between terrestrial and aquatic ecosystems; and shifting animal distribution and demographics. The Arctic biophysical system is now clearly trending away from its 20th Century state and into an unprecedented state, with implications not only within but beyond the Arctic. The indicator time series of this study are freely downloadable at AMAP.no.

Journal ArticleDOI
TL;DR: Teplizumab delayed progression to clinical type 1 diabetes in high-risk participants and among the participants who were Hla-DR3-negative, HLA-DR4-positive, or anti-zinc transporter 8 antibody- negative, fewer participants in the teplizuab group than in the placebo group had diabetes diagnosed.
Abstract: Background Type 1 diabetes is a chronic autoimmune disease that leads to destruction of insulin-producing beta cells and dependence on exogenous insulin for survival. Some interventions ha...