scispace - formally typeset
Search or ask a question

Showing papers by "Queen's University published in 2019"


Journal ArticleDOI
TL;DR: Efforts to reverse global trends in freshwater degradation now depend on bridging an immense gap between the aspirations of conservation biologists and the accelerating rate of species endangerment.
Abstract: In the 12 years since Dudgeon et al. (2006) reviewed major pressures on freshwater ecosystems, the biodiversity crisis in the world’s lakes, reservoirs, rivers, streams and wetlands has deepened. While lakes, reservoirs and rivers cover only 2.3% of the Earth’s surface, these ecosystems host at least 9.5% of the Earth’s described animal species. Furthermore, using the World Wide Fund for Nature’s Living Planet Index, freshwater population declines (83% between 1970 and 2014) continue to outpace contemporaneous declines in marine or terrestrial systems. The Anthropocene has brought multiple new and varied threats that disproportionately impact freshwater systems. We document 12 emerging threats to freshwater biodiversity that are either entirely new since 2006 or have since intensified: (i) changing climates; (ii) e-commerce and invasions; (iii) infectious diseases; (iv) harmful algal blooms; (v) expanding hydropower; (vi) emerging contaminants; (vii) engineered nanomaterials; (viii) microplastic pollution; (ix) light and noise; (x) freshwater salinisation; (xi) declining calcium; and (xii) cumulative stressors. Effects are evidenced for amphibians, fishes, invertebrates, microbes, plants, turtles and waterbirds, with potential for ecosystem-level changes through bottom-up and top-down processes. In our highly uncertain future, the net effects of these threats raise serious concerns for freshwater ecosystems. However, we also highlight opportunities for conservation gains as a result of novel management tools (e.g. environmental flows, environmental DNA) and specific conservation-oriented actions (e.g. dam removal, habitat protection policies,managed relocation of species) that have been met with varying levels of success.Moving forward, we advocate hybrid approaches that manage fresh waters as crucial ecosystems for human life support as well as essential hotspots of biodiversity and ecological function. Efforts to reverse global trends in freshwater degradation now depend on bridging an immense gap between the aspirations of conservation biologists and the accelerating rate of species endangerment.

1,230 citations


Journal ArticleDOI
TL;DR: With prolonged follow-up, first-line pembrolizumab monotherapy continues to demonstrate an OS benefit over chemotherapy in patients with previously untreated, advanced NSCLC without EGFR/ALK aberrations, despite crossover from the control arm to pembrolezumab as subsequent therapy.
Abstract: PurposeIn the randomized, open-label, phase III KEYNOTE-024 study, pembrolizumab significantly improved progression-free survival and overall survival (OS) compared with platinum-based chemotherapy in patients with previously untreated advanced non–small-cell lung cancer (NSCLC) with a programmed death ligand 1 tumor proportion score of 50% or greater and without EGFR/ALK aberrations. We report an updated OS and tolerability analysis, including analyses adjusting for potential bias introduced by crossover from chemotherapy to pembrolizumab.Patients and MethodsPatients were randomly assigned to pembrolizumab 200 mg every 3 weeks (for up to 2 years) or investigator’s choice of platinum-based chemotherapy (four to six cycles). Patients assigned to chemotherapy could cross over to pembrolizumab upon meeting eligibility criteria. The primary end point was progression-free survival; OS was an important key secondary end point. Crossover adjustment analysis was done using the following three methods: simplified ...

988 citations


Journal ArticleDOI
TL;DR: Enzalutamide was associated with significantly longer progression-free and overall survival than standard care in men with metastatic, hormone-sensitive prostate cancer receiving testosterone suppression.
Abstract: Background Enzalutamide, an androgen-receptor inhibitor, has been associated with improved overall survival in men with castration-resistant prostate cancer. It is not known whether adding enzalutamide to testosterone suppression, with or without early docetaxel, will improve survival in men with metastatic, hormone-sensitive prostate cancer. Methods In this open-label, randomized, phase 3 trial, we assigned patients to receive testosterone suppression plus either open-label enzalutamide or a standard nonsteroidal antiandrogen therapy (standard-care group). The primary end point was overall survival. Secondary end points included progression-free survival as determined by the prostate-specific antigen (PSA) level, clinical progression-free survival, and adverse events. Results A total of 1125 men underwent randomization; the median follow-up was 34 months. There were 102 deaths in the enzalutamide group and 143 deaths in the standard-care group (hazard ratio, 0.67; 95% confidence interval [CI], 0.52 to 0.86; P = 0.002). Kaplan-Meier estimates of overall survival at 3 years were 80% (based on 94 events) in the enzalutamide group and 72% (based on 130 events) in the standard-care group. Better results with enzalutamide were also seen in PSA progression-free survival (174 and 333 events, respectively; hazard ratio, 0.39; P Conclusions Enzalutamide was associated with significantly longer progression-free and overall survival than standard care in men with metastatic, hormone-sensitive prostate cancer receiving testosterone suppression. The enzalutamide group had a higher incidence of seizures and other toxic effects, especially among those treated with early docetaxel. (Funded by Astellas Scientific and Medical Affairs and others; ENZAMET (ANZUP 1304) ANZCTR number, ACTRN12614000110684; ClinicalTrials.gov number, NCT02446405; and EU Clinical Trials Register number, 2014-003190-42.).

865 citations


Journal ArticleDOI
Nasim Mavaddat1, Kyriaki Michailidou1, Kyriaki Michailidou2, Joe Dennis1  +307 moreInstitutions (105)
TL;DR: This PRS, optimized for prediction of estrogen receptor (ER)-specific disease, from the largest available genome-wide association dataset is developed and empirically validated and is a powerful and reliable predictor of breast cancer risk that may improve breast cancer prevention programs.
Abstract: Stratification of women according to their risk of breast cancer based on polygenic risk scores (PRSs) could improve screening and prevention strategies. Our aim was to develop PRSs, optimized for prediction of estrogen receptor (ER)-specific disease, from the largest available genome-wide association dataset and to empirically validate the PRSs in prospective studies. The development dataset comprised 94,075 case subjects and 75,017 control subjects of European ancestry from 69 studies, divided into training and validation sets. Samples were genotyped using genome-wide arrays, and single-nucleotide polymorphisms (SNPs) were selected by stepwise regression or lasso penalized regression. The best performing PRSs were validated in an independent test set comprising 11,428 case subjects and 18,323 control subjects from 10 prospective studies and 190,040 women from UK Biobank (3,215 incident breast cancers). For the best PRSs (313 SNPs), the odds ratio for overall disease per 1 standard deviation in ten prospective studies was 1.61 (95%CI: 1.57-1.65) with area under receiver-operator curve (AUC) = 0.630 (95%CI: 0.628-0.651). The lifetime risk of overall breast cancer in the top centile of the PRSs was 32.6%. Compared with women in the middle quintile, those in the highest 1% of risk had 4.37- and 2.78-fold risks, and those in the lowest 1% of risk had 0.16- and 0.27-fold risks, of developing ER-positive and ER-negative disease, respectively. Goodness-of-fit tests indicated that this PRS was well calibrated and predicts disease risk accurately in the tails of the distribution. This PRS is a powerful and reliable predictor of breast cancer risk that may improve breast cancer prevention programs.

653 citations


Journal ArticleDOI
29 Apr 2019-eLife
TL;DR: The goal is to facilitate a more accurate use of the stop-signal task and provide user-friendly open-source resources intended to inform statistical-power considerations, facilitate the correct implementation of the task, and assist in proper data analysis.
Abstract: Response inhibition is essential for navigating everyday life. Its derailment is considered integral to numerous neurological and psychiatric disorders, and more generally, to a wide range of behavioral and health problems. Response-inhibition efficiency furthermore correlates with treatment outcome in some of these conditions. The stop-signal task is an essential tool to determine how quickly response inhibition is implemented. Despite its apparent simplicity, there are many features (ranging from task design to data analysis) that vary across studies in ways that can easily compromise the validity of the obtained results. Our goal is to facilitate a more accurate use of the stop-signal task. To this end, we provide 12 easy-to-implement consensus recommendations and point out the problems that can arise when they are not followed. Furthermore, we provide user-friendly open-source resources intended to inform statistical-power considerations, facilitate the correct implementation of the task, and assist in proper data analysis.

617 citations


Journal ArticleDOI
TL;DR: The evidence for visceral adiposity and ectopic fat as emerging risk factors for type 2 diabetes, atherosclerosis, and cardiovascular disease, with a focus on practical recommendations for health professionals and future directions for research and clinical practice is summarised.

553 citations


Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1491 moreInstitutions (239)
TL;DR: In this article, the authors present the second volume of the Future Circular Collider Conceptual Design Report, devoted to the electron-positron collider FCC-ee, and present the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan.
Abstract: In response to the 2013 Update of the European Strategy for Particle Physics, the Future Circular Collider (FCC) study was launched, as an international collaboration hosted by CERN. This study covers a highest-luminosity high-energy lepton collider (FCC-ee) and an energy-frontier hadron collider (FCC-hh), which could, successively, be installed in the same 100 km tunnel. The scientific capabilities of the integrated FCC programme would serve the worldwide community throughout the 21st century. The FCC study also investigates an LHC energy upgrade, using FCC-hh technology. This document constitutes the second volume of the FCC Conceptual Design Report, devoted to the electron-positron collider FCC-ee. After summarizing the physics discovery opportunities, it presents the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan. FCC-ee can be built with today’s technology. Most of the FCC-ee infrastructure could be reused for FCC-hh. Combining concepts from past and present lepton colliders and adding a few novel elements, the FCC-ee design promises outstandingly high luminosity. This will make the FCC-ee a unique precision instrument to study the heaviest known particles (Z, W and H bosons and the top quark), offering great direct and indirect sensitivity to new physics.

526 citations


Journal ArticleDOI
01 Apr 2019-Nature
TL;DR: This article used an extrapolation of glaciological and geodetic observations to show that glaciers contributed 27 ± 22 millimetres to global mean sea-level rise from 1961 to 2016.
Abstract: Glaciers distinct from the Greenland and Antarctic ice sheets cover an area of approximately 706,000 square kilometres globally1, with an estimated total volume of 170,000 cubic kilometres, or 0.4 metres of potential sea-level-rise equivalent2. Retreating and thinning glaciers are icons of climate change3 and affect regional runoff4 as well as global sea level5,6. In past reports from the Intergovernmental Panel on Climate Change, estimates of changes in glacier mass were based on the multiplication of averaged or interpolated results from available observations of a few hundred glaciers by defined regional glacier areas7–10. For data-scarce regions, these results had to be complemented with estimates based on satellite altimetry and gravimetry11. These past approaches were challenged by the small number and heterogeneous spatiotemporal distribution of in situ measurement series and their often unknown ability to represent their respective mountain ranges, as well as by the spatial limitations of satellite altimetry (for which only point data are available) and gravimetry (with its coarse resolution). Here we use an extrapolation of glaciological and geodetic observations to show that glaciers contributed 27 ± 22 millimetres to global mean sea-level rise from 1961 to 2016. Regional specific-mass-change rates for 2006–2016 range from −0.1 metres to −1.2 metres of water equivalent per year, resulting in a global sea-level contribution of 335 ± 144 gigatonnes, or 0.92 ± 0.39 millimetres, per year. Although statistical uncertainty ranges overlap, our conclusions suggest that glacier mass loss may be larger than previously reported11. The present glacier mass loss is equivalent to the sea-level contribution of the Greenland Ice Sheet12, clearly exceeds the loss from the Antarctic Ice Sheet13, and accounts for 25 to 30 per cent of the total observed sea-level rise14. Present mass-loss rates indicate that glaciers could almost disappear in some mountain ranges in this century, while heavily glacierized regions will continue to contribute to sea-level rise beyond 2100. The largest collection so far of glaciological and geodetic observations suggests that glaciers contributed about 27 millimetres to sea-level rise from 1961 to 2016, at rates of ice loss that could see the disappearance of many glaciers this century.

486 citations


Journal Article
TL;DR: The largest collection so far of glaciological and geodetic observations suggests that glaciers contributed about 27 millimetres to sea-level rise from 1961 to 2016, at rates of ice loss that could see the disappearance of many glaciers this century.
Abstract: Glaciers distinct from the Greenland and Antarctic ice sheets cover an area of approximately 706,000 square kilometres globally1, with an estimated total volume of 170,000 cubic kilometres, or 0.4 metres of potential sea-level-rise equivalent2. Retreating and thinning glaciers are icons of climate change3 and affect regional runoff4 as well as global sea level5,6. In past reports from the Intergovernmental Panel on Climate Change, estimates of changes in glacier mass were based on the multiplication of averaged or interpolated results from available observations of a few hundred glaciers by defined regional glacier areas7–10. For data-scarce regions, these results had to be complemented with estimates based on satellite altimetry and gravimetry11. These past approaches were challenged by the small number and heterogeneous spatiotemporal distribution of in situ measurement series and their often unknown ability to represent their respective mountain ranges, as well as by the spatial limitations of satellite altimetry (for which only point data are available) and gravimetry (with its coarse resolution). Here we use an extrapolation of glaciological and geodetic observations to show that glaciers contributed 27 ± 22 millimetres to global mean sea-level rise from 1961 to 2016. Regional specific-mass-change rates for 2006–2016 range from −0.1 metres to −1.2 metres of water equivalent per year, resulting in a global sea-level contribution of 335 ± 144 gigatonnes, or 0.92 ± 0.39 millimetres, per year. Although statistical uncertainty ranges overlap, our conclusions suggest that glacier mass loss may be larger than previously reported11. The present glacier mass loss is equivalent to the sea-level contribution of the Greenland Ice Sheet12, clearly exceeds the loss from the Antarctic Ice Sheet13, and accounts for 25 to 30 per cent of the total observed sea-level rise14. Present mass-loss rates indicate that glaciers could almost disappear in some mountain ranges in this century, while heavily glacierized regions will continue to contribute to sea-level rise beyond 2100.The largest collection so far of glaciological and geodetic observations suggests that glaciers contributed about 27 millimetres to sea-level rise from 1961 to 2016, at rates of ice loss that could see the disappearance of many glaciers this century.

439 citations


Journal ArticleDOI
TL;DR: The wild bootstrap was originally developed for regression models with heteroskedasticity of unknown form and has been extended to models estimated by instrumental variables over the past 30 years as discussed by the authors.
Abstract: The wild bootstrap was originally developed for regression models with heteroskedasticity of unknown form. Over the past 30 years, it has been extended to models estimated by instrumental variables...

436 citations


Journal ArticleDOI
TL;DR: FNDC5/irisin is placed as a novel agent capable of opposing synapse failure and memory impairment in AD, and restoration of its expression can ameliorate these phenotypes in rodent models.
Abstract: Defective brain hormonal signaling has been associated with Alzheimer's disease (AD), a disorder characterized by synapse and memory failure. Irisin is an exercise-induced myokine released on cleavage of the membrane-bound precursor protein fibronectin type III domain-containing protein 5 (FNDC5), also expressed in the hippocampus. Here we show that FNDC5/irisin levels are reduced in AD hippocampi and cerebrospinal fluid, and in experimental AD models. Knockdown of brain FNDC5/irisin impairs long-term potentiation and novel object recognition memory in mice. Conversely, boosting brain levels of FNDC5/irisin rescues synaptic plasticity and memory in AD mouse models. Peripheral overexpression of FNDC5/irisin rescues memory impairment, whereas blockade of either peripheral or brain FNDC5/irisin attenuates the neuroprotective actions of physical exercise on synaptic plasticity and memory in AD mice. By showing that FNDC5/irisin is an important mediator of the beneficial effects of exercise in AD models, our findings place FNDC5/irisin as a novel agent capable of opposing synapse failure and memory impairment in AD.

Journal ArticleDOI
TL;DR: Control of Confounding and Reporting of Results in Causal Inference Studies Guidance for Authors from Editors of Respiratory, Sleep, and Critical Care Journals is published.
Abstract: Control of Confounding and Reporting of Results in Causal Inference Studies Guidance for Authors fromEditors of Respiratory, Sleep, andCritical Care Journals David J. Lederer*, Scott C. Bell*, Richard D. Branson*, James D. Chalmers*, Rachel Marshall*, David M. Maslove*, David E. Ost*, Naresh M. Punjabi*, Michael Schatz*, Alan R. Smyth*, Paul W. Stewart*, Samy Suissa*, Alex A. Adjei, Cezmi A. Akdis, Élie Azoulay, Jan Bakker, Zuhair K. Ballas, Philip G. Bardin, Esther Barreiro, Rinaldo Bellomo, Jonathan A. Bernstein, Vito Brusasco, Timothy G. Buchman, Sudhansu Chokroverty, Nancy A. Collop, James D. Crapo, Dominic A. Fitzgerald, Lauren Hale, Nicholas Hart, Felix J. Herth, Theodore J. Iwashyna, Gisli Jenkins, Martin Kolb, Guy B. Marks, Peter Mazzone, J. Randall Moorman, ThomasM.Murphy, Terry L. Noah, Paul Reynolds, Dieter Riemann, Richard E. Russell, Aziz Sheikh, Giovanni Sotgiu, Erik R. Swenson, Rhonda Szczesniak, Ronald Szymusiak, Jean-Louis Teboul, and Jean-Louis Vincent Department of Medicine and Department of Epidemiology, Columbia University Irving Medical Center, New York, New York; Editor-inChief, Annals of the American Thoracic Society; Department of Thoracic Medicine, The Prince Charles Hospital, Brisbane, Queensland, Australia; Editor-in-Chief, Journal of Cystic Fibrosis; Department of Surgery, University of Cincinnati, Cincinnati, Ohio; Editor-in-Chief, Respiratory Care; University of Dundee, Dundee, Scotland; Deputy Chief Editor, European Respiratory Journal; London, England; Deputy Editor, The Lancet Respiratory Medicine; Department of Medicine, Queen’s University, Kingston, Ontario, Canada; Associate Editor for Data Science, Critical Care Medicine; Department of Pulmonary Medicine, University of Texas MD Anderson Cancer Center, Houston, Texas; Editor-in-Chief, Journal of Bronchology and Interventional Pulmonology; Division of Pulmonary and Critical Care Medicine, Johns Hopkins University, Baltimore, Maryland; Deputy Editor-in-Chief, SLEEP; Department of Allergy, Kaiser Permanente Medical Center, San Diego, California; Editor-in-Chief, The Journal of Allergy & Clinical Immunology: In Practice; Division of Child Health, Obstetrics, and Gynecology, University of Nottingham, Nottingham, England; Joint Editor-in-Chief, Thorax; Department of Biostatistics, University of North Carolina, Chapel Hill, North Carolina; Associate Editor, Pediatric Pulmonology; Department of Epidemiology, Biostatistics, and Occupational Health, McGill University, Montreal, Quebec, Canada; Advisor, COPD: Journal of Chronic Obstructive Pulmonary Disease; Department of Oncology, Mayo Clinic, Rochester, Minnesota; Editor-in-Chief, Journal of Thoracic Oncology; Swiss Institute of Allergy and Asthma Research, University of Zurich, Davos, Switzerland; Editor-in-Chief, Allergy; St. Louis Hospital, University of Paris, Paris, France; Editor-in-Chief, Intensive Care Medicine; Department of Medicine, Columbia University Irving Medical Center, and Division of Pulmonary, Critical Care, and Sleep, NYU Langone Health, New York, New York; Department of Intensive Care Adults, Erasmus MC University Medical Center, Rotterdam, the Netherlands; Department of Intensive Care, Pontificia Universidad Católica de Chile, Santiago, Chile; Editor-in-Chief, Journal of Critical Care; Department of Internal Medicine, University of Iowa and the Iowa City Veterans Affairs Medical Center, Iowa City, Iowa; Editor-in-Chief, The Journal of Allergy and Clinical Immunology; Monash Lung and Sleep, Monash Hospital and University, Melbourne, Victoria, Australia; Co-Editor-in-Chief, Respirology; Pulmonology Department, Muscle and Lung Cancer Research Group, Research Institute of Hospital del Mar and Centro de Investigación Biomédica en Red Enfermedades Respiratorias Instituto de Salud Carlos III, Barcelona, Spain; Editor-in-Chief, Archivos de Bronconeumologia; Department of Intensive Care Medicine, Austin Hospital and University of Melbourne, Melbourne, Victoria, Australia; Editor-in-Chief, Critical Care & Resuscitation; Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, Ohio; Editor-in-Chief, Journal of Asthma; Department of Internal Medicine, University of Genoa, Genoa, Italy; Editor-in-Chief, COPD: Journal of Chronic Obstructive Pulmonary Disease; Department of Surgery, Department of Anesthesiology, and Department of Biomedical Informatics, Emory University School of Medicine, Atlanta, Georgia; Editor-in-Chief,Critical CareMedicine; JFKNewJersey Neuroscience Institute, HackensackMeridian Health–JFKMedical Center, Edison, New Jersey; Editor-in-Chief, Sleep Medicine; Department of Medicine and Department of Neurology, Emory University School of Medicine, Atlanta, Georgia; Editor-in-Chief, Journal of Clinical Sleep Medicine; Department of Medicine, National Jewish Hospital, Denver, Colorado; Editor-in-Chief, Journal of the COPD Foundation; The Children’s Hospital at Westmead, Sydney Medical School, University of

Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1496 moreInstitutions (238)
TL;DR: In this paper, the authors describe the detailed design and preparation of a construction project for a post-LHC circular energy frontier collider in collaboration with national institutes, laboratories and universities worldwide, and enhanced by a strong participation of industrial partners.
Abstract: Particle physics has arrived at an important moment of its history. The discovery of the Higgs boson, with a mass of 125 GeV, completes the matrix of particles and interactions that has constituted the “Standard Model” for several decades. This model is a consistent and predictive theory, which has so far proven successful at describing all phenomena accessible to collider experiments. However, several experimental facts do require the extension of the Standard Model and explanations are needed for observations such as the abundance of matter over antimatter, the striking evidence for dark matter and the non-zero neutrino masses. Theoretical issues such as the hierarchy problem, and, more in general, the dynamical origin of the Higgs mechanism, do likewise point to the existence of physics beyond the Standard Model. This report contains the description of a novel research infrastructure based on a highest-energy hadron collider with a centre-of-mass collision energy of 100 TeV and an integrated luminosity of at least a factor of 5 larger than the HL-LHC. It will extend the current energy frontier by almost an order of magnitude. The mass reach for direct discovery will reach several tens of TeV, and allow, for example, to produce new particles whose existence could be indirectly exposed by precision measurements during the earlier preceding e+e– collider phase. This collider will also precisely measure the Higgs self-coupling and thoroughly explore the dynamics of electroweak symmetry breaking at the TeV scale, to elucidate the nature of the electroweak phase transition. WIMPs as thermal dark matter candidates will be discovered, or ruled out. As a single project, this particle collider infrastructure will serve the world-wide physics community for about 25 years and, in combination with a lepton collider (see FCC conceptual design report volume 2), will provide a research tool until the end of the 21st century. Collision energies beyond 100 TeV can be considered when using high-temperature superconductors. The European Strategy for Particle Physics (ESPP) update 2013 stated “To stay at the forefront of particle physics, Europe needs to be in a position to propose an ambitious post-LHC accelerator project at CERN by the time of the next Strategy update”. The FCC study has implemented the ESPP recommendation by developing a long-term vision for an “accelerator project in a global context”. This document describes the detailed design and preparation of a construction project for a post-LHC circular energy frontier collider “in collaboration with national institutes, laboratories and universities worldwide”, and enhanced by a strong participation of industrial partners. Now, a coordinated preparation effort can be based on a core of an ever-growing consortium of already more than 135 institutes worldwide. The technology for constructing a high-energy circular hadron collider can be brought to the technology readiness level required for constructing within the coming ten years through a focused R&D programme. The FCC-hh concept comprises in the baseline scenario a power-saving, low-temperature superconducting magnet system based on an evolution of the Nb3Sn technology pioneered at the HL-LHC, an energy-efficient cryogenic refrigeration infrastructure based on a neon-helium (Nelium) light gas mixture, a high-reliability and low loss cryogen distribution infrastructure based on Invar, high-power distributed beam transfer using superconducting elements and local magnet energy recovery and re-use technologies that are already gradually introduced at other CERN accelerators. On a longer timescale, high-temperature superconductors can be developed together with industrial partners to achieve an even more energy efficient particle collider or to reach even higher collision energies.The re-use of the LHC and its injector chain, which also serve for a concurrently running physics programme, is an essential lever to come to an overall sustainable research infrastructure at the energy frontier. Strategic R&D for FCC-hh aims at minimising construction cost and energy consumption, while maximising the socio-economic impact. It will mitigate technology-related risks and ensure that industry can benefit from an acceptable utility. Concerning the implementation, a preparatory phase of about eight years is both necessary and adequate to establish the project governance and organisation structures, to build the international machine and experiment consortia, to develop a territorial implantation plan in agreement with the host-states’ requirements, to optimise the disposal of land and underground volumes, and to prepare the civil engineering project. Such a large-scale, international fundamental research infrastructure, tightly involving industrial partners and providing training at all education levels, will be a strong motor of economic and societal development in all participating nations. The FCC study has implemented a set of actions towards a coherent vision for the world-wide high-energy and particle physics community, providing a collaborative framework for topically complementary and geographically well-balanced contributions. This conceptual design report lays the foundation for a subsequent infrastructure preparatory and technical design phase.

Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1501 moreInstitutions (239)
TL;DR: In this article, the physics opportunities of the Future Circular Collider (FC) were reviewed, covering its e+e-, pp, ep and heavy ion programs, and the measurement capabilities of each FCC component, addressing the study of electroweak, Higgs and strong interactions.
Abstract: We review the physics opportunities of the Future Circular Collider, covering its e+e-, pp, ep and heavy ion programmes. We describe the measurement capabilities of each FCC component, addressing the study of electroweak, Higgs and strong interactions, the top quark and flavour, as well as phenomena beyond the Standard Model. We highlight the synergy and complementarity of the different colliders, which will contribute to a uniquely coherent and ambitious research programme, providing an unmatchable combination of precision and sensitivity to new physics.

Journal ArticleDOI
TL;DR: In this article, an equilibrium state model for the temperature at the top of the permafrost (TTOP model) for the 2000-2016 period, driven by remotely-sensed land surface temperatures, down-scaled ERA-Interim climate reanalysis data, tundra wetness classes and landcover map from the ESA Landcover Climate Change Initiative (CCI), was employed.

Journal ArticleDOI
TL;DR: The task force of the International Conference of Frailty and Sarcopenia Research developed these clinical practice guidelines to overview the current evidence-base and provide recommendations for the identification and management of frailty in older adults using the GRADE approach.
Abstract: Objective: The task force of the International Conference of Frailty and Sarcopenia Research (ICFSR) developed these clinical practice guidelines to overview the current evidence-base and to provide recommendations for the identification and management of frailty in older adults. Methods: These recommendations were formed using the GRADE approach, which ranked the strength and certainty (quality) of the supporting evidence behind each recommendation. Where the evidence-base was limited or of low quality, Consensus Based Recommendations (CBRs) were formulated. The recommendations focus on the clinical and practical aspects of care for older people with frailty, and promote person-centred care. Recommendations for Screening and Assessment: The task force recommends that health practitioners case identify/screen all older adults for frailty using a validated instrument suitable for the specific setting or context (strong recommendation). Ideally, the screening instrument should exclude disability as part of the screening process. For individuals screened as positive for frailty, a more comprehensive clinical assessment should be performed to identify signs and underlying mechanisms of frailty (strong recommendation). Recommendations for Management: A comprehensive care plan for frailty should address polypharmacy (whether rational or nonrational), the management of sarcopenia, the treatable causes of weight loss, and the causes of exhaustion (depression, anaemia, hypotension, hypothyroidism, and B12 deficiency) (strong recommendation). All persons with frailty should receive social support as needed to address unmet needs and encourage adherence to a comprehensive care plan (strong recommendation). First-line therapy for the management of frailty should include a multi-component physical activity programme with a resistance-based training component (strong recommendation). Protein/caloric supplementation is recommended when weight loss or undernutrition are present (conditional recommendation). No recommendation was given for systematic additional therapies such as cognitive therapy, problem-solving therapy, vitamin D supplementation, and hormone-based treatment. Pharmacological treatment as presently available is not recommended therapy for the treatment of frailty.

Journal ArticleDOI
TL;DR: In this article, the MRgFUS with intravenously injected microbubbles can temporarily and repeatedly disrupt the blood-brain barrier (BBB) in a targeted fashion, without open surgery.
Abstract: The blood-brain barrier (BBB) has long limited therapeutic access to brain tumor and peritumoral tissue. In animals, MR-guided focused ultrasound (MRgFUS) with intravenously injected microbubbles can temporarily and repeatedly disrupt the BBB in a targeted fashion, without open surgery. Our objective is to demonstrate safety and feasibility of MRgFUS BBB opening with systemically administered chemotherapy in patients with glioma in a phase I, single-arm, open-label study. Five patients with previously confirmed or suspected high-grade glioma based on imaging underwent the MRgFUS in conjunction with administration of chemotherapy (n = 1 liposomal doxorubicin, n = 4 temozolomide) one day prior to their scheduled surgical resection. Samples of “sonicated” and “unsonicated” tissue were measured for the chemotherapy by liquid-chromatography-mass spectrometry. Complete follow-up was three months. The procedure was well-tolerated, with no adverse clinical or radiologic events related to the procedure. The BBB within the target volume showed radiographic evidence of opening with an immediate 15–50% increased contrast enhancement on T1-weighted MRI, and resolution approximately 20 hours after. Biochemical analysis of sonicated versus unsonicated tissue suggest chemotherapy delivery is feasible. In this study, we demonstrated transient BBB opening in tumor and peritumor tissue using non-invasive low-intensity MRgFUS with systemically administered chemotherapy was safe and feasible. The characterization of therapeutic delivery and clinical response to this treatment paradigm requires further investigation.

Journal ArticleDOI
TL;DR: This review provides an in-depth look at recent advances in the use of NHCs for the development of functional materials.
Abstract: N-Heterocyclic carbenes (NHCs) have become one of the most widely studied class of ligands in molecular chemistry and have found applications in fields as varied as catalysis, the stabilization of reactive molecular fragments, and biochemistry. More recently, NHCs have found applications in materials chemistry and have allowed for the functionalization of surfaces, polymers, nanoparticles, and discrete, well-defined clusters. In this review, we provide an in-depth look at recent advances in the use of NHCs for the development of functional materials.

Journal ArticleDOI
TL;DR: An extensive state-of-the-art survey on OBIA techniques is conducted, discussed different segmentation techniques and their applicability to OBIB, and selected optimal parameters and algorithms that can general image objects matching with the meaningful geographic objects.
Abstract: Image segmentation is a critical and important step in (GEographic) Object-Based Image Analysis (GEOBIA or OBIA). The final feature extraction and classification in OBIA is highly dependent on the quality of image segmentation. Segmentation has been used in remote sensing image processing since the advent of the Landsat-1 satellite. However, after the launch of the high-resolution IKONOS satellite in 1999, the paradigm of image analysis moved from pixel-based to object-based. As a result, the purpose of segmentation has been changed from helping pixel labeling to object identification. Although several articles have reviewed segmentation algorithms, it is unclear if some segmentation algorithms are generally more suited for (GE)OBIA than others. This article has conducted an extensive state-of-the-art survey on OBIA techniques, discussed different segmentation techniques and their applicability to OBIA. Conceptual details of those techniques are explained along with the strengths and weaknesses. The available tools and software packages for segmentation are also summarized. The key challenge in image segmentation is to select optimal parameters and algorithms that can general image objects matching with the meaningful geographic objects. Recent research indicates an apparent movement towards the improvement of segmentation algorithms, aiming at more accurate, automated, and computationally efficient techniques.

Journal ArticleDOI
TL;DR: A framework for its systematic advancement at the molecular and translational levels is proposed and a number of challenges are reviewed, including the lack of a precise functional and molecular definition, lack of understanding of the causative pathological role of EndMT in CVDs, and a lack of robust human data corroborating the extent and causality.

Journal ArticleDOI
TL;DR: Major cardiovascular events were more common among those with low levels of education in all types of country studied, but much more so in low-income countries, and differences in outcomes between educational groups were not explained by differences in risk factors.

Journal ArticleDOI
TL;DR: In this article, the authors study the impact of parameter optimization on defect prediction models and find that automated parameter optimization can substantially shift the importance ranking of variables, with as few as 28 percent of the top-ranked variables in optimized classifiers also being topranked in non-optimized classifiers.
Abstract: Defect prediction models—classifiers that identify defect-prone software modules—have configurable parameters that control their characteristics (e.g., the number of trees in a random forest). Recent studies show that these classifiers underperform when default settings are used. In this paper, we study the impact of automated parameter optimization on defect prediction models. Through a case study of 18 datasets, we find that automated parameter optimization: (1) improves AUC performance by up to 40 percentage points; (2) yields classifiers that are at least as stable as those trained using default settings; (3) substantially shifts the importance ranking of variables, with as few as 28 percent of the top-ranked variables in optimized classifiers also being top-ranked in non-optimized classifiers; (4) yields optimized settings for 17 of the 20 most sensitive parameters that transfer among datasets without a statistically significant drop in performance; and (5) adds less than 30 minutes of additional computation to 12 of the 26 studied classification techniques. While widely-used classification techniques like random forest and support vector machines are not optimization-sensitive, traditionally overlooked techniques like C5.0 and neural networks can actually outperform widely-used techniques after optimization is applied. This highlights the importance of exploring the parameter space when using parameter-sensitive classification techniques.

Journal ArticleDOI
TL;DR: Women with a broader array of pregnancy complications, including placental abruption and stillbirth, are at increased risk of future CVD, and the findings support the need for assessment and risk factor management beyond the postpartum period.
Abstract: Background: Women with a history of certain pregnancy complications are at higher risk for cardiovascular (CVD) disease. However, most clinical guidelines only recommend postpartum follow-up of tho...

Journal ArticleDOI
TL;DR: The various state-switching mechanisms of these boron-based materials will be introduced, followed by a detailed account of recent advances in the field and emphasis will be placed on structure-property relationships and the potential applications of the stimuli-responsive boreon compounds.
Abstract: Boron-based stimuli responsive systems represent an emerging class of useful materials with a wide variety of applications. Functions within these boron-doped molecules are derived from external stimuli such as light, heat, and force, which alter their intra- and/or intermolecular interactions, yielding unique electronic/photophysical or mechanical properties that can be exploited as optical probes or switchable materials. In this review, the various state-switching mechanisms of these boron-based materials will be introduced, followed by a detailed account of recent advances in the field. Emphasis will be placed on structure-property relationships and the potential applications of the stimuli-responsive boron compounds.

Journal ArticleDOI
TL;DR: A previously unknown ligand for gold(0) nanoclusters—N-heterocyclic carbenes (NHCs)—which feature a robust metal–carbon single bond and impart high stability to the corresponding gold cluster is reported.
Abstract: Magic-number gold nanoclusters are atomically precise nanomaterials that have enabled unprecedented insight into structure-property relationships in nanoscience. Thiolates are the most common ligand, binding to the cluster via a staple motif in which only central gold atoms are in the metallic state. The lack of other strongly bound ligands for nanoclusters with different bonding modes has been a significant limitation in the field. Here, we report a previously unknown ligand for gold(0) nanoclusters-N-heterocyclic carbenes (NHCs)-which feature a robust metal-carbon single bond and impart high stability to the corresponding gold cluster. The addition of a single NHC to gold nanoclusters results in significantly improved stability and catalytic properties in the electrocatalytic reduction of CO2. By varying the conditions, nature and number of equivalents of the NHC, predominantly or exclusively monosubstituted NHC-functionalized clusters result. Clusters can also be obtained with up to five NHCs, as a mixture of species.

Journal ArticleDOI
Susan M. Natali1, Jennifer D. Watts1, Brendan M. Rogers1, S. Potter1, S. Ludwig1, A. K. Selbmann2, Patrick F. Sullivan3, Benjamin W. Abbott4, Kyle A. Arndt5, Leah Birch1, Mats P. Björkman6, A. Anthony Bloom7, Gerardo Celis8, Torben R. Christensen9, Casper T. Christiansen10, Roisin Commane11, Elisabeth J. Cooper12, Patrick M. Crill13, Claudia I. Czimczik14, S. P. Davydov, Jinyang Du15, Jocelyn Egan16, Bo Elberling17, Eugénie S. Euskirchen18, Thomas Friborg17, Hélène Genet18, Mathias Göckede19, Jordan P. Goodrich20, Jordan P. Goodrich5, Paul Grogan21, Manuel Helbig22, Manuel Helbig23, Elchin Jafarov24, Julie D. Jastrow25, Aram Kalhori5, Yongwon Kim18, John S. Kimball15, Lars Kutzbach26, Mark J. Lara27, Klaus Steenberg Larsen17, Bang Yong Lee, Zhihua Liu28, Michael M. Loranty29, Magnus Lund9, Massimo Lupascu30, Nima Madani7, Avni Malhotra31, Roser Matamala25, Jack W. McFarland32, A. David McGuire18, Anders Michelsen17, Christina Minions1, Walter C. Oechel5, Walter C. Oechel33, David Olefeldt34, Frans-Jan W. Parmentier35, Frans-Jan W. Parmentier36, N. Pirk36, N. Pirk35, Ben Poulter37, William L. Quinton38, Fereidoun Rezanezhad39, David Risk40, Torsten Sachs, Kevin Schaefer41, Niels Martin Schmidt9, Edward A. G. Schuur8, Philipp R. Semenchuk42, Gaius R. Shaver43, Oliver Sonnentag22, Gregory Starr44, Claire C. Treat45, M. P. Waldrop32, Yihui Wang5, Jeffrey M. Welker46, Jeffrey M. Welker3, Christian Wille, Xiaofeng Xu5, Zhen Zhang47, Qianlai Zhuang48, Donatella Zona5, Donatella Zona49 
TL;DR: In this paper, the authors synthesize regional in situ observations of CO2 flux from Arctic and boreal soils to assess current and future winter carbon losses from the northern permafrost domain.
Abstract: Recent warming in the Arctic, which has been amplified during the winter1–3, greatly enhances microbial decomposition of soil organic matter and subsequent release of carbon dioxide (CO2)4. However, the amount of CO2 released in winter is not known and has not been well represented by ecosystem models or empirically based estimates5,6. Here we synthesize regional in situ observations of CO2 flux from Arctic and boreal soils to assess current and future winter carbon losses from the northern permafrost domain. We estimate a contemporary loss of 1,662 TgC per year from the permafrost region during the winter season (October–April). This loss is greater than the average growing season carbon uptake for this region estimated from process models (−1,032 TgC per year). Extending model predictions to warmer conditions up to 2100 indicates that winter CO2 emissions will increase 17% under a moderate mitigation scenario—Representative Concentration Pathway 4.5—and 41% under business-as-usual emissions scenario—Representative Concentration Pathway 8.5. Our results provide a baseline for winter CO2 emissions from northern terrestrial regions and indicate that enhanced soil CO2 loss due to winter warming may offset growing season carbon uptake under future climatic conditions. Winter warming in the Arctic will increase the CO2 flux from soils. A pan-Arctic analysis shows a current loss of 1,662 TgC per year over the winter, exceeding estimated carbon uptake in the growing season; projections suggest a 17% increase under RCP 4.5 and a 41% increase under RCP 8.5 by 2100.

Journal ArticleDOI
TL;DR: A sensor-based data-driven scheme using a deep learning tool and the similarity-based curve matching technique to estimate the RUL of a system, which demonstrates the competitiveness of the proposed method used for RUL estimation of systems.

Journal ArticleDOI
TL;DR: An unprecedented 60-fold increase in RTS numbers for Banks Island, Canada, from 1984 to 2015 due to a warming summer climate is shown, providing additional evidence that ice-rich continuous permafrost terrain can be highly vulnerable to changing summer climate.
Abstract: Retrogressive thaw slumps (RTS) – landslides caused by the melt of ground ice in permafrost – have become more common in the Arctic, but the timing of this recent increase and its links to climate have not been fully established. Here we annually resolve RTS formation and longevity for Banks Island, Canada (70,000 km2) using the Google Earth Engine Timelapse dataset. We describe a 60-fold increase in numbers between 1984 and 2015 as more than 4000 RTS were initiated, primarily following four particularly warm summers. Colour change due to increased turbidity occurred in 288 lakes affected by RTS outflows and sediment accumulated in many valley floors. Modelled RTS initiation rates increased by an order of magnitude between 1906–1985 and 2006–2015, and are projected under RCP4.5 to rise to >10,000 decade−1 after 2075. These results provide additional evidence that ice-rich continuous permafrost terrain can be highly vulnerable to changing summer climate. Retrogressive thaw slumps (RTS) are landslides caused by melting ground ice in permafrost areas. Based on Google Earth Engine Timelapse data, the authors show an unprecedented 60-fold increase in RTS numbers for Banks Island, Canada, from 1984 to 2015 due to a warming summer climate.

Journal ArticleDOI
TL;DR: In this article, the authors use literature and field data to outline some key trends being observed at the nexus of agricultural production, technology, and labour in North America, with a particular focus on the Canadian context.

Journal ArticleDOI
TL;DR: The challenges of DC microgrid protection are investigated from various aspects including, dc fault current characteristics, ground systems, fault detection methods, protective devices, and fault location methods.
Abstract: DC microgrids have attracted significant attention over the last decade in both academia and industry. DC microgrids have demonstrated superiority over AC microgrids with respect to reliability, efficiency, control simplicity, integration of renewable energy sources, and connection of dc loads. Despite these numerous advantages, designing and implementing an appropriate protection system for dc microgrids remains a significant challenge. The challenge stems from the rapid rise of dc fault current which must be extinguished in the absence of naturally occurring zero crossings, potentially leading to sustained arcs. In this paper, the challenges of DC microgrid protection are investigated from various aspects including, dc fault current characteristics, ground systems, fault detection methods, protective devices, and fault location methods. In each part, a comprehensive review has been carried out. Finally, future trends in the protection of DC microgrids are briefly discussed.