Showing papers by "University of Aberdeen published in 2020"
••
University of Amsterdam1, Ghent University2, University of Chicago3, University of Pennsylvania4, Lund University5, Auckland City Hospital6, University of Antwerp7, University of New South Wales8, Katholieke Universiteit Leuven9, Guy's and St Thomas' NHS Foundation Trust10, Queen's University11, University of Zagreb12, Northwestern University13, Medical University of Łódź14, University of Aberdeen15, Medical University of South Carolina16, University of North Carolina at Chapel Hill17, University of Southampton18, University of São Paulo19, National University of Singapore20, Flinders University21
TL;DR: The European Position Paper on Rhinosinusitis and Nasal Polyps 2020 is the update of similar evidence based position papers published in 2005 and 2007 and 2012 and addresses areas not extensively covered in EPOS2012 such as paediatric CRS and sinus surgery.
Abstract: The European Position Paper on Rhinosinusitis and Nasal Polyps 2020 is the update of similar evidence based position papers published in 2005 and 2007 and 2012. The core objective of the EPOS2020 guideline is to provide revised, up-to-date and clear evidence-based recommendations and integrated care pathways in ARS and CRS. EPOS2020 provides an update on the literature published and studies undertaken in the eight years since the EPOS2012 position paper was published and addresses areas not extensively covered in EPOS2012 such as paediatric CRS and sinus surgery. EPOS2020 also involves new stakeholders, including pharmacists and patients, and addresses new target users who have become more involved in the management and treatment of rhinosinusitis since the publication of the last EPOS document, including pharmacists, nurses, specialised care givers and indeed patients themselves, who employ increasing self-management of their condition using over the counter treatments. The document provides suggestions for future research in this area and offers updated guidance for definitions and outcome measurements in research in different settings. EPOS2020 contains chapters on definitions and classification where we have defined a large number of terms and indicated preferred terms. A new classification of CRS into primary and secondary CRS and further division into localized and diffuse disease, based on anatomic distribution is proposed. There are extensive chapters on epidemiology and predisposing factors, inflammatory mechanisms, (differential) diagnosis of facial pain, allergic rhinitis, genetics, cystic fibrosis, aspirin exacerbated respiratory disease, immunodeficiencies, allergic fungal rhinosinusitis and the relationship between upper and lower airways. The chapters on paediatric acute and chronic rhinosinusitis are totally rewritten. All available evidence for the management of acute rhinosinusitis and chronic rhinosinusitis with or without nasal polyps in adults and children is systematically reviewed and integrated care pathways based on the evidence are proposed. Despite considerable increases in the amount of quality publications in recent years, a large number of practical clinical questions remain. It was agreed that the best way to address these was to conduct a Delphi exercise . The results have been integrated into the respective sections. Last but not least, advice for patients and pharmacists and a new list of research needs are included. The full document can be downloaded for free on the website of this journal: http://www.rhinologyjournal.com.
2,853 citations
••
Katholieke Universiteit Leuven1, Public Health Research Institute2, Leiden University3, John Radcliffe Hospital4, University of Oxford5, Keele University6, Medical University of Vienna7, University Medical Center Utrecht8, University College Cork9, University of Pennsylvania10, University of Cologne11, Manchester Academic Health Science Centre12, University of Aberdeen13, RMIT University14, University of Manchester15, University of Amsterdam16, University of Ioannina17, Imperial College London18, Maastricht University Medical Centre19, Humboldt University of Berlin20
TL;DR: Proposed models for covid-19 are poorly reported, at high risk of bias, and their reported performance is probably optimistic, according to a review of published and preprint reports.
Abstract: Objective To review and appraise the validity and usefulness of published and preprint reports of prediction models for diagnosing coronavirus disease 2019 (covid-19) in patients with suspected infection, for prognosis of patients with covid-19, and for detecting people in the general population at increased risk of covid-19 infection or being admitted to hospital with the disease. Design Living systematic review and critical appraisal by the COVID-PRECISE (Precise Risk Estimation to optimise covid-19 Care for Infected or Suspected patients in diverse sEttings) group. Data sources PubMed and Embase through Ovid, up to 1 July 2020, supplemented with arXiv, medRxiv, and bioRxiv up to 5 May 2020. Study selection Studies that developed or validated a multivariable covid-19 related prediction model. Data extraction At least two authors independently extracted data using the CHARMS (critical appraisal and data extraction for systematic reviews of prediction modelling studies) checklist; risk of bias was assessed using PROBAST (prediction model risk of bias assessment tool). Results 37 421 titles were screened, and 169 studies describing 232 prediction models were included. The review identified seven models for identifying people at risk in the general population; 118 diagnostic models for detecting covid-19 (75 were based on medical imaging, 10 to diagnose disease severity); and 107 prognostic models for predicting mortality risk, progression to severe disease, intensive care unit admission, ventilation, intubation, or length of hospital stay. The most frequent types of predictors included in the covid-19 prediction models are vital signs, age, comorbidities, and image features. Flu-like symptoms are frequently predictive in diagnostic models, while sex, C reactive protein, and lymphocyte counts are frequent prognostic factors. Reported C index estimates from the strongest form of validation available per model ranged from 0.71 to 0.99 in prediction models for the general population, from 0.65 to more than 0.99 in diagnostic models, and from 0.54 to 0.99 in prognostic models. All models were rated at high or unclear risk of bias, mostly because of non-representative selection of control patients, exclusion of patients who had not experienced the event of interest by the end of the study, high risk of model overfitting, and unclear reporting. Many models did not include a description of the target population (n=27, 12%) or care setting (n=75, 32%), and only 11 (5%) were externally validated by a calibration plot. The Jehi diagnostic model and the 4C mortality score were identified as promising models. Conclusion Prediction models for covid-19 are quickly entering the academic literature to support medical decision making at a time when they are urgently needed. This review indicates that almost all pubished prediction models are poorly reported, and at high risk of bias such that their reported predictive performance is probably optimistic. However, we have identified two (one diagnostic and one prognostic) promising models that should soon be validated in multiple cohorts, preferably through collaborative efforts and data sharing to also allow an investigation of the stability and heterogeneity in their performance across populations and settings. Details on all reviewed models are publicly available at https://www.covprecise.org/. Methodological guidance as provided in this paper should be followed because unreliable predictions could cause more harm than benefit in guiding clinical decisions. Finally, prediction model authors should adhere to the TRIPOD (transparent reporting of a multivariable prediction model for individual prognosis or diagnosis) reporting guideline. Systematic review registration Protocol https://osf.io/ehc47/, registration https://osf.io/wy245. Readers’ note This article is a living systematic review that will be updated to reflect emerging evidence. Updates may occur for up to two years from the date of original publication. This version is update 3 of the original article published on 7 April 2020 (BMJ 2020;369:m1328). Previous updates can be found as data supplements (https://www.bmj.com/content/369/bmj.m1328/related#datasupp). When citing this paper please consider adding the update number and date of access for clarity.
2,183 citations
••
University of Sydney1, University of Michigan2, Duke University3, University of Alabama at Birmingham4, University of Pittsburgh5, University of Florida6, Centers for Disease Control and Prevention7, University of Münster8, University of Udine9, Ankara University10, University of Wisconsin-Madison11, Paris Diderot University12, University of Arkansas for Medical Sciences13, University of Paris14, University of Lausanne15, Brown University16, Istituto Giannina Gaslini17, Carlos III Health Institute18, Uniformed Services University of the Health Sciences19, National Institutes of Health20, University of Pennsylvania21, St George's, University of London22, Heidelberg University23, University of Copenhagen24, University College London25, University of Texas MD Anderson Cancer Center26, Katholieke Universiteit Leuven27, Goethe University Frankfurt28, University of Würzburg29, Johns Hopkins University30, Monash University31, Federal University of Rio de Janeiro32, Catholic University of the Sacred Heart33, University of Texas Health Science Center at San Antonio34, Masaryk University35, RMIT University36, Radboud University Nijmegen37, University of Melbourne38, Stanford University39, University of California, Davis40, Georgia Regents University41, Cornell University42, University of Aberdeen43, University Hospital of Wales44
TL;DR: These updated definitions of IFDs should prove applicable in clinical, diagnostic, and epidemiologic research of a broader range of patients at high-risk.
Abstract: BACKGROUND: Invasive fungal diseases (IFDs) remain important causes of morbidity and mortality. The consensus definitions of the Infectious Diseases Group of the European Organization for Research and Treatment of Cancer and the Mycoses Study Group have been of immense value to researchers who conduct clinical trials of antifungals, assess diagnostic tests, and undertake epidemiologic studies. However, their utility has not extended beyond patients with cancer or recipients of stem cell or solid organ transplants. With newer diagnostic techniques available, it was clear that an update of these definitions was essential. METHODS: To achieve this, 10 working groups looked closely at imaging, laboratory diagnosis, and special populations at risk of IFD. A final version of the manuscript was agreed upon after the groups' findings were presented at a scientific symposium and after a 3-month period for public comment. There were several rounds of discussion before a final version of the manuscript was approved. RESULTS: There is no change in the classifications of "proven," "probable," and "possible" IFD, although the definition of "probable" has been expanded and the scope of the category "possible" has been diminished. The category of proven IFD can apply to any patient, regardless of whether the patient is immunocompromised. The probable and possible categories are proposed for immunocompromised patients only, except for endemic mycoses. CONCLUSIONS: These updated definitions of IFDs should prove applicable in clinical, diagnostic, and epidemiologic research of a broader range of patients at high-risk.
1,211 citations
••
TL;DR: This Consensus Statement outlines the definition and scope of the term ‘synbiotics’ as determined by an expert panel convened by the International Scientific Association for Probiotics and Prebiotics in May 2019 and explores the levels of evidence, safety, effects upon targets and implications for stakeholders of the synbiotic concept.
Abstract: In May 2019, the International Scientific Association for Probiotics and Prebiotics (ISAPP) convened a panel of nutritionists, physiologists and microbiologists to review the definition and scope of synbiotics. The panel updated the definition of a synbiotic to “a mixture comprising live microorganisms and substrate(s) selectively utilized by host microorganisms that confers a health benefit on the host”. The panel concluded that defining synbiotics as simply a mixture of probiotics and prebiotics could suppress the innovation of synbiotics that are designed to function cooperatively. Requiring that each component must meet the evidence and dose requirements for probiotics and prebiotics individually could also present an obstacle. Rather, the panel clarified that a complementary synbiotic, which has not been designed so that its component parts function cooperatively, must be composed of a probiotic plus a prebiotic, whereas a synergistic synbiotic does not need to be so. A synergistic synbiotic is a synbiotic for which the substrate is designed to be selectively utilized by the co-administered microorganisms. This Consensus Statement further explores the levels of evidence (existing and required), safety, effects upon targets and implications for stakeholders of the synbiotic concept. Gut microbiota can be manipulated to benefit host health, including the use of probiotics, prebiotics and synbiotics. This Consensus Statement outlines the definition and scope of the term ‘synbiotics’ as determined by an expert panel convened by the International Scientific Association for Probiotics and Prebiotics in May 2019.
953 citations
••
TL;DR: The extent of the trait data compiled in TRY is evaluated and emerging patterns of data coverage and representativeness are analyzed to conclude that reducing data gaps and biases in the TRY database remains a key challenge and requires a coordinated approach to data mobilization and trait measurements.
Abstract: Plant traits-the morphological, anatomical, physiological, biochemical and phenological characteristics of plants-determine how plants respond to environmental factors, affect other trophic levels, and influence ecosystem properties and their benefits and detriments to people. Plant trait data thus represent the basis for a vast area of research spanning from evolutionary biology, community and functional ecology, to biodiversity conservation, ecosystem and landscape management, restoration, biogeography and earth system modelling. Since its foundation in 2007, the TRY database of plant traits has grown continuously. It now provides unprecedented data coverage under an open access data policy and is the main plant trait database used by the research community worldwide. Increasingly, the TRY database also supports new frontiers of trait-based plant research, including the identification of data gaps and the subsequent mobilization or measurement of new data. To support this development, in this article we evaluate the extent of the trait data compiled in TRY and analyse emerging patterns of data coverage and representativeness. Best species coverage is achieved for categorical traits-almost complete coverage for 'plant growth form'. However, most traits relevant for ecology and vegetation modelling are characterized by continuous intraspecific variation and trait-environmental relationships. These traits have to be measured on individual plants in their respective environment. Despite unprecedented data coverage, we observe a humbling lack of completeness and representativeness of these continuous traits in many aspects. We, therefore, conclude that reducing data gaps and biases in the TRY database remains a key challenge and requires a coordinated approach to data mobilization and trait measurements. This can only be achieved in collaboration with other initiatives.
882 citations
••
Aneurin Bevan University Health Board1, Cardiff University2, King's College London3, University of Manchester4, University of Salford5, University of Glasgow6, North Bristol NHS Trust7, University of Modena and Reggio Emilia8, Aberdeen Royal Infirmary9, Woodend Hospital10, Alexandra Hospital11, University of Aberdeen12
TL;DR: The prevalence of frailty in patients with COVID-19 who were admitted to hospital is established and its association with mortality and duration of hospital stay is investigated and disease outcomes were better predicted by frailty than either age or comorbidity.
Abstract: Background
The COVID-19 pandemic has placed unprecedented strain on health-care systems. Frailty is being used in clinical decision making for patients with COVID-19, yet the prevalence and effect of frailty in people with COVID-19 is not known. In the COVID-19 in Older PEople (COPE) study we aimed to establish the prevalence of frailty in patients with COVID-19 who were admitted to hospital and investigate its association with mortality and duration of hospital stay.
Methods
This was an observational cohort study conducted at ten hospitals in the UK and one in Italy. All adults (≥18 years) admitted to participating hospitals with COVID-19 were included. Patients with incomplete hospital records were excluded. The study analysed routinely generated hospital data for patients with COVID-19. Frailty was assessed by specialist COVID-19 teams using the clinical frailty scale (CFS) and patients were grouped according to their score (1–2=fit; 3–4=vulnerable, but not frail; 5–6=initial signs of frailty but with some degree of independence; and 7–9=severe or very severe frailty). The primary outcome was in-hospital mortality (time from hospital admission to mortality and day-7 mortality).
Findings
Between Feb 27, and April 28, 2020, we enrolled 1564 patients with COVID-19. The median age was 74 years (IQR 61–83); 903 (57·7%) were men and 661 (42·3%) were women; 425 (27·2%) had died at data cutoff (April 28, 2020). 772 (49·4%) were classed as frail (CFS 5–8) and 27 (1·7%) were classed as terminally ill (CFS 9). Compared with CFS 1–2, the adjusted hazard ratios for time from hospital admission to death were 1·55 (95% CI 1·00–2·41) for CFS 3–4, 1·83 (1·15–2·91) for CFS 5–6, and 2·39 (1·50–3·81) for CFS 7–9, and adjusted odds ratios for day-7 mortality were 1·22 (95% CI 0·63–2·38) for CFS 3–4, 1·62 (0·81–3·26) for CFS 5–6, and 3·12 (1·56–6·24) for CFS 7–9.
Interpretation
In a large population of patients admitted to hospital with COVID-19, disease outcomes were better predicted by frailty than either age or comorbidity. Our results support the use of CFS to inform decision making about medical care in adult patients admitted to hospital with COVID-19.
475 citations
••
University of Edinburgh1, University of California2, University of Sheffield3, University of Virginia4, Aarhus University5, University of California, Davis6, University of Barcelona7, Northern Arizona University8, University of Alaska Fairbanks9, Netherlands Organisation for Scientific Research10, University of Oslo11, University of Bergen12, VU University Amsterdam13, University of Exeter14, Institute of Arctic and Alpine Research15, University of Lapland16, Grand Valley State University17, University of Zurich18, Colgate University19, University of Oxford20, Open University21, Umeå University22, University of Stirling23, University of Tromsø24, Lund University25, University of Alaska Anchorage26, University of Texas at El Paso27, University of Greifswald28, Swiss Federal Institute for Forest, Snow and Landscape Research29, University of Aberdeen30
TL;DR: In this paper, a consensus is emerging that the underlying causes and future dynamics of so-called Arctic greening and browning trends are more complex, variable and inherently scale-dependent than previously thought.
Abstract: As the Arctic warms, vegetation is responding, and satellite measures indicate widespread greening at high latitudes. This ‘greening of the Arctic’ is among the world’s most important large-scale ecological responses to global climate change. However, a consensus is emerging that the underlying causes and future dynamics of so-called Arctic greening and browning trends are more complex, variable and inherently scale-dependent than previously thought. Here we summarize the complexities of observing and interpreting high-latitude greening to identify priorities for future research. Incorporating satellite and proximal remote sensing with in-situ data, while accounting for uncertainties and scale issues, will advance the study of past, present and future Arctic vegetation change.
407 citations
••
Royal Museum for Central Africa1, Ghent University2, University of Leeds3, University College London4, Forestry Commission5, University of York6, Wildlife Conservation Society7, University of Kisangani8, University of Plymouth9, World Wide Fund for Nature10, Norwegian University of Life Sciences11, University of Yaoundé I12, Manchester Metropolitan University13, University of British Columbia14, Center for International Forestry Research15, Bioversity International16, University of Toronto17, University of Stirling18, Forestry Research Institute of Ghana19, Centre de coopération internationale en recherche agronomique pour le développement20, University of Montpellier21, Mbarara University of Science and Technology22, Marien Ngouabi University23, University of Buea24, Duke University25, University of Edinburgh26, National Park Service27, Smithsonian Institution28, University of Cambridge29, Gembloux Agro-Bio Tech30, University of Birmingham31, University of Exeter32, Smithsonian Tropical Research Institute33, Chinese Academy of Sciences34, Royal Botanic Garden Edinburgh35, American Museum of Natural History36, African Wildlife Foundation37, University of Bristol38, University of Hong Kong39, Royal Society for the Protection of Birds40, Royal Botanic Gardens41, Environmental Change Institute42, University of the Sunshine Coast43, Fleming College44, Sokoine University of Agriculture45, University of Southampton46, University of Lincoln47, University of Florence48, University of Aberdeen49, Innovate UK50, National University of Singapore51, Washington State University Vancouver52, Yale University53, University of Nottingham54, Florida International University55, Université libre de Bruxelles56, Bangor University57, University of Liberia58
TL;DR: Overall, the uptake of carbon into Earth’s intact tropical forests peaked in the 1990s and independent observations indicating greater recent carbon uptake into the Northern Hemisphere landmass reinforce the conclusion that the intact tropical forest carbon sink has already peaked.
Abstract: Structurally intact tropical forests sequestered about half of the global terrestrial carbon uptake over the 1990s and early 2000s, removing about 15 per cent of anthropogenic carbon dioxide emissions. Climate-driven vegetation models typically predict that this tropical forest ‘carbon sink’ will continue for decades. Here we assess trends in the carbon sink using 244 structurally intact African tropical forests spanning 11 countries, compare them with 321 published plots from Amazonia and investigate the underlying drivers of the trends. The carbon sink in live aboveground biomass in intact African tropical forests has been stable for the three decades to 2015, at 0.66 tonnes of carbon per hectare per year (95 per cent confidence interval 0.53–0.79), in contrast to the long-term decline in Amazonian forests. Therefore the carbon sink responses of Earth’s two largest expanses of tropical forest have diverged. The difference is largely driven by carbon losses from tree mortality, with no detectable multi-decadal trend in Africa and a long-term increase in Amazonia. Both continents show increasing tree growth, consistent with the expected net effect of rising atmospheric carbon dioxide and air temperature. Despite the past stability of the African carbon sink, our most intensively monitored plots suggest a post-2010 increase in carbon losses, delayed compared to Amazonia, indicating asynchronous carbon sink saturation on the two continents. A statistical model including carbon dioxide, temperature, drought and forest dynamics accounts for the observed trends and indicates a long-term future decline in the African sink, whereas the Amazonian sink continues to weaken rapidly. Overall, the uptake of carbon into Earth’s intact tropical forests peaked in the 1990s. Given that the global terrestrial carbon sink is increasing in size, independent observations indicating greater recent carbon uptake into the Northern Hemisphere landmass reinforce our conclusion that the intact tropical forest carbon sink has already peaked. This saturation and ongoing decline of the tropical forest carbon sink has consequences for policies intended to stabilize Earth’s climate.
395 citations
••
TL;DR: Overall, the available mechanistic data and limited human data on the metabolic consequences of elevated gut-derived SCFA production strongly suggest that increasingSCFA production could be a valuable strategy in the preventing gastro-intestinal dysfunction, obesity and type 2 diabetes mellitus.
Abstract: Evidence is accumulating that short chain fatty acids (SCFA) play an important role in the maintenance of gut and metabolic health. The SCFA acetate, propionate and butyrate are produced from the microbial fermentation of indigestible carbohydrates and appear to be key mediators of the beneficial effects elicited by the gut microbiome. Microbial SCFA production is essential for gut integrity by regulating the luminal pH, mucus production, providing fuel for epithelial cells and effects on mucosal immune function. SCFA also directly modulate host metabolic health through a range of tissue-specific mechanisms related to appetite regulation, energy expenditure, glucose homeostasis and immunomodulation. Therefore, an increased microbial SCFA production can be considered as a health benefit, but data are mainly based on animal studies, whereas well-controlled human studies are limited. In this review an expert group by ILSI Europe's Prebiotics Task Force discussed the current scientific knowledge on SCFA to consider the relationship between SCFA and gut and metabolic health with a particular focus on human evidence. Overall, the available mechanistic data and limited human data on the metabolic consequences of elevated gut-derived SCFA production strongly suggest that increasing SCFA production could be a valuable strategy in the preventing gastro-intestinal dysfunction, obesity and type 2 diabetes mellitus. Nevertheless, there is an urgent need for well controlled longer term human SCFA intervention studies, including measurement of SCFA fluxes and kinetics, the heterogeneity in response based on metabolic phenotype, the type of dietary fibre and fermentation site in fibre intervention studies and the control for factors that could shape the microbiome like diet, physical activity and use of medication.
323 citations
••
27 Jul 2020
TL;DR: The findings show that a single deep learning algorithm can be trained to predict a wide range of molecular alterations from routine, paraffin-embedded histology slides stained with hematoxylin and eosin.
Abstract: Molecular alterations in cancer can cause phenotypic changes in tumor cells and their micro-environment. Routine histopathology tissue slides - which are ubiquitously available - can reflect such morphological changes. Here, we show that deep learning can consistently infer a wide range of genetic mutations, molecular tumor subtypes, gene expression signatures and standard pathology biomarkers directly from routine histology. We developed, optimized, validated and publicly released a one-stop-shop workflow and applied it to tissue slides of more than 5000 patients across multiple solid tumors. Our findings show that a single deep learning algorithm can be trained to predict a wide range of molecular alterations from routine, paraffin-embedded histology slides stained with hematoxylin and eosin. These predictions generalize to other populations and are spatially resolved. Our method can be implemented on mobile hardware, potentially enabling point-of-care diagnostics for personalized cancer treatment. More generally, this approach could elucidate and quantify genotype-phenotype links in cancer.
317 citations
••
16 Mar 2020
TL;DR: In this article, the authors quantify the role of soil carbon in natural (land-based) climate solutions and review some of the project design mechanisms available to tap into the potential.
Abstract: Mitigating climate change requires clean energy and the removal of atmospheric carbon. Building soil carbon is an appealing way to increase carbon sinks and reduce emissions owing to the associated benefits to agriculture. However, the practical implementation of soil carbon climate strategies lags behind the potential, partly because we lack clarity around the magnitude of opportunity and how to capitalize on it. Here we quantify the role of soil carbon in natural (land-based) climate solutions and review some of the project design mechanisms available to tap into the potential. We show that soil carbon represents 25% of the potential of natural climate solutions (total potential, 23.8 Gt of CO2-equivalent per year), of which 40% is protection of existing soil carbon and 60% is rebuilding depleted stocks. Soil carbon comprises 9% of the mitigation potential of forests, 72% for wetlands and 47% for agriculture and grasslands. Soil carbon is important to land-based efforts to prevent carbon emissions, remove atmospheric carbon dioxide and deliver ecosystem services in addition to climate mitigation. Diverse strategies are needed to mitigate climate change. This study finds that storing carbon in soils represents 25% of land-based potential, of which 60% must come from rebuilding depleted carbon stores.
••
University of Aberdeen1, Institut national de la recherche agronomique2, Agriculture and Agri-Food Canada3, University of Waikato4, Agro ParisTech5, Landcare Research6, Aarhus University7, International Center for Tropical Agriculture8, University of Vermont9, Spanish National Research Council10, Technical University of Madrid11
TL;DR: A new vision for a global framework for MRV of SOC change is described, to support national and international initiatives seeking to effect change in the way the authors manage their soils.
Abstract: There is growing international interest in better managing soils to increase soil organic carbon (SOC) content to contribute to climate change mitigation, to enhance resilience to climate change and to underpin food security, through initiatives such as international ‘4p1000’ initiative and the FAO's Global assessment of SOC sequestration potential (GSOCseq) programme. Since SOC content of soils cannot be easily measured, a key barrier to implementing programmes to increase SOC at large scale, is the need for credible and reliable measurement/monitoring, reporting and verification (MRV) platforms, both for national reporting and for emissions trading. Without such platforms, investments could be considered risky. In this paper, we review methods and challenges of measuring SOC change directly in soils, before examining some recent novel developments that show promise for quantifying SOC. We describe how repeat soil surveys are used to estimate changes in SOC over time, and how long‐term experiments and space‐for‐time substitution sites can serve as sources of knowledge and can be used to test models, and as potential benchmark sites in global frameworks to estimate SOC change. We briefly consider models that can be used to simulate and project change in SOC and examine the MRV platforms for SOC change already in use in various countries/regions. In the final section, we bring together the various components described in this review, to describe a new vision for a global framework for MRV of SOC change, to support national and international initiatives seeking to effect change in the way we manage our soils.
••
TL;DR: This paper provides a comprehensive literature review of the security issues and problems that impact the deployment of blockchain systems in smart cities, and presents a detailed discussion of several key factors for the convergence of Blockchain and AI technologies that will help form a sustainable smart society.
••
TL;DR: The authors review how genomics is being applied to aquaculture species at all stages of the domestication process to optimize selective breeding and how combining genomic selection with biotechnological innovations, such as genome editing and surrogate broodstock technologies, may further expedite genetic improvement in Aquaculture.
Abstract: Aquaculture is the fastest-growing farmed food sector and will soon become the primary source of fish and shellfish for human diets. In contrast to crop and livestock production, aquaculture production is derived from numerous, exceptionally diverse species that are typically in the early stages of domestication. Genetic improvement of production traits via well-designed, managed breeding programmes has great potential to help meet the rising seafood demand driven by human population growth. Supported by continuous advances in sequencing and bioinformatics, genomics is increasingly being applied across the broad range of aquaculture species and at all stages of the domestication process to optimize selective breeding. In the future, combining genomic selection with biotechnological innovations, such as genome editing and surrogate broodstock technologies, may further expedite genetic improvement in aquaculture.
••
University of Barcelona1, University of Liverpool2, Vita-Salute San Raffaele University3, University of Tübingen4, Medical University of Vienna5, Charles University in Prague6, Freeman Hospital7, Autonomous University of Barcelona8, Royal Free London NHS Foundation Trust9, Netherlands Cancer Institute10, Heidelberg University11, University of Patras12, Leeds Teaching Hospitals NHS Trust13, University of Ioannina14, University of Modena and Reggio Emilia15, Zhejiang University16, Radboud University Nijmegen17, Umeå University18, University of Paris19, Istanbul Medipol University20, University of Basel21, Erasmus University Rotterdam22, Praxis23, Innsbruck Medical University24, University of St. Gallen25, Ghent University Hospital26, St. George's University27, University College London Hospitals NHS Foundation Trust28, European Association of Urology29, University of Sheffield30, Katholieke Universiteit Leuven31, Dresden University of Technology32, University of Copenhagen33, University of Aberdeen34
TL;DR: It is hoped that the revised recommendations will assist urologist surgeons across the globe to guide the management of urological conditions during the current COVID-19 pandemic.
••
Commonwealth Scientific and Industrial Research Organisation1, International Livestock Research Institute2, Chatham House3, Potsdam Institute for Climate Impact Research4, University of Oslo5, Deakin University6, University of Copenhagen7, International Center for Tropical Agriculture8, University of Oxford9, Wageningen University and Research Centre10, International Institute for Applied Systems Analysis11, University of Minnesota12, Stanford University13, University of Queensland14, University of Potsdam15, University of Aberdeen16, Netherlands Environmental Assessment Agency17, CGIAR18, Utrecht University19
TL;DR: In this article, the authors identify technologies, assess their readiness and propose eight action points that could accelerate the transition towards a more sustainable food system and argue that the speed of innovation could be significantly increased with the appropriate incentives, regulations and social licence.
Abstract: Future technologies and systemic innovation are critical for the profound transformation the food system needs. These innovations range from food production, land use and emissions, all the way to improved diets and waste management. Here, we identify these technologies, assess their readiness and propose eight action points that could accelerate the transition towards a more sustainable food system. We argue that the speed of innovation could be significantly increased with the appropriate incentives, regulations and social licence. These, in turn, require constructive stakeholder dialogue and clear transition pathways.
••
TL;DR: The processes of globalisation and time-space compression, driven mainly by the neoliberal agenda and the advancement of various space-shrinking technologies, have markedly re-shaped the world over the last decade as discussed by the authors.
Abstract: The processes of globalisation and time-space compression, driven mainly by the neoliberal agenda and the advancement of various space-shrinking technologies, have markedly re-shaped the world over...
••
TL;DR: Research on the biochemistry, physiology and ecology of ammonia oxidisers is reviewed and the ways in which this knowledge, coupled with improved methods for characterising communities, might lead to improved fertiliser use efficiency and mitigation of N2 O emissions are discussed.
Abstract: Oxidation of ammonia to nitrite by bacteria and archaea is responsible for global emissions of nitrous oxide directly and indirectly through provision of nitrite and, after further oxidation, nitrate to denitrifiers. Their contributions to increasing N2 O emissions are greatest in terrestrial environments, due to the dramatic and continuing increases in use of ammonia-based fertilizers, which have been driven by requirement for increased food production, but which also provide a source of energy for ammonia oxidizers (AO), leading to an imbalance in the terrestrial nitrogen cycle. Direct N2 O production by AO results from several metabolic processes, sometimes combined with abiotic reactions. Physiological characteristics, including mechanisms for N2 O production, vary within and between ammonia-oxidizing archaea (AOA) and bacteria (AOB) and comammox bacteria and N2 O yield of AOB is higher than in the other two groups. There is also strong evidence for niche differentiation between AOA and AOB with respect to environmental conditions in natural and engineered environments. In particular, AOA are favored by low soil pH and AOA and AOB are, respectively, favored by low rates of ammonium supply, equivalent to application of slow-release fertilizer, or high rates of supply, equivalent to addition of high concentrations of inorganic ammonium or urea. These differences between AOA and AOB provide the potential for better fertilization strategies that could both increase fertilizer use efficiency and reduce N2 O emissions from agricultural soils. This article reviews research on the biochemistry, physiology and ecology of AO and discusses the consequences for AO communities subjected to different agricultural practices and the ways in which this knowledge, coupled with improved methods for characterizing communities, might lead to improved fertilizer use efficiency and mitigation of N2 O emissions.
••
TL;DR: COVID-19 misinformation exposure wasassociated with misinformation belief, while misinformation belief was associated with fewer preventive behaviors, and up-to-date public health strategies are required to counter the proliferation of misinformation.
Abstract: Background: Online misinformation proliferation during the COVID-19 pandemic has become a major public health concern.
Objective: We aimed to assess the prevalence of COVID-19 misinformation exposure and beliefs, associated factors including psychological distress with misinformation exposure, and the associations between COVID-19 knowledge and number of preventive behaviors.
Methods: A cross-sectional online survey was conducted with 1049 South Korean adults in April 2020. Respondents were asked about receiving COVID-19 misinformation using 12 items identified by the World Health Organization. Logistic regression was used to compute adjusted odds ratios (aORs) for the association of receiving misinformation with sociodemographic characteristics, source of information, COVID-19 misinformation belief, and psychological distress, as well as the associations of COVID-19 misinformation belief with COVID-19 knowledge and the number of COVID-19 preventive behaviors among those who received the misinformation. All data were weighted according to the Korea census data in 2018.
Results: Overall, 67.78% (n=711) of respondents reported exposure to at least one COVID-19 misinformation item. Misinformation exposure was associated with younger age, higher education levels, and lower income. Sources of information associated with misinformation exposure were social networking services (aOR 1.67, 95% CI 1.20-2.32) and instant messaging (aOR 1.79, 1.27-2.51). Misinformation exposure was also associated with psychological distress including anxiety (aOR 1.80, 1.24-2.61), depressive (aOR 1.47, 1.09-2.00), and posttraumatic stress disorder symptoms (aOR 1.97, 1.42-2.73), as well as misinformation belief (aOR 7.33, 5.17-10.38). Misinformation belief was associated with poorer COVID-19 knowledge (high: aOR 0.62, 0.45-0.84) and fewer preventive behaviors (≥7 behaviors: aOR 0.54, 0.39-0.74).
Conclusions: COVID-19 misinformation exposure was associated with misinformation belief, while misinformation belief was associated with fewer preventive behaviors. Given the potential of misinformation to undermine global efforts in COVID-19 disease control, up-to-date public health strategies are required to counter the proliferation of misinformation.
••
Martin J. P. Sullivan1, Martin J. P. Sullivan2, Simon L. Lewis3, Simon L. Lewis1 +247 more•Institutions (104)
TL;DR: This synthesis of plot networks across climatic and biogeographic gradients shows that forest thermal sensitivity is dominated by high daytime temperatures, and biome-wide variation in tropical forest carbon stocks and dynamics shows long-term resilience to increasing high temperatures.
Abstract: The sensitivity of tropical forest carbon to climate is a key uncertainty in predicting global climate change. Although short-term drying and warming are known to affect forests, it is unknown if such effects translate into long-term responses. Here, we analyze 590 permanent plots measured across the tropics to derive the equilibrium climate controls on forest carbon. Maximum temperature is the most important predictor of aboveground biomass (−9.1 megagrams of carbon per hectare per degree Celsius), primarily by reducing woody productivity, and has a greater impact per °C in the hottest forests (>32.2°C). Our results nevertheless reveal greater thermal resilience than observations of short-term variation imply. To realize the long-term climate adaptation potential of tropical forests requires both protecting them and stabilizing Earth’s climate.
••
University of California, Davis1, University of Eastern Finland2, National Institute for Health and Welfare3, University of Zurich4, Boston Children's Hospital5, Swiss Institute of Allergy and Asthma Research6, University of Lorraine7, Centre national de la recherche scientifique8, University Hospital Regensburg9, University of Marburg10, University of Helsinki11, University of Aberdeen12, Ludwig Maximilian University of Munich13
TL;DR: The gut microbiome may contribute to asthma protection through metabolites, supporting the concept of a gut-lung axis in humans.
Abstract: Growing up on a farm is associated with an asthma-protective effect, but the mechanisms underlying this effect are largely unknown. In the Protection against Allergy: Study in Rural Environments (PASTURE) birth cohort, we modeled maturation using 16S rRNA sequence data of the human gut microbiome in infants from 2 to 12 months of age. The estimated microbiome age (EMA) in 12-month-old infants was associated with previous farm exposure (β = 0.27 (0.12–0.43), P = 0.001, n = 618) and reduced risk of asthma at school age (odds ratio (OR) = 0.72 (0.56–0.93), P = 0.011). EMA mediated the protective farm effect by 19%. In a nested case–control sample (n = 138), we found inverse associations of asthma with the measured level of fecal butyrate (OR = 0.28 (0.09–0.91), P = 0.034), bacterial taxa that predict butyrate production (OR = 0.38 (0.17–0.84), P = 0.017) and the relative abundance of the gene encoding butyryl–coenzyme A (CoA):acetate–CoA-transferase, a major enzyme in butyrate metabolism (OR = 0.43 (0.19–0.97), P = 0.042). The gut microbiome may contribute to asthma protection through metabolites, supporting the concept of a gut–lung axis in humans. Growing up in the rich microbial environment of a farm strongly influences the maturation of the gut microbiome in the first year of life, which helps protect against the development of asthma in children.
••
TL;DR: The purpose of this study is to explore the possibilities for the application of laser therapy in medicine and dentistry by analyzing lasers’ underlying mechanism of action on different cells, with a special focus on stem cells and mechanisms of repair.
Abstract: The purpose of this study is to explore the possibilities for the application of laser therapy in medicine and dentistry by analyzing lasers' underlying mechanism of action on different cells, with a special focus on stem cells and mechanisms of repair. The interest in the application of laser therapy in medicine and dentistry has remarkably increased in the last decade. There are different types of lasers available and their usage is well defined by different parameters, such as: wavelength, energy density, power output, and duration of radiation. Laser irradiation can induce a photobiomodulatory (PBM) effect on cells and tissues, contributing to a directed modulation of cell behaviors, enhancing the processes of tissue repair. Photobiomodulation (PBM), also known as low-level laser therapy (LLLT), can induce cell proliferation and enhance stem cell differentiation. Laser therapy is a non-invasive method that contributes to pain relief and reduces inflammation, parallel to the enhanced healing and tissue repair processes. The application of these properties was employed and observed in the treatment of various diseases and conditions, such as diabetes, brain injury, spinal cord damage, dermatological conditions, oral irritation, and in different areas of dentistry.
••
Memorial Sloan Kettering Cancer Center1, University of Pittsburgh2, University of Texas MD Anderson Cancer Center3, Brigham and Women's Hospital4, University of Basel5, University of Colorado Boulder6, Fudan University7, Taipei Veterans General Hospital8, Royal Prince Alfred Hospital9, New Generation University College10, Mount Sinai Hospital11, University of Aberdeen12, All India Institute of Medical Sciences13, Shanghai Chest Hospital14, Kindai University15, New York University16, University of Turin17, University of Milan18, Ariès19, Mayo Clinic20, VU University Medical Center21, University of Toronto22, Katholieke Universiteit Leuven23, University of Zurich24
TL;DR: Detailed recommendations on how to process lung cancer resection specimens and to define pathologic response including MPR and CPR following neoadjuvant therapy are outlined to improve consistency of pathologic assessment of treatment response.
••
University of Aberdeen1, Joint Global Change Research Institute2, United Nations Economic Commission for Africa3, University of the West Indies4, Norwegian University of Science and Technology5, Ministry of Agriculture and Rural Development6, Makerere University7, Rutgers University8, International Food Policy Research Institute9, National Institute for Environmental Studies10, Institut national de la recherche agronomique11, International Trademark Association12, University of Bristol13, University of Virginia14, University of New England (Australia)15, University of Edinburgh16, Karlsruhe Institute of Technology17
TL;DR: A number of practices, such as increased food productivity, dietary change and reduced food loss and waste, can reduce demand for land conversion, thereby potentially freeing‐up land and creating opportunities for enhanced implementation of other practices, making them important components of portfolios of practices to address the combined land challenges.
Abstract: There is a clear need for transformative change in the land management and food production sectors to address the global land challenges of climate change mitigation, climate change adaptation, combatting land degradation and desertification, and delivering food security (referred to hereafter as "land challenges"). We assess the potential for 40 practices to address these land challenges and find that: Nine options deliver medium to large benefits for all four land challenges. A further two options have no global estimates for adaptation, but have medium to large benefits for all other land challenges. Five options have large mitigation potential (>3 Gt CO2 eq/year) without adverse impacts on the other land challenges. Five options have moderate mitigation potential, with no adverse impacts on the other land challenges. Sixteen practices have large adaptation potential (>25 million people benefit), without adverse side effects on other land challenges. Most practices can be applied without competing for available land. However, seven options could result in competition for land. A large number of practices do not require dedicated land, including several land management options, all value chain options, and all risk management options. Four options could greatly increase competition for land if applied at a large scale, though the impact is scale and context specific, highlighting the need for safeguards to ensure that expansion of land for mitigation does not impact natural systems and food security. A number of practices, such as increased food productivity, dietary change and reduced food loss and waste, can reduce demand for land conversion, thereby potentially freeing-up land and creating opportunities for enhanced implementation of other practices, making them important components of portfolios of practices to address the combined land challenges.
••
TL;DR: It was showed that the prevalence of polypharmacy varied between 10% to as high as around 90% in different populations, and chronic conditions, demographics, socioeconomics and self-assessed health factors were independent predictors of polyPharmacy.
Abstract: A high rate of polypharmacy is, in part, a consequence of the increasing proportion of multimorbidity in the ageing population worldwide. Our understanding of the potential harm of taking multiple medications in an older, multi-morbid population, who are likely to be on a polypharmacy regime, is limited. This is a narrative literature review that aims to appraise and summarise recent studies published about polypharmacy. We searched MEDLINE using the search terms polypharmacy (and its variations, e.g. multiple prescriptions, inappropriate drug use, etc.) in titles. Systematic reviews and original studies in English published between 2003 and 2018 were included. In this review, we provide current definitions of polypharmacy. We identify the determinants and prevalence of polypharmacy reported in different studies. Finally, we summarise some of the findings regarding the association between polypharmacy and health outcomes in older adults, with a focus on frailty, hospitalisation and mortality. Polypharmacy was most often defined in terms of the number of medications that are being taken by an individual at any given time. Our review showed that the prevalence of polypharmacy varied between 10% to as high as around 90% in different populations. Chronic conditions, demographics, socioeconomics and self-assessed health factors were independent predictors of polypharmacy. Polypharmacy was reported to be associated with various adverse outcomes after adjusting for health conditions. Optimising care for polypharmacy with valid, reliable measures, relevant to all patients, will improve the health outcomes of older adult population.
••
TL;DR: Genomic feature, gene-expression and gene-set analyses revealed distinct biological signatures for each trait, highlighting different underlying biological pathways, increasing understanding of diabetes pathophysiology by use of trans-ancestry studies for improved power and resolution.
Abstract: Glycaemic traits are used to diagnose and monitor type 2 diabetes, and cardiometabolic health. To date, most genetic studies of glycaemic traits have focused on individuals of European ancestry. Here, we aggregated genome-wide association studies in up to 281,416 individuals without diabetes (30% non-European ancestry) with fasting glucose, 2h-glucose post-challenge, glycated haemoglobin, and fasting insulin data. Trans-ancestry and single-ancestry meta-analyses identified 242 loci (99 novel; P
••
Centre national de la recherche scientifique1, University of Tehran2, Université Paris-Saclay3, Higher University of San Andrés4, University of California, Davis5, Empresa Brasileira de Pesquisa Agropecuária6, National Agriculture and Food Research Organization7, University of Aberdeen8, Institut national de la recherche agronomique9, Landcare Research10, University of Vermont11
TL;DR: The objective of this paper is to present the aims of the initiative, to discuss critical issues and to present challenges for its implementation, and to advocate for collaboration between multiple parties in order to stimulate innovation and to initiate the transition of agricultural systems toward sustainability.
Abstract: Authors would like to acknowledge the executive secretariat of the 4p1000 initiative, Charlotte Verger and Claire Weill for their valuable contributions during the preparation of this manuscript. The input of PS contributes to the UK NERC-funded Soils-R-GGREAT project (NE/P019455/1).
••
TL;DR: Patients with NC had a reduced risk of mortality compared to CAC infection, however, caution should be taken when interpreting this finding.
••
TL;DR: Oral/systemic corticosteroids were commonly used for asthma management and were more frequently used in patients with severe asthma than in those with milder disease, and the risks of acute and chronic complications increase with the cumulative oral Corticosteroid dosage.
Abstract: Systemic corticosteroid use to manage uncontrolled asthma and its associated healthcare burden may account for important health-related adverse effects. We conducted a systematic literature review to investigate the real-world extent and burden of systemic corticosteroid use in asthma. We searched MEDLINE and Embase databases to identify English-language articles published in 2010-2017, using search terms for asthma with keywords for oral corticosteroids and systemic corticosteroids. Observational studies, prescription database analyses, economic analyses, and surveys on oral/systemic corticosteroid use in children (>5 yr old), adolescents (12-17 yr old), and adults with asthma were included. We identified and reviewed 387 full-text articles, and our review included data from 139 studies. The included studies were conducted in Europe, North America, and Asia. Overall, oral/systemic corticosteroids were commonly used for asthma management and were more frequently used in patients with severe asthma than in those with milder disease. Long-term oral/systemic corticosteroid use was, in general, less frequent than short-term use. Compared with no use, long-term and repeated short-term oral/systemic corticosteroid use were associated with an increased risk of acute and chronic adverse events, even when doses were comparatively low. Greater oral/systemic corticosteroid exposure was also associated with increased costs and healthcare resource use. This review provides a comprehensive overview of oral/systemic corticosteroid use and associated adverse events for patients with all degrees of asthma severity and exposure duration. We report that oral/systemic corticosteroid use is prevalent in asthma management, and the risks of acute and chronic complications increase with the cumulative oral corticosteroid dosage.
••
TL;DR: Wang et al. as discussed by the authors simulated the methane flow behavior through authentic kerogen-based circular nanopores with the use of molecular dynamics (MD) for the first time, and a novel construction method was developed to generate kerogenbased organic nanopore with desirable pore size and different kerogen types.