Showing papers by "University of Vermont published in 2019"
••
TL;DR: In this article, the Event Horizon Telescope was used to reconstruct event-horizon-scale images of the supermassive black hole candidate in the center of the giant elliptical galaxy M87.
Abstract: When surrounded by a transparent emission region, black holes are expected to reveal a dark shadow caused by gravitational light bending and photon capture at the event horizon. To image and study this phenomenon, we have assembled the Event Horizon Telescope, a global very long baseline interferometry array observing at a wavelength of 1.3 mm. This allows us to reconstruct event-horizon-scale images of the supermassive black hole candidate in the center of the giant elliptical galaxy M87. We have resolved the central compact radio source as an asymmetric bright emission ring with a diameter of 42 +/- 3 mu as, which is circular and encompasses a central depression in brightness with a flux ratio greater than or similar to 10: 1. The emission ring is recovered using different calibration and imaging schemes, with its diameter and width remaining stable over four different observations carried out in different days. Overall, the observed image is consistent with expectations for the shadow of a Kerr black hole as predicted by general relativity. The asymmetry in brightness in the ring can be explained in terms of relativistic beaming of the emission from a plasma rotating close to the speed of light around a black hole. We compare our images to an extensive library of ray-traced general-relativistic magnetohydrodynamic simulations of black holes and derive a central mass of M = (6.5 +/- 0.7) x 10(9) M-circle dot. Our radio-wave observations thus provide powerful evidence for the presence of supermassive black holes in centers of galaxies and as the central engines of active galactic nuclei. They also present a new tool to explore gravity in its most extreme limit and on a mass scale that was so far not accessible.
2,589 citations
••
University of Oklahoma1, University of Vermont2, University of Missouri–Kansas City3, Children's Mercy Hospital4, Boston Children's Hospital5, Harvard University6, University of North Carolina at Chapel Hill7, Ohio University8, American Academy of Pediatrics9, University of Cincinnati10, Cincinnati Children's Hospital Medical Center11, American Academy of Family Physicians12, Centers for Disease Control and Prevention13, Vanderbilt University14, Northwestern University15, American Academy of Child and Adolescent Psychiatry16
TL;DR: Attention-deficit/hyperactivity disorder (ADHD) is the most common neurobehavioral disorder of childhood and can profoundly affect the academic achievement, well-being, and social interactions of children.
Abstract: Attention-deficit/hyperactivity disorder (ADHD) is the most common neurobehavioral disorder of childhood and can profoundly affect the academic achievement, well-being, and social interactions of children; the American Academy of Pediatrics first published clinical recommendations for the diagnosis and evaluation of ADHD in children in 2000; recommendations for treatment followed in 2001.
1,657 citations
••
University of New South Wales1, Oregon State University2, Braunschweig University of Technology3, University of California, San Diego4, Norwegian University of Life Sciences5, University of Liverpool6, Max Planck Society7, University of Tasmania8, University of Vermont9, ETH Zurich10, Stazione Zoologica Anton Dohrn11, Montana State University12, University of Amsterdam13, University of Southern California14, Pacific Northwest National Laboratory15, University of Hawaii at Manoa16, University of California, Berkeley17, Marine Biological Laboratory18, University of California, Irvine19, University of Georgia20, California Institute of Technology21, University of Edinburgh22, Ohio State University23, University of Sydney24, University of Alberta25, Georgia Institute of Technology26, Australian Institute of Marine Science27, University of Melbourne28, University of Texas Medical Branch29, University of Queensland30
TL;DR: This Consensus Statement documents the central role and global importance of microorganisms in climate change biology and puts humanity on notice that the impact of climate change will depend heavily on responses of micro organisms, which are essential for achieving an environmentally sustainable future.
Abstract: In the Anthropocene, in which we now live, climate change is impacting most life on Earth. Microorganisms support the existence of all higher trophic life forms. To understand how humans and other life forms on Earth (including those we are yet to discover) can withstand anthropogenic climate change, it is vital to incorporate knowledge of the microbial 'unseen majority'. We must learn not just how microorganisms affect climate change (including production and consumption of greenhouse gases) but also how they will be affected by climate change and other human activities. This Consensus Statement documents the central role and global importance of microorganisms in climate change biology. It also puts humanity on notice that the impact of climate change will depend heavily on responses of microorganisms, which are essential for achieving an environmentally sustainable future.
963 citations
••
University of Vermont1, Karolinska University Hospital2, Universidade Federal de Minas Gerais3, Universidade Católica de Pelotas4, University of Tokyo5, Fujita Health University6, Central University of Venezuela7, University of Trieste8, University of Cape Town9, Monash University10, Ohio State University11, University of Alberta12, Hospital General de México13, University of Waterloo14, American Society for Parenteral and Enteral Nutrition15, Brigham and Women's Hospital16, Saint Louis University Hospital17, Sapienza University of Rome18, La Trobe University19, Khon Kaen University20, HAN University of Applied Sciences21, Rabin Medical Center22, University of Illinois at Chicago23, Pontifical Catholic University of Chile24, University of São Paulo25, Peking Union Medical College Hospital26, University of Pennsylvania27, Free University of Brussels28
TL;DR: A consensus scheme for diagnosing malnutrition in adults in clinical settings on a global scale is proposed and it is recommended that the etiologic criteria be used to guide intervention and anticipated outcomes.
Abstract: Summary Rationale This initiative is focused on building a global consensus around core diagnostic criteria for malnutrition in adults in clinical settings Methods In January 2016, the Global Leadership Initiative on Malnutrition (GLIM) was convened by several of the major global clinical nutrition societies GLIM appointed a core leadership committee and a supporting working group with representatives bringing additional global diversity and expertise Empirical consensus was reached through a series of face-to-face meetings, telephone conferences, and e-mail communications Results A two-step approach for the malnutrition diagnosis was selected, ie, first screening to identify “at risk” status by the use of any validated screening tool, and second, assessment for diagnosis and grading the severity of malnutrition The malnutrition criteria for consideration were retrieved from existing approaches for screening and assessment Potential criteria were subjected to a ballot among the GLIM core and supporting working group members The top five ranked criteria included three phenotypic criteria (non-volitional weight loss, low body mass index, and reduced muscle mass) and two etiologic criteria (reduced food intake or assimilation, and inflammation or disease burden) To diagnose malnutrition at least one phenotypic criterion and one etiologic criterion should be present Phenotypic metrics for grading severity as Stage 1 (moderate) and Stage 2 (severe) malnutrition are proposed It is recommended that the etiologic criteria be used to guide intervention and anticipated outcomes The recommended approach supports classification of malnutrition into four etiology-related diagnosis categories Conclusion A consensus scheme for diagnosing malnutrition in adults in clinical settings on a global scale is proposed Next steps are to secure further collaboration and endorsements from leading nutrition professional societies, to identify overlaps with syndromes like cachexia and sarcopenia, and to promote dissemination, validation studies, and feedback The diagnostic construct should be re-considered every 3–5 years
885 citations
••
Karolinska University Hospital1, Uppsala University2, University of Vermont3, Universidade Federal de Minas Gerais4, Universidade Católica de Pelotas5, University of Tokyo6, Fujita Health University7, Central University of Venezuela8, University of Trieste9, University of Cape Town10, University of Warwick11, Monash University12, Ohio State University13, University of Alberta14, Hospital General de México15, University of Waterloo16, American Society for Parenteral and Enteral Nutrition17, Brigham and Women's Hospital18, Saint Louis University Hospital19, Sapienza University of Rome20, Khon Kaen University21, VU University Amsterdam22, HAN University of Applied Sciences23, Tel Aviv University24, Rabin Medical Center25, University of Illinois at Chicago26, Pontifical Catholic University of Chile27, University of São Paulo28, Peking Union Medical College Hospital29, Free University of Brussels30, University of Pennsylvania31
TL;DR: This initiative is focused on building a global consensus around core diagnostic criteria for malnutrition in adults in clinical settings.
Abstract: Rationale
This initiative is focused on building a global consensus around core diagnostic criteria for malnutrition in adults in clinical settings.
827 citations
••
TL;DR: The Event Horizon Telescope (EHT) as mentioned in this paper is a very long baseline interferometry (VLBI) array that comprises millimeter and submillimeter-wavelength telescopes separated by distances comparable to the diameter of the Earth.
Abstract: The Event Horizon Telescope (EHT) is a very long baseline interferometry (VLBI) array that comprises millimeter- and submillimeter-wavelength telescopes separated by distances comparable to the diameter of the Earth. At a nominal operating wavelength of ~1.3 mm, EHT angular resolution (λ/D) is ~25 μas, which is sufficient to resolve nearby supermassive black hole candidates on spatial and temporal scales that correspond to their event horizons. With this capability, the EHT scientific goals are to probe general relativistic effects in the strong-field regime and to study accretion and relativistic jet formation near the black hole boundary. In this Letter we describe the system design of the EHT, detail the technology and instrumentation that enable observations, and provide measures of its performance. Meeting the EHT science objectives has required several key developments that have facilitated the robust extension of the VLBI technique to EHT observing wavelengths and the production of instrumentation that can be deployed on a heterogeneous array of existing telescopes and facilities. To meet sensitivity requirements, high-bandwidth digital systems were developed that process data at rates of 64 gigabit s^(−1), exceeding those of currently operating cm-wavelength VLBI arrays by more than an order of magnitude. Associated improvements include the development of phasing systems at array facilities, new receiver installation at several sites, and the deployment of hydrogen maser frequency standards to ensure coherent data capture across the array. These efforts led to the coordination and execution of the first Global EHT observations in 2017 April, and to event-horizon-scale imaging of the supermassive black hole candidate in M87.
756 citations
••
Daniel Taliun1, Daniel N. Harris2, Michael D. Kessler2, Jedidiah Carlson1 +191 more•Institutions (61)
TL;DR: The nearly complete catalog of genetic variation in TOPMed studies provides unique opportunities for exploring the contributions of rare and non-coding sequence variants to phenotypic variation as well as resources and early insights from the sequence data.
Abstract: Summary paragraph The Trans-Omics for Precision Medicine (TOPMed) program seeks to elucidate the genetic architecture and disease biology of heart, lung, blood, and sleep disorders, with the ultimate goal of improving diagnosis, treatment, and prevention. The initial phases of the program focus on whole genome sequencing of individuals with rich phenotypic data and diverse backgrounds. Here, we describe TOPMed goals and design as well as resources and early insights from the sequence data. The resources include a variant browser, a genotype imputation panel, and sharing of genomic and phenotypic data via dbGaP. In 53,581 TOPMed samples, >400 million single-nucleotide and insertion/deletion variants were detected by alignment with the reference genome. Additional novel variants are detectable through assembly of unmapped reads and customized analysis in highly variable loci. Among the >400 million variants detected, 97% have frequency
662 citations
••
King Mongkut's University of Technology Thonburi1, Ferdowsi University of Mashhad2, Xi'an Jiaotong University3, University of Monastir4, Shahid Beheshti University5, University of Rennes6, Clarkson University7, North Carolina State University8, University of Vermont9, Iran University of Science and Technology10, University of New South Wales11, Royal Society12, King Abdulaziz University13, Quaid-i-Azam University14, University of Tehran15, Babeș-Bolyai University16
TL;DR: A review of the latest developments in modeling of nanofluid flows and heat transfer with an emphasis on 3D simulations can be found in this paper, where the main models used to calculate the thermophysical properties of Nanofluids are reviewed.
659 citations
••
Ghent University1, University of California, San Diego2, Leiden University3, Dresden University of Technology4, Stanford University5, University of Maryland, College Park6, Indiana University7, University of Cambridge8, Cardiff University9, University of Western Ontario10, Monash University, Clayton campus11, University of Toronto12, University of Vermont13, University of Oregon14, University of Tasmania15, University of Oslo16, Utrecht University17, Katholieke Universiteit Leuven18, Yale University19, Vanderbilt University20, University of Amsterdam21, Anglia Ruskin University22, Indian Institute of Science23, Queen's University24, King's College London25, Michigan State University26, University of Iowa27, Trinity College, Dublin28
TL;DR: The goal is to facilitate a more accurate use of the stop-signal task and provide user-friendly open-source resources intended to inform statistical-power considerations, facilitate the correct implementation of the task, and assist in proper data analysis.
Abstract: Response inhibition is essential for navigating everyday life. Its derailment is considered integral to numerous neurological and psychiatric disorders, and more generally, to a wide range of behavioral and health problems. Response-inhibition efficiency furthermore correlates with treatment outcome in some of these conditions. The stop-signal task is an essential tool to determine how quickly response inhibition is implemented. Despite its apparent simplicity, there are many features (ranging from task design to data analysis) that vary across studies in ways that can easily compromise the validity of the obtained results. Our goal is to facilitate a more accurate use of the stop-signal task. To this end, we provide 12 easy-to-implement consensus recommendations and point out the problems that can arise when they are not followed. Furthermore, we provide user-friendly open-source resources intended to inform statistical-power considerations, facilitate the correct implementation of the task, and assist in proper data analysis.
617 citations
••
Stanford University1, University of North Carolina at Chapel Hill2, Fred Hutchinson Cancer Research Center3, Vanderbilt University4, Anschutz Medical Campus5, University of Southern California6, Icahn School of Medicine at Mount Sinai7, University of California, San Francisco8, University of Hawaii9, University of Washington10, University of Texas Health Science Center at Houston11, University of Potsdam12, Johns Hopkins University13, Pennsylvania State University14, National Institutes of Health15, University of California, Davis16, Ohio State University17, Cancer Prevention Institute of California18, University of Vermont19, Rutgers University20
TL;DR: The value of diverse, multi-ethnic participants in large-scale genomic studies is demonstrated and evidence of effect-size heterogeneity across ancestries for published GWAS associations, substantial benefits for fine-mapping using diverse cohorts and insights into clinical implications are shown.
Abstract: Genome-wide association studies (GWAS) have laid the foundation for investigations into the biology of complex traits, drug development and clinical guidelines. However, the majority of discovery efforts are based on data from populations of European ancestry1-3. In light of the differential genetic architecture that is known to exist between populations, bias in representation can exacerbate existing disease and healthcare disparities. Critical variants may be missed if they have a low frequency or are completely absent in European populations, especially as the field shifts its attention towards rare variants, which are more likely to be population-specific4-10. Additionally, effect sizes and their derived risk prediction scores derived in one population may not accurately extrapolate to other populations11,12. Here we demonstrate the value of diverse, multi-ethnic participants in large-scale genomic studies. The Population Architecture using Genomics and Epidemiology (PAGE) study conducted a GWAS of 26 clinical and behavioural phenotypes in 49,839 non-European individuals. Using strategies tailored for analysis of multi-ethnic and admixed populations, we describe a framework for analysing diverse populations, identify 27 novel loci and 38 secondary signals at known loci, as well as replicate 1,444 GWAS catalogue associations across these traits. Our data show evidence of effect-size heterogeneity across ancestries for published GWAS associations, substantial benefits for fine-mapping using diverse cohorts and insights into clinical implications. In the United States-where minority populations have a disproportionately higher burden of chronic conditions13-the lack of representation of diverse populations in genetic research will result in inequitable access to precision medicine for those with the highest burden of disease. We strongly advocate for continued, large genome-wide efforts in diverse populations to maximize genetic discovery and reduce health disparities.
591 citations
••
TL;DR: Systemic administration of a brain-penetrant selenopeptide activates homeostatic transcription to inhibit cell death and improves function when delivered after hemorrhagic or ischemic stroke.
••
Xi'an Jiaotong University1, Ferdowsi University of Mashhad2, King Mongkut's University of Technology Thonburi3, University of Monastir4, Shahid Beheshti University5, University of Rennes6, Clarkson University7, North Carolina State University8, University of Vermont9, University of New South Wales10, Khalifa University11, Royal Society12, Quaid-i-Azam University13, King Abdulaziz University14, University of Tehran15, Babeș-Bolyai University16
TL;DR: In this paper, the authors present a review of the main computational methods for solving the transport equations associated with nanofluid flow, including finite difference, finite volume, finite element, lattice Boltzmann methods, and Lagrangian methods.
••
University of California, San Diego1, McGill University2, Oregon Health & Science University3, Florida International University4, Yale University5, Washington University in St. Louis6, Virginia Commonwealth University7, University of Vermont8, University of Michigan9, Medical University of South Carolina10, National Institutes of Health11, SRI International12, University of Southern California13, McGovern Institute for Brain Research14, Harvard University15, Medical College of Wisconsin16, University of California, Irvine17, University of California, Los Angeles18, University of California, San Francisco19, University of Colorado Boulder20, University of Florida21, University of Maryland, Baltimore22, University of Massachusetts Boston23, University of Minnesota24, University of Pittsburgh25, University of Rochester26, University of Tennessee27, University of Utah28, University of Wisconsin–Milwaukee29, United States Department of Veterans Affairs30, Boston University31
TL;DR: The baseline neuroimaging processing and subject-level analysis methods used by the Adolescent Brain Cognitive Development Study are described to be a resource of unprecedented scale and depth for studying typical and atypical development.
••
Tufts University1, Harvard University2, National Institutes of Health3, Maine Medical Center4, HealthPartners5, Duke University6, University of Nebraska Medical Center7, Baylor College of Medicine8, Memorial Hospital of South Bend9, Pennington Biomedical Research Center10, University of Tennessee Health Science Center11, Cleveland Clinic12, Stanford University13, Kaiser Permanente14, University of Vermont15, Lenox Hill Hospital16, Emory University17, Medical University of South Carolina18, University of Southern California19, United States Department of Veterans Affairs20, Translational Research Institute21, University of Colorado Denver22, University of Kansas23
TL;DR: Among persons at high risk for type 2 diabetes not selected for vitamin D insufficiency, vitamin D3 supplementation at a dose of 4000 IU per day did not result in a significantly lower risk of diabetes than placebo.
Abstract: Background Observational studies support an association between a low blood 25-hydroxyvitamin D level and the risk of type 2 diabetes. However, whether vitamin D supplementation lowers the...
••
Johns Hopkins University School of Medicine1, Utrecht University2, University of Turin3, Agency for Science, Technology and Research4, University of Modena and Reggio Emilia5, Paracelsus Private Medical University of Salzburg6, La Trobe University7, St. George's University8, Health Sciences Authority9, Boston Children's Hospital10, University of Pittsburgh11, Kyoto University12, National University of Singapore13, University of Vermont14, Southern Medical University15, University of Duisburg-Essen16
TL;DR: MSC-sEVs should be defined by quantifiable metrics to identify the cellular origin of the sEVs in a preparation, presence of lipid-membrane vesicles, and the degree of physical and biochemical integrity of the vesicle.
Abstract: Small extracellular vesicles (sEVs) from mesenchymal stromal/stem cells (MSCs) are transiting rapidly towards clinical applications. However, discrepancies and controversies about the biology, functions, and potency of MSC-sEVs have arisen due to several factors: the diversity of MSCs and their preparation; various methods of sEV production and separation; a lack of standardized quality assurance assays; and limited reproducibility of in vitro and in vivo functional assays. To address these issues, members of four societies (SOCRATES, ISEV, ISCT and ISBT) propose specific harmonization criteria for MSC-sEVs to facilitate data sharing and comparison, which should help to advance the field towards clinical applications. Specifically, MSC-sEVs should be defined by quantifiable metrics to identify the cellular origin of the sEVs in a preparation, presence of lipid-membrane vesicles, and the degree of physical and biochemical integrity of the vesicles. For practical purposes, new MSC-sEV preparations might also be measured against a well-characterized MSC-sEV biological reference. The ultimate goal of developing these metrics is to map aspects of MSC-sEV biology and therapeutic potency onto quantifiable features of each preparation.
••
Vanderbilt University1, Mayo Clinic2, University of Wisconsin-Madison3, University of Texas Southwestern Medical Center4, Stanford University5, University of Alabama at Birmingham6, Washington University in St. Louis7, Medical College of Wisconsin8, Case Western Reserve University9, Medical University of South Carolina10, Duke University11, Oregon Health & Science University12, University of Vermont13
TL;DR: Patients with MM refractory to CD38 MoAB have poor prognosis and this study provides benchmark for new therapies to be tested in this population.
Abstract: The introduction of CD38-targeting monoclonal antibodies (CD38 MoABs), daratumumab and isatuximab, has significantly impacted the management of patients with multiple myeloma (MM). Outcomes of patients with MM refractory to CD38 MoABs have not been described. We analyzed outcomes of 275 MM patients at 14 academic centers with disease refractory to CD38 MoABs. Median interval between MM diagnosis and refractoriness to CD38 MoAB (T0) was 50.1 months. The median overall survival (OS) from T0 for the entire cohort was 8.6 [95% C.I. 7.5–9.9] months, ranging from 11.2 months for patients not simultaneously refractory to an immunomodulatory (IMiD) agent and a proteasome inhibitor (PI) to 5.6 months for “penta-refractory” patients (refractory to CD38 MoAB, 2 PIs and 2 IMiDs). At least one subsequent treatment regimen was employed after T0 in 249 (90%) patients. Overall response rate to first regimen after T0 was 31% with median progression-free survival (PFS) and OS of 3.4 and 9.3 months, respectively. PFS was best achieved with combinations of carfilzomib and alkylator (median 5.7 months), and daratumumab and IMiD (median 4.5 months). Patients with MM refractory to CD38 MoAB have poor prognosis and this study provides benchmark for new therapies to be tested in this population.
••
Henan University1, Hebei University2, Peking University3, Chinese Academy of Sciences4, Colorado State University5, University of Vermont6, University of Antwerp7, University of Tasmania8, Auckland University of Technology9, University of Copenhagen10, Swedish University of Agricultural Sciences11, East China Normal University12, Northern Arizona University13, Villanova University14, University of Bern15, ETH Zurich16, Purdue University17, Marine Biological Laboratory18, Michigan State University19, Iowa State University20, Environmental Molecular Sciences Laboratory21, University of California, Berkeley22, Lawrence Berkeley National Laboratory23, United States Department of Agriculture24, Boston University25, Virginia Tech26, Oak Ridge National Laboratory27, Indiana University28, Commonwealth Scientific and Industrial Research Organisation29, Binzhou University30
TL;DR: There is an urgent need to explore the interactions among multiple global change drivers in underrepresented regions such as semi-arid ecosystems, forests in the tropics and subtropics, and Arctic tundra when forecasting future terrestrial carbon-climate feedback.
Abstract: Direct quantification of terrestrial biosphere responses to global change is crucial for projections of future climate change in Earth system models. Here, we synthesized ecosystem carbon-cycling data from 1,119 experiments performed over the past four decades concerning changes in temperature, precipitation, CO2 and nitrogen across major terrestrial vegetation types of the world. Most experiments manipulated single rather than multiple global change drivers in temperate ecosystems of the USA, Europe and China. The magnitudes of warming and elevated CO2 treatments were consistent with the ranges of future projections, whereas those of precipitation changes and nitrogen inputs often exceeded the projected ranges. Increases in global change drivers consistently accelerated, but decreased precipitation slowed down carbon-cycle processes. Nonlinear (including synergistic and antagonistic) effects among global change drivers were rare. Belowground carbon allocation responded negatively to increased precipitation and nitrogen addition and positively to decreased precipitation and elevated CO2. The sensitivities of carbon variables to multiple global change drivers depended on the background climate and ecosystem condition, suggesting that Earth system models should be evaluated using site-specific conditions for best uses of this large dataset. Together, this synthesis underscores an urgent need to explore the interactions among multiple global change drivers in underrepresented regions such as semi-arid ecosystems, forests in the tropics and subtropics, and Arctic tundra when forecasting future terrestrial carbon-climate feedback.
••
TL;DR: In this paper, the authors reviewed observed and projected impacts on river flows along both the Mekong mainstream and its tributaries, including the effects of diversions and inter-basin transfers.
••
College of William & Mary1, Macquarie University2, University of Kansas3, University of Amsterdam4, Pennsylvania State University5, University at Albany, SUNY6, Oklahoma State University–Stillwater7, University of Maryland, College Park8, University of Arizona9, Purdue University10, University of New South Wales11, Vanderbilt University12, Université de Montréal13, University of South Florida14, University of Utah15, University of Minnesota16, University of Liverpool17, Northwestern University18, Maastricht University19, King's College London20, Emory University21, University of Pittsburgh22, University of Kassel23, University of Toronto24, Southern Methodist University25, University of Hawaii at Manoa26, University of Notre Dame27, Medical Research Council28, University of California, Davis29, University of Vermont30, Georgia State University31, Florida State University32, University of North Texas33, Stony Brook University34
TL;DR: The Hierarchical Taxonomy of Psychopathology (HiTOP) as discussed by the authors is based on empirical patterns of co-occurrence among psychological symptoms, and it has the potential to accelerate and improve research on mental health problems as well as efforts to more effectively assess, prevent, and treat mental illness.
Abstract: For more than a century, research on psychopathology has focused on categorical diagnoses. Although this work has produced major discoveries, growing evidence points to the superiority of a dimensional approach to the science of mental illness. Here we outline one such dimensional system-the Hierarchical Taxonomy of Psychopathology (HiTOP)-that is based on empirical patterns of co-occurrence among psychological symptoms. We highlight key ways in which this framework can advance mental-health research, and we provide some heuristics for using HiTOP to test theories of psychopathology. We then review emerging evidence that supports the value of a hierarchical, dimensional model of mental illness across diverse research areas in psychological science. These new data suggest that the HiTOP system has the potential to accelerate and improve research on mental-health problems as well as efforts to more effectively assess, prevent, and treat mental illness.
••
Institut national de la recherche agronomique1, CGIAR2, International Institute for Applied Systems Analysis3, University of Vermont4, Institut de recherche pour le développement5, Centre de coopération internationale en recherche agronomique pour le développement6, Versailles Saint-Quentin-en-Yvelines University7, University of Aberdeen8, Ohio State University9
TL;DR: In this article, the authors discussed the potential benefits of soil organic matter (SOM) stewardship for both degraded and healthy soils along contrasting spatial scales (field, farm, landscape and country) and temporal (year to century) horizons.
Abstract: At the 21st session of the United Nations Framework Convention on Climate Change (UNFCCC, COP21), a voluntary action plan, the ‘4 per 1000 Initiative: Soils for Food Security and Climate’ was proposed under the Agenda for Action. The Initiative underlines the role of soil organic matter (SOM) in addressing the three-fold challenge of food and nutritional security, adaptation to climate change and mitigation of human-induced greenhouse gases (GHGs) emissions. It sets an ambitious aspirational target of a 4 per 1000 (i.e. 0.4%) rate of annual increase in global soil organic carbon (SOC) stocks, with a focus on agricultural lands where farmers would ensure the carbon stewardship of soils, like they manage day-to-day multipurpose production systems in a changing environment. In this paper, the opportunities and challenges for the 4 per 1000 initiative are discussed. We show that the 4 per 1000 target, calculated relative to global top soil SOC stocks, is consistent with literature estimates of the technical potential for SOC sequestration, though the achievable potential is likely to be substantially lower given socio-economic constraints. We calculate that land-based negative emissions from additional SOC sequestration could significantly contribute to reducing the anthropogenic CO2 equivalent emission gap identified from Nationally Determined Contributions pledged by countries to stabilize global warming levels below 2 °C or even 1.5 °C under the Paris agreement on climate. The 4 per 1000 target could be implemented by taking into account differentiated SOC stock baselines, reversing the current trend of huge soil CO2 losses, e.g. from agriculture encroaching peatland soils. We further discuss the potential benefits of SOC stewardship for both degraded and healthy soils along contrasting spatial scales (field, farm, landscape and country) and temporal (year to century) horizons. Last, we present some of the implications relative to non-CO2 GHGs emissions, water and nutrients use as well as co-benefits for crop yields and climate change adaptation. We underline the considerable challenges associated with the non-permanence of SOC stocks and show how the rates of adoption and the duration of improved soil management practices could alter the global impacts of practices under the 4 per 1000 initiative. We conclude that the 4 per 1000 initiative has potential to support multiple sustainable development goals (SDGs) of the 2030 Agenda. It can be regarded as no-regret since increasing SOC in agricultural soils will contribute to food security benefits that will enhance resilience to climate change. However, social, economic and environmental safeguards will be needed to ensure an equitable and sustainable implementation of the 4 per 1000 target.
••
TL;DR: For the largest and most comprehensive socioeconomic-environmental dataset yet assembled, there is no evidence of negative PA impacts and consistent statistical evidence to suggest PAs can positively affect human well-being.
Abstract: Protected areas (PAs) are fundamental for biodiversity conservation, yet their impacts on nearby residents are contested. We synthesized environmental and socioeconomic conditions of >87,000 children in >60,000 households situated either near or far from >600 PAs within 34 developing countries. We used quasi-experimental hierarchical regression to isolate the impact of living near a PA on several aspects of human well-being. Households near PAs with tourism also had higher wealth levels (by 17%) and a lower likelihood of poverty (by 16%) than similar households living far from PAs. Children under 5 years old living near multiple-use PAs with tourism also had higher height-for-age scores (by 10%) and were less likely to be stunted (by 13%) than similar children living far from PAs. For the largest and most comprehensive socioeconomic-environmental dataset yet assembled, we found no evidence of negative PA impacts and consistent statistical evidence to suggest PAs can positively affect human well-being.
••
International Crops Research Institute for the Semi-Arid Tropics1, Institut de recherche pour le développement2, University of Western Australia3, Osmania University4, Indian Council of Agricultural Research5, Central Agricultural University6, Ethiopian Institute of Agricultural Research7, Egerton University8, University of Agricultural Sciences, Bangalore9, Indian Agricultural Research Institute10, Seoul National University11, University of California, Davis12, International Maize and Wheat Improvement Center13, University of Missouri14, South Australian Research and Development Institute15, University of Adelaide16, University of Vermont17, Beijing Genomics Institute18
TL;DR: This study establishes a foundation for large-scale characterization of germplasm and population genomics, and a resource for trait dissection, accelerating genetic gains in future chickpea breeding.
Abstract: We report a map of 4.97 million single-nucleotide polymorphisms of the chickpea from whole-genome resequencing of 429 lines sampled from 45 countries. We identified 122 candidate regions with 204 genes under selection during chickpea breeding. Our data suggest the Eastern Mediterranean as the primary center of origin and migration route of chickpea from the Mediterranean/Fertile Crescent to Central Asia, and probably in parallel from Central Asia to East Africa (Ethiopia) and South Asia (India). Genome-wide association studies identified 262 markers and several candidate genes for 13 traits. Our study establishes a foundation for large-scale characterization of germplasm and population genomics, and a resource for trait dissection, accelerating genetic gains in future chickpea breeding.
••
University of North Carolina at Chapel Hill1, Harvard University2, University of Washington3, University of Illinois at Chicago4, University of Kentucky5, University of Colorado Denver6, Johns Hopkins University7, Fred Hutchinson Cancer Research Center8, University of Texas at Austin9, University of Texas Health Science Center at Houston10, Wake Forest University11, Rutgers University12, Brigham and Women's Hospital13, Broad Institute14, Kaiser Permanente15, Boston University16, University of Vermont17, University of Michigan18, Tulane University19, University of Alabama at Birmingham20, Albert Einstein College of Medicine21, Icahn School of Medicine at Mount Sinai22, Oklahoma Medical Research Foundation23, University of Minnesota24, University of Virginia25, Los Angeles Biomedical Research Institute26, United States Department of Veterans Affairs27, University of Mississippi Medical Center28, University of Wisconsin–Milwaukee29
TL;DR: It is demonstrated that using TOPMed sequencing data as the imputation reference panel improves genotypes into admixed African and Hispanic/Latino samples with genome-wide genotyping array data, which subsequently enhanced gene-mapping power for complex traits.
Abstract: Most genome-wide association and fine-mapping studies to date have been conducted in individuals of European descent, and genetic studies of populations of Hispanic/Latino and African ancestry are limited. In addition, these populations have more complex linkage disequilibrium structure. In order to better define the genetic architecture of these understudied populations, we leveraged >100,000 phased sequences available from deep-coverage whole genome sequencing through the multi-ethnic NHLBI Trans-Omics for Precision Medicine (TOPMed) program to impute genotypes into admixed African and Hispanic/Latino samples with genome-wide genotyping array data. We demonstrated that using TOPMed sequencing data as the imputation reference panel improves genotype imputation quality in these populations, which subsequently enhanced gene-mapping power for complex traits. For rare variants with minor allele frequency (MAF) 86%. Subsequent association analyses of TOPMed reference panel-imputed genotype data with hematological traits (hemoglobin (HGB), hematocrit (HCT), and white blood cell count (WBC)) in ~21,600 African-ancestry and ~21,700 Hispanic/Latino individuals identified associations with two rare variants in the HBB gene (rs33930165 with higher WBC [p = 8.8x10-15] in African populations, rs11549407 with lower HGB [p = 1.5x10-12] and HCT [p = 8.8x10-10] in Hispanics/Latinos). By comparison, neither variant would have been genome-wide significant if either 1000 Genomes Project Phase 3 or Haplotype Reference Consortium reference panels had been used for imputation. Our findings highlight the utility of the TOPMed imputation reference panel for identification of novel rare variant associations not previously detected in similarly sized genome-wide studies of under-represented African and Hispanic/Latino populations.
••
TL;DR: It is suggested that the system is an exotic ferromagnetic Mott insulator, with well-defined experimental signatures, after calculating the maximally localized superlattice Wannier wave functions.
Abstract: We address the effective tight-binding Hamiltonian that describes the insulating Mott state of twisted graphene bilayers at a magic angle. In that configuration, twisted bilayers form a honeycomb superlattice of localized states, characterized by the appearance of flat bands with fourfold degeneracy. After calculating the maximally localized superlattice Wannier wave functions, we derive the effective spin model that describes the Mott state. We suggest that the system is an exotic ferromagnetic Mott insulator, with well-defined experimental signatures.
••
University of Vermont1, University of Oregon2, University of Melbourne3, Icahn School of Medicine at Mount Sinai4, University of Barcelona5, Radboud University Nijmegen6, Yale University7, University of Cape Town8, National Institute on Drug Abuse9, Monash University10, Utrecht University11, Montreal Neurological Institute and Hospital12, King's College London13, Oregon Health & Science University14, University of Rochester15, University of Amsterdam16, University of Michigan17, University of Colorado Boulder18, Washington University in St. Louis19, University of California, Los Angeles20, Australian Catholic University21, University of Liverpool22, National Institutes of Health23, University of California, San Diego24, McGovern Institute for Brain Research25, Max Planck Society26, Leiden University27, Illawarra Health & Medical Research Institute28, VU University Medical Center29, QIMR Berghofer Medical Research Institute30, University of Utah31
TL;DR: The results indicate that dependence on a range of different substances shares a common neural substrate and that differential patterns of regional volume could serve as useful biomarkers of dependence on alcohol and nicotine.
Abstract: Objective: Although lower brain volume has been routinely observed in individuals with substance dependence compared with nondependent control subjects, the brain regions exhibiting lower volume have not been consistent across studies. In addition, it is not clear whether a common set of regions are involved in substance dependence regardless of the substance used or whether some brain volume effects are substance specific. Resolution of these issues may contribute to the identification of clinically relevant imaging biomarkers. Using pooled data from 14 countries, the authors sought to identify general and substance-specific associations between dependence and regional brain volumes. Method: Brain structure was examined in a mega-analysis of previously published data pooled from 23 laboratories, including 3,240 individuals, 2,140 of whom had substance dependence on one of five substances: alcohol, nicotine, cocaine, methamphetamine, or cannabis. Subcortical volume and cortical thickness in regions defined by FreeSurfer were compared with nondependent control subjects when all sampled substance categories were combined, as well as separately, while controlling for age, sex, imaging site, and total intracranial volume. Because of extensive associations with alcohol dependence, a secondary contrast was also performed for dependence on all substances except alcohol. An optimized split-half strategy was used to assess the reliability of the findings. Results: Lower volume or thickness was observed in many brain regions in individuals with substance dependence. The greatest effects were associated with alcohol use disorder. A set of affected regions related to dependence in general, regardless of the substance, included the insula and the medial orbitofrontal cortex. Furthermore, a support vector machine multivariate classification of regional brain volumes successfully classified individuals with substance dependence on alcohol or nicotine relative to nondependent control subjects. Conclusions: The results indicate that dependence on a range of different substances shares a common neural substrate and that differential patterns of regional volume could serve as useful biomarkers of dependence on alcohol and nicotine.
••
TL;DR: Estimates of the rate of a first recurrent venous thromboembolism (VTE) event after discontinuation of anticoagulant treatment in patients with a first episode of unprovoked VTE, and the cumulative incidence for recurrent VTE up to 10 years should inform clinical practice guidelines, enhance confidence in counselling patients of their prognosis, and help guide decision making about long term management of un provoked Vte.
Abstract: Objectives To determine the rate of a first recurrent venous thromboembolism (VTE) event after discontinuation of anticoagulant treatment in patients with a first episode of unprovoked VTE, and the cumulative incidence for recurrent VTE up to 10 years. Design Systematic review and meta-analysis. Data sources Medline, Embase, and the Cochrane Central Register of Controlled Trials (from inception to 15 March 2019). Study selection Randomised controlled trials and prospective cohort studies reporting symptomatic recurrent VTE after discontinuation of anticoagulant treatment in patients with a first unprovoked VTE event who had completed at least three months of treatment. Data extraction and synthesis Two investigators independently screened studies, extracted data, and appraised risk of bias. Data clarifications were sought from authors of eligible studies. Recurrent VTE events and person years of follow-up after discontinuation of anticoagulant treatment were used to calculate rates for individual studies, and data were pooled using random effects meta-analysis. Sex and site of initial VTE were investigated as potential sources of between study heterogeneity. Results 18 studies involving 7515 patients were included in the analysis. The pooled rate of recurrent VTE per 100 person years after discontinuation of anticoagulant treatment was 10.3 events (95% confidence interval 8.6 to 12.1) in the first year, 6.3 (5.1 to 7.7) in the second year, 3.8 events/year (95% confidence interval 3.2 to 4.5) in years 3-5, and 3.1 events/year (1.7 to 4.9) in years 6-10. The cumulative incidence for recurrent VTE was 16% (95% confidence interval 13% to 19%) at 2 years, 25% (21% to 29%) at 5 years, and 36% (28% to 45%) at 10 years. The pooled rate of recurrent VTE per 100 person years in the first year was 11.9 events (9.6 to 14.4) for men and 8.9 events (6.8 to 11.3) for women, with a cumulative incidence for recurrent VTE of 41% (28% to 56%) and 29% (20% to 38%), respectively, at 10 years. Compared to patients with isolated pulmonary embolism, the rate of recurrent VTE was higher in patients with proximal deep vein thrombosis (rate ratio 1.4, 95% confidence interval 1.1 to 1.7) and in patients with pulmonary embolism plus deep vein thrombosis (1.5, 1.1 to 1.9). In patients with distal deep vein thrombosis, the pooled rate of recurrent VTE per 100 person years was 1.9 events (95% confidence interval 0.5 to 4.3) in the first year after anticoagulation had stopped. The case fatality rate for recurrent VTE was 4% (95% confidence interval 2% to 6%). Conclusions In patients with a first episode of unprovoked VTE who completed at least three months of anticoagulant treatment, the risk of recurrent VTE was 10% in the first year after treatment, 16% at two years, 25% at five years, and 36% at 10 years, with 4% of recurrent VTE events resulting in death. These estimates should inform clinical practice guidelines, enhance confidence in counselling patients of their prognosis, and help guide decision making about long term management of unprovoked VTE. Systematic review registration PROSPERO CRD42017056309.
••
TL;DR: It is concluded that use of the JIF is encouraged in RPT evaluations, especially at research-intensive universities, and that there is work to be done to avoid the potential misuse of metrics like the Jif.
Abstract: We analyzed how often and in what ways the Journal Impact Factor (JIF) is currently used in review, promotion, and tenure (RPT) documents of a representative sample of universities from the United States and Canada. 40% of research-intensive institutions and 18% of master’s institutions mentioned the JIF, or closely related terms. Of the institutions that mentioned the JIF, 87% supported its use in at least one of their RPT documents, 13% expressed caution about its use, and none heavily criticized it or prohibited its use. Furthermore, 63% of institutions that mentioned the JIF associated the metric with quality, 40% with impact, importance, or significance, and 20% with prestige, reputation, or status. We conclude that use of the JIF is encouraged in RPT evaluations, especially at research-intensive universities, and that there is work to be done to avoid the potential misuse of metrics like the JIF.
••
TL;DR: Studies suggest an association between exposure to e-cigarette marketing and lower harm perceptions of e-cigarettes, intention to use e- cigarettes, and e- cigarette trial, highlighting the need to for advertising regulations that support public health goals.
Abstract: Introduction Given the lack of regulation on marketing of electronic cigarettes (e-cigarettes) in the United States and the increasing exchange of e-cigarette-related information online, it is critical to understand how e-cigarette companies market e-cigarettes and how the public engages with e-cigarette information. Methods Results are from a systematic review of peer-reviewed literature on e-cigarettes via a PubMed search through June 1, 2017. Search terms included: "e-cigarette*" or "electronic cigarette" or "electronic cigarettes" or "electronic nicotine delivery" or "vape" or "vaping." Experimental studies, quasi-experimental studies, observational studies, qualitative studies, and mixed methods studies providing empirical findings on e-cigarette marketing and communication (ie, nonmarketing communication in the public) were included. Results One hundred twenty-four publications on e-cigarette marketing and communication were identified. They covered topics including e-cigarette advertisement claims/promotions and exposure/receptivity, the effect of e-cigarette advertisements on e-cigarette and cigarette use, public engagement with e-cigarette information, and the public's portrayal of e-cigarettes. Studies show increases in e-cigarette marketing expenditures and online engagement through social media over time, that e-cigarettes are often framed as an alternative to combustible cigarettes, and that e-cigarette advertisement exposure may be associated with e-cigarette trial in adolescents and young adults. Discussion Few studies examine the effects of e-cigarette marketing on perceptions and e-cigarette and cigarette use. Evidence suggests that exposure to e-cigarette advertisements affects perceptions and trial of e-cigarettes, but there is no evidence that exposure affects cigarette use. No studies examined how exposure to e-cigarette communication, particularly misleading or inaccurate information, impacts e-cigarette, and tobacco use behaviors. Implications The present article provides a comprehensive review of e-cigarette marketing and how the public engages with e-cigarette information. Studies suggest an association between exposure to e-cigarette marketing and lower harm perceptions of e-cigarettes, intention to use e-cigarettes, and e-cigarette trial, highlighting the need to for advertising regulations that support public health goals. Findings from this review also present the methodological limitations of the existing research (primarily due to cross-sectional and correlational analyses) and underscore the need for timely, rigorous research to provide an accurate understanding of e-cigarette marketing and communication and its impact on e-cigarette and tobacco product use.
••
TL;DR: By fostering equanimity with feelings of loneliness and social disconnect, acceptance-skills training may allow loneliness to dissipate and encourage greater engagement with others in daily life, a behavioral therapeutic target for improving social-relationship functioning is described.
Abstract: Loneliness and social isolation are a growing public health concern, yet there are few evidence-based interventions for mitigating these social risk factors. Accumulating evidence suggests that mindfulness interventions can improve social-relationship processes. However, the active ingredients of mindfulness training underlying these improvements are unclear. Developing mindfulness-specific skills-namely, (i) monitoring present-moment experiences with (ii) an orientation of acceptance-may change the way people perceive and relate toward others. We predicted that developing openness and acceptance toward present experiences is critical for reducing loneliness and increasing social contact and that removing acceptance-skills training from a mindfulness intervention would eliminate these benefits. In this dismantling trial, 153 community adults were randomly assigned to a 14-lesson smartphone-based intervention: (i) training in both monitoring and acceptance (Monitor+Accept), (ii) training in monitoring only (Monitor Only), or (iii) active control training. For 3 d before and after the intervention, ambulatory assessments were used to measure loneliness and social contact in daily life. Consistent with predictions, Monitor+Accept training reduced daily-life loneliness by 22% (d = 0.44, P = 0.0001) and increased social contact by two more interactions each day (d = 0.47, P = 0.001) and one more person each day (d = 0.39, P = 0.004), compared with both Monitor Only and control trainings. These findings describe a behavioral therapeutic target for improving social-relationship functioning; by fostering equanimity with feelings of loneliness and social disconnect, acceptance-skills training may allow loneliness to dissipate and encourage greater engagement with others in daily life.
••
TL;DR: It is found that a minimum of eleven total pathogenic and benign variant controls are required to reach moderate-level evidence in the absence of rigorous statistical analysis.
Abstract: Background The American College of Medical Genetics and Genomics (ACMG)/Association for Molecular Pathology (AMP) clinical variant interpretation guidelines established criteria (PS3/BS3) for functional assays that specified a “strong” level of evidence. However, they did not provide detailed guidance on how functional evidence should be evaluated, and differences in the application of the PS3/BS3 codes is a contributor to variant interpretation discordance between laboratories. This recommendation seeks to provide a more structured approach to the assessment of functional assays for variant interpretation and guidance on the use of various levels of strength based on assay validation. Methods The Clinical Genome Resource (ClinGen) Sequence Variant Interpretation (SVI) Working Group used curated functional evidence from ClinGen Variant Curation Expert Panel-developed rule specifications and expert opinions to refine the PS3/BS3 criteria over multiple in-person and virtual meetings. We estimated odds of pathogenicity for assays using various numbers of variant controls to determine the minimum controls required to reach moderate level evidence. Feedback from the ClinGen Steering Committee and outside experts were incorporated into the recommendations at multiple stages of development. Results The SVI Working Group developed recommendations for evaluators regarding the assessment of the clinical validity of functional data and a four-step provisional framework to determine the appropriate strength of evidence that can be applied in clinical variant interpretation. These steps are: 1. Define the disease mechanism; 2. Evaluate applicability of general classes of assays used in the field; 3. Evaluate validity of specific instances of assays; 4. Apply evidence to individual variant interpretation. We found that a minimum of eleven total pathogenic and benign variant controls are required to reach moderate-level evidence in the absence of rigorous statistical analysis. Conclusions The recommendations and approach to functional evidence evaluation described here should help clarify the clinical variant interpretation process for functional assays. Further, we hope that these recommendations will help develop productive partnerships with basic scientists who have developed functional assays that are useful for interrogating the function of a variety of genes.