Showing papers by "University of York published in 2016"
••
University of Bristol1, Harvard University2, University Hospitals Bristol NHS Foundation Trust3, Research Triangle Park4, University of Toronto5, University of Oxford6, University of Ottawa7, Paris Descartes University8, University of London9, University of York10, University of Birmingham11, University of Southern Denmark12, University of Liverpool13, University of East Anglia14, Loyola University Chicago15, University of Aberdeen16, Kaiser Permanente17, Baruch College18, McMaster University19, Cochrane Collaboration20, McGill University21, Ottawa Hospital Research Institute22, University of Louisville23, University of Melbourne24
TL;DR: Risk of Bias In Non-randomised Studies - of Interventions is developed, a new tool for evaluating risk of bias in estimates of the comparative effectiveness of interventions from studies that did not use randomisation to allocate units or clusters of individuals to comparison groups.
Abstract: Non-randomised studies of the effects of interventions are critical to many areas of healthcare evaluation, but their results may be biased. It is therefore important to understand and appraise their strengths and weaknesses. We developed ROBINS-I (“Risk Of Bias In Non-randomised Studies - of Interventions”), a new tool for evaluating risk of bias in estimates of the comparative effectiveness (harm or benefit) of interventions from studies that did not use randomisation to allocate units (individuals or clusters of individuals) to comparison groups. The tool will be particularly useful to those undertaking systematic reviews that include non-randomised studies.
8,028 citations
••
TL;DR: In this paper, the authors present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macro-autophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes.
For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure flux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy.
Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation, it is imperative to target by gene knockout or RNA interference more than one autophagy-related protein. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways implying that not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular assays, we hope to encourage technical innovation in the field.
5,187 citations
••
TL;DR: The Global Burden of Disease 2015 Study provides a comprehensive assessment of all-cause and cause-specific mortality for 249 causes in 195 countries and territories from 1980 to 2015, finding several countries in sub-Saharan Africa had very large gains in life expectancy, rebounding from an era of exceedingly high loss of life due to HIV/AIDS.
4,804 citations
••
Duke University1, Tufts Medical Center2, University of Washington3, Harvard University4, McMaster University5, University Health Network6, University of Minnesota7, University of Chicago8, VA Palo Alto Healthcare System9, University of Michigan10, University of York11, Brown University12, Rutgers University13, Patient-Centered Outcomes Research Institute14, University of Miami15
TL;DR: The Second Panel on Cost-Effectiveness in Health and Medicine reviewed the current status of the field of cost-effectiveness analysis and developed a new set of recommendations, including the recommendation to perform analyses from 2 reference case perspectives and to provide an impact inventory to clarify included consequences.
Abstract: Importance Since publication of the report by the Panel on Cost-Effectiveness in Health and Medicine in 1996, researchers have advanced the methods of cost-effectiveness analysis, and policy makers have experimented with its application. The need to deliver health care efficiently and the importance of using analytic techniques to understand the clinical and economic consequences of strategies to improve health have increased in recent years. Objective To review the state of the field and provide recommendations to improve the quality of cost-effectiveness analyses. The intended audiences include researchers, government policy makers, public health officials, health care administrators, payers, businesses, clinicians, patients, and consumers. Design In 2012, the Second Panel on Cost-Effectiveness in Health and Medicine was formed and included 2 co-chairs, 13 members, and 3 additional members of a leadership group. These members were selected on the basis of their experience in the field to provide broad expertise in the design, conduct, and use of cost-effectiveness analyses. Over the next 3.5 years, the panel developed recommendations by consensus. These recommendations were then reviewed by invited external reviewers and through a public posting process. Findings The concept of a “reference case” and a set of standard methodological practices that all cost-effectiveness analyses should follow to improve quality and comparability are recommended. All cost-effectiveness analyses should report 2 reference case analyses: one based on a health care sector perspective and another based on a societal perspective. The use of an “impact inventory,” which is a structured table that contains consequences (both inside and outside the formal health care sector), intended to clarify the scope and boundaries of the 2 reference case analyses is also recommended. This special communication reviews these recommendations and others concerning the estimation of the consequences of interventions, the valuation of health outcomes, and the reporting of cost-effectiveness analyses. Conclusions and Relevance The Second Panel reviewed the current status of the field of cost-effectiveness analysis and developed a new set of recommendations. Major changes include the recommendation to perform analyses from 2 reference case perspectives and to provide an impact inventory to clarify included consequences.
1,995 citations
••
Nicholas J Kassebaum1, Megha Arora1, Ryan M Barber1, Zulfiqar A Bhutta2 +679 more•Institutions (268)
TL;DR: In this paper, the authors used the Global Burden of Diseases, Injuries, and Risk Factors Study 2015 (GBD 2015) for all-cause mortality, cause-specific mortality, and non-fatal disease burden to derive HALE and DALYs by sex for 195 countries and territories from 1990 to 2015.
1,533 citations
••
Max Planck Society1, Harvard University2, Massachusetts Institute of Technology3, University of Hamburg4, Medical University of Vienna5, Montreal Neurological Institute and Hospital6, University of Düsseldorf7, New York University8, Nathan Kline Institute for Psychiatric Research9, University of York10
TL;DR: An overarching organization of large-scale connectivity that situates the default-mode network at the opposite end of a spectrum from primary sensory and motor regions is described, suggesting that the role of the DMN in cognition might arise from its position at one extreme of a hierarchy, allowing it to process transmodal information that is unrelated to immediate sensory input.
Abstract: Understanding how the structure of cognition arises from the topographical organization of the cortex is a primary goal in neuroscience. Previous work has described local functional gradients extending from perceptual and motor regions to cortical areas representing more abstract functions, but an overarching framework for the association between structure and function is still lacking. Here, we show that the principal gradient revealed by the decomposition of connectivity data in humans and the macaque monkey is anchored by, at one end, regions serving primary sensory/motor functions and at the other end, transmodal regions that, in humans, are known as the default-mode network (DMN). These DMN regions exhibit the greatest geodesic distance along the cortical surface-and are precisely equidistant-from primary sensory/motor morphological landmarks. The principal gradient also provides an organizing spatial framework for multiple large-scale networks and characterizes a spectrum from unimodal to heteromodal activity in a functional metaanalysis. Together, these observations provide a characterization of the topographical organization of cortex and indicate that the role of the DMN in cognition might arise from its position at one extreme of a hierarchy, allowing it to process transmodal information that is unrelated to immediate sensory input.
1,346 citations
••
TL;DR: A selection guide of common solvents has been elaborated, based on a survey of publically available solvent selection guides, and a set of Safety, Health and Environment criteria is proposed, aligned with the GHS and European regulations.
1,161 citations
••
Ghent University1, Forschungszentrum Jülich2, Åbo Akademi University3, Aalto University4, Vienna University of Technology5, Duke University6, University of Grenoble7, École Polytechnique Fédérale de Lausanne8, Durham University9, International School for Advanced Studies10, Max Planck Society11, Uppsala University12, Fritz Haber Institute of the Max Planck Society13, Humboldt University of Berlin14, Technical University of Denmark15, National Institute of Standards and Technology16, University of Udine17, Université catholique de Louvain18, University of Basel19, Harvard University20, University of California, Davis21, Rutgers University22, University of York23, Wake Forest University24, Science and Technology Facilities Council25, University of Oxford26, University of Vienna27, Leibniz Institute for Neurobiology28, Dresden University of Technology29, Radboud University Nijmegen30, University of Tokyo31, Centre national de la recherche scientifique32, University of Cambridge33, Royal Holloway, University of London34, University of California, Santa Barbara35, University of Luxembourg36, Los Alamos National Laboratory37, Harbin Institute of Technology38
TL;DR: A procedure to assess the precision of DFT methods was devised and used to demonstrate reproducibility among many of the most widely used DFT codes, demonstrating that the precisionof DFT implementations can be determined, even in the absence of one absolute reference code.
Abstract: The widespread popularity of density functional theory has given rise to an extensive range of dedicated codes for predicting molecular and crystalline properties. However, each code implements the formalism in a different way, raising questions about the reproducibility of such predictions. We report the results of a community-wide effort that compared 15 solid-state codes, using 40 different potentials or basis set types, to assess the quality of the Perdew-Burke-Ernzerhof equations of state for 71 elemental crystals. We conclude that predictions from recent codes and pseudopotentials agree very well, with pairwise differences that are comparable to those between different high-precision experiments. Older methods, however, have less precise agreement. Our benchmark provides a framework for users and developers to document the precision of new applications and methodological improvements.
1,141 citations
••
01 Jan 2016
TL;DR: This paper describes the SemEval 2016 shared task on Aspect Based Sentiment Analysis (ABSA), a continuation of the respective tasks of 2014 and 2015, which attracted 245 submissions from 29 teams and provided 19 training and 20 testing datasets.
Abstract: This paper describes the SemEval 2016 shared task on Aspect Based Sentiment Analysis (ABSA), a continuation of the respective tasks of 2014 and 2015. In its third year, the task provided 19 training and 20 testing datasets for 8 languages and 7 domains, as well as a common evaluation procedure. From these datasets, 25 were for sentence-level and 14 for text-level ABSA; the latter was introduced for the first time as a subtask in SemEval. The task attracted 245 submissions from 29 teams.
1,139 citations
•
TL;DR: Cox as mentioned in this paper discusses various gramscian concepts and what their implications are for the study of different historical forms of hegemony and counter-hegemony, and suggests that these could have a revolutionary effect on international structures and organizations, as well as rupture with the hegemony performed by the transnational economic order.
Abstract: Este articulo es, a dia de hoy, una de las piezas clasicas y fundamentales para la posibilidad de estudiar las relaciones globales de poder a partir de las herramientas conceptuales desarrolladas por Gramsci a lo largo de su obra. Cox, contribuye de esta forma a las corrientes criticas de las Relaciones Internacionales al discutir varios conceptos gramscianos y cuales serian las implicaciones para estudiar las relaciones internacionales en distintos periodos de hegemonia y contrahegemonia. De igual forma, el autor planteo la cuestion –en su momento novedosa– de la relevancia de tomar en cuenta los procesos internos de construccion de bloques historicos contrahegemonicos como aquellos que podrian tener un efecto revolucionario en las estructuras y organizaciones internacionales, asi como ruptura con la hegemonia plasmada como una clase perteneciente a un orden economico universal transnacional. This article is a classic and fundamental for approaching global power relations with the conceptual tools developed by Gramsci. Cox contributes to critical thought in International Relations by discussing various gramscian concepts and what their implications are for the study of different historical forms of hegemony and counter-hegemony. Also, the author draws our attention –novel at the time of its publicaction– to the relevance of taking into account the construction of domestic counter-hegemonic historic blocs. He suggests that these could have a revolutionary effect on international structures and organizations, as well as rupture with the hegemony performed by the transnational economic order.
1,081 citations
••
University of Minnesota1, University of Michigan2, Florida State University3, University of Twente4, Queen's University Belfast5, University of California, Berkeley6, University of Belgrade7, University of Bristol8, University of Padua9, University of York10, Osaka University11, Loughborough University12, Leibniz Association13, Brno University of Technology14, Academy of Sciences of the Czech Republic15, Comenius University in Bratislava16, École Polytechnique17, Ulster University18, Clarkson University19, Michigan Technological University20, University of Antwerp21, Lublin University of Technology22, University of Montpellier23, Eindhoven University of Technology24, Max Planck Society25, University of Alberta26, Durham University27, Lawrence Berkeley National Laboratory28, National Institute of Advanced Industrial Science and Technology29, Saint Petersburg State University30
TL;DR: A review of the state-of-the-art of this multidisciplinary area and identifying the key research challenges is provided in this paper, where the developments in diagnostics, modeling and further extensions of cross section and reaction rate databases are discussed.
Abstract: Plasma–liquid interactions represent a growing interdisciplinary area of research involving plasma science, fluid dynamics, heat and mass transfer, photolysis, multiphase chemistry and aerosol science. This review provides an assessment of the state-of-the-art of this multidisciplinary area and identifies the key research challenges. The developments in diagnostics, modeling and further extensions of cross section and reaction rate databases that are necessary to address these challenges are discussed. The review focusses on non-equilibrium plasmas.
••
TL;DR: Investigating the ability of simple measures of childhood obesity such as body mass index (BMI) to predict future obesity in adolescence and adulthood found that obese children and adolescents were around five times more likely to be obese in adulthood than those who were not obese.
Abstract: A systematic review and meta-analysis was performed to investigate the ability of simple measures of childhood obesity such as body mass index (BMI) to predict future obesity in adolescence and adulthood. Large cohort studies, which measured obesity both in childhood and in later adolescence or adulthood, using any recognized measure of obesity were sought. Study quality was assessed. Studies were pooled using diagnostic meta-analysis methods. Fifteen prospective cohort studies were included in the meta-analysis. BMI was the only measure of obesity reported in any study, with 200,777 participants followed up. Obese children and adolescents were around five times more likely to be obese in adulthood than those who were not obese. Around 55% of obese children go on to be obese in adolescence, around 80% of obese adolescents will still be obese in adulthood and around 70% will be obese over age 30. Therefore, action to reduce and prevent obesity in these adolescents is needed. However, 70% of obese adults were not obese in childhood or adolescence, so targeting obesity reduction solely at obese or overweight children needs to be considered carefully as this may not substantially reduce the overall burden of adult obesity.
••
TL;DR: The findings suggest that deficiencies in social relationships are associated with an increased risk of developing CHD and stroke in high-income countries.
Abstract: Background The influence of social relationships on morbidity is widely accepted, but the size of the risk to cardiovascular health is unclear.
Objective We undertook a systematic review and meta-analysis to investigate the association between loneliness or social isolation and incident coronary heart disease (CHD) and stroke.
Methods Sixteen electronic databases were systematically searched for longitudinal studies set in high-income countries and published up until May 2015. Two independent reviewers screened studies for inclusion and extracted data. We assessed quality using a component approach and pooled data for analysis using random effects models.
Results Of the 35 925 records retrieved, 23 papers met inclusion criteria for the narrative review. They reported data from 16 longitudinal datasets, for a total of 4628 CHD and 3002 stroke events recorded over follow-up periods ranging from 3 to 21 years. Reports of 11 CHD studies and 8 stroke studies provided data suitable for meta-analysis. Poor social relationships were associated with a 29% increase in risk of incident CHD (pooled relative risk: 1.29, 95% CI 1.04 to 1.59) and a 32% increase in risk of stroke (pooled relative risk: 1.32, 95% CI 1.04 to 1.68). Subgroup analyses did not identify any differences by gender.
Conclusions Our findings suggest that deficiencies in social relationships are associated with an increased risk of developing CHD and stroke. Future studies are needed to investigate whether interventions targeting loneliness and social isolation can help to prevent two of the leading causes of death and disability in high-income countries.
Study registration number CRD42014010225.
••
TL;DR: The prevalence of depression in adolescents and young adults has increased in recent years and trends in prevalence translate into a growing number of young people with untreated depression, calling for renewed efforts to expand service capacity to best meet the mental health care needs of this age group.
Abstract: OBJECTIVES: This study examined national trends in 12-month prevalence of major depressive episodes (MDEs) in adolescents and young adults overall and in different sociodemographic groups, as well as trends in depression treatment between 2005 and 2014.
METHODS: Data were drawn from the National Surveys on Drug Use and Health for 2005 to 2014, which are annual cross-sectional surveys of the US general population. Participants included 172 495 adolescents aged 12 to 17 and 178 755 adults aged 18 to 25. Time trends in 12-month prevalence of MDEs were examined overall and in different subgroups, as were time trends in the use of treatment services.
RESULTS: The 12-month prevalence of MDEs increased from 8.7% in 2005 to 11.3% in 2014 in adolescents and from 8.8% to 9.6% in young adults (both P < .001). The increase was larger and statistically significant only in the age range of 12 to 20 years. The trends remained significant after adjustment for substance use disorders and sociodemographic factors. Mental health care contacts overall did not change over time; however, the use of specialty mental health providers increased in adolescents and young adults, and the use of prescription medications and inpatient hospitalizations increased in adolescents.
CONCLUSIONS: The prevalence of depression in adolescents and young adults has increased in recent years. In the context of little change in mental health treatments, trends in prevalence translate into a growing number of young people with untreated depression. The findings call for renewed efforts to expand service capacity to best meet the mental health care needs of this age group.
••
TL;DR: Poor wellbeing and moderate to high levels of burnout are associated, in the majority of studies reviewed, with poor patient safety outcomes such as medical errors, however the lack of prospective studies reduces the ability to determine causality.
Abstract: Objective To determine whether there is an association between healthcare professionals’ wellbeing and burnout, with patient safety. Design Systematic research review. Data Sources PsychInfo (1806 to July 2015), Medline (1946 to July 2015), Embase (1947 to July 2015) and Scopus (1823 to July 2015) were searched, along with reference lists of eligible articles. Eligibility Criteria for Selecting Studies Quantitative, empirical studies that included i) either a measure of wellbeing or burnout, and ii) patient safety, in healthcare staff populations. Results Forty-six studies were identified. Sixteen out of the 27 studies that measured wellbeing found a significant correlation between poor wellbeing and worse patient safety, with six additional studies finding an association with some but not all scales used, and one study finding a significant association but in the opposite direction to the majority of studies. Twenty-one out of the 30 studies that measured burnout found a significant association between burnout and patient safety, whilst a further four studies found an association between one or more (but not all) subscales of the burnout measures employed, and patient safety. Conclusions Poor wellbeing and moderate to high levels of burnout are associated, in the majority of studies reviewed, with poor patient safety outcomes such as medical errors, however the lack of prospective studies reduces the ability to determine causality. Further prospective studies, research in primary care, conducted within the UK, and a clearer definition of healthcare staff wellbeing are needed. Implications This review illustrates the need for healthcare organisations to consider improving employees’ mental health as well as creating safer work environments when planning interventions to improve patient safety. Systematic Review Registration PROSPERO registration number: CRD42015023340.
••
West Virginia University1, Yale University2, Food and Agriculture Organization3, Landcare Research4, University of Udine5, Max Planck Society6, University of Alaska Fairbanks7, Technische Universität München8, Université du Québec à Montréal9, University of the French West Indies and Guiana10, University of Freiburg Faculty of Biology11, Cornell University12, Wageningen University and Research Centre13, University of Sydney14, Polytechnic Institute of Viseu15, University of Trás-os-Montes and Alto Douro16, University of Göttingen17, Russian Academy of Sciences18, Oeschger Centre for Climate Change Research19, Lakehead University20, University of La Frontera21, Seoul National University22, Martin Luther University of Halle-Wittenberg23, University of Cambridge24, Center for International Forestry Research25, James Cook University26, University of Zurich27, University of Yaoundé I28, University of Wisconsin-Madison29, Queensland Government30, Florida International University31, Institut national de la recherche agronomique32, Forest Research Institute33, Polish Academy of Sciences34, University of Minnesota35, Warsaw University of Life Sciences36, Ştefan cel Mare University of Suceava37, University of Florence38, University of Warsaw39, King Juan Carlos University40, Spanish National Research Council41, International Trademark Association42, National University of Austral Patagonia43, National Scientific and Technical Research Council44, Wildlife Conservation Society45, College of African Wildlife Management46, University of York47, Durham University48, Ontario Ministry of Natural Resources49, Pontificia Universidad Católica del Ecuador50, Centre national de la recherche scientifique51, Museu Paraense Emílio Goeldi52, University College London53, University of Leeds54
TL;DR: A consistent positive concave-down effect of biodiversity on forest productivity across the world is revealed, showing that a continued biodiversity loss would result in an accelerating decline in forest productivity worldwide.
Abstract: The biodiversity-productivity relationship (BPR) is foundational to our understanding of the global extinction crisis and its impacts on ecosystem functioning. Understanding BPR is critical for the accurate valuation and effective conservation of biodiversity. Using ground-sourced data from 777,126 permanent plots, spanning 44 countries and most terrestrial biomes, we reveal a globally consistent positive concave-down BPR, showing that continued biodiversity loss would result in an accelerating decline in forest productivity worldwide. The value of biodiversity in maintaining commercial forest productivity alone-US$166 billion to 490 billion per year according to our estimation-is more than twice what it would cost to implement effective global conservation. This highlights the need for a worldwide reassessment of biodiversity values, forest management strategies, and conservation priorities.
••
Institute for Health Metrics and Evaluation1, College of Health Sciences, Bahrain2, Harvard University3, Kwame Nkrumah University of Science and Technology4, Charité5, Ahmadu Bello University6, University of the Philippines Manila7, Pontifical Xavierian University8, Madawalabu University9, World Bank10, Public Health Foundation of India11, Guy's and St Thomas' NHS Foundation Trust12, Griffith University13, University of New South Wales14, Massey University15, University of Peradeniya16, University of Sydney17, Chinese Center for Disease Control and Prevention18, Russian Academy of Sciences19, Tehran University of Medical Sciences20, Auckland University of Technology21, James Cook University22, Monash University23, University of California, San Francisco24, Arabian Gulf University25, Central South University26, Virginia Commonwealth University27, Jordan University of Science and Technology28, Health Services Academy29, Oregon Health & Science University30, University of Sheffield31, University at Albany, SUNY32, Aintree University Hospitals NHS Foundation Trust33, Swansea University34, University of York35, South African Medical Research Council36, Children's Hospital of Philadelphia37, Addis Ababa University38, Curtin University39, University of Washington40, Queensland University of Technology41, University of British Columbia42, Suez Canal University43, Karolinska Institutet44, University of Alabama at Birmingham45, An-Najah National University46, Tufts Medical Center47, Norwegian Institute of Public Health48, Stavanger University Hospital49, University of Cape Town50, University of California, Irvine51, University of Illinois at Urbana–Champaign52, St. John's University53, Johns Hopkins University54, Hanoi Medical University55, National Research University – Higher School of Economics56, University of Gondar57, University of Hong Kong58, Jackson State University59, Wuhan University60
TL;DR: An overview of injury estimates from the 2013 update of GBD is provided, with detailed information on incidence, mortality, DALYs and rates of change from 1990 to 2013 for 26 causes of injury, globally, by region and by country.
Abstract: Background The Global Burden of Diseases (GBD), Injuries, and Risk Factors study used the disability-adjusted life year (DALY) to quantify the burden of diseases, injuries, and risk factors. This paper provides an overview of injury estimates from the 2013 update of GBD, with detailed information on incidence, mortality, DALYs and rates of change from 1990 to 2013 for 26 causes of injury, globally, by region and by country.
Methods Injury mortality was estimated using the extensive GBD mortality database, corrections for ill-defined cause of death and the cause of death ensemble modelling tool. Morbidity estimation was based on inpatient and outpatient data sets, 26 cause-of-injury and 47 nature-of-injury categories, and seven follow-up studies with patient-reported long-term outcome measures.
Results In 2013, 973 million (uncertainty interval (UI) 942 to 993) people sustained injuries that warranted some type of healthcare and 4.8 million (UI 4.5 to 5.1) people died from injuries. Between 1990 and 2013 the global age-standardised injury DALY rate decreased by 31% (UI 26% to 35%). The rate of decline in DALY rates was significant for 22 cause-of-injury categories, including all the major injuries.
Conclusions Injuries continue to be an important cause of morbidity and mortality in the developed and developing world. The decline in rates for almost all injuries is so prominent that it warrants a general statement that the world is becoming a safer place to live in. However, the patterns vary widely by cause, age, sex, region and time and there are still large improvements that need to be made.
••
TL;DR: The GAD-7 had acceptable properties for identifying GAD at cutoff scores 7-10 and the GAD/GAD-2 had acceptable qualities for identifying generalized anxiety disorder at a cutoff score of 3.
••
TL;DR: The aim is to identify known methods for estimation of the between‐study variance and its corresponding uncertainty, and to summarise the simulation and empirical evidence that compares them and recommend the Q‐profile method and the alternative approach based on a ‘generalised Cochran between‐ study variance statistic’.
Abstract: Meta-analyses are typically used to estimate the overall/mean of an outcome of interest. However, inference about between-study variability, which is typically modelled using a between-study variance parameter, is usually an additional aim. The DerSimonian and Laird method, currently widely used by default to estimate the between-study variance, has been long challenged. Our aim is to identify known methods for estimation of the between-study variance and its corresponding uncertainty, and to summarise the simulation and empirical evidence that compares them. We identified 16 estimators for the between-study variance, seven methods to calculate confidence intervals, and several comparative studies. Simulation studies suggest that for both dichotomous and continuous data the estimator proposed by Paule and Mandel and for continuous data the restricted maximum likelihood estimator are better alternatives to estimate the between-study variance. Based on the scenarios and results presented in the published studies, we recommend the Q-profile method and the alternative approach based on a 'generalised Cochran between-study variance statistic' to compute corresponding confidence intervals around the resulting estimates. Our recommendations are based on a qualitative evaluation of the existing literature and expert consensus. Evidence-based recommendations require an extensive simulation study where all methods would be compared under the same scenarios.
••
23 May 2016TL;DR: A review of general purpose solvent selection guides can be found in this paper, highlighting their similarities and differences and how they can be used to enhance the greenness of chemical processes, particularly in laboratory organic synthesis and the pharmaceutical industry.
Abstract: Driven by legislation and evolving attitudes towards environmental issues, establishing green solvents for extractions, separations, formulations and reaction chemistry has become an increasingly important area of research Several general purpose solvent selection guides have now been published with the aim to reduce use of the most hazardous solvents This review serves the purpose of explaining the role of these guides, highlighting their similarities and differences How they can be used most effectively to enhance the greenness of chemical processes, particularly in laboratory organic synthesis and the pharmaceutical industry, is addressed in detail
••
TL;DR: The current state of evidence supports that gamification can have a positive impact in health and wellbeing, particularly for health behaviours, however several studies report mixed or neutral effect.
••
TL;DR: The offspring of depressed parents constitute a high-risk group for psychiatric and medical problems, which begin early and continue through adulthood, and early detection seems warranted.
Abstract: Objective:While the increased risk of psychopathology in the biological offspring of depressed parents has been widely replicated, the long-term outcome through their full age of risk is less known. The authors present a 30-year follow-up of biological offspring (mean age=47 years) of depressed (high-risk) and nondepressed (low-risk) parents.Method:One hundred forty-seven offspring of moderately to severely depressed or nondepressed parents selected from the same community were followed for up to 30 years. Diagnostic assessments were conducted blind to parents’ clinical status. Final diagnoses were made by a blinded M.D. or Ph.D. evaluator.Results:The risk for major depression was approximately three times as high in the high-risk offspring. The period of highest risk for first onset was between ages 15 and 25 in both groups. Prepubertal onsets were uncommon, but high-risk offspring had over 10-fold increased risk. The early onset of major depression seen in the offspring of depressed parents was not offs...
••
TL;DR: Findings indicate an urgent need to destigmatize DUD and educate the public, clinicians, and policy makers about its treatment to encourage affected individuals to obtain help.
Abstract: Importance Current information on the prevalence and sociodemographic and clinical profiles of individuals in the general population with DSM-5 drug use disorder (DUD) is limited. Given the present societal and economic context in the United States and the new diagnostic system, up-to-date national information is needed from a single uniform data source. Objective To present nationally representative findings on the prevalence, correlates, psychiatric comorbidity, disability, and treatment of DSM-5 DUD diagnoses overall and by severity level. Design, Setting, and Participants In-person interviews were conducted with 36 309 adults in the 2012-2013 National Epidemiologic Survey on Alcohol and Related Conditions–III, a cross-sectional representative survey of the United States. The household response rate was 72%; person-level response rate, 84%; and overall response rate, 60.1%. Data were collected April 2012 through June 2013 and analyzed from February through March 2015. Main Outcomes and Measures Twelve-month and lifetime DUD, based on amphetamine, cannabis, club drug, cocaine, hallucinogen, heroin, nonheroin opioid, sedative/tranquilizer, and/or solvent/inhalant use disorders. Results Prevalences of 12-month and lifetime DUD were 3.9% and 9.9%, respectively. Drug use disorder was generally greater among men, white and Native American individuals, younger and previously or never married adults, those with lower education and income, and those residing in the West. Significant associations were found between 12-month and lifetime DUD and other substance use disorders. Significant associations were also found between any 12-month DUD and major depressive disorder (odds ratio [OR], 1.3; 95% CI, 1.09-1.64), dysthymia (OR, 1.5; 95% CI, 1.09-2.02), bipolar I (OR, 1.5; 95% CI, 1.06-2.05), posttraumatic stress disorder (OR, 1.6; 95% CI, 1.27-2.10), and antisocial (OR, 1.4; 95% CI, 1.11-1.75), borderline (OR, 1.8; 95% CI, 1.41-2.24), and schizotypal (OR, 1.5; 95% CI, 1.18-1.87) personality disorders. Similar associations were found for any lifetime DUD with the exception that lifetime DUD was also associated with generalized anxiety disorder (OR, 1.3; 95% CI, 1.06-1.49), panic disorder (OR, 1.3; 95% CI, 1.06-1.59), and social phobia (OR, 1.3; 95% CI, 1.09-1.64). Twelve-month DUD was associated with significant disability, increasing with DUD severity. Among respondents with 12-month and lifetime DUD, only 13.5% and 24.6% received treatment, respectively. Conclusions and Relevance DSM-5 DUD is a common, highly comorbid, and disabling disorder that largely goes untreated in the United States. These findings indicate the need for additional studies to understand the broad relationships in more detail; estimate present-day economic costs of DUDs; investigate hypotheses regarding etiology, chronicity, and treatment use; and provide information to policy makers about allocation of resources for service delivery and research. Findings also indicate an urgent need to destigmatize DUD and educate the public, clinicians, and policy makers about its treatment to encourage affected individuals to obtain help.
••
Nicholas J Kassebaum1, Ryan M Barber1, Zulfiqar A Bhutta2, Zulfiqar A Bhutta3 +613 more•Institutions (272)
TL;DR: In this article, the authors quantified maternal mortality throughout the world by underlying cause and age from 1990 to 2015 for ages 10-54 years by systematically compiling and processing all available data sources from 186 of 195 countries and territories.
••
TL;DR: How to design fragment libraries, how to select screening techniques and how to make the most of information gleaned from them are discussed, and how concepts from FBDD have permeated and enhanced drug discovery efforts are shown.
Abstract: Fragment-based methods have made substantial contributions to drug discovery in the past 20 years, particularly for challenging targets. Erlanson and colleagues discuss progress in the field, key aspects such as the design of fragment libraries and the choice of screening technique, and how current challenges in fragment-based drug discovery might be overcome.
••
TL;DR: It is argued that there is a sound principled reason for including both the costs and benefits of unrelated care in technology appraisals, and changed practice would have material consequences for decisions about reimbursing particular technologies.
Abstract: In this editorial, we consider the vexing issue of 'unrelated future costs' (for example, the costs of caring for people with dementia or kidney failure after preventing their deaths from a heart attack). The National Institute of Health and Care Excellence (NICE) guidance is not to take such costs into account in technology appraisals. However, standard appraisal practice involves modelling the benefits of those unrelated technologies. We argue that there is a sound principled reason for including both the costs and benefits of unrelated care. Changing this practice would have material consequences for decisions about reimbursing particular technologies, and we urge future research to understand this better. Copyright © 2016 John Wiley & Sons, Ltd.
••
TL;DR: If suitably designed, supramolecular gels can be recyclable and environmentally benign, while the responsive and tunable nature of the self-assembled network offers significant advantages over other materials solutions to problems caused by pollution in an environmental setting.
Abstract: This review explores supramolecular gels as materials for environmental remediation. These soft materials are formed by self-assembling low-molecular-weight building blocks, which can be programmed with molecular-scale information by simple organic synthesis. The resulting gels often have nanoscale ‘solid-like’ networks which are sample-spanning within a ‘liquid-like’ solvent phase. There is intimate contact between the solvent and the gel nanostructure, which has a very high effective surface area as a result of its dimensions. As such, these materials have the ability to bring a solid-like phase into contact with liquids in an environmental setting. Such materials can therefore remediate unwanted pollutants from the environment including: immobilisation of oil spills, removal of dyes, extraction of heavy metals or toxic anions, and the detection or removal of chemical weapons. Controlling the interactions between the gel nanofibres and pollutants can lead to selective uptake and extraction. Furthermore, if suitably designed, such materials can be recyclable and environmentally benign, while the responsive and tunable nature of the self-assembled network offers significant advantages over other materials solutions to problems caused by pollution in an environmental setting.
••
TL;DR: Estimating opportunity-cost-based cost-effectiveness thresholds for low-/middle-income countries can provide a useful input to inform resource allocation decisions and suggest that routinely used CETs have been too high.
••
TL;DR: Populations of P. aeruginosa in chronic CF lung infections typically exhibit high phenotypic diversity, including for clinically important traits such as antibiotic resistance and toxin production, and this diversity is dynamic over time, making accurate diagnosis and treatment challenging.
••
TL;DR: This report provides national estimates of levels and trends of HIV/AIDS incidence, prevalence, coverage of antiretroviral therapy (ART), and mortality for 195 countries and territories from 1980 to 2015.