scispace - formally typeset
Search or ask a question

Showing papers by "University of Oxford published in 2009"


Journal ArticleDOI
TL;DR: Moher et al. as mentioned in this paper introduce PRISMA, an update of the QUOROM guidelines for reporting systematic reviews and meta-analyses, which is used in this paper.
Abstract: David Moher and colleagues introduce PRISMA, an update of the QUOROM guidelines for reporting systematic reviews and meta-analyses

62,157 citations


Journal Article
TL;DR: The QUOROM Statement (QUality Of Reporting Of Meta-analyses) as mentioned in this paper was developed to address the suboptimal reporting of systematic reviews and meta-analysis of randomized controlled trials.
Abstract: Systematic reviews and meta-analyses have become increasingly important in health care. Clinicians read them to keep up to date with their field,1,2 and they are often used as a starting point for developing clinical practice guidelines. Granting agencies may require a systematic review to ensure there is justification for further research,3 and some health care journals are moving in this direction.4 As with all research, the value of a systematic review depends on what was done, what was found, and the clarity of reporting. As with other publications, the reporting quality of systematic reviews varies, limiting readers' ability to assess the strengths and weaknesses of those reviews. Several early studies evaluated the quality of review reports. In 1987, Mulrow examined 50 review articles published in 4 leading medical journals in 1985 and 1986 and found that none met all 8 explicit scientific criteria, such as a quality assessment of included studies.5 In 1987, Sacks and colleagues6 evaluated the adequacy of reporting of 83 meta-analyses on 23 characteristics in 6 domains. Reporting was generally poor; between 1 and 14 characteristics were adequately reported (mean = 7.7; standard deviation = 2.7). A 1996 update of this study found little improvement.7 In 1996, to address the suboptimal reporting of meta-analyses, an international group developed a guidance called the QUOROM Statement (QUality Of Reporting Of Meta-analyses), which focused on the reporting of meta-analyses of randomized controlled trials.8 In this article, we summarize a revision of these guidelines, renamed PRISMA (Preferred Reporting Items for Systematic reviews and Meta-Analyses), which have been updated to address several conceptual and practical advances in the science of systematic reviews (Box 1). Box 1 Conceptual issues in the evolution from QUOROM to PRISMA

46,935 citations


Journal ArticleDOI
TL;DR: An Explanation and Elaboration of the PRISMA Statement is presented and updated guidelines for the reporting of systematic reviews and meta-analyses are presented.
Abstract: Systematic reviews and meta-analyses are essential to summarize evidence relating to efficacy and safety of health care interventions accurately and reliably. The clarity and transparency of these reports, however, is not optimal. Poor reporting of systematic reviews diminishes their value to clinicians, policy makers, and other users. Since the development of the QUOROM (QUality Of Reporting Of Meta-analysis) Statement—a reporting guideline published in 1999—there have been several conceptual, methodological, and practical advances regarding the conduct and reporting of systematic reviews and meta-analyses. Also, reviews of published systematic reviews have found that key information about these studies is often poorly reported. Realizing these issues, an international group that included experienced authors and methodologists developed PRISMA (Preferred Reporting Items for Systematic reviews and Meta-Analyses) as an evolution of the original QUOROM guideline for systematic reviews and meta-analyses of evaluations of health care interventions. The PRISMA Statement consists of a 27-item checklist and a four-phase flow diagram. The checklist includes items deemed essential for transparent reporting of a systematic review. In this Explanation and Elaboration document, we explain the meaning and rationale for each checklist item. For each item, we include an example of good reporting and, where possible, references to relevant empirical studies and methodological literature. The PRISMA Statement, this document, and the associated Web site (http://www.prisma-statement.org/) should be helpful resources to improve reporting of systematic reviews and meta-analyses.

25,711 citations


Journal ArticleDOI
TL;DR: PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) is introduced, an update of the QUOROM guidelines for reporting systematic reviews and meta-analyses.
Abstract: Moher and colleagues introduce PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses), an update of the QUOROM guidelines for reporting systematic reviews and meta-analyses. Us...

23,203 citations


Journal ArticleDOI
TL;DR: In this paper, Heaton, AG Hogg, KA Hughen, KF Kaiser, B Kromer, SW Manning, RW Reimer, DA Richards, JR Southon, S Talamo, CSM Turney, J van der Plicht, CE Weyhenmeyer
Abstract: Additional co-authors: TJ Heaton, AG Hogg, KA Hughen, KF Kaiser, B Kromer, SW Manning, RW Reimer, DA Richards, JR Southon, S Talamo, CSM Turney, J van der Plicht, CE Weyhenmeyer

13,605 citations


Journal ArticleDOI
23 Sep 2009-Nature
TL;DR: Identifying and quantifying planetary boundaries that must not be transgressed could help prevent human activities from causing unacceptable environmental change, argue Johan Rockstrom and colleagues.
Abstract: Identifying and quantifying planetary boundaries that must not be transgressed could help prevent human activities from causing unacceptable environmental change, argue Johan Rockstrom and colleagues.

8,837 citations


Journal ArticleDOI
TL;DR: This Explanation and Elaboration document explains the meaning and rationale for each checklist item and includes an example of good reporting and, where possible, references to relevant empirical studies and methodological literature.

8,021 citations


Journal ArticleDOI
08 Oct 2009-Nature
TL;DR: This paper examined potential sources of missing heritability and proposed research strategies, including and extending beyond current genome-wide association approaches, to illuminate the genetics of complex diseases and enhance its potential to enable effective disease prevention or treatment.
Abstract: Genome-wide association studies have identified hundreds of genetic variants associated with complex human diseases and traits, and have provided valuable insights into their genetic architecture. Most variants identified so far confer relatively small increments in risk, and explain only a small proportion of familial clustering, leading many to question how the remaining, 'missing' heritability can be explained. Here we examine potential sources of missing heritability and propose research strategies, including and extending beyond current genome-wide association approaches, to illuminate the genetics of complex diseases and enhance its potential to enable effective disease prevention or treatment.

7,797 citations


Journal ArticleDOI
TL;DR: An overview of the main model components used in chronological analysis, their mathematical formulation, and examples of how such analyses can be performed using the latest version of the OxCal software (v4) are given.
Abstract: If radiocarbon measurements are to be used at all for chronological purposes, we have to use statistical methods for calibration. The most widely used method of calibration can be seen as a simple application of Bayesian statistics, which uses both the information from the new measurement and information from the 14C calibration curve. In most dating applications, however, we have larger numbers of 14C measurements and we wish to relate those to events in the past. Bayesian statistics provides a coherent framework in which such analysis can be performed and is becoming a core element in many 14C dating projects. This article gives an overview of the main model components used in chronological analysis, their mathematical formulation, and examples of how such analyses can be performed using the latest version of the OxCal software (v4). Many such models can be put together, in a modular fashion, from simple elements, with defined constraints and groupings. In other cases, the commonly used "uniform phase" models might not be appropriate, and ramped, exponential, or normal distributions of events might be more useful. When considering analyses of these kinds, it is useful to be able run simulations on synthetic data. Methods for performing such tests are discussed here along with other methods of diagnosing possible problems with statistical models of this kind.

6,323 citations


Journal ArticleDOI
TL;DR: In this article, the Wilkinson Microwave Anisotropy Probe (WMAP) 5-year data were used to constrain the physics of cosmic inflation via Gaussianity, adiabaticity, the power spectrum of primordial fluctuations, gravitational waves, and spatial curvature.
Abstract: The Wilkinson Microwave Anisotropy Probe (WMAP) 5-year data provide stringent limits on deviations from the minimal, six-parameter Λ cold dark matter model. We report these limits and use them to constrain the physics of cosmic inflation via Gaussianity, adiabaticity, the power spectrum of primordial fluctuations, gravitational waves, and spatial curvature. We also constrain models of dark energy via its equation of state, parity-violating interaction, and neutrino properties, such as mass and the number of species. We detect no convincing deviations from the minimal model. The six parameters and the corresponding 68% uncertainties, derived from the WMAP data combined with the distance measurements from the Type Ia supernovae (SN) and the Baryon Acoustic Oscillations (BAO) in the distribution of galaxies, are: Ω b h 2 = 0.02267+0.00058 –0.00059, Ω c h 2 = 0.1131 ± 0.0034, ΩΛ = 0.726 ± 0.015, ns = 0.960 ± 0.013, τ = 0.084 ± 0.016, and at k = 0.002 Mpc-1. From these, we derive σ8 = 0.812 ± 0.026, H 0 = 70.5 ± 1.3 km s-1 Mpc–1, Ω b = 0.0456 ± 0.0015, Ω c = 0.228 ± 0.013, Ω m h 2 = 0.1358+0.0037 –0.0036, z reion = 10.9 ± 1.4, and t 0 = 13.72 ± 0.12 Gyr. With the WMAP data combined with BAO and SN, we find the limit on the tensor-to-scalar ratio of r 1 is disfavored even when gravitational waves are included, which constrains the models of inflation that can produce significant gravitational waves, such as chaotic or power-law inflation models, or a blue spectrum, such as hybrid inflation models. We obtain tight, simultaneous limits on the (constant) equation of state of dark energy and the spatial curvature of the universe: –0.14 < 1 + w < 0.12(95%CL) and –0.0179 < Ω k < 0.0081(95%CL). We provide a set of WMAP distance priors, to test a variety of dark energy models with spatial curvature. We test a time-dependent w with a present value constrained as –0.33 < 1 + w 0 < 0.21 (95% CL). Temperature and dark matter fluctuations are found to obey the adiabatic relation to within 8.9% and 2.1% for the axion-type and curvaton-type dark matter, respectively. The power spectra of TB and EB correlations constrain a parity-violating interaction, which rotates the polarization angle and converts E to B. The polarization angle could not be rotated more than –59 < Δα < 24 (95% CL) between the decoupling and the present epoch. We find the limit on the total mass of massive neutrinos of ∑m ν < 0.67 eV(95%CL), which is free from the uncertainty in the normalization of the large-scale structure data. The number of relativistic degrees of freedom (dof), expressed in units of the effective number of neutrino species, is constrained as N eff = 4.4 ± 1.5 (68%), consistent with the standard value of 3.04. Finally, quantitative limits on physically-motivated primordial non-Gaussianity parameters are –9 < f local NL < 111 (95% CL) and –151 < f equil NL < 253 (95% CL) for the local and equilateral models, respectively.

5,904 citations


Journal ArticleDOI
TL;DR: In this article, the authors proposed a new approach to global sustainability in which they define planetary boundaries within which they expect that humanity can operate safely. But the proposed concept of "planetary boundaries" lays the groundwork for shifting our approach to governance and management, away from the essentially sectoral analyses of limits to growth aimed at minimizing negative externalities, toward the estimation of the safe space for human development.
Abstract: Anthropogenic pressures on the Earth System have reached a scale where abrupt global environmental change can no longer be excluded. We propose a new approach to global sustainability in which we define planetary boundaries within which we expect that humanity can operate safely. Transgressing one or more planetary boundaries may be deleterious or even catastrophic due to the risk of crossing thresholds that will trigger non-linear, abrupt environmental change within continental- to planetary-scale systems. We have identified nine planetary boundaries and, drawing upon current scientific understanding, we propose quantifications for seven of them. These seven are climate change (CO2 concentration in the atmosphere <350 ppm and/or a maximum change of +1 W m-2 in radiative forcing); ocean acidification (mean surface seawater saturation state with respect to aragonite ≥ 80% of pre-industrial levels); stratospheric ozone (<5% reduction in O3 concentration from pre-industrial level of 290 Dobson Units); biogeochemical nitrogen (N) cycle (limit industrial and agricultural fixation of N2 to 35 Tg N yr-1) and phosphorus (P) cycle (annual P inflow to oceans not to exceed 10 times the natural background weathering of P); global freshwater use (<4000 km3 yr-1 of consumptive use of runoff resources); land system change (<15% of the ice-free land surface under cropland); and the rate at which biological diversity is lost (annual rate of <10 extinctions per million species). The two additional planetary boundaries for which we have not yet been able to determine a boundary level are chemical pollution and atmospheric aerosol loading. We estimate that humanity has already transgressed three planetary boundaries: for climate change, rate of biodiversity loss, and changes to the global nitrogen cycle. Planetary boundaries are interdependent, because transgressing one may both shift the position of other boundaries or cause them to be transgressed. The social impacts of transgressing boundaries will be a function of the social-ecological resilience of the affected societies. Our proposed boundaries are rough, first estimates only, surrounded by large uncertainties and knowledge gaps. Filling these gaps will require major advancements in Earth System and resilience science. The proposed concept of "planetary boundaries" lays the groundwork for shifting our approach to governance and management, away from the essentially sectoral analyses of limits to growth aimed at minimizing negative externalities, toward the estimation of the safe space for human development. Planetary boundaries define, as it were, the boundaries of the "planetary playing field" for humanity if we want to be sure of avoiding major human-induced environmental change on a global scale.

Journal ArticleDOI
TL;DR: It is concluded that the full repertoire of functional networks utilized by the brain in action is continuously and dynamically “active” even when at “rest.”
Abstract: Neural connections, providing the substrate for functional networks, exist whether or not they are functionally active at any given moment. However, it is not known to what extent brain regions are continuously interacting when the brain is “at rest.” In this work, we identify the major explicit activation networks by carrying out an image-based activation network analysis of thousands of separate activation maps derived from the BrainMap database of functional imaging studies, involving nearly 30,000 human subjects. Independently, we extract the major covarying networks in the resting brain, as imaged with functional magnetic resonance imaging in 36 subjects at rest. The sets of major brain networks, and their decompositions into subnetworks, show close correspondence between the independent analyses of resting and activation brain dynamics. We conclude that the full repertoire of functional networks utilized by the brain in action is continuously and dynamically “active” even when at “rest.”

Journal ArticleDOI
Shaun Purcell1, Shaun Purcell2, Naomi R. Wray3, Jennifer Stone1, Jennifer Stone2, Peter M. Visscher, Michael Conlon O'Donovan4, Patrick F. Sullivan5, Pamela Sklar2, Pamela Sklar1, Douglas M. Ruderfer, Andrew McQuillin, Derek W. Morris6, Colm O'Dushlaine6, Aiden Corvin6, Peter Holmans4, Stuart MacGregor3, Hugh Gurling, Douglas Blackwood7, Nicholas John Craddock5, Michael Gill6, Christina M. Hultman8, Christina M. Hultman9, George Kirov4, Paul Lichtenstein8, Walter J. Muir7, Michael John Owen4, Carlos N. Pato10, Edward M. Scolnick2, Edward M. Scolnick1, David St Clair, Nigel Williams4, Lyudmila Georgieva4, Ivan Nikolov4, Nadine Norton4, Hywel Williams4, Draga Toncheva, Vihra Milanova, Emma Flordal Thelander8, Patrick Sullivan11, Elaine Kenny6, Emma M. Quinn6, Khalid Choudhury12, Susmita Datta12, Jonathan Pimm12, Srinivasa Thirumalai13, Vinay Puri12, Robert Krasucki12, Jacob Lawrence12, Digby Quested14, Nicholas Bass12, Caroline Crombie15, Gillian Fraser15, Soh Leh Kuan, Nicholas Walker, Kevin A. McGhee7, Ben S. Pickard16, P. Malloy7, Alan W Maclean7, Margaret Van Beck7, Michele T. Pato10, Helena Medeiros10, Frank A. Middleton17, Célia Barreto Carvalho10, Christopher P. Morley17, Ayman H. Fanous, David V. Conti10, James A. Knowles10, Carlos Ferreira, António Macedo18, M. Helena Azevedo18, Andrew Kirby1, Andrew Kirby2, Manuel A. R. Ferreira1, Manuel A. R. Ferreira2, Mark J. Daly2, Mark J. Daly1, Kimberly Chambert1, Finny G Kuruvilla1, Stacey Gabriel1, Kristin G. Ardlie1, Jennifer L. Moran1 
06 Aug 2009-Nature
TL;DR: The extent to which common genetic variation underlies the risk of schizophrenia is shown, using two analytic approaches, and the major histocompatibility complex is implicate, which is shown to involve thousands of common alleles of very small effect.
Abstract: Schizophrenia is a severe mental disorder with a lifetime risk of about 1%, characterized by hallucinations, delusions and cognitive deficits, with heritability estimated at up to 80%(1,2). We performed a genome-wide association study of 3,322 European individuals with schizophrenia and 3,587 controls. Here we show, using two analytic approaches, the extent to which common genetic variation underlies the risk of schizophrenia. First, we implicate the major histocompatibility complex. Second, we provide molecular genetic evidence for a substantial polygenic component to the risk of schizophrenia involving thousands of common alleles of very small effect. We show that this component also contributes to the risk of bipolar disorder, but not to several non-psychiatric diseases.

Journal ArticleDOI
TL;DR: A new method is proposed which attempts to keep the sensitivity benefits of cluster-based thresholding (and indeed the general concept of "clusters" of signal), while avoiding (or at least minimising) these problems, and is referred to as "threshold-free cluster enhancement" (TFCE).

Journal ArticleDOI
TL;DR: The Consolidated Statement of Reporting Trials (CONSORT) provides readers of RCTs with a list of criteria useful to assess trial validity (for full details visit www.consortstatement.org).
Abstract: Method Fifty-seven parents randomised to I0 weeks ofex~erimental Habilitation programmes for intellectual disability are primitive in developing countries (Heron & Myers, 1983). Resources to develop specialist care are scarce in these nations. One compensatory option for this deficit is to facilitate the primary care-giver to take on the role of therapist (McLoughlin, 1992), because parents are the focus of intervention (Myreddi, 1992). Parental attitude influences the development and training of the developmentally disabled child (Beckett-Edwards, 1994) and is a dynamic adaptational process subject to change (Gallimore et al, 1993). Changes in and control therapy were assessed using parental attitude occur with intervention the Parental Attitude Scale towards the (Bruiner & Beck, 1984; Sameroff & Managementof Intellectual DisabilityThe 1990). Interventions with parents are varpreand post-intervention measurements ied (Girimaii, 19931, including a model were done by a single-blinded rater and with an O ~ ~ O r m n i t y raise questions and discuss problems over a period of time (Stecompared. phens & Wyatt, 1969; Cunningham et al, Results The intervention group had a 1993). This randomised-controlled ma1 evalustatistically significant increase in the ates the efficacy of Interactive Group outcome scores and clinical improvement psychoeducation (IGP) in changing attiin the total parental attitude score, tudes towards children with intellectual orientation towards child-rearing, disability.

Journal ArticleDOI
20 Feb 2009-Cell
TL;DR: The evolution of long noncoding RNAs and their roles in transcriptional regulation, epigenetic gene regulation, and disease are reviewed.

Journal ArticleDOI
TL;DR: It is found that imputation accuracy can be greatly enhanced by expanding the reference panel to contain thousands of chromosomes and that IMPUTE v2 outperforms other methods in this setting at both rare and common SNPs, with overall error rates that are 15%–20% lower than those of the closest competing method.
Abstract: Genotype imputation methods are now being widely used in the analysis of genome-wide association studies. Most imputation analyses to date have used the HapMap as a reference dataset, but new reference panels (such as controls genotyped on multiple SNP chips and densely typed samples from the 1,000 Genomes Project) will soon allow a broader range of SNPs to be imputed with higher accuracy, thereby increasing power. We describe a genotype imputation method (IMPUTE version 2) that is designed to address the challenges presented by these new datasets. The main innovation of our approach is a flexible modelling framework that increases accuracy and combines information across multiple reference panels while remaining computationally feasible. We find that IMPUTE v2 attains higher accuracy than other methods when the HapMap provides the sole reference panel, but that the size of the panel constrains the improvements that can be made. We also find that imputation accuracy can be greatly enhanced by expanding the reference panel to contain thousands of chromosomes and that IMPUTE v2 outperforms other methods in this setting at both rare and common SNPs, with overall error rates that are 15%–20% lower than those of the closest competing method. One particularly challenging aspect of next-generation association studies is to integrate information across multiple reference panels genotyped on different sets of SNPs; we show that our approach to this problem has practical advantages over other suggested solutions.

Journal ArticleDOI
TL;DR: Below the range 22.5-25 kg/m(2), BMI was associated inversely with overall mortality, mainly because of strong inverse associations with respiratory disease and lung cancer, despite cigarette consumption per smoker varying little with BMI.

Journal ArticleDOI
TL;DR: This revision of the consensus algorithm for the medical management of type 2 diabetes focuses on the new classes of medications that now have more clinical data and experience and addresses safety issues surrounding the thiazolidinediones.
Abstract: The consensus algorithm for the medical management of type 2 diabetes was published in August 2006 with the expectation that it would be updated, based on the availability of new interventions and new evidence to establish their clinical role. The authors continue to endorse the principles used to develop the algorithm and its major features. We are sensitive to the risks of changing the algorithm cavalierly or too frequently, without compelling new information. An update to the consensus algorithm published in January 2008 specifically addressed safety issues surrounding the thiazolidinediones. In this revision, we focus on the new classes of medications that now have more clinical data and experience.

Journal ArticleDOI
TL;DR: Provide a structured summary including, as applicable, background, objectives, data sources, study eligibility criteria, participants, interventions, study appraisal and synthesis methods, results, limitations, conclusions and implications of key findings, systematic review registration number 2.
Abstract: Provide a structured summary including, as applicable, background, objectives, data sources, study eligibility criteria, participants, interventions, study appraisal and synthesis methods, results, limitations, conclusions and implications of key findings, systematic review registration number 2 Structured summary

Journal ArticleDOI
TL;DR: The cellular sources of these cytokines, receptor signaling pathways, and induced markers and gene signatures are reviewed and the concept of macrophage activation in the context of the immune response is revisit.
Abstract: Macrophages are innate immune cells with well-established roles in the primary response to pathogens, but also in tissue homeostasis, coordination of the adaptive immune response, inflammation, resolution, and repair. These cells recognize danger signals through receptors capable of inducing specialized activation programs. The classically known macrophage activation is induced by IFN-gamma, which triggers a harsh proinflammatory response that is required to kill intracellular pathogens. Macrophages also undergo alternative activation by IL-4 and IL-13, which trigger a different phenotype that is important for the immune response to parasites. Here we review the cellular sources of these cytokines, receptor signaling pathways, and induced markers and gene signatures. We draw attention to discrepancies found between mouse and human models of alternative activation. The evidence for in vivo alternative activation of macrophages is also analyzed, with nematode infection as prototypic disease. Finally, we revisit the concept of macrophage activation in the context of the immune response.

Journal ArticleDOI
30 Apr 2009-Nature
TL;DR: A comprehensive probabilistic analysis aimed at quantifying GHG emission budgets for the 2000–50 period that would limit warming throughout the twenty-first century to below 2 °C, based on a combination of published distributions of climate system properties and observational constraints is provided.
Abstract: More than 100 countries have adopted a global warming limit of 2 degrees C or below (relative to pre-industrial levels) as a guiding principle for mitigation efforts to reduce climate change risks, impacts and damages. However, the greenhouse gas (GHG) emissions corresponding to a specified maximum warming are poorly known owing to uncertainties in the carbon cycle and the climate response. Here we provide a comprehensive probabilistic analysis aimed at quantifying GHG emission budgets for the 2000-50 period that would limit warming throughout the twenty-first century to below 2 degrees C, based on a combination of published distributions of climate system properties and observational constraints. We show that, for the chosen class of emission scenarios, both cumulative emissions up to 2050 and emission levels in 2050 are robust indicators of the probability that twenty-first century warming will not exceed 2 degrees C relative to pre-industrial temperatures. Limiting cumulative CO(2) emissions over 2000-50 to 1,000 Gt CO(2) yields a 25% probability of warming exceeding 2 degrees C-and a limit of 1,440 Gt CO(2) yields a 50% probability-given a representative estimate of the distribution of climate system properties. As known 2000-06 CO(2) emissions were approximately 234 Gt CO(2), less than half the proven economically recoverable oil, gas and coal reserves can still be emitted up to 2050 to achieve such a goal. Recent G8 Communiques envisage halved global GHG emissions by 2050, for which we estimate a 12-45% probability of exceeding 2 degrees C-assuming 1990 as emission base year and a range of published climate sensitivity distributions. Emissions levels in 2020 are a less robust indicator, but for the scenarios considered, the probability of exceeding 2 degrees C rises to 53-87% if global GHG emissions are still more than 25% above 2000 levels in 2020.

Journal ArticleDOI
TL;DR: A systematic review of population-based studies of the incidence and early (21 days to 1 month) case fatality of stroke is based on studies published from 1970 to 2008 as mentioned in this paper.
Abstract: This systematic review of population-based studies of the incidence and early (21 days to 1 month) case fatality of stroke is based on studies published from 1970 to 2008. Stroke incidence (incident strokes only) and case fatality from 21 days to 1 month post-stroke were analysed by four decades of study, two country income groups (high-income countries and low to middle income countries, in accordance with the World Bank's country classification) and, when possible, by stroke pathological type: ischaemic stroke, primary intracerebral haemorrhage, and subarachnoid haemorrhage. This Review shows a divergent, statistically significant trend in stroke incidence rates over the past four decades, with a 42% decrease in stroke incidence in high-income countries and a greater than 100% increase in stroke incidence in low to middle income countries. In 2000-08, the overall stroke incidence rates in low to middle income countries have, for the first time, exceeded the level of stroke incidence seen in high-income countries, by 20%. The time to decide whether or not stroke is an issue that should be on the governmental agenda in low to middle income countries has now passed. Now is the time for action.

Journal ArticleDOI
TL;DR: How Bayesian techniques have made a significant impact in tackling problems such as neuroimaging problems, particularly in regards to the analysis tools in the FMRIB Software Library (FSL), is described.


Journal ArticleDOI
25 Jun 2009-Nature
TL;DR: It is shown that the new swine-origin influenza A (H1N1) virus emerged in Mexico and the United States was derived from several viruses circulating in swine, and that the initial transmission to humans occurred several months before recognition of the outbreak.
Abstract: In March and early April 2009, a new swine-origin influenza A (H1N1) virus (S-OIV) emerged in Mexico and the United States. During the first few weeks of surveillance, the virus spread worldwide to 30 countries (as of May 11) by human-to-human transmission, causing the World Health Organization to raise its pandemic alert to level 5 of 6. This virus has the potential to develop into the first influenza pandemic of the twenty-first century. Here we use evolutionary analysis to estimate the timescale of the origins and the early development of the S-OIV epidemic. We show that it was derived from several viruses circulating in swine, and that the initial transmission to humans occurred several months before recognition of the outbreak. A phylogenetic estimate of the gaps in genetic surveillance indicates a long period of unsampled ancestry before the S-OIV outbreak, suggesting that the reassortment of swine lineages may have occurred years before emergence in humans, and that the multiple genetic ancestry of S-OIV is not indicative of an artificial origin. Furthermore, the unsampled history of the epidemic means that the nature and location of the genetically closest swine viruses reveal little about the immediate origin of the epidemic, despite the fact that we included a panel of closely related and previously unpublished swine influenza isolates. Our results highlight the need for systematic surveillance of influenza in swine, and provide evidence that the mixing of new genetic elements in swine can result in the emergence of viruses with pandemic potential in humans.

Journal ArticleDOI
19 Jun 2009-Science
TL;DR: Transmissibility is substantially higher than that of seasonal flu, and comparable with lower estimates of R0 obtained from previous influenza pandemics, by analyzing the outbreak in Mexico, early data on international spread, and viral genetic diversity, which makes an early assessment of transmissibility and severity.
Abstract: A novel influenza A (H1N1) virus has spread rapidly across the globe. Judging its pandemic potential is difficult with limited data, but nevertheless essential to inform appropriate health responses. By analyzing the outbreak in Mexico, early data on international spread, and viral genetic diversity, we make an early assessment of transmissibility and severity. Our estimates suggest that 23,000 (range 6000 to 32,000) individuals had been infected in Mexico by late April, giving an estimated case fatality ratio (CFR) of 0.4% (range: 0.3 to 1.8%) based on confirmed and suspected deaths reported to that time. In a community outbreak in the small community of La Gloria, Veracruz, no deaths were attributed to infection, giving an upper 95% bound on CFR of 0.6%. Thus, although substantial uncertainty remains, clinical severity appears less than that seen in the 1918 influenza pandemic but comparable with that seen in the 1957 pandemic. Clinical attack rates in children in La Gloria were twice that in adults ( /=15 years: 29%). Three different epidemiological analyses gave basic reproduction number (R0) estimates in the range of 1.4 to 1.6, whereas a genetic analysis gave a central estimate of 1.2. This range of values is consistent with 14 to 73 generations of human-to-human transmission having occurred in Mexico to late April. Transmissibility is therefore substantially higher than that of seasonal flu, and comparable with lower estimates of R0 obtained from previous influenza pandemics.

Journal ArticleDOI
TL;DR: The Wilkinson Microwave Anisotropy Probe (WMAP) is a medium-class Explorer (MIDEX) satellite aimed at elucidating cosmology through full-sky observations of the cosmic microwave background (CMB) as mentioned in this paper.
Abstract: The Wilkinson Microwave Anisotropy Probe (WMAP) is a Medium-Class Explorer (MIDEX) satellite aimed at elucidating cosmology through full-sky observations of the cosmic microwave background (CMB). The WMAP full-sky maps of the temperature and polarization anisotropy in five frequency bands provide our most accurate view to date of conditions in the early universe. The multi-frequency data facilitate the separation of the CMB signal from foreground emission arising both from our Galaxy and from extragalactic sources. The CMB angular power spectrum derived from these maps exhibits a highly coherent acoustic peak structure which makes it possible to extract a wealth of information about the composition and history of the universe. as well as the processes that seeded the fluctuations. WMAP data have played a key role in establishing ACDM as the new standard model of cosmology (Bennett et al. 2003: Spergel et al. 2003; Hinshaw et al. 2007: Spergel et al. 2007): a flat universe dominated by dark energy, supplemented by dark matter and atoms with density fluctuations seeded by a Gaussian, adiabatic, nearly scale invariant process. The basic properties of this universe are determined by five numbers: the density of matter, the density of atoms. the age of the universe (or equivalently, the Hubble constant today), the amplitude of the initial fluctuations, and their scale dependence. By accurately measuring the first few peaks in the angular power spectrum, WMAP data have enabled the following accomplishments: Showing the dark matter must be non-baryonic and interact only weakly with atoms and radiation. The WMAP measurement of the dark matter density puts important constraints on supersymmetric dark matter models and on the properties of other dark matter candidates. With five years of data and a better determination of our beam response, this measurement has been significantly improved. Precise determination of the density of atoms in the universe. The agreement between the atomic density derived from WMAP and the density inferred from the deuterium abundance is an important test of the standard big bang model. Determination of the acoustic scale at redshift z = 1090. Similarly, the recent measurement of baryon acoustic oscillations (BAO) in the galaxy power spectrum (Eisenstein et al. 2005) has determined the acoustic scale at redshift z approx. 0.35. When combined, these standard rulers accurately measure the geometry of the universe and the properties of the dark energy. These data require a nearly flat universe dominated by dark energy consistent with a cosmological constant. Precise determination of the Hubble Constant, in conjunction with BAO observations. Even when allowing curvature (Omega(sub 0) does not equal 1) and a free dark energy equation of state (w does not equal -1), the acoustic data determine the Hubble constant to within 3%. The measured value is in excellent agreement with independent results from the Hubble Key Project (Freedman et al. 2001), providing yet another important consistency test for the standard model. Significant constraint of the basic properties of the primordial fluctuations. The anti-correlation seen in the temperature/polarization (TE) correlation spectrum on 4deg scales implies that the fluctuations are primarily adiabatic and rule out defect models and isocurvature models as the primary source of fluctuations (Peiris et al. 2003).

Journal ArticleDOI
TL;DR: It is shown that a protein nanopore with a covalently attached adapter molecule can continuously identify unlabelled nucleoside 5'-monophosphate molecules with accuracies averaging 99.8%.
Abstract: A single-molecule method for sequencing DNA that does not require fluorescent labelling could reduce costs and increase sequencing speeds. An exonuclease enzyme might be used to cleave individual nucleotide molecules from the DNA, and when coupled to an appropriate detection system, these nucleotides could be identified in the correct order. Here, we show that a protein nanopore with a covalently attached adapter molecule can continuously identify unlabelled nucleoside 5'-monophosphate molecules with accuracies averaging 99.8%. Methylated cytosine can also be distinguished from the four standard DNA bases: guanine, adenine, thymine and cytosine. The operating conditions are compatible with the exonuclease, and the kinetic data show that the nucleotides have a high probability of translocation through the nanopore and, therefore, of not being registered twice. This highly accurate tool is suitable for integration into a system for sequencing nucleic acids and for analysing epigenetic modifications. A protein nanopore with a permanent adaptor molecule can continuously identify unlabelled DNA bases with ∼99.8% accuracy. This level of performance could provide the foundation for the development of nanopore-based DNA sequencing technologies that are faster and less expensive than existing approaches.

Journal ArticleDOI
TL;DR: Several of the likely causal genes are highly expressed or known to act in the central nervous system (CNS), emphasizing, as in rare monogenic forms of obesity, the role of the CNS in predisposition to obesity.
Abstract: Common variants at only two loci, FTO and MC4R, have been reproducibly associated with body mass index (BMI) in humans. To identify additional loci, we conducted meta-analysis of 15 genome-wide association studies for BMI (n > 32,000) and followed up top signals in 14 additional cohorts (n > 59,000). We strongly confirm FTO and MC4R and identify six additional loci (P < 5 x 10(-8)): TMEM18, KCTD15, GNPDA2, SH2B1, MTCH2 and NEGR1 (where a 45-kb deletion polymorphism is a candidate causal variant). Several of the likely causal genes are highly expressed or known to act in the central nervous system (CNS), emphasizing, as in rare monogenic forms of obesity, the role of the CNS in predisposition to obesity.