Institution
McGill University
Education•Montreal, Quebec, Canada•
About: McGill University is a(n) education organization based out in Montreal, Quebec, Canada. It is known for research contribution in the topic(s): Population & Poison control. The organization has 72688 authors who have published 162565 publication(s) receiving 6966523 citation(s). The organization is also known as: Royal institution of advanced learning & University of McGill College.
Topics: Population, Poison control, Health care, Cancer, Receptor
Papers published on a yearly basis
Papers
More filters
Johns Hopkins University School of Medicine1, Johns Hopkins University2, Mayo Clinic3, McGill University4, Harvard University5, University of California, Irvine6, University of Pittsburgh7, Columbia University Medical Center8, Eli Lilly and Company9, Washington University in St. Louis10, UCL Institute of Neurology11, VU University Medical Center12, Alzheimer's Association13, Northwestern University14, National Institutes of Health15
TL;DR: The workgroup sought to ensure that the revised criteria would be flexible enough to be used by both general healthcare providers without access to neuropsychological testing, advanced imaging, and cerebrospinal fluid measures, and specialized investigators involved in research or in clinical trial studies who would have these tools available.
Abstract: The National Institute on Aging and the Alzheimer's Association charged a workgroup with the task of revising the 1984 criteria for Alzheimer's disease (AD) dementia. The workgroup sought to ensure that the revised criteria would be flexible enough to be used by both general healthcare providers without access to neuropsychological testing, advanced imaging, and cerebrospinal fluid measures, and specialized investigators involved in research or in clinical trial studies who would have these tools available. We present criteria for all-cause dementia and for AD dementia. We retained the general framework of probable AD dementia from the 1984 criteria. On the basis of the past 27 years of experience, we made several changes in the clinical criteria for the diagnosis. We also retained the term possible AD dementia, but redefined it in a manner more focused than before. Biomarker evidence was also integrated into the diagnostic formulations for probable and possible AD dementia for use in research settings. The core clinical criteria for AD dementia will continue to be the cornerstone of the diagnosis in clinical practice, but biomarker evidence is expected to enhance the pathophysiological specificity of the diagnosis of AD dementia. Much work lies ahead for validating the biomarker diagnosis of AD dementia.
11,067 citations
TL;DR: This study evaluated a modified, timed version of the “Get‐Up and Go” Test (Mathias et al, 1986) in 60 patients referred to a Geriatric Day Hospital and suggested that the timed “Up & Go’ test is a reliable and valid test for quantifying functional mobility that may also be useful in following clinical change over time.
Abstract: This study evaluated a modified, timed version of the "Get-Up and Go" Test (Mathias et al, 1986) in 60 patients referred to a Geriatric Day Hospital (mean age 79.5 years). The patient is observed and timed while he rises from an arm chair, walks 3 meters, turns, walks back, and sits down again. The results indicate that the time score is (1) reliable (inter-rater and intra-rater); (2) correlates well with log-transformed scores on the Berg Balance Scale (r = -0.81), gait speed (r = -0.61) and Barthel Index of ADL (r = -0.78); and (3) appears to predict the patient's ability to go outside alone safely. These data suggest that the timed "Up & Go" test is a reliable and valid test for quantifying functional mobility that may also be useful in following clinical change over time. The test is quick, requires no special equipment or training, and is easily included as part of the routine medical examination.
10,502 citations
TL;DR: In this article, the authors present a cosmological analysis based on full-mission Planck observations of temperature and polarization anisotropies of the cosmic microwave background (CMB) radiation.
Abstract: This paper presents cosmological results based on full-mission Planck observations of temperature and polarization anisotropies of the cosmic microwave background (CMB) radiation. Our results are in very good agreement with the 2013 analysis of the Planck nominal-mission temperature data, but with increased precision. The temperature and polarization power spectra are consistent with the standard spatially-flat 6-parameter ΛCDM cosmology with a power-law spectrum of adiabatic scalar perturbations (denoted “base ΛCDM” in this paper). From the Planck temperature data combined with Planck lensing, for this cosmology we find a Hubble constant, H0 = (67.8 ± 0.9) km s-1Mpc-1, a matter density parameter Ωm = 0.308 ± 0.012, and a tilted scalar spectral index with ns = 0.968 ± 0.006, consistent with the 2013 analysis. Note that in this abstract we quote 68% confidence limits on measured parameters and 95% upper limits on other parameters. We present the first results of polarization measurements with the Low Frequency Instrument at large angular scales. Combined with the Planck temperature and lensing data, these measurements give a reionization optical depth of τ = 0.066 ± 0.016, corresponding to a reionization redshift of . These results are consistent with those from WMAP polarization measurements cleaned for dust emission using 353-GHz polarization maps from the High Frequency Instrument. We find no evidence for any departure from base ΛCDM in the neutrino sector of the theory; for example, combining Planck observations with other astrophysical data we find Neff = 3.15 ± 0.23 for the effective number of relativistic degrees of freedom, consistent with the value Neff = 3.046 of the Standard Model of particle physics. The sum of neutrino masses is constrained to ∑ mν < 0.23 eV. The spatial curvature of our Universe is found to be very close to zero, with | ΩK | < 0.005. Adding a tensor component as a single-parameter extension to base ΛCDM we find an upper limit on the tensor-to-scalar ratio of r0.002< 0.11, consistent with the Planck 2013 results and consistent with the B-mode polarization constraints from a joint analysis of BICEP2, Keck Array, and Planck (BKP) data. Adding the BKP B-mode data to our analysis leads to a tighter constraint of r0.002 < 0.09 and disfavours inflationarymodels with a V(φ) ∝ φ2 potential. The addition of Planck polarization data leads to strong constraints on deviations from a purely adiabatic spectrum of fluctuations. We find no evidence for any contribution from isocurvature perturbations or from cosmic defects. Combining Planck data with other astrophysical data, including Type Ia supernovae, the equation of state of dark energy is constrained to w = −1.006 ± 0.045, consistent with the expected value for a cosmological constant. The standard big bang nucleosynthesis predictions for the helium and deuterium abundances for the best-fit Planck base ΛCDM cosmology are in excellent agreement with observations. We also constraints on annihilating dark matter and on possible deviations from the standard recombination history. In neither case do we find no evidence for new physics. The Planck results for base ΛCDM are in good agreement with baryon acoustic oscillation data and with the JLA sample of Type Ia supernovae. However, as in the 2013 analysis, the amplitude of the fluctuation spectrum is found to be higher than inferred from some analyses of rich cluster counts and weak gravitational lensing. We show that these tensions cannot easily be resolved with simple modifications of the base ΛCDM cosmology. Apart from these tensions, the base ΛCDM cosmology provides an excellent description of the Planck CMB observations and many other astrophysical data sets.
10,334 citations
TL;DR: The 1000 Genomes Project set out to provide a comprehensive description of common human genetic variation by applying whole-genome sequencing to a diverse set of individuals from multiple populations, and has reconstructed the genomes of 2,504 individuals from 26 populations using a combination of low-coverage whole-generation sequencing, deep exome sequencing, and dense microarray genotyping.
Abstract: The 1000 Genomes Project set out to provide a comprehensive description of common human genetic variation by applying whole-genome sequencing to a diverse set of individuals from multiple populations. Here we report completion of the project, having reconstructed the genomes of 2,504 individuals from 26 populations using a combination of low-coverage whole-genome sequencing, deep exome sequencing, and dense microarray genotyping. We characterized a broad spectrum of genetic variation, in total over 88 million variants (84.7 million single nucleotide polymorphisms (SNPs), 3.6 million short insertions/deletions (indels), and 60,000 structural variants), all phased onto high-quality haplotypes. This resource includes >99% of SNP variants with a frequency of >1% for a variety of ancestries. We describe the distribution of genetic variation across the global sample, and discuss the implications for common disease studies.
9,821 citations
TL;DR: In this paper, the authors present a general approach that accommodates most forms of experimental layout and ensuing analysis (designed experiments with fixed effects for factors, covariates and interaction of factors).
Abstract: + Abstract: Statistical parametric maps are spatially extended statistical processes that are used to test hypotheses about regionally specific effects in neuroimaging data. The most established sorts of statistical parametric maps (e.g., Friston et al. (1991): J Cereb Blood Flow Metab 11:690-699; Worsley et al. 119921: J Cereb Blood Flow Metab 12:YOO-918) are based on linear models, for example ANCOVA, correlation coefficients and t tests. In the sense that these examples are all special cases of the general linear model it should be possible to implement them (and many others) within a unified framework. We present here a general approach that accommodates most forms of experimental layout and ensuing analysis (designed experiments with fixed effects for factors, covariates and interaction of factors). This approach brings together two well established bodies of theory (the general linear model and the theory of Gaussian fields) to provide a complete and simple framework for the analysis of imaging data. The importance of this framework is twofold: (i) Conceptual and mathematical simplicity, in that the same small number of operational equations is used irrespective of the complexity of the experiment or nature of the statistical model and (ii) the generality of the framework provides for great latitude in experimental design and analysis.
9,254 citations
Authors
Showing all 72688 results
Name | H-index | Papers | Citations |
---|---|---|---|
Karl J. Friston | 217 | 1267 | 217169 |
Yi Chen | 217 | 4342 | 293080 |
Yoshua Bengio | 202 | 1033 | 420313 |
Irving L. Weissman | 201 | 1141 | 172504 |
Mark I. McCarthy | 200 | 1028 | 187898 |
Lewis C. Cantley | 196 | 748 | 169037 |
Martin White | 196 | 2038 | 232387 |
Michael Marmot | 193 | 1147 | 170338 |
Michael A. Strauss | 185 | 1688 | 208506 |
Alan C. Evans | 183 | 866 | 134642 |
Douglas R. Green | 182 | 661 | 145944 |
David A. Weitz | 178 | 1038 | 114182 |
David L. Kaplan | 177 | 1944 | 146082 |
Hyun-Chul Kim | 176 | 4076 | 183227 |
Feng Zhang | 172 | 1278 | 181865 |