scispace - formally typeset
Search or ask a question

Showing papers by "McGill University published in 2014"


Journal ArticleDOI
Peter A. R. Ade1, Nabila Aghanim2, C. Armitage-Caplan3, Monique Arnaud4  +324 moreInstitutions (70)
TL;DR: In this paper, the authors present the first cosmological results based on Planck measurements of the cosmic microwave background (CMB) temperature and lensing-potential power spectra, which are extremely well described by the standard spatially-flat six-parameter ΛCDM cosmology with a power-law spectrum of adiabatic scalar perturbations.
Abstract: This paper presents the first cosmological results based on Planck measurements of the cosmic microwave background (CMB) temperature and lensing-potential power spectra. We find that the Planck spectra at high multipoles (l ≳ 40) are extremely well described by the standard spatially-flat six-parameter ΛCDM cosmology with a power-law spectrum of adiabatic scalar perturbations. Within the context of this cosmology, the Planck data determine the cosmological parameters to high precision: the angular size of the sound horizon at recombination, the physical densities of baryons and cold dark matter, and the scalar spectral index are estimated to be θ∗ = (1.04147 ± 0.00062) × 10-2, Ωbh2 = 0.02205 ± 0.00028, Ωch2 = 0.1199 ± 0.0027, and ns = 0.9603 ± 0.0073, respectively(note that in this abstract we quote 68% errors on measured parameters and 95% upper limits on other parameters). For this cosmology, we find a low value of the Hubble constant, H0 = (67.3 ± 1.2) km s-1 Mpc-1, and a high value of the matter density parameter, Ωm = 0.315 ± 0.017. These values are in tension with recent direct measurements of H0 and the magnitude-redshift relation for Type Ia supernovae, but are in excellent agreement with geometrical constraints from baryon acoustic oscillation (BAO) surveys. Including curvature, we find that the Universe is consistent with spatial flatness to percent level precision using Planck CMB data alone. We use high-resolution CMB data together with Planck to provide greater control on extragalactic foreground components in an investigation of extensions to the six-parameter ΛCDM model. We present selected results from a large grid of cosmological models, using a range of additional astrophysical data sets in addition to Planck and high-resolution CMB data. None of these models are favoured over the standard six-parameter ΛCDM cosmology. The deviation of the scalar spectral index from unity isinsensitive to the addition of tensor modes and to changes in the matter content of the Universe. We find an upper limit of r0.002< 0.11 on the tensor-to-scalar ratio. There is no evidence for additional neutrino-like relativistic particles beyond the three families of neutrinos in the standard model. Using BAO and CMB data, we find Neff = 3.30 ± 0.27 for the effective number of relativistic degrees of freedom, and an upper limit of 0.23 eV for the sum of neutrino masses. Our results are in excellent agreement with big bang nucleosynthesis and the standard value of Neff = 3.046. We find no evidence for dynamical dark energy; using BAO and CMB data, the dark energy equation of state parameter is constrained to be w = -1.13-0.10+0.13. We also use the Planck data to set limits on a possible variation of the fine-structure constant, dark matter annihilation and primordial magnetic fields. Despite the success of the six-parameter ΛCDM model in describing the Planck data at high multipoles, we note that this cosmology does not provide a good fit to the temperature power spectrum at low multipoles. The unusual shape of the spectrum in the multipole range 20 ≲ l ≲ 40 was seen previously in the WMAP data and is a real feature of the primordial CMB anisotropies. The poor fit to the spectrum at low multipoles is not of decisive significance, but is an “anomaly” in an otherwise self-consistent analysis of the Planck temperature data.

7,060 citations


Journal ArticleDOI
TL;DR: Recommendations and guidelines on the evaluation and treatment of severe asthma in children and adults and coordinated research efforts for improved phenotyping will provide safe and effective biomarker-driven approaches to severe asthma therapy are provided.
Abstract: Severe or therapy-resistant asthma is increasingly recognised as a major unmet need. A Task Force, supported by the European Respiratory Society and American Thoracic Society, reviewed the definition and provided recommendations and guidelines on the evaluation and treatment of severe asthma in children and adults. A literature review was performed, followed by discussion by an expert committee according to the GRADE (Grading of Recommendations, Assessment, Development and Evaluation) approach for development of specific clinical recommendations. When the diagnosis of asthma is confirmed and comorbidities addressed, severe asthma is defined as asthma that requires treatment with high dose inhaled corticosteroids plus a second controller and/or systemic corticosteroids to prevent it from becoming “uncontrolled” or that remains “uncontrolled” despite this therapy. Severe asthma is a heterogeneous condition consisting of phenotypes such as eosinophilic asthma. Specific recommendations on the use of sputum eosinophil count and exhaled nitric oxide to guide therapy, as well as treatment with anti-IgE antibody, methotrexate, macrolide antibiotics, antifungal agents and bronchial thermoplasty are provided. Coordinated research efforts for improved phenotyping will provide safe and effective biomarker-driven approaches to severe asthma therapy.

2,795 citations


Journal ArticleDOI
TL;DR: It is proposed that downstream topographical biomarkers of the disease, such as volumetric MRI and fluorodeoxyglucose PET, might better serve in the measurement and monitoring of the course of disease.
Abstract: In the past 8 years, both the International Working Group (IWG) and the US National Institute on Aging-Alzheimer's Association have contributed criteria for the diagnosis of Alzheimer's disease (AD) that better define clinical phenotypes and integrate biomarkers into the diagnostic process, covering the full staging of the disease. This Position Paper considers the strengths and limitations of the IWG research diagnostic criteria and proposes advances to improve the diagnostic framework. On the basis of these refinements, the diagnosis of AD can be simplified, requiring the presence of an appropriate clinical AD phenotype (typical or atypical) and a pathophysiological biomarker consistent with the presence of Alzheimer's pathology. We propose that downstream topographical biomarkers of the disease, such as volumetric MRI and fluorodeoxyglucose PET, might better serve in the measurement and monitoring of the course of disease. This paper also elaborates on the specific diagnostic criteria for atypical forms of AD, for mixed AD, and for the preclinical states of AD.

2,581 citations


Journal ArticleDOI
TL;DR: The addition of bevacizumab to radiotherapy-temozolomide did not improve survival in patients with glioblastoma, and the glucocorticoid requirement was lower.
Abstract: Background Standard therapy for newly diagnosed glioblastoma is radiotherapy plus temozolomide. In this phase 3 study, we evaluated the effect of the addition of bevacizumab to radiotherapy–temozolomide for the treatment of newly diagnosed glioblastoma. Methods We randomly assigned patients with supratentorial glioblastoma to receive intravenous bevacizumab (10 mg per kilogram of body weight every 2 weeks) or placebo, plus radiotherapy (2 Gy 5 days a week; maximum, 60 Gy) and oral temozolomide (75 mg per square meter of body-surface area per day) for 6 weeks. After a 28-day treatment break, maintenance bevacizumab (10 mg per kilogram intravenously every 2 weeks) or placebo, plus temozolomide (150 to 200 mg per square meter per day for 5 days), was continued for six 4-week cycles, followed by bevacizumab monotherapy (15 mg per kilogram intravenously every 3 weeks) or placebo until the disease progressed or unacceptable toxic effects developed. The coprimary end points were investigator-assessed progression-free survival and overall survival. Results A total of 458 patients were assigned to the bevacizumab group, and 463 patients to the placebo group. The median progression-free survival was longer in the bevacizumab group than in the placebo group (10.6 months vs. 6.2 months; stratified hazard ratio for progression or death, 0.64; 95% confidence interval [CI], 0.55 to 0.74; P<0.001). The benefit with respect to progression-free survival was observed across subgroups. Overall survival did not differ significantly between groups (stratified hazard ratio for death, 0.88; 95% CI, 0.76 to 1.02; P = 0.10). The respective overall survival rates with bevacizumab and placebo were 72.4% and 66.3% at 1 year (P = 0.049) and 33.9% and 30.1% at 2 years (P = 0.24). Baseline health-related quality of life and performance status were maintained longer in the bevacizumab group, and the glucocorticoid requirement was lower. More patients in the bevacizumab group than in the placebo group had grade 3 or higher adverse events (66.8% vs. 51.3%) and grade 3 or higher adverse events often associated with bevacizumab (32.5% vs. 15.8%). Conclusions The addition of bevacizumab to radiotherapy–temozolomide did not improve survival in patients with glioblastoma. Improved progression-free survival and maintenance of baseline quality of life and performance status were observed with bevacizumab; however, the rate of adverse events was higher with bevacizumab than with placebo. (Funded by F. Hoffmann–La Roche; ClinicalTrials.gov number, NCT00943826.)

1,996 citations


Journal ArticleDOI
Yukinori Okada1, Yukinori Okada2, Di Wu3, Di Wu2, Di Wu1, Gosia Trynka1, Gosia Trynka2, Towfique Raj2, Towfique Raj1, Chikashi Terao4, Katsunori Ikari, Yuta Kochi, Koichiro Ohmura4, Akari Suzuki, Shinji Yoshida, Robert R. Graham5, A. Manoharan5, Ward Ortmann5, Tushar Bhangale5, Joshua C. Denny6, Robert J. Carroll6, Anne E. Eyler6, Jeff Greenberg7, Joel M. Kremer, Dimitrios A. Pappas8, Lei Jiang9, Jian Yin9, Lingying Ye9, Ding Feng Su9, Jian Yang10, Gang Xie11, E.C. Keystone11, Harm-Jan Westra12, Tõnu Esko1, Tõnu Esko2, Tõnu Esko13, Andres Metspalu13, Xuezhong Zhou14, Namrata Gupta2, Daniel B. Mirel2, Eli A. Stahl15, Dorothee Diogo2, Dorothee Diogo1, Jing Cui2, Jing Cui1, Katherine P. Liao1, Katherine P. Liao2, Michael H. Guo1, Michael H. Guo2, Keiko Myouzen, Takahisa Kawaguchi4, Marieke J H Coenen16, Piet L. C. M. van Riel16, Mart A F J van de Laar17, Henk-Jan Guchelaar18, Tom W J Huizinga18, Philippe Dieudé19, Xavier Mariette20, S. Louis Bridges21, Alexandra Zhernakova18, Alexandra Zhernakova12, René E. M. Toes18, Paul P. Tak22, Paul P. Tak23, Paul P. Tak24, Corinne Miceli-Richard20, So Young Bang25, Hye Soon Lee25, Javier Martin26, Miguel A. Gonzalez-Gay, Luis Rodriguez-Rodriguez27, Solbritt Rantapää-Dahlqvist28, Lisbeth Ärlestig28, Hyon K. Choi29, Hyon K. Choi1, Yoichiro Kamatani30, Pilar Galan19, Mark Lathrop31, Steve Eyre32, Steve Eyre33, John Bowes33, John Bowes32, Anne Barton32, Niek de Vries22, Larry W. Moreland34, Lindsey A. Criswell35, Elizabeth W. Karlson1, Atsuo Taniguchi, Ryo Yamada4, Michiaki Kubo, Jun Liu1, Sang Cheol Bae25, Jane Worthington33, Jane Worthington32, Leonid Padyukov36, Lars Klareskog36, Peter K. Gregersen37, Soumya Raychaudhuri1, Soumya Raychaudhuri2, Barbara E. Stranger38, Philip L. De Jager1, Philip L. De Jager2, Lude Franke12, Peter M. Visscher10, Matthew A. Brown10, Hisashi Yamanaka, Tsuneyo Mimori4, Atsushi Takahashi, Huji Xu9, Timothy W. Behrens5, Katherine A. Siminovitch11, Shigeki Momohara, Fumihiko Matsuda4, Kazuhiko Yamamoto39, Robert M. Plenge2, Robert M. Plenge1 
20 Feb 2014-Nature
TL;DR: A genome-wide association study meta-analysis in a total of >100,000 subjects of European and Asian ancestries provides empirical evidence that the genetics of RA can provide important information for drug discovery, and sheds light on fundamental genes, pathways and cell types that contribute to RA pathogenesis.
Abstract: A major challenge in human genetics is to devise a systematic strategy to integrate disease-associated variants with diverse genomic and biological data sets to provide insight into disease pathogenesis and guide drug discovery for complex traits such as rheumatoid arthritis (RA)1. Here we performed a genome-wide association study meta-analysis in a total of >100,000 subjects of European and Asian ancestries (29,880 RA cases and 73,758 controls), by evaluating ~10 million single-nucleotide polymorphisms. We discovered 42 novel RA risk loci at a genome-wide level of significance, bringing the total to 101 (refs 2, 3, 4). We devised an in silico pipeline using established bioinformatics methods based on functional annotation5, cis-acting expression quantitative trait loci6 and pathway analyses7, 8, 9—as well as novel methods based on genetic overlap with human primary immunodeficiency, haematological cancer somatic mutations and knockout mouse phenotypes—to identify 98 biological candidate genes at these 101 risk loci. We demonstrate that these genes are the targets of approved therapies for RA, and further suggest that drugs approved for other indications may be repurposed for the treatment of RA. Together, this comprehensive genetic study sheds light on fundamental genes, pathways and cell types that contribute to RA pathogenesis, and provides empirical evidence that the genetics of RA can provide important information for drug discovery.

1,910 citations


Journal ArticleDOI
TL;DR: In this paper, the evolution of the new microstructures produced by two types of dynamic recrystallization is reviewed, including those brought about by severe plastic deformation (SPD).

1,777 citations


Journal ArticleDOI
Peter A. R. Ade1, Nabila Aghanim2, M. I. R. Alves2, C. Armitage-Caplan3  +469 moreInstitutions (89)
TL;DR: The European Space Agency's Planck satellite, dedicated to studying the early Universe and its subsequent evolution, was launched 14 May 2009 and has been scanning the microwave and submillimetre sky continuously since 12 August 2009 as discussed by the authors.
Abstract: The European Space Agency’s Planck satellite, dedicated to studying the early Universe and its subsequent evolution, was launched 14 May 2009 and has been scanning the microwave and submillimetre sky continuously since 12 August 2009. In March 2013, ESA and the Planck Collaboration released the initial cosmology products based on the first 15.5 months of Planck data, along with a set of scientific and technical papers and a web-based explanatory supplement. This paper gives an overview of the mission and its performance, the processing, analysis, and characteristics of the data, the scientific results, and the science data products and papers in the release. The science products include maps of the cosmic microwave background (CMB) and diffuse extragalactic foregrounds, a catalogue of compact Galactic and extragalactic sources, and a list of sources detected through the Sunyaev-Zeldovich effect. The likelihood code used to assess cosmological models against the Planck data and a lensing likelihood are described. Scientific results include robust support for the standard six-parameter ΛCDM model of cosmology and improved measurements of its parameters, including a highly significant deviation from scale invariance of the primordial power spectrum. The Planck values for these parameters and others derived from them are significantly different from those previously determined. Several large-scale anomalies in the temperature distribution of the CMB, first detected by WMAP, are confirmed with higher confidence. Planck sets new limits on the number and mass of neutrinos, and has measured gravitational lensing of CMB anisotropies at greater than 25σ. Planck finds no evidence for non-Gaussianity in the CMB. Planck’s results agree well with results from the measurements of baryon acoustic oscillations. Planck finds a lower Hubble constant than found in some more local measures. Some tension is also present between the amplitude of matter fluctuations (σ8) derived from CMB data and that derived from Sunyaev-Zeldovich data. The Planck and WMAP power spectra are offset from each other by an average level of about 2% around the first acoustic peak. Analysis of Planck polarization data is not yet mature, therefore polarization results are not released, although the robust detection of E-mode polarization around CMB hot and cold spots is shown graphically.

1,719 citations


Journal ArticleDOI
TL;DR: This Review highlights the recent progress in the field of cross-dehydrogenative C sp 3C formations and provides a comprehensive overview on existing procedures and employed methodologies.
Abstract: Over the last decade, substantial research has led to the introduction of an impressive number of efficient procedures which allow the selective construction of CC bonds by directly connecting two different CH bonds under oxidative conditions. Common to these methodologies is the generation of the reactive intermediates in situ by activation of both CH bonds. This strategy was introduced by the group of Li as cross-dehydrogenative coupling (CDC) and discloses waste-minimized synthetic alternatives to classic coupling procedures which rely on the use of prefunctionalized starting materials. This Review highlights the recent progress in the field of cross-dehydrogenative C sp 3C formations and provides a comprehensive overview on existing procedures and employed methodologies.

1,528 citations


Journal ArticleDOI
Peter A. R. Ade1, Nabila Aghanim2, Monique Arnaud3, Frederico Arroja4  +321 moreInstitutions (79)
TL;DR: In this article, the authors present the implications for cosmic inflation of the Planck measurements of the cosmic microwave background (CMB) anisotropies in both temperature and polarization based on the full Planck survey.
Abstract: We present the implications for cosmic inflation of the Planck measurements of the cosmic microwave background (CMB) anisotropies in both temperature and polarization based on the full Planck survey, which includes more than twice the integration time of the nominal survey used for the 2013 release papers. The Planck full mission temperature data and a first release of polarization data on large angular scales measure the spectral index of curvature perturbations to be ns = 0.968 ± 0.006 and tightly constrain its scale dependence to dns/ dlnk = −0.003 ± 0.007 when combined with the Planck lensing likelihood. When the Planck high-l polarization data are included, the results are consistent and uncertainties are further reduced. The upper bound on the tensor-to-scalar ratio is r0.002< 0.11 (95% CL). This upper limit is consistent with the B-mode polarization constraint r< 0.12 (95% CL) obtained from a joint analysis of the BICEP2/Keck Array and Planck data. These results imply that V(φ) ∝ φ2 and natural inflation are now disfavoured compared to models predicting a smaller tensor-to-scalar ratio, such as R2 inflation. We search for several physically motivated deviations from a simple power-law spectrum of curvature perturbations, including those motivated by a reconstruction of the inflaton potential not relying on the slow-roll approximation. We find that such models are not preferred, either according to a Bayesian model comparison or according to a frequentist simulation-based analysis. Three independent methods reconstructing the primordial power spectrum consistently recover a featureless and smooth over the range of scales 0.008 Mpc-1 ≲ k ≲ 0.1 Mpc-1. At large scales, each method finds deviations from a power law, connected to a deficit at multipoles l ≈ 20−40 in the temperature power spectrum, but at an uncompelling statistical significance owing to the large cosmic variance present at these multipoles. By combining power spectrum and non-Gaussianity bounds, we constrain models with generalized Lagrangians, including Galileon models and axion monodromy models. The Planck data are consistent with adiabatic primordial perturbations, and the estimated values for the parameters of the base Λ cold dark matter (ΛCDM) model are not significantly altered when more general initial conditions are admitted. In correlated mixed adiabatic and isocurvature models, the 95% CL upper bound for the non-adiabatic contribution to the observed CMB temperature variance is | αnon - adi | < 1.9%, 4.0%, and 2.9% for CDM, neutrino density, and neutrino velocity isocurvature modes, respectively. We have tested inflationary models producing an anisotropic modulation of the primordial curvature power spectrum findingthat the dipolar modulation in the CMB temperature field induced by a CDM isocurvature perturbation is not preferred at a statistically significant level. We also establish tight constraints on a possible quadrupolar modulation of the curvature perturbation. These results are consistent with the Planck 2013 analysis based on the nominal mission data and further constrain slow-roll single-field inflationary models, as expected from the increased precision of Planck data using the full set of observations.

1,401 citations


Journal ArticleDOI
TL;DR: The AASLD/EASL Practice Guideline Subcommittee on Hepatic Encephalopathy are: Jayant A. Talwalkar, Hari S. Conjeevaram, Michael Porayko, Raphael B. Merriman, Peter L. Jansen, Fabien Zoulim.

1,375 citations


Journal ArticleDOI
TL;DR: The noninvasive evaluation of LVEF has gained importance, and notwithstanding the limitations of the techniques used for its calculation, has emerged as the most widely used strategy for monitoring the changes in cardiac function, both during and after the administration of potentially car- diotoxic cancer treatment.
Abstract: Cardiac dysfunction resulting from exposure to cancer therapeutics was first recognized in the 1960s, with the widespread introduction of anthracyclines into the oncologic therapeutic armamentarium. Heart failure (HF) associated with anthracyclines was then recognized as an important side effect. As a result, physicians learned to limit their doses to avoid cardiac dysfunction. Several strategies have been used over the past decades to detect it. Two of them evolved over time to be very useful: endomyocardial biopsies and monitoring of left ven- tricular (LV) ejection fraction (LVEF) by cardiac imaging. Examination of endomyocardial biopsies proved to be the most sensitive and spe- cific parameter for the identification of anthracycline-induced LV dysfunction and became the gold standard in the 1970s. However, the interest in endomyocardial biopsy has diminished over time because of the reduction in the cumulative dosages used to treat ma- lignancies, the invasive nature of the procedure, and the remarkable progress made in noninvasive cardiac imaging. The noninvasive evaluation of LVEF has gained importance, and notwithstanding the limitations of the techniques used for its calculation, has emerged as the most widely used strategy for monitoring the changes in cardiac function, both during and after the administration of potentially car- diotoxic cancer treatment.

Journal ArticleDOI
20 Nov 2014-Cell
TL;DR: The map uncovers significant interconnectivity between known and candidate cancer gene products, providing unbiased evidence for an expanded functional cancer landscape, while demonstrating how high-quality interactome models will help "connect the dots" of the genomic revolution.

Journal ArticleDOI
30 Apr 2014-Nature
TL;DR: The results show that PINK1-dependent phosphorylation of both parkin and ubiquitin is sufficient for full activation of parkin E3 activity, and demonstrate that phosphorylated ubiquit in is a parkin activator.
Abstract: Ubiquitin, known for its role in post-translational modification of other proteins, undergoes post-translational modification itself; after a decrease in mitochondrial membrane potential, the kinase enzyme PINK1 phosphorylates ubiquitin at Ser 65, and the phosphorylated ubiquitin then interacts with ubiquitin ligase (E3) enzyme parkin, which is also phosphorylated by PINK1, and this process is sufficient for full activation of parkin enzymatic activity. The small protein ubiquitin, familiar for its role in post-translational modification of other proteins by binding to them and regulating their activity or stability, is shown here to be the substrate of the kinase PINK1, which together with the ubiquitin ligase parkin is a causal gene for hereditary recessive Parkinsonism. Noriyuki Matsuda and colleagues show that following a decrease in mitochondrial membrane potential, PINK1 phosphorylates ubiquitin at serine residue 65; the phosphorylated ubiquitin then interacts with parkin, which is also phosphorylated by PINK1. This interaction allows full activation of parkin enzymatic activity, which involves tagging mitochondrial substrates with ubiquitin. PINK1 (PTEN induced putative kinase 1) and PARKIN (also known as PARK2) have been identified as the causal genes responsible for hereditary recessive early-onset Parkinsonism1,2. PINK1 is a Ser/Thr kinase that specifically accumulates on depolarized mitochondria, whereas parkin is an E3 ubiquitin ligase that catalyses ubiquitin transfer to mitochondrial substrates3,4,5. PINK1 acts as an upstream factor for parkin6,7 and is essential both for the activation of latent E3 parkin activity8 and for recruiting parkin onto depolarized mitochondria8,9,10,11,12. Recently, mechanistic insights into mitochondrial quality control mediated by PINK1 and parkin have been revealed3,4,5, and PINK1-dependent phosphorylation of parkin has been reported13,14,15. However, the requirement of PINK1 for parkin activation was not bypassed by phosphomimetic parkin mutation15, and how PINK1 accelerates the E3 activity of parkin on damaged mitochondria is still obscure. Here we report that ubiquitin is the genuine substrate of PINK1. PINK1 phosphorylated ubiquitin at Ser 65 both in vitro and in cells, and a Ser 65 phosphopeptide derived from endogenous ubiquitin was only detected in cells in the presence of PINK1 and following a decrease in mitochondrial membrane potential. Unexpectedly, phosphomimetic ubiquitin bypassed PINK1-dependent activation of a phosphomimetic parkin mutant in cells. Furthermore, phosphomimetic ubiquitin accelerates discharge of the thioester conjugate formed by UBCH7 (also known as UBE2L3) and ubiquitin (UBCH7∼ubiquitin) in the presence of parkin in vitro, indicating that it acts allosterically. The phosphorylation-dependent interaction between ubiquitin and parkin suggests that phosphorylated ubiquitin unlocks autoinhibition of the catalytic cysteine. Our results show that PINK1-dependent phosphorylation of both parkin and ubiquitin is sufficient for full activation of parkin E3 activity. These findings demonstrate that phosphorylated ubiquitin is a parkin activator.

Journal ArticleDOI
Peter A. R. Ade1, Nabila Aghanim2, C. Armitage-Caplan3, Monique Arnaud4  +273 moreInstitutions (59)
TL;DR: In this article, the authors characterized the effective beams, the effective beam window functions and the associated errors for the Planck High Frequency Instrument (HFI) detectors, including the effect of the optics, detectors, data processing and the scan strategy.
Abstract: This paper characterizes the effective beams, the effective beam window functions and the associated errors for the Planck High Frequency Instrument (HFI) detectors. The effective beam is the angular response including the effect of the optics, detectors, data processing and the scan strategy. The window function is the representation of this beam in the harmonic domain which is required to recover an unbiased measurement of the cosmic microwave background angular power spectrum. The HFI is a scanning instrument and its effective beams are the convolution of: a) the optical response of the telescope and feeds; b) the processing of the time-ordered data and deconvolution of the bolometric and electronic transfer function; and c) the merging of several surveys to produce maps. The time response transfer functions are measured using observations of Jupiter and Saturn and by minimizing survey difference residuals. The scanning beam is the post-deconvolution angular response of the instrument, and is characterized with observations of Mars. The main beam solid angles are determined to better than 0.5% at each HFI frequency band. Observations of Jupiter and Saturn limit near sidelobes (within 5 degrees) to about 0.1% of the total solid angle. Time response residuals remain as long tails in the scanning beams, but contribute less than 0.1% of the total solid angle. The bias and uncertainty in the beam products are estimated using ensembles of simulated planet observations that include the impact of instrumental noise and known systematic effects. The correlation structure of these ensembles is well-described by five errors eigenmodes that are sub-dominant to sample variance and instrumental noise in the harmonic domain. A suite of consistency tests provide confidence that the error model represents a sufficient description of the data. The total error in the effective beam window functions is below 1% at 100 GHz up to multiple l similar to 1500, below 0.5% at 143 and 217 GHz up to l similar to 2000.

Journal ArticleDOI
TL;DR: The construct of mild cognitive impairment (MCI) has evolved over the past 10 years since the publication of the new MCI definition at the Key Symposium in 2003, but the core criteria have remained unchanged.
Abstract: The construct of mild cognitive impairment (MCI) has evolved over the past 10 years since the publication of the new MCI definition at the Key Symposium in 2003, but the core criteria have remained unchanged. The construct has been extensively used worldwide, both in clinical and in research settings, to define the grey area between intact cognitive functioning and clinical dementia. A rich set of data regarding occurrence, risk factors and progression of MCI has been generated. Discrepancies between studies can be mostly explained by differences in the operationalization of the criteria, differences in the setting where the criteria have been applied, selection of subjects and length of follow-up in longitudinal studies. Major controversial issues that remain to be further explored are algorithmic versus clinical classification, reliability of clinical judgment, temporal changes in cognitive performances and predictivity of putative biomarkers. Some suggestions to further develop the MCI construct include the tailoring of the clinical criteria to specific populations and to specific contexts. The addition of biomarkers to the clinical phenotypes is promising but requires deeper investigation. Translation of findings from the specialty clinic to the population setting, although challenging, will enhance uniformity of outcomes. More longitudinal population-based studies on cognitive ageing and MCI need to be performed to clarify all these issues.

Journal ArticleDOI
TL;DR: This updated Cochrane Review on the diagnostic accuracy of Xpert® MTB/RIF for pulmonary TB and rifampicin resistance detection was performed as part of a WHO process to develop updated guidelines on the use of the test.
Abstract: Background Accurate, rapid detection of tuberculosis (TB) and TB drug resistance is critical for improving patient care and decreasing TB transmission. Xpert® MTB/RIF assay is an automated test that can detect both TB and rifampicin resistance, generally within two hours after starting the test, with minimal hands-on technical time. The World Health Organization (WHO) issued initial recommendations on Xpert® MTB/RIF in early 2011. A Cochrane Review on the diagnostic accuracy of Xpert® MTB/RIF for pulmonary TB and rifampicin resistance was published January 2013. We performed this updated Cochrane Review as part of a WHO process to develop updated guidelines on the use of the test. Objectives To assess the diagnostic accuracy of Xpert® MTB/RIF for pulmonary TB (TB detection), where Xpert® MTB/RIF was used as both an initial test replacing microscopy and an add-on test following a negative smear microscopy result. To assess the diagnostic accuracy of Xpert® MTB/RIF for rifampicin resistance detection, where Xpert® MTB/RIF was used as the initial test replacing culture-based drug susceptibility testing (DST). The populations of interest were adults presumed to have pulmonary, rifampicin-resistant or multidrug-resistant TB (MDR-TB), with or without HIV infection. The settings of interest were intermediate- and peripheral-level laboratories. The latter may be associated with primary health care facilities. Search methods We searched for publications in any language up to 7 February 2013 in the following databases: Cochrane Infectious Diseases Group Specialized Register; MEDLINE; EMBASE; ISI Web of Knowledge; MEDION; LILACS; BIOSIS; and SCOPUS. We also searched the metaRegister of Controlled Trials (mRCT) and the search portal of the WHO International Clinical Trials Registry Platform to identify ongoing trials. Selection criteria We included randomized controlled trials, cross-sectional studies, and cohort studies using respiratory specimens that allowed for extraction of data evaluating Xpert® MTB/RIF against the reference standard. We excluded gastric fluid specimens. The reference standard for TB was culture and for rifampicin resistance was phenotypic culture-based DST. Data collection and analysis For each study, two review authors independently extracted data using a standardized form. When possible, we extracted data for subgroups by smear and HIV status. We assessed the quality of studies using QUADAS-2 and carried out meta-analyses to estimate pooled sensitivity and specificity of Xpert® MTB/RIF separately for TB detection and rifampicin resistance detection. For TB detection, we performed the majority of analyses using a bivariate random-effects model and compared the sensitivity of Xpert® MTB/RIF and smear microscopy against culture as reference standard. For rifampicin resistance detection, we undertook univariate meta-analyses for sensitivity and specificity separately to include studies in which no rifampicin resistance was detected. Main results We included 27 unique studies (integrating nine new studies) involving 9557 participants. Sixteen studies (59%) were performed in low- or middle-income countries. For all QUADAS-2 domains, most studies were at low risk of bias and low concern regarding applicability. As an initial test replacing smear microscopy, Xpert® MTB/RIF pooled sensitivity was 89% [95% Credible Interval (CrI) 85% to 92%] and pooled specificity 99% (95% CrI 98% to 99%), (22 studies, 8998 participants: 2953 confirmed TB, 6045 non-TB). As an add-on test following a negative smear microscopy result, Xpert®MTB/RIF pooled sensitivity was 67% (95% CrI 60% to 74%) and pooled specificity 99% (95% CrI 98% to 99%; 21 studies, 6950 participants). For smear-positive, culture-positive TB, Xpert® MTB/RIF pooled sensitivity was 98% (95% CrI 97% to 99%; 21 studies, 1936 participants). For people with HIV infection, Xpert® MTB/RIF pooled sensitivity was 79% (95% CrI 70% to 86%; seven studies, 1789 participants), and for people without HIV infection, it was 86% (95% CrI 76% to 92%; seven studies, 1470 participants). Among 180 specimens with nontuberculous mycobacteria (NTM), Xpert® MTB/RIF was positive in only one specimen that grew NTM (14 studies, 2626 participants). Comparison with smear microscopy In comparison with smear microscopy, Xpert® MTB/RIF increased TB detection among culture-confirmed cases by 23% (95% CrI 15% to 32%; 21 studies, 8880 participants). For TB detection, if pooled sensitivity estimates for Xpert® MTB/RIF and smear microscopy are applied to a hypothetical cohort of 1000 patients where 10% of those with symptoms have TB, Xpert® MTB/RIF will diagnose 88 cases and miss 12 cases, whereas sputum microscopy will diagnose 65 cases and miss 35 cases. Rifampicin resistance For rifampicin resistance detection, Xpert® MTB/RIF pooled sensitivity was 95% (95% CrI 90% to 97%; 17 studies, 555 rifampicin resistance positives) and pooled specificity was 98% (95% CrI 97% to 99%; 24 studies, 2411 rifampicin resistance negatives). For rifampicin resistance detection, if the pooled accuracy estimates for Xpert® MTB/RIF are applied to a hypothetical cohort of 1000 individuals where 15% of those with symptoms are rifampicin resistant, Xpert® MTB/RIF would correctly identify 143 individuals as rifampicin resistant and miss eight cases, and correctly identify 833 individuals as rifampicin susceptible and misclassify 17 individuals as resistant. Where 5% of those with symptoms are rifampicin resistant, Xpert® MTB/RIF would correctly identify 48 individuals as rifampicin resistant and miss three cases and correctly identify 931 individuals as rifampicin susceptible and misclassify 19 individuals as resistant. Authors' conclusions In adults thought to have TB, with or without HIV infection, Xpert® MTB/RIF is sensitive and specific. Compared with smear microscopy, Xpert® MTB/RIF substantially increases TB detection among culture-confirmed cases. Xpert® MTB/RIF has higher sensitivity for TB detection in smear-positive than smear-negative patients. Nonetheless, this test may be valuable as an add-on test following smear microscopy in patients previously found to be smear-negative. For rifampicin resistance detection, Xpert® MTB/RIF provides accurate results and can allow rapid initiation of MDR-TB treatment, pending results from conventional culture and DST. The tests are expensive, so current research evaluating the use of Xpert® MTB/RIF in TB programmes in high TB burden settings will help evaluate how this investment may help start treatment promptly and improve outcomes.

Journal ArticleDOI
TL;DR: The most comprehensive exploration of genetic loci influencing human metabolism thus far, comprising 7,824 adult individuals from 2 European population studies, is reported, reporting genome-wide significant associations at 145 metabolic loci and their biochemical connectivity with more than 400 metabolites in human blood.
Abstract: Genome-wide association scans with high-throughput metabolic profiling provide unprecedented insights into how genetic variation influences metabolism and complex disease. Here we report the most comprehensive exploration of genetic loci influencing human metabolism thus far, comprising 7,824 adult individuals from 2 European population studies. We report genome-wide significant associations at 145 metabolic loci and their biochemical connectivity with more than 400 metabolites in human blood. We extensively characterize the resulting in vivo blueprint of metabolism in human blood by integrating it with information on gene expression, heritability and overlap with known loci for complex disorders, inborn errors of metabolism and pharmacological targets. We further developed a database and web-based resources for data mining and results visualization. Our findings provide new insights into the role of inherited variation in blood metabolic diversity and identify potential new opportunities for drug development and for understanding disease.

Journal ArticleDOI
TL;DR: Evidence for genetic adaptation to climate change has been found in some systems, but is still relatively scarce and it is clear that more studies are needed – and these must employ better inferential methods – before general conclusions can be drawn.
Abstract: Many studies have recorded phenotypic changes in natural populations and attributed them to climate change. However, controversy and uncertainty has arisen around three levels of inference in such studies. First, it has proven difficult to conclusively distinguish whether phenotypic changes are genetically based or the result of phenotypic plasticity. Second, whether or not the change is adaptive is usually assumed rather than tested. Third, inferences that climate change is the specific causal agent have rarely involved the testing – and exclusion – of other potential drivers. We here review the various ways in which the above inferences have been attempted, and evaluate the strength of support that each approach can provide. This methodological assessment sets the stage for 11 accompanying review articles that attempt comprehensive syntheses of what is currently known – and not known – about responses to climate change in a variety of taxa and in theory. Summarizing and relying on the results of these reviews, we arrive at the conclusion that evidence for genetic adaptation to climate change has been found in some systems, but is still relatively scarce. Most importantly, it is clear that more studies are needed – and these must employ better inferential methods – before general conclusions can be drawn. Overall, we hope that the present paper and special issue provide inspiration for future research and guidelines on best practices for its execution.

Journal ArticleDOI
TL;DR: Clinical Practice Guidelines for the Management of Hypertension in the Community as mentioned in this paper A Statement by the American Society of hypertension and the International Society of Hyperpharmension (ISH).
Abstract: Clinical Practice Guidelines for the Management of Hypertension in the Community A Statement by the American Society of Hypertension and the International Society of Hypertension

Journal ArticleDOI
TL;DR: The non-invasive evaluation of LVEF has gained importance, and notwithstanding the limitations of the techniques used for its calculation, has emerged as the most widely used strategy for monitoring the changes in cardiac function, both during and after the administration of potentially cardiotoxic cancer treatment.
Abstract: ### A. Definition, classification, and mechanisms of toxicity Cardiac dysfunction resulting from exposure to cancer therapeutics was first recognized in the 1960s, with the widespread introduction of anthracyclines into the oncological therapeutic armamentarium.1 Heart failure (HF) associated with anthracyclines was then recognized as an important side effect. As a result, physicians learned to limit their doses to avoid cardiac dysfunction.2 Several strategies have been used over the past decades to detect it. Two of them evolved over time to be very useful: endomyocardial biopsies and monitoring of left ventricular (LV) ejection fraction (LVEF) by cardiac imaging. Examination of endomyocardial biopsies proved to be the most sensitive and specific parameter for the identification of anthracycline-induced LV dysfunction and became the gold standard in the 1970s. However, the interest in endomyocardial biopsy has diminished over time because of the reduction in the cumulative dosages used to treat malignancies, the invasive nature of the procedure, and the remarkable progress made in non-invasive cardiac imaging. The non-invasive evaluation of LVEF has gained importance, and notwithstanding the limitations of the techniques used for its calculation, has emerged as the most widely used strategy for monitoring the changes in cardiac function, both during and after the administration of potentially cardiotoxic cancer treatment.3–5 The timing of LV dysfunction can vary among agents. In the case of anthracyclines, the damage occurs immediately after the exposure;6 for others, the time frame between drug administration and detectable cardiac dysfunction appears to be more variable. Nevertheless, the heart has significant cardiac reserve, and the expression of damage in the form of alterations in systolic or diastolic parameters may not be overt until a substantial amount of cardiac reserve has been exhausted. Thus, cardiac damage may not become apparent until years or even decades after receiving the cardiotoxic treatment. This is particularly applicable to …

Journal ArticleDOI
TL;DR: The epistemological background for mixed methods research and mixed studies reviews is presented and the main types of mixed methodsResearch designs and techniques as well as guidance for planning, conducting, and appraising mixed method research are presented.
Abstract: This article provides an overview of mixed methods research and mixed studies reviews. These two approaches are used to combine the strengths of quantitative and qualitative methods and to compensate for their respective limitations. This article is structured in three main parts. First, the epistemological background for mixed methods will be presented. Afterward, we present the main types of mixed methods research designs and techniques as well as guidance for planning, conducting, and appraising mixed methods research. In the last part, we describe the main types of mixed studies reviews and provide a tool kit and examples. Future research needs to offer guidance for assessing mixed methods research and reporting mixed studies reviews, among other challenges.

Journal ArticleDOI
TL;DR: This phase 3 randomised controlled trial assessed whether dose intensification of doxorubicin with ifosfamide improves survival of patients with advanced soft-tissue sarcoma compared with doxorbicin alone.
Abstract: Summary Background Effective targeted treatment is unavailable for most sarcomas and doxorubicin and ifosfamide—which have been used to treat soft-tissue sarcoma for more than 30 years—still have an important role Whether doxorubicin alone or the combination of doxorubicin and ifosfamide should be used routinely is still controversial We assessed whether dose intensification of doxorubicin with ifosfamide improves survival of patients with advanced soft-tissue sarcoma compared with doxorubicin alone Methods We did this phase 3 randomised controlled trial (EORTC 62012) at 38 hospitals in ten countries We included patients with locally advanced, unresectable, or metastatic high-grade soft-tissue sarcoma, age 18–60 years with a WHO performance status of 0 or 1 They were randomly assigned (1:1) by the minimisation method to either doxorubicin (75 mg/m 2 by intravenous bolus on day 1 or 72 h continuous intravenous infusion) or intensified doxorubicin (75 mg/m 2 ; 25 mg/m 2 per day, days 1–3) plus ifosfamide (10 g/m 2 over 4 days with mesna and pegfilgrastim) as first-line treatment Randomisation was stratified by centre, performance status (0 vs 1), age ( vs ≥50 years), presence of liver metastases, and histopathological grade (2 vs 3) Patients were treated every 3 weeks till progression or unacceptable toxic effects for up to six cycles The primary endpoint was overall survival in the intention-to-treat population The trial is registered with ClinicalTrialsgov, number NCT00061984 Findings Between April 30, 2003, and May 25, 2010, 228 patients were randomly assigned to receive doxorubicin and 227 to receive doxorubicin and ifosfamide Median follow-up was 56 months (IQR 31–77) in the doxorubicin only group and 59 months (36–72) in the combination group There was no significant difference in overall survival between groups (median overall survival 12·8 months [95·5% CI 10·5–14·3] in the doxorubicin group vs 14·3 months [12·5–16·5] in the doxorubicin and ifosfamide group; hazard ratio [HR] 0·83 [95·5% CI 0·67–1·03]; stratified log-rank test p=0·076) Median progression-free survival was significantly higher for the doxorubicin and ifosfamide group (7·4 months [95% CI 6·6–8·3]) than for the doxorubicin group (4·6 months [2·9–5·6]; HR 0·74 [95% CI 0·60–0·90], stratified log-rank test p=0·003) More patients in the doxorubicin and ifosfamide group than in the doxorubicin group had an overall response (60 [26%] of 227 patients vs 31 [14%] of 228; p vs 40 [18%] of 223 patients), neutropenia (93 [42%] vs 83 [37%]), febrile neutropenia (103 (46%) vs 30 [13%]), anaemia (78 [35%] vs 10 [5%]), and thrombocytopenia (75 [33%]) vs one [ Interpretation Our results do not support the use of intensified doxorubicin and ifosfamide for palliation of advanced soft-tissue sarcoma unless the specific goal is tumour shrinkage These findings should help individualise the care of patients with this disease Funding Cancer Research UK, EORTC Charitable Trust, UK NHS, Canadian Cancer Society Research Institute, Amgen

Journal ArticleDOI
TL;DR: The rapid induction of glycolytic flux increased within minutes of exposure to TLR agonists and served an essential role in supporting the de novo synthesis of fatty acids for the expansion of the endoplasmic reticulum and Golgi required for the production and secretion of proteins that are integral to DC activation.
Abstract: The ligation of Toll-like receptors (TLRs) leads to rapid activation of dendritic cells (DCs). However, the metabolic requirements that support this process remain poorly defined. We found that DC glycolytic flux increased within minutes of exposure to TLR agonists and that this served an essential role in supporting the de novo synthesis of fatty acids for the expansion of the endoplasmic reticulum and Golgi required for the production and secretion of proteins that are integral to DC activation. Signaling via the kinases TBK1, IKKɛ and Akt was essential for the TLR-induced increase in glycolysis by promoting the association of the glycolytic enzyme HK-II with mitochondria. In summary, we identified the rapid induction of glycolysis as an integral component of TLR signaling that is essential for the anabolic demands of the activation and function of DCs.

Journal ArticleDOI
TL;DR: This work sought to synthesize the existing body of evidence and offer a perspective on how to integrate frailty into clinical practice and contribute valuable prognostic insights incremental to existing risk models and assists clinicians in defining optimal care pathways for their patients.

Journal ArticleDOI
TL;DR: These guidelines were developed by Canadian experts in anxiety and related disorders through a consensus process based on global impression of efficacy, effectiveness, and side effects, using a modified version of the periodic health examination guidelines.
Abstract: Anxiety and related disorders are among the most common mental disorders, with lifetime prevalence reportedly as high as 31%. Unfortunately, anxiety disorders are under-diagnosed and under-treated. These guidelines were developed by Canadian experts in anxiety and related disorders through a consensus process. Data on the epidemiology, diagnosis, and treatment (psychological and pharmacological) were obtained through MEDLINE, PsycINFO, and manual searches (1980–2012). Treatment strategies were rated on strength of evidence, and a clinical recommendation for each intervention was made, based on global impression of efficacy, effectiveness, and side effects, using a modified version of the periodic health examination guidelines. These guidelines are presented in 10 sections, including an introduction, principles of diagnosis and management, six sections (Sections 3 through 8) on the specific anxiety-related disorders (panic disorder, agoraphobia, specific phobia, social anxiety disorder, generalized anxiety disorder, obsessive-compulsive disorder, and posttraumatic stress disorder), and two additional sections on special populations (children/adolescents, pregnant/lactating women, and the elderly) and clinical issues in patients with comorbid conditions. Anxiety and related disorders are very common in clinical practice, and frequently comorbid with other psychiatric and medical conditions. Optimal management requires a good understanding of the efficacy and side effect profiles of pharmacological and psychological treatments.

Journal ArticleDOI
Alain Abergel1, Peter A. R. Ade2, Nabila Aghanim1, M. I. R. Alves1  +307 moreInstitutions (66)
TL;DR: In this article, the authors presented an all-sky model of dust emission from the Planck 857, 545 and 353 GHz, and IRAS 100 micron data.
Abstract: This paper presents an all-sky model of dust emission from the Planck 857, 545 and 353 GHz, and IRAS 100 micron data. Using a modified black-body fit to the data we present all-sky maps of the dust optical depth, temperature, and spectral index over the 353-3000 GHz range. This model is a tight representation of the data at 5 arcmin. It shows variations of the order of 30 % compared with the widely-used model of Finkbeiner, Davis, and Schlegel. The Planck data allow us to estimate the dust temperature uniformly over the whole sky, providing an improved estimate of the dust optical depth compared to previous all-sky dust model, especially in high-contrast molecular regions. An increase of the dust opacity at 353 GHz, tau_353/N_H, from the diffuse to the denser interstellar medium (ISM) is reported. It is associated with a decrease in the observed dust temperature, T_obs, that could be due at least in part to the increased dust opacity. We also report an excess of dust emission at HI column densities lower than 10^20 cm^-2 that could be the signature of dust in the warm ionized medium. In the diffuse ISM at high Galactic latitude, we report an anti-correlation between tau_353/N_H and T_obs while the dust specific luminosity, i.e., the total dust emission integrated over frequency (the radiance) per hydrogen atom, stays about constant. The implication is that in the diffuse high-latitude ISM tau_353 is not as reliable a tracer of dust column density as we conclude it is in molecular clouds where the correlation of tau_353 with dust extinction estimated using colour excess measurements on stars is strong. To estimate Galactic E(B-V) in extragalactic fields at high latitude we develop a new method based on the thermal dust radiance, instead of the dust optical depth, calibrated to E(B-V) using reddening measurements of quasars deduced from Sloan Digital Sky Survey data.

Journal ArticleDOI
TL;DR: The directed and weighted G29 × 91 connectivity matrix for the macaque will be valuable for comparison with connectivity analyses in other species, including humans, and inform future modeling studies that explore the regularities of cortical networks.
Abstract: Retrograde tracer injections in 29 of the 91 areas of the macaque cerebral cortex revealed 1,615 interareal pathways, a third of which have not previously been reported. A weight index (extrinsic fraction of labeled neurons [FLNe]) was determined for each area-to-area pathway. Newly found projections were weaker on average compared with the known projections; nevertheless, the 2 sets of pathways had extensively overlapping weight distributions. Repeat injections across individuals revealed modest FLNe variability given the range of FLNe values (standard deviation <1 log unit, range 5 log units). The connectivity profile for each area conformed to a lognormal distribution, where a majority of projections are moderate or weak in strength. In the G29 × 29 interareal subgraph, two-thirds of the connections that can exist do exist. Analysis of the smallest set of areas that collects links from all 91 nodes of the G29 × 91 subgraph (dominating set analysis) confirms the dense (66%) structure of the cortical matrix. The G29 × 29 subgraph suggests an unexpectedly high incidence of unidirectional links. The directed and weighted G29 × 91 connectivity matrix for the macaque will be valuable for comparison with connectivity analyses in other species, including humans. It will also inform future modeling studies that explore the regularities of cortical networks.

Journal ArticleDOI
TL;DR: The characteristics of VBNC cells, including the similarities and differences to viable, culturable cells and dead cells, and different detection methods are discussed, and their potential influence on human health is reviewed.
Abstract: Many bacterial species have been found to exist in a viable but non-culturable (VBNC) state since its discovery in 1982. VBNC cells are characterized by a loss of culturability on routine agar, which impairs their detection by conventional plate count techniques. This leads to an underestimation of total viable cells in environmental or clinical samples, and thus poses a risk to public health. In this review, we present recent findings on the VBNC state of human bacterial pathogens. The characteristics of VBNC cells, including the similarities and differences to viable, culturable cells and dead cells, and different detection methods are discussed. Exposure to various stresses can induce the VBNC state, and VBNC cells may be resuscitated back to culturable cells under suitable stimuli. The conditions that trigger the induction of the VBNC state and resuscitation from it are summarized and the mechanisms underlying these two processes are discussed. Last but not least, the significance of VBNC cells and their potential influence on human health are also reviewed.

Journal ArticleDOI
Peter A. R. Ade1, Nabila Aghanim2, Yashar Akrami3, Yashar Akrami4  +310 moreInstitutions (70)
TL;DR: In this article, the statistical isotropy and Gaussianity of the cosmic microwave background (CMB) anisotropies using observations made by the Planck satellite were investigated.
Abstract: We test the statistical isotropy and Gaussianity of the cosmic microwave background (CMB) anisotropies using observations made by the Planck satellite. Our results are based mainly on the full Planck mission for temperature, but also include some polarization measurements. In particular, we consider the CMB anisotropy maps derived from the multi-frequency Planck data by several component-separation methods. For the temperature anisotropies, we find excellent agreement between results based on these sky maps over both a very large fraction of the sky and a broad range of angular scales, establishing that potential foreground residuals do not affect our studies. Tests of skewness, kurtosis, multi-normality, N-point functions, and Minkowski functionals indicate consistency with Gaussianity, while a power deficit at large angular scales is manifested in several ways, for example low map variance. The results of a peak statistics analysis are consistent with the expectations of a Gaussian random field. The “Cold Spot” is detected with several methods, including map kurtosis, peak statistics, and mean temperature profile. We thoroughly probe the large-scale dipolar power asymmetry, detecting it with several independent tests, and address the subject of a posteriori correction. Tests of directionality suggest the presence of angular clustering from large to small scales, but at a significance that is dependent on the details of the approach. We perform the first examination of polarization data, finding the morphology of stacked peaks to be consistent with the expectations of statistically isotropic simulations. Where they overlap, these results are consistent with the Planck 2013 analysis based on the nominal mission data and provide our most thorough view of the statistics of the CMB fluctuations to date.

Journal ArticleDOI
TL;DR: The analyses suggest that DNA methylation changes may have a role in the onset of AD given that they were observed in presymptomatic subjects and that six of the validated genes connect to a known AD susceptibility gene network.
Abstract: We used a collection of 708 prospectively collected autopsied brains to assess the methylation state of the brain's DNA in relation to Alzheimer's disease (AD). We found that the level of methylation at 71 of the 415,848 interrogated CpGs was significantly associated with the burden of AD pathology, including CpGs in the ABCA7 and BIN1 regions, which harbor known AD susceptibility variants. We validated 11 of the differentially methylated regions in an independent set of 117 subjects. Furthermore, we functionally validated these CpG associations and identified the nearby genes whose RNA expression was altered in AD: ANK1, CDH23, DIP2A, RHBDF2, RPL13, SERPINF1 and SERPINF2. Our analyses suggest that these DNA methylation changes may have a role in the onset of AD given that we observed them in presymptomatic subjects and that six of the validated genes connect to a known AD susceptibility gene network.