scispace - formally typeset
Search or ask a question

Showing papers by "University of British Columbia published in 2016"


Journal ArticleDOI
Peter A. R. Ade1, Nabila Aghanim2, Monique Arnaud3, M. Ashdown4  +334 moreInstitutions (82)
TL;DR: In this article, the authors present a cosmological analysis based on full-mission Planck observations of temperature and polarization anisotropies of the cosmic microwave background (CMB) radiation.
Abstract: This paper presents cosmological results based on full-mission Planck observations of temperature and polarization anisotropies of the cosmic microwave background (CMB) radiation. Our results are in very good agreement with the 2013 analysis of the Planck nominal-mission temperature data, but with increased precision. The temperature and polarization power spectra are consistent with the standard spatially-flat 6-parameter ΛCDM cosmology with a power-law spectrum of adiabatic scalar perturbations (denoted “base ΛCDM” in this paper). From the Planck temperature data combined with Planck lensing, for this cosmology we find a Hubble constant, H0 = (67.8 ± 0.9) km s-1Mpc-1, a matter density parameter Ωm = 0.308 ± 0.012, and a tilted scalar spectral index with ns = 0.968 ± 0.006, consistent with the 2013 analysis. Note that in this abstract we quote 68% confidence limits on measured parameters and 95% upper limits on other parameters. We present the first results of polarization measurements with the Low Frequency Instrument at large angular scales. Combined with the Planck temperature and lensing data, these measurements give a reionization optical depth of τ = 0.066 ± 0.016, corresponding to a reionization redshift of . These results are consistent with those from WMAP polarization measurements cleaned for dust emission using 353-GHz polarization maps from the High Frequency Instrument. We find no evidence for any departure from base ΛCDM in the neutrino sector of the theory; for example, combining Planck observations with other astrophysical data we find Neff = 3.15 ± 0.23 for the effective number of relativistic degrees of freedom, consistent with the value Neff = 3.046 of the Standard Model of particle physics. The sum of neutrino masses is constrained to ∑ mν < 0.23 eV. The spatial curvature of our Universe is found to be very close to zero, with | ΩK | < 0.005. Adding a tensor component as a single-parameter extension to base ΛCDM we find an upper limit on the tensor-to-scalar ratio of r0.002< 0.11, consistent with the Planck 2013 results and consistent with the B-mode polarization constraints from a joint analysis of BICEP2, Keck Array, and Planck (BKP) data. Adding the BKP B-mode data to our analysis leads to a tighter constraint of r0.002 < 0.09 and disfavours inflationarymodels with a V(φ) ∝ φ2 potential. The addition of Planck polarization data leads to strong constraints on deviations from a purely adiabatic spectrum of fluctuations. We find no evidence for any contribution from isocurvature perturbations or from cosmic defects. Combining Planck data with other astrophysical data, including Type Ia supernovae, the equation of state of dark energy is constrained to w = −1.006 ± 0.045, consistent with the expected value for a cosmological constant. The standard big bang nucleosynthesis predictions for the helium and deuterium abundances for the best-fit Planck base ΛCDM cosmology are in excellent agreement with observations. We also constraints on annihilating dark matter and on possible deviations from the standard recombination history. In neither case do we find no evidence for new physics. The Planck results for base ΛCDM are in good agreement with baryon acoustic oscillation data and with the JLA sample of Type Ia supernovae. However, as in the 2013 analysis, the amplitude of the fluctuation spectrum is found to be higher than inferred from some analyses of rich cluster counts and weak gravitational lensing. We show that these tensions cannot easily be resolved with simple modifications of the base ΛCDM cosmology. Apart from these tensions, the base ΛCDM cosmology provides an excellent description of the Planck CMB observations and many other astrophysical data sets.

10,728 citations


Journal ArticleDOI
Daniel J. Klionsky1, Kotb Abdelmohsen2, Akihisa Abe3, Joynal Abedin4  +2519 moreInstitutions (695)
TL;DR: In this paper, the authors present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macro-autophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure flux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation, it is imperative to target by gene knockout or RNA interference more than one autophagy-related protein. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways implying that not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular assays, we hope to encourage technical innovation in the field.

5,187 citations


Journal ArticleDOI
Theo Vos1, Christine Allen1, Megha Arora1, Ryan M Barber1  +696 moreInstitutions (260)
TL;DR: The Global Burden of Diseases, Injuries, and Risk Factors Study 2015 (GBD 2015) as discussed by the authors was used to estimate the incidence, prevalence, and years lived with disability for diseases and injuries at the global, regional, and national scale over the period of 1990 to 2015.

5,050 citations


Journal ArticleDOI
Haidong Wang1, Mohsen Naghavi1, Christine Allen1, Ryan M Barber1  +841 moreInstitutions (293)
TL;DR: The Global Burden of Disease 2015 Study provides a comprehensive assessment of all-cause and cause-specific mortality for 249 causes in 195 countries and territories from 1980 to 2015, finding several countries in sub-Saharan Africa had very large gains in life expectancy, rebounding from an era of exceedingly high loss of life due to HIV/AIDS.

4,804 citations


Journal ArticleDOI
TL;DR: In intermediate-risk patients, TAVR was similar to surgical aortic-valve replacement with respect to the primary end point of death or disabling stroke; surgery resulted in fewer major vascular complications and less paravalvular aorta regurgitation.
Abstract: BackgroundPrevious trials have shown that among high-risk patients with aortic stenosis, survival rates are similar with transcatheter aortic-valve replacement (TAVR) and surgical aortic-valve replacement. We evaluated the two procedures in a randomized trial involving intermediate-risk patients. MethodsWe randomly assigned 2032 intermediate-risk patients with severe aortic stenosis, at 57 centers, to undergo either TAVR or surgical replacement. The primary end point was death from any cause or disabling stroke at 2 years. The primary hypothesis was that TAVR would not be inferior to surgical replacement. Before randomization, patients were entered into one of two cohorts on the basis of clinical and imaging findings; 76.3% of the patients were included in the transfemoral-access cohort and 23.7% in the transthoracic-access cohort. ResultsThe rate of death from any cause or disabling stroke was similar in the TAVR group and the surgery group (P=0.001 for noninferiority). At 2 years, the Kaplan–Meier event...

3,744 citations


Journal ArticleDOI
01 Jan 2016
TL;DR: This review paper introduces Bayesian optimization, highlights some of its methodological aspects, and showcases a wide range of applications.
Abstract: Big Data applications are typically associated with systems involving large numbers of users, massive complex software systems, and large-scale heterogeneous computing and storage architectures. The construction of such systems involves many distributed design choices. The end products (e.g., recommendation systems, medical analysis tools, real-time game engines, speech recognizers) thus involve many tunable configuration parameters. These parameters are often specified and hard-coded into the software by various developers or teams. If optimized jointly, these parameters can result in significant improvements. Bayesian optimization is a powerful tool for the joint optimization of design choices that is gaining great popularity in recent years. It promises greater automation so as to increase both product quality and human productivity. This review paper introduces Bayesian optimization, highlights some of its methodological aspects, and showcases a wide range of applications.

3,703 citations


Journal ArticleDOI
TL;DR: In this article, a review of the classification schemes of both fully gapped and gapless topological materials is presented, and a pedagogical introduction to the field of topological band theory is given.
Abstract: In recent years an increasing amount of attention has been devoted to quantum materials with topological characteristics that are robust against disorder and other perturbations. In this context it was discovered that topological materials can be classified with respect to their dimension and symmetry properties. This review provides an overview of the classification schemes of both fully gapped and gapless topological materials and gives a pedagogical introduction into the field of topological band theory.

2,123 citations


Journal ArticleDOI
07 Jan 2016-Nature
TL;DR: It is shown that droughts and extreme heat significantly reduced national cereal production by 9–10%, whereas the analysis could not identify an effect from floods and extreme cold in the national data, which may help to guide agricultural priorities in international disaster risk reduction and adaptation efforts.
Abstract: In recent years, several extreme weather disasters have partially or completely damaged regional crop production. While detailed regional accounts of the effects of extreme weather disasters exist, the global scale effects of droughts, floods and extreme temperature on crop production are yet to be quantified. Here we estimate for the first time, to our knowledge, national cereal production losses across the globe resulting from reported extreme weather disasters during 1964-2007. We show that droughts and extreme heat significantly reduced national cereal production by 9-10%, whereas our analysis could not identify an effect from floods and extreme cold in the national data. Analysing the underlying processes, we find that production losses due to droughts were associated with a reduction in both harvested area and yields, whereas extreme heat mainly decreased cereal yields. Furthermore, the results highlight ~7% greater production damage from more recent droughts and 8-11% more damage in developed countries than in developing ones. Our findings may help to guide agricultural priorities in international disaster risk reduction and adaptation efforts.

1,934 citations


Journal ArticleDOI
TL;DR: Among patients with previously untreated ER-positive, HER2-negative advanced breast cancer, palbociclib combined with letrozole resulted in significantly longer progression-free survival than that with let rozole alone, although the rates of myelotoxic effects were higher with palbokiclib-letrozoles.
Abstract: BackgroundA phase 2 study showed that progression-free survival was longer with palbociclib plus letrozole than with letrozole alone in the initial treatment of postmenopausal women with estrogen-receptor (ER)–positive, human epidermal growth factor receptor 2 (HER2)–negative advanced breast cancer. We performed a phase 3 study that was designed to confirm and expand the efficacy and safety data for palbociclib plus letrozole for this indication. MethodsIn this double-blind study, we randomly assigned, in a 2:1 ratio, 666 postmenopausal women with ER-positive, HER2-negative breast cancer, who had not had prior treatment for advanced disease, to receive palbociclib plus letrozole or placebo plus letrozole. The primary end point was progression-free survival, as assessed by the investigators; secondary end points were overall survival, objective response, clinical benefit response, patient-reported outcomes, pharmacokinetic effects, and safety. ResultsThe median progression-free survival was 24.8 months (95...

1,737 citations


Journal ArticleDOI
16 Sep 2016-Science
TL;DR: It is found that environmental conditions strongly influence the distribution of functional groups in marine microbial communities by shaping metabolic niches, but only weakly influence taxonomic composition within individual functional groups.
Abstract: Microbial metabolism powers biogeochemical cycling in Earth’s ecosystems. The taxonomic composition of microbial communities varies substantially between environments, but the ecological causes of this variation remain largely unknown. We analyzed taxonomic and functional community profiles to determine the factors that shape marine bacterial and archaeal communities across the global ocean. By classifying >30,000 marine microorganisms into metabolic functional groups, we were able to disentangle functional from taxonomic community variation. We find that environmental conditions strongly influence the distribution of functional groups in marine microbial communities by shaping metabolic niches, but only weakly influence taxonomic composition within individual functional groups. Hence, functional structure and composition within functional groups constitute complementary and roughly independent “axes of variation” shaped by markedly different processes.

1,566 citations


Journal ArticleDOI
Nicholas J Kassebaum1, Megha Arora1, Ryan M Barber1, Zulfiqar A Bhutta2  +679 moreInstitutions (268)
TL;DR: In this paper, the authors used the Global Burden of Diseases, Injuries, and Risk Factors Study 2015 (GBD 2015) for all-cause mortality, cause-specific mortality, and non-fatal disease burden to derive HALE and DALYs by sex for 195 countries and territories from 1990 to 2015.

Journal ArticleDOI
TL;DR: In this article, a hybrid bidirectional LSTM and CNN architecture was proposed to automatically detect word and character-level features, eliminating the need for feature engineering and lexicons to achieve high performance.
Abstract: Named entity recognition is a challenging task that has traditionally required large amounts of knowledge in the form of feature engineering and lexicons to achieve high performance. In this paper, we present a novel neural network architecture that automatically detects word- and character-level features using a hybrid bidirectional LSTM and CNN architecture, eliminating the need for most feature engineering. We also propose a novel method of encoding partial lexicon matches in neural networks and compare it to existing approaches. Extensive evaluation shows that, given only tokenized text and publicly available word embeddings, our system is competitive on the CoNLL-2003 dataset and surpasses the previously reported state of the art performance on the OntoNotes 5.0 dataset by 2.13 F1 points. By using two lexicons constructed from publicly-available sources, we establish new state of the art performance with an F1 score of 91.62 on CoNLL-2003 and 86.28 on OntoNotes, surpassing systems that employ heavy feature engineering, proprietary lexicons, and rich entity linking information.

Journal ArticleDOI
TL;DR: An updated review of the literature and evidence on the definitions and lexicon, the limits, the natural history, the markers of progression, and the ethical consequence of detecting the disease at this asymptomatic stage of Alzheimer's disease are provided.
Abstract: During the past decade, a conceptual shift occurred in the field of Alzheimer's disease (AD) considering the disease as a continuum. Thanks to evolving biomarker research and substantial discoveries, it is now possible to identify the disease even at the preclinical stage before the occurrence of the first clinical symptoms. This preclinical stage of AD has become a major research focus as the field postulates that early intervention may offer the best chance of therapeutic success. To date, very little evidence is established on this "silent" stage of the disease. A clarification is needed about the definitions and lexicon, the limits, the natural history, the markers of progression, and the ethical consequence of detecting the disease at this asymptomatic stage. This article is aimed at addressing all the different issues by providing for each of them an updated review of the literature and evidence, with practical recommendations.

Journal ArticleDOI
TL;DR: This poster aims to demonstrate the efforts towards in-situ applicability of EMMARM, which aims to provide real-time information about the physical and cognitive properties of Alzheimer's disease and other dementias.
Abstract: Defeating Alzheimer's disease and other dementias : a priority for European science and society

Journal ArticleDOI
TL;DR: This work proposes the “A/T/N” system, a descriptive system for categorizing multidomain biomarker findings at the individual person level in a format that is easy to understand and use and suited to population studies of cognitive aging.
Abstract: Biomarkers have become an essential component of Alzheimer disease (AD) research and because of the pervasiveness of AD pathology in the elderly, the same biomarkers are used in cognitive aging research. A number of current issues suggest that an unbiased descriptive classification scheme for these biomarkers would be useful. We propose the "A/T/N" system in which 7 major AD biomarkers are divided into 3 binary categories based on the nature of the pathophysiology that each measures. "A" refers to the value of a β-amyloid biomarker (amyloid PET or CSF Aβ42); "T," the value of a tau biomarker (CSF phospho tau, or tau PET); and "N," biomarkers of neurodegeneration or neuronal injury ([(18)F]-fluorodeoxyglucose-PET, structural MRI, or CSF total tau). Each biomarker category is rated as positive or negative. An individual score might appear as A+/T+/N-, or A+/T-/N-, etc. The A/T/N system includes the new modality tau PET. It is agnostic to the temporal ordering of mechanisms underlying AD pathogenesis. It includes all individuals in any population regardless of the mix of biomarker findings and therefore is suited to population studies of cognitive aging. It does not specify disease labels and thus is not a diagnostic classification system. It is a descriptive system for categorizing multidomain biomarker findings at the individual person level in a format that is easy to understand and use. Given the present lack of consensus among AD specialists on terminology across the clinically normal to dementia spectrum, a biomarker classification scheme will have broadest acceptance if it is independent from any one clinically defined diagnostic scheme.

Journal ArticleDOI
TL;DR: Combined with recent advances in human pluripotent stem cell technologies, 3D-bioprinted tissue models could serve as an enabling platform for high-throughput predictive drug screening and more effective regenerative therapies.

Journal ArticleDOI
TL;DR: The Canadian 24-Hour Movement Guidelines for Children and Youth: An Integration of Physical Activity, Sedentary Behaviour, and Sleep provide evidence-informed recommendations for a healthy day (24 h), comprising a combination of sleep, sedentary behaviours, light-, moderate-, and vigorous-intensity physical activity.
Abstract: Leaders from the Canadian Society for Exercise Physiology convened representatives of national organizations, content experts, methodologists, stakeholders, and end-users who followed rigorous and transparent guideline development procedures to create the Canadian 24-Hour Movement Guidelines for Children and Youth: An Integration of Physical Activity, Sedentary Behaviour, and Sleep. These novel guidelines for children and youth aged 5-17 years respect the natural and intuitive integration of movement behaviours across the whole day (24-h period). The development process was guided by the Appraisal of Guidelines for Research Evaluation (AGREE) II instrument and systematic reviews of evidence informing the guidelines were assessed using the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) approach. Four systematic reviews (physical activity, sedentary behaviour, sleep, integrated behaviours) examining the relationships between and among movement behaviours and several health indicators were completed and interpreted by expert consensus. Complementary compositional analyses were performed using Canadian Health Measures Survey data to examine the relationships between movement behaviours and health indicators. A stakeholder survey was employed (n = 590) and 28 focus groups/stakeholder interviews (n = 104) were completed to gather feedback on draft guidelines. Following an introductory preamble, the guidelines provide evidence-informed recommendations for a healthy day (24 h), comprising a combination of sleep, sedentary behaviours, light-, moderate-, and vigorous-intensity physical activity. Proactive dissemination, promotion, implementation, and evaluation plans have been prepared in an effort to optimize uptake and activation of the new guidelines. Future research should consider the integrated relationships among movement behaviours, and similar integrated guidelines for other age groups should be developed.

Journal ArticleDOI
TL;DR: The JASPAR CORE collection was expanded with 494 new TF binding profiles, and 130 transcription factor flexible models trained on ChIP-seq data for vertebrates, which capture dinucleotide dependencies within TF binding sites were introduced.
Abstract: JASPAR (http://jaspar.genereg.net) is an open-access database storing curated, non-redundant transcription factor (TF) binding profiles representing transcription factor binding preferences as position frequency matrices for multiple species in six taxonomic groups. For this 2016 release, we expanded the JASPAR CORE collection with 494 new TF binding profiles (315 in vertebrates, 11 in nematodes, 3 in insects, 1 in fungi and 164 in plants) and updated 59 profiles (58 in vertebrates and 1 in fungi). The introduced profiles represent an 83% expansion and 10% update when compared to the previous release. We updated the structural annotation of the TF DNA binding domains (DBDs) following a published hierarchical structural classification. In addition, we introduced 130 transcription factor flexible models trained on ChIP-seq data for vertebrates, which capture dinucleotide dependencies within TF binding sites. This new JASPAR release is accompanied by a new web tool to infer JASPAR TF binding profiles recognized by a given TF protein sequence. Moreover, we provide the users with a Ruby module complementing the JASPAR API to ease programmatic access and use of the JASPAR collection of profiles. Finally, we provide the JASPAR2016 R/Bioconductor data package with the data of this release.

Journal ArticleDOI
TL;DR: It is proposed that focusing only on instrumental or intrinsic values may fail to resonate with views on personal and collective well-being, or “what is right,” with regard to nature and the environment, and it is time to engage seriously with a third class of values, one with diverse roots and current expressions: relational values.
Abstract: A cornerstone of environmental policy is the debate over protecting nature for humans’ sake (instrumental values) or for nature’s (intrinsic values) (1). We propose that focusing only on instrumental or intrinsic values may fail to resonate with views on personal and collective well-being, or “what is right,” with regard to nature and the environment. Without complementary attention to other ways that value is expressed and realized by people, such a focus may inadvertently promote worldviews at odds with fair and desirable futures. It is time to engage seriously with a third class of values, one with diverse roots and current expressions: relational values. By doing so, we reframe the discussion about environmental protection, and open the door to new, potentially more productive policy approaches.

Journal ArticleDOI
11 Nov 2016
TL;DR: This work proposes a novel active learning method capable of enriching massive geometric datasets with accurate semantic region annotations, and demonstrates that incorporating verification of all produced labelings within this unified objective improves both accuracy and efficiency of the active learning procedure.
Abstract: Large repositories of 3D shapes provide valuable input for data-driven analysis and modeling tools. They are especially powerful once annotated with semantic information such as salient regions and functional parts. We propose a novel active learning method capable of enriching massive geometric datasets with accurate semantic region annotations. Given a shape collection and a user-specified region label our goal is to correctly demarcate the corresponding regions with minimal manual work. Our active framework achieves this goal by cycling between manually annotating the regions, automatically propagating these annotations across the rest of the shapes, manually verifying both human and automatic annotations, and learning from the verification results to improve the automatic propagation algorithm. We use a unified utility function that explicitly models the time cost of human input across all steps of our method. This allows us to jointly optimize for the set of models to annotate and for the set of models to verify based on the predicted impact of these actions on the human efficiency. We demonstrate that incorporating verification of all produced labelings within this unified objective improves both accuracy and efficiency of the active learning procedure. We automatically propagate human labels across a dynamic shape network using a conditional random field (CRF) framework, taking advantage of global shape-to-shape similarities, local feature similarities, and point-to-point correspondences. By combining these diverse cues we achieve higher accuracy than existing alternatives. We validate our framework on existing benchmarks demonstrating it to be significantly more efficient at using human input compared to previous techniques. We further validate its efficiency and robustness by annotating a massive shape dataset, labeling over 93,000 shape parts, across multiple model classes, and providing a labeled part collection more than one order of magnitude larger than existing ones.

Journal ArticleDOI
TL;DR: The concept of no longer clinically benefiting is introduced to underscore the distinction between first evidence of progression and the clinical need to terminate or change treatment, and the importance of documenting progression in existing lesions as distinct from the development of new lesions.
Abstract: PurposeEvolving treatments, disease phenotypes, and biology, together with a changing drug development environment, have created the need to revise castration-resistant prostate cancer (CRPC) clinical trial recommendations to succeed those from prior Prostate Cancer Clinical Trials Working Groups.MethodsAn international expert committee of prostate cancer clinical investigators (the Prostate Cancer Clinical Trials Working Group 3 [PCWG3]) was reconvened and expanded and met in 2012-2015 to formulate updated criteria on the basis of emerging trial data and validation studies of the Prostate Cancer Clinical Trials Working Group 2 recommendations.ResultsPCWG3 recommends that baseline patient assessment include tumor histology, detailed records of prior systemic treatments and responses, and a detailed reporting of disease subtypes based on an anatomic pattern of metastatic spread. New recommendations for trial outcome measures include the time to event end point of symptomatic skeletal events, as well as tim...

Journal ArticleDOI
Nabila Aghanim1, Monique Arnaud2, M. Ashdown3, J. Aumont1  +291 moreInstitutions (73)
TL;DR: In this article, the authors present the Planck 2015 likelihoods, statistical descriptions of the 2-point correlation functions of the cosmic microwave background (CMB) temperature and polarization fluctuations that account for relevant uncertainties.
Abstract: This paper presents the Planck 2015 likelihoods, statistical descriptions of the 2-point correlationfunctions of the cosmic microwave background (CMB) temperature and polarization fluctuations that account for relevant uncertainties, both instrumental and astrophysical in nature. They are based on the same hybrid approach used for the previous release, i.e., a pixel-based likelihood at low multipoles (l< 30) and a Gaussian approximation to the distribution of cross-power spectra at higher multipoles. The main improvements are the use of more and better processed data and of Planck polarization information, along with more detailed models of foregrounds and instrumental uncertainties. The increased redundancy brought by more than doubling the amount of data analysed enables further consistency checks and enhanced immunity to systematic effects. It also improves the constraining power of Planck, in particular with regard to small-scale foreground properties. Progress in the modelling of foreground emission enables the retention of a larger fraction of the sky to determine the properties of the CMB, which also contributes to the enhanced precision of the spectra. Improvements in data processing and instrumental modelling further reduce uncertainties. Extensive tests establish the robustness and accuracy of the likelihood results, from temperature alone, from polarization alone, and from their combination. For temperature, we also perform a full likelihood analysis of realistic end-to-end simulations of the instrumental response to the sky, which were fed into the actual data processing pipeline; this does not reveal biases from residual low-level instrumental systematics. Even with the increase in precision and robustness, the ΛCDM cosmological model continues to offer a very good fit to the Planck data. The slope of the primordial scalar fluctuations, n_s, is confirmed smaller than unity at more than 5σ from Planck alone. We further validate the robustness of the likelihood results against specific extensions to the baseline cosmology, which are particularly sensitive to data at high multipoles. For instance, the effective number of neutrino species remains compatible with the canonical value of 3.046. For this first detailed analysis of Planck polarization spectra, we concentrate at high multipoles on the E modes, leaving the analysis of the weaker B modes to future work. At low multipoles we use temperature maps at all Planck frequencies along with a subset of polarization data. These data take advantage of Planck’s wide frequency coverage to improve the separation of CMB and foreground emission. Within the baseline ΛCDM cosmology this requires τ = 0.078 ± 0.019 for the reionization optical depth, which is significantly lower than estimates without the use of high-frequency data for explicit monitoring of dust emission. At high multipoles we detect residual systematic errors in E polarization, typically at the μK^2 level; we therefore choose to retain temperature information alone for high multipoles as the recommended baseline, in particular for testing non-minimal models. Nevertheless, the high-multipole polarization spectra from Planck are already good enough to enable a separate high-precision determination of the parameters of the ΛCDM model, showing consistency with those established independently from temperature information alone.

Journal ArticleDOI
TL;DR: To better reflect the current state of knowledge and improve the feasibility of future research into its etiology and treatment, the working group proposes a new conceptual framework for acute respiratory deterioration in idiopathic pulmonary fibrosis and a revised definition and diagnostic criteria for acute exacerbation.
Abstract: Acute exacerbation of idiopathic pulmonary fibrosis has been defined as an acute, clinically significant, respiratory deterioration of unidentifiable cause. The objective of this international working group report on acute exacerbation of idiopathic pulmonary fibrosis was to provide a comprehensive update on the topic. A literature review was conducted to identify all relevant English text publications and abstracts. Evidence-based updates on the epidemiology, etiology, risk factors, prognosis, and management of acute exacerbations of idiopathic pulmonary fibrosis are provided. Finally, to better reflect the current state of knowledge and improve the feasibility of future research into its etiology and treatment, the working group proposes a new conceptual framework for acute respiratory deterioration in idiopathic pulmonary fibrosis and a revised definition and diagnostic criteria for acute exacerbation of idiopathic pulmonary fibrosis.

Journal ArticleDOI
TL;DR: A decade-long multinational ‘catch reconstruction’ project covering the Exclusive Economic Zones of the world's maritime countries and the High Seas from 1950 to 2010, and accounting for all fisheries, suggests that catch actually peaked at 130 million tonnes, and has been declining much more strongly since.
Abstract: Fisheries data assembled by the Food and Agriculture Organization (FAO) suggest that global marine fisheries catches increased to 86 million tonnes in 1996, then slightly declined. Here, using a decade-long multinational ‘catch reconstruction' project covering the Exclusive Economic Zones of the world's maritime countries and the High Seas from 1950 to 2010, and accounting for all fisheries, we identify catch trajectories differing considerably from the national data submitted to the FAO. We suggest that catch actually peaked at 130 million tonnes, and has been declining much more strongly since. This decline in reconstructed catches reflects declines in industrial catches and to a smaller extent declining discards, despite industrial fishing having expanded from industrialized countries to the waters of developing countries. The differing trajectories documented here suggest a need for improved monitoring of all fisheries, including often neglected small-scale fisheries, and illegal and other problematic fisheries, as well as discarded bycatch.

Journal ArticleDOI
10 Mar 2016-Nature
TL;DR: These repeat bursts with high dispersion measure and variable spectra specifically seen from the direction of FRB 121102 support an origin in a young, highly magnetized, extragalactic neutron star.
Abstract: Observations of repeated fast radio bursts, having dispersion measures and sky positions consistent with those of FRB 121102, show that the signals do not originate in a single cataclysmic event and may come from a young, highly magnetized, extragalactic neutron star. Fast radio bursts (FRBs) are transient radio pulses that last a few milliseconds. They are thought to be extragalactic, and are of unknown physical origin. Many FRB models have proposed the cause to be one-time-only cataclysmic events. Follow-up monitoring of detected bursts did not reveal repeat bursts, consistent with such models. However, this paper reports ten additional bursts from the direction of FRB 121102, demonstrating that its source survived the energetic events that caused the bursts. Although there may be multiple physical origins for the burst, the repeating bursts seen from FRB 121102 support an origin in a young, highly magnetized, extragalactic neutron star. Fast radio bursts are millisecond-duration astronomical radio pulses of unknown physical origin that appear to come from extragalactic distances1,2,3,4,5,6,7,8. Previous follow-up observations have failed to find additional bursts at the same dispersion measure (that is, the integrated column density of free electrons between source and telescope) and sky position as the original detections9. The apparent non-repeating nature of these bursts has led to the suggestion that they originate in cataclysmic events10. Here we report observations of ten additional bursts from the direction of the fast radio burst FRB 121102. These bursts have dispersion measures and sky positions consistent with the original burst4. This unambiguously identifies FRB 121102 as repeating and demonstrates that its source survives the energetic events that cause the bursts. Additionally, the bursts from FRB 121102 show a wide range of spectral shapes that appear to be predominantly intrinsic to the source and which vary on timescales of minutes or less. Although there may be multiple physical origins for the population of fast radio bursts, these repeat bursts with high dispersion measure and variable spectra specifically seen from the direction of FRB 121102 support an origin in a young, highly magnetized, extragalactic neutron star11,12.

Journal ArticleDOI
Juanita A. Haagsma1, Nicholas Graetz1, Ian Bolliger1, Mohsen Naghavi1, Hideki Higashi1, Erin C Mullany1, Semaw Ferede Abera2, Jerry Puthenpurakal Abraham3, Koranteng Adofo4, Ubai Alsharif5, Emmanuel A. Ameh6, Walid Ammar, Carl Abelardo T. Antonio7, Lope H Barrero8, Tolesa Bekele9, Dipan Bose10, Alexandra Brazinova, Ferrán Catalá-López, Lalit Dandona1, Rakhi Dandona11, Paul I. Dargan12, Diego De Leo13, Louisa Degenhardt14, Sarah Derrett15, Samath D Dharmaratne16, Tim Driscoll17, Leilei Duan18, Sergey Petrovich Ermakov19, Farshad Farzadfar20, Valery L. Feigin21, Richard C. Franklin22, Belinda J. Gabbe23, Richard A. Gosselin24, Nima Hafezi-Nejad20, Randah R. Hamadeh25, Martha Híjar, Guoqing Hu26, Sudha Jayaraman27, Guohong Jiang, Yousef Khader28, Ejaz Ahmad Khan29, Sanjay Krishnaswami30, Chanda Kulkarni, Fiona Lecky31, Ricky Leung32, Raimundas Lunevicius33, Ronan A Lyons34, Marek Majdan, Amanda J. Mason-Jones35, Richard Matzopoulos36, Peter A. Meaney37, Wubegzier Mekonnen38, Ted R. Miller39, Charles Mock40, Rosana E. Norman41, Ricardo Orozco, Suzanne Polinder, Farshad Pourmalek42, Vafa Rahimi-Movaghar20, Amany H. Refaat43, David Rojas-Rueda, Nobhojit Roy44, David C. Schwebel45, Amira Shaheen46, Saeid Shahraz47, Vegard Skirbekk48, Kjetil Søreide49, Sergey Soshnikov, Dan J. Stein50, Bryan L. Sykes51, Karen M. Tabb52, Awoke Misganaw Temesgen, Eric Y. Tenkorang53, Alice Theadom21, Bach Xuan Tran54, Bach Xuan Tran55, Tommi Vasankari, Monica S. Vavilala40, Vasiliy Victorovich Vlassov56, Solomon Meseret Woldeyohannes57, Paul S. F. Yip58, Naohiro Yonemoto, Mustafa Z. Younis59, Chuanhua Yu60, Christopher J L Murray1, Theo Vos1 
Institute for Health Metrics and Evaluation1, College of Health Sciences, Bahrain2, Harvard University3, Kwame Nkrumah University of Science and Technology4, Charité5, Ahmadu Bello University6, University of the Philippines Manila7, Pontifical Xavierian University8, Madawalabu University9, World Bank10, Public Health Foundation of India11, Guy's and St Thomas' NHS Foundation Trust12, Griffith University13, University of New South Wales14, Massey University15, University of Peradeniya16, University of Sydney17, Chinese Center for Disease Control and Prevention18, Russian Academy of Sciences19, Tehran University of Medical Sciences20, Auckland University of Technology21, James Cook University22, Monash University23, University of California, San Francisco24, Arabian Gulf University25, Central South University26, Virginia Commonwealth University27, Jordan University of Science and Technology28, Health Services Academy29, Oregon Health & Science University30, University of Sheffield31, University at Albany, SUNY32, Aintree University Hospitals NHS Foundation Trust33, Swansea University34, University of York35, South African Medical Research Council36, Children's Hospital of Philadelphia37, Addis Ababa University38, Curtin University39, University of Washington40, Queensland University of Technology41, University of British Columbia42, Suez Canal University43, Karolinska Institutet44, University of Alabama at Birmingham45, An-Najah National University46, Tufts Medical Center47, Norwegian Institute of Public Health48, Stavanger University Hospital49, University of Cape Town50, University of California, Irvine51, University of Illinois at Urbana–Champaign52, St. John's University53, Hanoi Medical University54, Johns Hopkins University55, National Research University – Higher School of Economics56, University of Gondar57, University of Hong Kong58, Jackson State University59, Wuhan University60
TL;DR: An overview of injury estimates from the 2013 update of GBD is provided, with detailed information on incidence, mortality, DALYs and rates of change from 1990 to 2013 for 26 causes of injury, globally, by region and by country.
Abstract: Background The Global Burden of Diseases (GBD), Injuries, and Risk Factors study used the disability-adjusted life year (DALY) to quantify the burden of diseases, injuries, and risk factors. This paper provides an overview of injury estimates from the 2013 update of GBD, with detailed information on incidence, mortality, DALYs and rates of change from 1990 to 2013 for 26 causes of injury, globally, by region and by country. Methods Injury mortality was estimated using the extensive GBD mortality database, corrections for ill-defined cause of death and the cause of death ensemble modelling tool. Morbidity estimation was based on inpatient and outpatient data sets, 26 cause-of-injury and 47 nature-of-injury categories, and seven follow-up studies with patient-reported long-term outcome measures. Results In 2013, 973 million (uncertainty interval (UI) 942 to 993) people sustained injuries that warranted some type of healthcare and 4.8 million (UI 4.5 to 5.1) people died from injuries. Between 1990 and 2013 the global age-standardised injury DALY rate decreased by 31% (UI 26% to 35%). The rate of decline in DALY rates was significant for 22 cause-of-injury categories, including all the major injuries. Conclusions Injuries continue to be an important cause of morbidity and mortality in the developed and developing world. The decline in rates for almost all injuries is so prominent that it warrants a general statement that the world is becoming a safer place to live in. However, the patterns vary widely by cause, age, sex, region and time and there are still large improvements that need to be made.

Journal ArticleDOI
TL;DR: This approach demonstrates that the addition of even sparse ground-based measurements to more globally continuous PM2.5 data sources can yield valuable improvements to PM 2.5 characterization on a global scale.
Abstract: We estimated global fine particulate matter (PM2.5) concentrations using information from satellite-, simulation- and monitor-based sources by applying a Geographically Weighted Regression (GWR) to global geophysically based satellite-derived PM2.5 estimates. Aerosol optical depth from multiple satellite products (MISR, MODIS Dark Target, MODIS and SeaWiFS Deep Blue, and MODIS MAIAC) was combined with simulation (GEOS-Chem) based upon their relative uncertainties as determined using ground-based sun photometer (AERONET) observations for 1998-2014. The GWR predictors included simulated aerosol composition and land use information. The resultant PM2.5 estimates were highly consistent (R(2) = 0.81) with out-of-sample cross-validated PM2.5 concentrations from monitors. The global population-weighted annual average PM2.5 concentrations were 3-fold higher than the 10 μg/m(3) WHO guideline, driven by exposures in Asian and African regions. Estimates in regions with high contributions from mineral dust were associated with higher uncertainty, resulting from both sparse ground-based monitoring, and challenging conditions for retrieval and simulation. This approach demonstrates that the addition of even sparse ground-based measurements to more globally continuous PM2.5 data sources can yield valuable improvements to PM2.5 characterization on a global scale.

Journal ArticleDOI
TL;DR: It is demonstrated that zygomycetes comprise two major clades that form a paraphyletic grade, and the phyla Mucoromycota and ZoopagomyCota are circumscribed.
Abstract: Zygomycete fungi were classified as a single phylum, Zygomycota, based on sexual reproduction by zygospores, frequent asexual reproduction by sporangia, absence of multicellular sporocarps, and production of coenocytic hyphae, all with some exceptions. Molecular phylogenies based on one or a few genes did not support the monophyly of the phylum, however, and the phylum was subsequently abandoned. Here we present phylogenetic analyses of a genome-scale data set for 46 taxa, including 25 zygomycetes and 192 proteins, and we demonstrate that zygomycetes comprise two major clades that form a paraphyletic grade. A formal phylogenetic classification is proposed herein and includes two phyla, six subphyla, four classes and 16 orders. On the basis of these results, the phyla Mucoromycota and Zoopagomycota are circumscribed. Zoopagomycota comprises Entomophtoromycotina, Kickxellomycotina and Zoopagomycotina; it constitutes the earliest diverging lineage of zygomycetes and contains species that are primarily parasites and pathogens of small animals (e.g. amoeba, insects, etc.) and other fungi, i.e. mycoparasites. Mucoromycota comprises Glomeromycotina, Mortierellomycotina, and Mucoromycotina and is sister to Dikarya. It is the more derived clade of zygomycetes and mainly consists of mycorrhizal fungi, root endophytes, and decomposers of plant material. Evolution of trophic modes, morphology, and analysis of genome-scale data are discussed.

Journal ArticleDOI
TL;DR: In this paper, the authors combined satellite-based estimates, chemical transport model simulations, and ground measurements from 79 different countries to produce global estimates of annual average fine particle (PM2.5) and ozone concentrations at 0.1° × 0. 1° spatial resolution for five-year intervals from 1990 to 2010 and the year 2013.
Abstract: Exposure to ambient air pollution is a major risk factor for global disease. Assessment of the impacts of air pollution on population health and evaluation of trends relative to other major risk factors requires regularly updated, accurate, spatially resolved exposure estimates. We combined satellite-based estimates, chemical transport model simulations, and ground measurements from 79 different countries to produce global estimates of annual average fine particle (PM2.5) and ozone concentrations at 0.1° × 0.1° spatial resolution for five-year intervals from 1990 to 2010 and the year 2013. These estimates were applied to assess population-weighted mean concentrations for 1990-2013 for each of 188 countries. In 2013, 87% of the world's population lived in areas exceeding the World Health Organization Air Quality Guideline of 10 μg/m(3) PM2.5 (annual average). Between 1990 and 2013, global population-weighted PM2.5 increased by 20.4% driven by trends in South Asia, Southeast Asia, and China. Decreases in population-weighted mean concentrations of PM2.5 were evident in most high income countries. Population-weighted mean concentrations of ozone increased globally by 8.9% from 1990-2013 with increases in most countries-except for modest decreases in North America, parts of Europe, and several countries in Southeast Asia.

Journal ArticleDOI
TL;DR: Members of the American Academy of Sleep Medicine developed consensus recommendations for the amount of sleep needed to promote optimal health in children and adolescents using a modified RAND Appropriateness Method.
Abstract: Sleep is essential for optimal health in children and adolescents. Members of the American Academy of Sleep Medicine developed consensus recommendations for the amount of sleep needed to promote op...