scispace - formally typeset
Search or ask a question

Showing papers by "Trinity College, Dublin published in 2016"


Journal ArticleDOI
Peter A. R. Ade1, Nabila Aghanim2, Monique Arnaud3, M. Ashdown4  +334 moreInstitutions (82)
TL;DR: In this article, the authors present a cosmological analysis based on full-mission Planck observations of temperature and polarization anisotropies of the cosmic microwave background (CMB) radiation.
Abstract: This paper presents cosmological results based on full-mission Planck observations of temperature and polarization anisotropies of the cosmic microwave background (CMB) radiation. Our results are in very good agreement with the 2013 analysis of the Planck nominal-mission temperature data, but with increased precision. The temperature and polarization power spectra are consistent with the standard spatially-flat 6-parameter ΛCDM cosmology with a power-law spectrum of adiabatic scalar perturbations (denoted “base ΛCDM” in this paper). From the Planck temperature data combined with Planck lensing, for this cosmology we find a Hubble constant, H0 = (67.8 ± 0.9) km s-1Mpc-1, a matter density parameter Ωm = 0.308 ± 0.012, and a tilted scalar spectral index with ns = 0.968 ± 0.006, consistent with the 2013 analysis. Note that in this abstract we quote 68% confidence limits on measured parameters and 95% upper limits on other parameters. We present the first results of polarization measurements with the Low Frequency Instrument at large angular scales. Combined with the Planck temperature and lensing data, these measurements give a reionization optical depth of τ = 0.066 ± 0.016, corresponding to a reionization redshift of . These results are consistent with those from WMAP polarization measurements cleaned for dust emission using 353-GHz polarization maps from the High Frequency Instrument. We find no evidence for any departure from base ΛCDM in the neutrino sector of the theory; for example, combining Planck observations with other astrophysical data we find Neff = 3.15 ± 0.23 for the effective number of relativistic degrees of freedom, consistent with the value Neff = 3.046 of the Standard Model of particle physics. The sum of neutrino masses is constrained to ∑ mν < 0.23 eV. The spatial curvature of our Universe is found to be very close to zero, with | ΩK | < 0.005. Adding a tensor component as a single-parameter extension to base ΛCDM we find an upper limit on the tensor-to-scalar ratio of r0.002< 0.11, consistent with the Planck 2013 results and consistent with the B-mode polarization constraints from a joint analysis of BICEP2, Keck Array, and Planck (BKP) data. Adding the BKP B-mode data to our analysis leads to a tighter constraint of r0.002 < 0.09 and disfavours inflationarymodels with a V(φ) ∝ φ2 potential. The addition of Planck polarization data leads to strong constraints on deviations from a purely adiabatic spectrum of fluctuations. We find no evidence for any contribution from isocurvature perturbations or from cosmic defects. Combining Planck data with other astrophysical data, including Type Ia supernovae, the equation of state of dark energy is constrained to w = −1.006 ± 0.045, consistent with the expected value for a cosmological constant. The standard big bang nucleosynthesis predictions for the helium and deuterium abundances for the best-fit Planck base ΛCDM cosmology are in excellent agreement with observations. We also constraints on annihilating dark matter and on possible deviations from the standard recombination history. In neither case do we find no evidence for new physics. The Planck results for base ΛCDM are in good agreement with baryon acoustic oscillation data and with the JLA sample of Type Ia supernovae. However, as in the 2013 analysis, the amplitude of the fluctuation spectrum is found to be higher than inferred from some analyses of rich cluster counts and weak gravitational lensing. We show that these tensions cannot easily be resolved with simple modifications of the base ΛCDM cosmology. Apart from these tensions, the base ΛCDM cosmology provides an excellent description of the Planck CMB observations and many other astrophysical data sets.

10,728 citations


Journal ArticleDOI
Daniel J. Klionsky1, Kotb Abdelmohsen2, Akihisa Abe3, Joynal Abedin4  +2519 moreInstitutions (695)
TL;DR: In this paper, the authors present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macro-autophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure flux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation, it is imperative to target by gene knockout or RNA interference more than one autophagy-related protein. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways implying that not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular assays, we hope to encourage technical innovation in the field.

5,187 citations


Journal ArticleDOI
TL;DR: A brief refresher course on six of the major metabolic pathways involved in immunometabolism is provided, giving specific examples of how precise changes in the metabolites of these pathways shape the immune cell response.
Abstract: Immunometabolism is emerging an important area of immunological research, but for many immunologists the complexity of the field can be daunting. Here, the authors provide an overview of six key metabolic pathways that occur in immune cells and explain what is known (and what is still to be uncovered) concerning their effects on immune cell function. In recent years a substantial number of findings have been made in the area of immunometabolism, by which we mean the changes in intracellular metabolic pathways in immune cells that alter their function. Here, we provide a brief refresher course on six of the major metabolic pathways involved (specifically, glycolysis, the tricarboxylic acid (TCA) cycle, the pentose phosphate pathway, fatty acid oxidation, fatty acid synthesis and amino acid metabolism), giving specific examples of how precise changes in the metabolites of these pathways shape the immune cell response. What is emerging is a complex interplay between metabolic reprogramming and immunity, which is providing an extra dimension to our understanding of the immune system in health and disease.

1,857 citations


Journal ArticleDOI
22 Apr 2016-Science
TL;DR: Proof-of-principle experimental studies support the hypothesis that trained immunity is one of the main immunological processes that mediate the nonspecific protective effects against infections induced by vaccines, such as bacillus Calmette-Guérin or measles vaccination.
Abstract: The general view that only adaptive immunity can build immunological memory has recently been challenged. In organisms lacking adaptive immunity, as well as in mammals, the innate immune system can mount resistance to reinfection, a phenomenon termed "trained immunity" or "innate immune memory." Trained immunity is orchestrated by epigenetic reprogramming, broadly defined as sustained changes in gene expression and cell physiology that do not involve permanent genetic changes such as mutations and recombination, which are essential for adaptive immunity. The discovery of trained immunity may open the door for novel vaccine approaches, new therapeutic strategies for the treatment of immune deficiency states, and modulation of exaggerated inflammation in autoinflammatory diseases.

1,690 citations


Journal ArticleDOI
06 Oct 2016-Cell
TL;DR: It is demonstrated that upon lipopolysaccharide stimulation, macrophages shift from producing ATP by oxidative phosphorylation to glycolysis while also increasing succinate levels, and repurpose mitochondria from ATP synthesis to ROS production in order to promote a pro-inflammatory state.

1,249 citations


Journal ArticleDOI
TL;DR: In this review, O’Neill and Pearce discuss recent intriguing findings on metabolic changes regulating the function of macrophages and dendritic cells.
Abstract: Recent studies on intracellular metabolism in dendritic cells (DCs) and macrophages provide new insights on the functioning of these critical controllers of innate and adaptive immunity. Both cell types undergo profound metabolic reprogramming in response to environmental cues, such as hypoxia or nutrient alterations, but importantly also in response to danger signals and cytokines. Metabolites such as succinate and citrate have a direct impact on the functioning of macrophages. Immunogenicity and tolerogenicity of DCs is also determined by anabolic and catabolic processes, respectively. These findings provide new prospects for therapeutic manipulation in inflammatory diseases and cancer.

1,089 citations


Journal ArticleDOI
Nabila Aghanim1, Monique Arnaud2, M. Ashdown3, J. Aumont1  +291 moreInstitutions (73)
TL;DR: In this article, the authors present the Planck 2015 likelihoods, statistical descriptions of the 2-point correlation functions of the cosmic microwave background (CMB) temperature and polarization fluctuations that account for relevant uncertainties.
Abstract: This paper presents the Planck 2015 likelihoods, statistical descriptions of the 2-point correlationfunctions of the cosmic microwave background (CMB) temperature and polarization fluctuations that account for relevant uncertainties, both instrumental and astrophysical in nature. They are based on the same hybrid approach used for the previous release, i.e., a pixel-based likelihood at low multipoles (l< 30) and a Gaussian approximation to the distribution of cross-power spectra at higher multipoles. The main improvements are the use of more and better processed data and of Planck polarization information, along with more detailed models of foregrounds and instrumental uncertainties. The increased redundancy brought by more than doubling the amount of data analysed enables further consistency checks and enhanced immunity to systematic effects. It also improves the constraining power of Planck, in particular with regard to small-scale foreground properties. Progress in the modelling of foreground emission enables the retention of a larger fraction of the sky to determine the properties of the CMB, which also contributes to the enhanced precision of the spectra. Improvements in data processing and instrumental modelling further reduce uncertainties. Extensive tests establish the robustness and accuracy of the likelihood results, from temperature alone, from polarization alone, and from their combination. For temperature, we also perform a full likelihood analysis of realistic end-to-end simulations of the instrumental response to the sky, which were fed into the actual data processing pipeline; this does not reveal biases from residual low-level instrumental systematics. Even with the increase in precision and robustness, the ΛCDM cosmological model continues to offer a very good fit to the Planck data. The slope of the primordial scalar fluctuations, n_s, is confirmed smaller than unity at more than 5σ from Planck alone. We further validate the robustness of the likelihood results against specific extensions to the baseline cosmology, which are particularly sensitive to data at high multipoles. For instance, the effective number of neutrino species remains compatible with the canonical value of 3.046. For this first detailed analysis of Planck polarization spectra, we concentrate at high multipoles on the E modes, leaving the analysis of the weaker B modes to future work. At low multipoles we use temperature maps at all Planck frequencies along with a subset of polarization data. These data take advantage of Planck’s wide frequency coverage to improve the separation of CMB and foreground emission. Within the baseline ΛCDM cosmology this requires τ = 0.078 ± 0.019 for the reionization optical depth, which is significantly lower than estimates without the use of high-frequency data for explicit monitoring of dust emission. At high multipoles we detect residual systematic errors in E polarization, typically at the μK^2 level; we therefore choose to retain temperature information alone for high multipoles as the recommended baseline, in particular for testing non-minimal models. Nevertheless, the high-multipole polarization spectra from Planck are already good enough to enable a separate high-precision determination of the parameters of the ΛCDM model, showing consistency with those established independently from temperature information alone.

932 citations


Journal ArticleDOI
TL;DR: Worldwide cooperative analyses of brain imaging data support a profile of subcortical abnormalities in schizophrenia, which is consistent with that based on traditional meta-analytic approaches, and validates that collaborative data analyses can readily be used across brain phenotypes and disorders.
Abstract: The profile of brain structural abnormalities in schizophrenia is still not fully understood, despite decades of research using brain scans. To validate a prospective meta-analysis approach to analyzing multicenter neuroimaging data, we analyzed brain MRI scans from 2028 schizophrenia patients and 2540 healthy controls, assessed with standardized methods at 15 centers worldwide. We identified subcortical brain volumes that differentiated patients from controls, and ranked them according to their effect sizes. Compared with healthy controls, patients with schizophrenia had smaller hippocampus (Cohen's d=-0.46), amygdala (d=-0.31), thalamus (d=-0.31), accumbens (d=-0.25) and intracranial volumes (d=-0.12), as well as larger pallidum (d=0.21) and lateral ventricle volumes (d=0.37). Putamen and pallidum volume augmentations were positively associated with duration of illness and hippocampal deficits scaled with the proportion of unmedicated patients. Worldwide cooperative analyses of brain imaging data support a profile of subcortical abnormalities in schizophrenia, which is consistent with that based on traditional meta-analytic approaches. This first ENIGMA Schizophrenia Working Group study validates that collaborative data analyses can readily be used across brain phenotypes and disorders and encourages analysis and data sharing efforts to further our understanding of severe mental illness.

919 citations


Journal ArticleDOI
Peter A. R. Ade1, Nabila Aghanim2, Monique Arnaud3, M. Ashdown4  +301 moreInstitutions (72)
TL;DR: In this paper, the implications of Planck data for models of dark energy (DE) and modified gravity (MG) beyond the standard cosmological constant scenario were studied, and it was shown that the density of DE at early times has to be below 2% of the critical density, even when forced to play a role for z < 50.
Abstract: We study the implications of Planck data for models of dark energy (DE) and modified gravity (MG) beyond the standard cosmological constant scenario. We start with cases where the DE only directly affects the background evolution, considering Taylor expansions of the equation of state w(a), as well as principal component analysis and parameterizations related to the potential of a minimally coupled DE scalar field. When estimating the density of DE at early times, we significantly improve present constraints and find that it has to be below ~2% (at 95% confidence) of the critical density, even when forced to play a role for z < 50 only. We then move to general parameterizations of the DE or MG perturbations that encompass both effective field theories and the phenomenology of gravitational potentials in MG models. Lastly, we test a range of specific models, such as k-essence, f(R) theories, and coupled DE. In addition to the latest Planck data, for our main analyses, we use background constraints from baryonic acoustic oscillations, type-Ia supernovae, and local measurements of the Hubble constant. We further show the impact of measurements of the cosmological perturbations, such as redshift-space distortions and weak gravitational lensing. These additional probes are important tools for testing MG models and for breaking degeneracies that are still present in the combination of Planck and background data sets. All results that include only background parameterizations (expansion of the equation of state, early DE, general potentials in minimally-coupled scalar fields or principal component analysis) are in agreement with ΛCDM. When testing models that also change perturbations (even when the background is fixed to ΛCDM), some tensions appear in a few scenarios: the maximum one found is ~2σ for Planck TT+lowP when parameterizing observables related to the gravitational potentials with a chosen time dependence; the tension increases to, at most, 3σ when external data sets are included. It however disappears when including CMB lensing.

816 citations


Journal ArticleDOI
TL;DR: This paper outlines a set of requirements for IoT middleware, and presents a comprehensive review of the existing middleware solutions against those requirements, and open research issues, challenges, and future research directions are highlighted.
Abstract: The Internet of Things (IoT) envisages a future in which digital and physical things or objects (e.g., smartphones, TVs, cars) can be connected by means of suitable information and communication technologies, to enable a range of applications and services. The IoT’s characteristics, including an ultra-large-scale network of things, device and network level heterogeneity, and large numbers of events generated spontaneously by these things, will make development of the diverse applications and services a very challenging task. In general, middleware can ease a development process by integrating heterogeneous computing and communications devices, and supporting interoperability within the diverse applications and services. Recently, there have been a number of proposals for IoT middleware. These proposals mostly addressed wireless sensor networks (WSNs), a key component of IoT, but do not consider RF identification (RFID), machine-to-machine (M2M) communications, and supervisory control and data acquisition (SCADA), other three core elements in the IoT vision. In this paper, we outline a set of requirements for IoT middleware, and present a comprehensive review of the existing middleware solutions against those requirements. In addition, open research issues, challenges, and future research directions are highlighted.

805 citations


Posted ContentDOI
23 Feb 2016-bioRxiv
TL;DR: A collaborative effort in which a centralized analysis pipeline is applied to a SCZ cohort, finding support at a suggestive level for nine additional candidate susceptibility and protective loci, which consist predominantly of CNVs mediated by non-allelic homologous recombination (NAHR).
Abstract: Genomic copy number variants (CNVs) have been strongly implicated in the etiology of schizophrenia (SCZ). However, apart from a small number of risk variants, elucidation of the CNV contribution to risk has been difficult due to the rarity of risk alleles, all occurring in less than 1% of cases. We sought to address this obstacle through a collaborative effort in which we applied a centralized analysis pipeline to a SCZ cohort of 21,094 cases and 20,227 controls. We observed a global enrichment of CNV burden in cases (OR=1.11, P=5.7e-15), which persisted after excluding loci implicated in previous studies (OR=1.07, P=1.7e-6). CNV burden is also enriched for genes associated with synaptic function (OR = 1.68, P = 2.8e-11) and neurobehavioral phenotypes in mouse (OR = 1.18, P= 7.3e-5). We identified genome-wide significant support for eight loci, including 1q21.1, 2p16.3 (NRXN1), 3q29, 7q11.2, 15q13.3, distal 16p11.2, proximal 16p11.2 and 22q11.2. We find support at a suggestive level for nine additional candidate susceptibility and protective loci, which consist predominantly of CNVs mediated by non-allelic homologous recombination (NAHR).

Journal ArticleDOI
TL;DR: The goal of this paper is to update a more extensive review and guidelines paper published in 2012 with any pertinent update pertaining to the diagnosis and staging of individual prima prima cancer patients.
Abstract: The goal of this paper is to update a more extensive review and guidelines paper published in 2012 [1] . Gen-erally, any pertinent update pertaining to the diagnosis and staging of individual prima ...

Journal ArticleDOI
Peter A. R. Ade1, Nabila Aghanim2, Monique Arnaud3, Frederico Arroja4  +306 moreInstitutions (75)
TL;DR: In this article, the Planck full mission cosmic microwave background (CMB) temperature and E-mode polarization maps are analysed to obtain constraints on primordial non-Gaussianity (NG).
Abstract: The Planck full mission cosmic microwave background (CMB) temperature and E-mode polarization maps are analysed to obtain constraints on primordial non-Gaussianity (NG). Using three classes of optimal bispectrum estimators – separable template-fitting (KSW), binned, and modal – we obtain consistent values for the primordial local, equilateral, and orthogonal bispectrum amplitudes, quoting as our final result from temperature alone ƒlocalNL = 2.5 ± 5.7, ƒequilNL= -16 ± 70, , and ƒorthoNL = -34 ± 32 (68% CL, statistical). Combining temperature and polarization data we obtain ƒlocalNL = 0.8 ± 5.0, ƒequilNL= -4 ± 43, and ƒorthoNL = -26 ± 21 (68% CL, statistical). The results are based on comprehensive cross-validation of these estimators on Gaussian and non-Gaussian simulations, are stable across component separation techniques, pass an extensive suite of tests, and are consistent with estimators based on measuring the Minkowski functionals of the CMB. The effect of time-domain de-glitching systematics on the bispectrum is negligible. In spite of these test outcomes we conservatively label the results including polarization data as preliminary, owing to a known mismatch of the noise model in simulations and the data. Beyond estimates of individual shape amplitudes, we present model-independent, three-dimensional reconstructions of the Planck CMB bispectrum and derive constraints on early universe scenarios that generate primordial NG, including general single-field models of inflation, axion inflation, initial state modifications, models producing parity-violating tensor bispectra, and directionally dependent vector models. We present a wide survey of scale-dependent feature and resonance models, accounting for the “look elsewhere” effect in estimating the statistical significance of features. We also look for isocurvature NG, and find no signal, but we obtain constraints that improve significantly with the inclusion of polarization. The primordial trispectrum amplitude in the local model is constrained to be

Journal ArticleDOI
TL;DR: A communication system in which status updates arrive at a source node, and should be transmitted through a network to the intended destination node, using the queuing theory, and it is assumed that the time it takes to successfully transmit a packet to the destination is an exponentially distributed service time.
Abstract: We consider a communication system in which status updates arrive at a source node, and should be transmitted through a network to the intended destination node. The status updates are samples of a random process under observation, transmitted as packets, which also contain the time stamp to identify when the sample was generated. The age of the information available to the destination node is the time elapsed, since the last received update was generated. In this paper, we model the source-destination link using the queuing theory, and we assume that the time it takes to successfully transmit a packet to the destination is an exponentially distributed service time. We analyze the age of information in the case that the source node has the capability to manage the arriving samples, possibly discarding packets in order to avoid wasting network resources with the transmission of stale information. In addition to characterizing the average age, we propose a new metric, called peak age, which provides information about the maximum value of the age, achieved immediately before receiving an update.

Journal ArticleDOI
09 Dec 2016-Science
TL;DR: By considering both the connectivity and mobility of the nanosheets, a quantitative model is developed that completely describes the electromechanical properties of graphene, allowing the manufacture of strain sensors that can detect respiration and the footsteps of spiders.
Abstract: Despite its widespread use in nanocomposites, the effect of embedding graphene in highly viscoelastic polymer matrices is not well understood. We added graphene to a lightly cross-linked polysilicone, often encountered as Silly Putty, changing its electromechanical properties substantially. The resulting nanocomposites display unusual electromechanical behavior, such as postdeformation temporal relaxation of electrical resistance and nonmonotonic changes in resistivity with strain. These phenomena are associated with the mobility of the nanosheets in the low-viscosity polymer matrix. By considering both the connectivity and mobility of the nanosheets, we developed a quantitative model that completely describes the electromechanical properties. These nanocomposites are sensitive electromechanical sensors with gauge factors >500 that can measure pulse, blood pressure, and even the impact associated with the footsteps of a small spider.

Journal ArticleDOI
TL;DR: This review identifies the emerging evidence to support policy for the management of people with multimorbidity and common comorbidities in primary care and community settings and confidence in the results regarding the effectiveness of interventions ranged from low to high certainty.
Abstract: Background Many people with chronic disease have more than one chronic condition, which is referred to as multimorbidity. The term comorbidity is also used but this is now taken to mean that there is a defined index condition with other linked conditions, for example diabetes and cardiovascular disease. It is also used when there are combinations of defined conditions that commonly co-exist, for example diabetes and depression. While this is not a new phenomenon, there is greater recognition of its impact and the importance of improving outcomes for individuals affected. Research in the area to date has focused mainly on descriptive epidemiology and impact assessment. There has been limited exploration of the effectiveness of interventions to improve outcomes for people with multimorbidity. Objectives To determine the effectiveness of health-service or patient-oriented interventions designed to improve outcomes in people with multimorbidity in primary care and community settings. Multimorbidity was defined as two or more chronic conditions in the same individual. Search methods We searched MEDLINE, EMBASE, CINAHL and seven other databases to 28 September 2015. We also searched grey literature and consulted experts in the field for completed or ongoing studies. Selection criteria Two review authors independently screened and selected studies for inclusion. We considered randomised controlled trials (RCTs), non-randomised clinical trials (NRCTs), controlled before-after studies (CBAs), and interrupted time series analyses (ITS) evaluating interventions to improve outcomes for people with multimorbidity in primary care and community settings. Multimorbidity was defined as two or more chronic conditions in the same individual. This includes studies where participants can have combinations of any condition or have combinations of pre-specified common conditions (comorbidity), for example, hypertension and cardiovascular disease. The comparison was usual care as delivered in that setting. Data collection and analysis Two review authors independently extracted data from the included studies, evaluated study quality, and judged the certainty of the evidence using the GRADE approach. We conducted a meta-analysis of the results where possible and carried out a narrative synthesis for the remainder of the results. We present the results in a 'Summary of findings' table and tabular format to show effect sizes across all outcome types. Main results We identified 18 RCTs examining a range of complex interventions for people with multimorbidity. Nine studies focused on defined comorbid conditions with an emphasis on depression, diabetes and cardiovascular disease. The remaining studies focused on multimorbidity, generally in older people. In 12 studies, the predominant intervention element was a change to the organisation of care delivery, usually through case management or enhanced multidisciplinary team work. In six studies, the interventions were predominantly patient-oriented, for example, educational or self-management support-type interventions delivered directly to participants. Overall our confidence in the results regarding the effectiveness of interventions ranged from low to high certainty. There was little or no difference in clinical outcomes (based on moderate certainty evidence). Mental health outcomes improved (based on high certainty evidence) and there were modest reductions in mean depression scores for the comorbidity studies that targeted participants with depression (standardized mean difference (SMD) −2.23, 95% confidence interval (CI) −2.52 to −1.95). There was probably a small improvement in patient-reported outcomes (moderate certainty evidence) although two studies that specifically targeted functional difficulties in participants had positive effects on functional outcomes with one of these studies also reporting a reduction in mortality at four year follow-up (Int 6%, Con 13%, absolute difference 7%). The intervention may make little or no difference to health service use (low certainty evidence), may slightly improve medication adherence (low certainty evidence), probably slightly improves patient-related health behaviours (moderate certainty evidence), and probably improves provider behaviour in terms of prescribing behaviour and quality of care (moderate certainty evidence). Cost data were limited. Authors' conclusions This review identifies the emerging evidence to support policy for the management of people with multimorbidity and common comorbidities in primary care and community settings. There are remaining uncertainties about the effectiveness of interventions for people with multimorbidity in general due to the relatively small number of RCTs conducted in this area to date, with mixed findings overall. It is possible that the findings may change with the inclusion of large ongoing well-organised trials in future updates. The results suggest an improvement in health outcomes if interventions can be targeted at risk factors such as depression, or specific functional difficulties in people with multimorbidity.

Journal ArticleDOI
TL;DR: It is shown that non-bee insect pollinators play a significant role in global crop production and respond differently than bees to landscape structure, probably making their crop pollination services more robust to changes in land use.
Abstract: Wild and managed bees are well documented as effective pollinators of global crops of economic importance. However, the contributions by pollinators other than bees have been little explored despite their potential to contribute to crop production and stability in the face of environmental change. Non-bee pollinators include flies, beetles, moths, butterflies, wasps, ants, birds, and bats, among others. Here we focus on non-bee insects and synthesize 39 field studies from five continents that directly measured the crop pollination services provided by non-bees, honey bees, and other bees to compare the relative contributions of these taxa. Non-bees performed 25–50% of the total number of flower visits. Although non-bees were less effective pollinators than bees per flower visit, they made more visits; thus these two factors compensated for each other, resulting in pollination services rendered by non-bees that were similar to those provided by bees. In the subset of studies that measured fruit set, fruit set increased with non-bee insect visits independently of bee visitation rates, indicating that non-bee insects provide a unique benefit that is not provided by bees. We also show that non-bee insects are not as reliant as bees on the presence of remnant natural or seminatural habitat in the surrounding landscape. These results strongly suggest that non-bee insect pollinators play a significant role in global crop production and respond differently than bees to landscape structure, probably making their crop pollination services more robust to changes in land use. Non-bee insects provide a valuable service and provide potential insurance against bee population declines.

Journal ArticleDOI
TL;DR: The changing perception of inflammation and inflammatory mediators in normal liver homeostasis is explored and targeting of liver-specific immune regulation pathways as a therapeutic approach to treat liver disease is proposed.
Abstract: The human liver is usually perceived as a non-immunological organ engaged primarily in metabolic, nutrient storage and detoxification activities. However, we now know that the healthy liver is also a site of complex immunological activity mediated by a diverse immune cell repertoire as well as non-hematopoietic cell populations. In the non-diseased liver, metabolic and tissue remodeling functions require elements of inflammation. This inflammation, in combination with regular exposure to dietary and microbial products, creates the potential for excessive immune activation. In this complex microenvironment, the hepatic immune system tolerates harmless molecules while at the same time remaining alert to possible infectious agents, malignant cells or tissue damage. Upon appropriate immune activation to challenge by pathogens or tissue damage, mechanisms to resolve inflammation are essential to maintain liver homeostasis. Failure to clear ‘dangerous’ stimuli or regulate appropriately activated immune mechanisms leads to pathological inflammation and disrupted tissue homeostasis characterized by the progressive development of fibrosis, cirrhosis and eventual liver failure. Hepatic inflammatory mechanisms therefore have a spectrum of roles in the healthy adult liver; they are essential to maintain tissue and organ homeostasis and, when dysregulated, are key drivers of the liver pathology associated with chronic infection, autoimmunity and malignancy. In this review, we explore the changing perception of inflammation and inflammatory mediators in normal liver homeostasis and propose targeting of liver-specific immune regulation pathways as a therapeutic approach to treat liver disease.


Journal ArticleDOI
R. Adam1, Peter A. R. Ade2, Nabila Aghanim3, M. I. R. Alves4  +281 moreInstitutions (69)
TL;DR: In this paper, the authors consider the problem of diffuse astrophysical component separation, and process these maps within a Bayesian framework to derive an internally consistent set of full-sky astrophysical components maps.
Abstract: Planck has mapped the microwave sky in temperature over nine frequency bands between 30 and 857 GHz and in polarization over seven frequency bands between 30 and 353 GHz in polarization. In this paper we consider the problem of diffuse astrophysical component separation, and process these maps within a Bayesian framework to derive an internally consistent set of full-sky astrophysical component maps. Component separation dedicated to cosmic microwave background (CMB) reconstruction is described in a companion paper. For the temperature analysis, we combine the Planck observations with the 9-yr Wilkinson Microwave Anisotropy Probe (WMAP) sky maps and the Haslam et al. 408 MHz map, to derive a joint model of CMB, synchrotron, free-free, spinning dust, CO, line emission in the 94 and 100 GHz channels, and thermal dust emission. Full-sky maps are provided for each component, with an angular resolution varying between 7.5 and 1deg. Global parameters (monopoles, dipoles, relative calibration, and bandpass errors) are fitted jointly with the sky model, and best-fit values are tabulated. For polarization, the model includes CMB, synchrotron, and thermal dust emission. These models provide excellent fits to the observed data, with rms temperature residuals smaller than 4μK over 93% of the sky for all Planck frequencies up to 353 GHz, and fractional errors smaller than 1% in the remaining 7% of the sky. The main limitations of the temperature model at the lower frequencies are internal degeneracies among the spinning dust, free-free, and synchrotron components; additional observations from external low-frequency experiments will be essential to break these degeneracies. The main limitations of the temperature model at the higher frequencies are uncertainties in the 545 and 857 GHz calibration and zero-points. For polarization, the main outstanding issues are instrumental systematics in the 100–353 GHz bands on large angular scales in the form of temperature-to-polarization leakage, uncertainties in the analogue-to-digital conversion, and corrections for the very long time constant of the bolometer detectors, all of which are expected to improve in the near future.

Journal ArticleDOI
Peter A. R. Ade1, Nabila Aghanim2, Monique Arnaud3, M. Ashdown4  +289 moreInstitutions (73)
TL;DR: The most significant measurement of the cosmic microwave background (CMB) lensing potential at a level of 40σ using temperature and polarization data from the Planck 2015 full-mission release was presented in this article.
Abstract: We present the most significant measurement of the cosmic microwave background (CMB) lensing potential to date (at a level of 40σ), using temperature and polarization data from the Planck 2015 full-mission release. Using a polarization-only estimator, we detect lensing at a significance of 5σ. We cross-check the accuracy of our measurement using the wide frequency coverage and complementarity of the temperature and polarization measurements. Public products based on this measurement include an estimate of the lensing potential over approximately 70% of the sky, an estimate of the lensing potential power spectrum in bandpowers for the multipole range 40 ≤ L ≤ 400, and an associated likelihood for cosmological parameter constraints. We find good agreement between our measurement of the lensing potential power spectrum and that found in the ΛCDM model that best fits the Planck temperature and polarization power spectra. Using the lensing likelihood alone we obtain a percent-level measurement of the parameter combination σ8Ω0.25m = 0.591 ± 0.021. We combine our determination of the lensing potential with the E-mode polarization, also measured by Planck, to generate an estimate of the lensing B-mode. We show that this lensing B-mode estimate is correlated with the B-modes observed directly by Planck at the expected level and with a statistical significance of 10σ, confirming Planck’s sensitivity to this known sky signal. We also correlate our lensing potential estimate with the large-scale temperature anisotropies, detecting a cross-correlation at the 3σ level, as expected because of dark energy in the concordance ΛCDM model.

Journal ArticleDOI
21 Jan 2016-Nature
TL;DR: It is found that an integrative model has substantially higher explanatory power than traditional bivariate analyses and several surprising findings that conflict with classical models are revealed.
Abstract: How ecosystem productivity and species richness are interrelated is one of the most debated subjects in the history of ecology. Decades of intensive study have yet to discern the actual mechanisms behind observed global patterns. Here, by integrating the predictions from multiple theories into a single model and using data from 1,126 grassland plots spanning five continents, we detect the clear signals of numerous underlying mechanisms linking productivity and richness. We find that an integrative model has substantially higher explanatory power than traditional bivariate analyses. In addition, the specific results unveil several surprising findings that conflict with classical models. These include the isolation of a strong and consistent enhancement of productivity by richness, an effect in striking contrast with superficial data patterns. Also revealed is a consistent importance of competition across the full range of productivity values, in direct conflict with some (but not all) proposed models. The promotion of local richness by macroecological gradients in climatic favourability, generally seen as a competing hypothesis, is also found to be important in our analysis. The results demonstrate that an integrative modelling approach leads to a major advance in our ability to discern the underlying processes operating in ecological systems.

Journal ArticleDOI
TL;DR: Evidence of ALS being a complex genetic trait with a polygenic architecture is established and the SNP-based heritability is estimated at 8.5%, with a distinct and important role for low-frequency variants (frequency 1–10%).
Abstract: To elucidate the genetic architecture of amyotrophic lateral sclerosis (ALS) and find associated loci, we assembled a custom imputation reference panel from whole-genome-sequenced patients with ALS and matched controls (n = 1,861). Through imputation and mixed-model association analysis in 12,577 cases and 23,475 controls, combined with 2,579 cases and 2,767 controls in an independent replication cohort, we fine-mapped a new risk locus on chromosome 21 and identified C21orf2 as a gene associated with ALS risk. In addition, we identified MOBP and SCFD1 as new associated risk loci. We established evidence of ALS being a complex genetic trait with a polygenic architecture. Furthermore, we estimated the SNP-based heritability at 8.5%, with a distinct and important role for low-frequency variants (frequency 1-10%). This study motivates the interrogation of larger samples with full genome coverage to identify rare causal variants that underpin ALS risk.

Journal ArticleDOI
TL;DR: A new open-source toolbox for performing temporal response functions describing a mapping between stimulus and response in both directions is introduced and the importance of regularizing the analysis is explained and how this regularization can be optimized for a particular dataset.
Abstract: Understanding how brains process sensory signals in natural environments is one of the key goals of 21st century neuroscience. While brain imaging and invasive electrophysiology will play key roles in this endeavor, there is also an important role to be played by noninvasive, macroscopic techniques with high temporal resolution such as electro- and magnetoencephalography. But challenges exist in determining how best to analyze such complex, time-varying neural responses to complex, time-varying and multivariate natural sensory stimuli. There has been a long history of applying system identification techniques to relate the firing activity of neurons to complex sensory stimuli and such techniques are now seeing increased application to EEG and MEG data. One particular example involves fitting a filter – often referred to as a temporal response function – that describes a mapping between some feature(s) of a sensory stimulus and the neural response. Here, we first briefly review the history of these system identification approaches and describe a specific technique for deriving temporal response functions known as regularized linear regression. We then introduce a new open-source toolbox for performing this analysis. We describe how it can be used to derive (multivariate) temporal response functions describing a mapping between stimulus and response in both directions. We also explain the importance of regularizing the analysis and how this regularization can be optimized for a particular dataset. We then outline specifically how the toolbox implements these analyses and provide several examples of the types of results that the toolbox can produce. Finally, we consider some of the limitations of the toolbox and opportunities for future development and application.

Journal ArticleDOI
R. Adam1, Peter A. R. Ade2, Nabila Aghanim3, Monique Arnaud4  +298 moreInstitutions (69)
TL;DR: In this article, the authors exploit the uniqueness of the Planck HFI polarization data from 100 to 353 GHz to measure the polarized dust angular power spectra C_l^(EE) and C_ l^(BB) over the multipole range 40
Abstract: The polarized thermal emission from diffuse Galactic dust is the main foreground present in measurements of the polarization of the cosmic microwave background (CMB) at frequencies above 100 GHz. In this paper we exploit the uniqueness of the Planck HFI polarization data from 100 to 353 GHz to measure the polarized dust angular power spectra C_l^(EE) and C_l^(BB) over the multipole range 40

Journal ArticleDOI
TL;DR: a Medical Oncology Department, Hospital Universitario Doce de Octubre, Madrid , Spain; b Department of oncology, Haukeland University Hospital, Bergen , Norway; c Institut Gustave Roussy, Villejuif, and d Oncologie Médicale, Hôpitaux Universitaires Paris Nord Val de Seine, Paris , France.
Abstract: a Medical Oncology Department, Hospital Universitario Doce de Octubre, Madrid , Spain; b Department of Oncology, Haukeland University Hospital, Bergen , Norway; c Institut Gustave Roussy, Villejuif , and d Oncologie Médicale, Hôpitaux Universitaires Paris Nord Val de Seine, Paris , France; e Department of Hepatology and Gastroenterology, Campus Virchow Klinikum, Charité Universitätsmedizin Berlin, Berlin , Germany; f Department of Surgery, Medical University of Vienna, Vienna , Austria; g Department of Oncology, First Faculty of Medicine and General Teaching Hospital, Prague , Czech Republic; h Neuroendocrine Tumour Unit, Royal Free Hospital, London , UK; i Institut für Pathologie und Zytologie, St. Vincenz Krankenhaus, Limburg , Germany; j Department of Radiology, Faculty of Medical Sciences, University of Warmia and Mazury, Olsztyn , Poland; k Neuroendocrine Tumour Unit, Royal Free Hospital, London , UK; l NET Centre, St. Vincent’s University and Department of Clinical Medicine, St. James Hospital and Trinity College, Dublin , Ireland; m Institute of Pathology, University of Bern, Bern , Switzerland

Journal ArticleDOI
TL;DR: The results suggest that boosting the metabolic activity of antitumor lymphocytes could be an effective strategy to promote immune-mediated tumor suppression and provide genetic, pharmacologic, and biochemical evidence that the kinase mTOR is a crucial signaling integrator of pro- and anti-inflammatory cytokines in NK cells.
Abstract: Transforming growth factor-β (TGF-β) is a major immunosuppressive cytokine that maintains immune homeostasis and prevents autoimmunity through its antiproliferative and anti-inflammatory properties in various immune cell types. We provide genetic, pharmacologic, and biochemical evidence that a critical target of TGF-β signaling in mouse and human natural killer (NK) cells is the serine and threonine kinase mTOR (mammalian target of rapamycin). Treatment of mouse or human NK cells with TGF-β in vitro blocked interleukin-15 (IL-15)-induced activation of mTOR. TGF-β and the mTOR inhibitor rapamycin both reduced the metabolic activity and proliferation of NK cells and reduced the abundances of various NK cell receptors and the cytotoxic activity of NK cells. In vivo, constitutive TGF-β signaling or depletion of mTOR arrested NK cell development, whereas deletion of the TGF-β receptor subunit TGF-βRII enhanced mTOR activity and the cytotoxic activity of the NK cells in response to IL-15. Suppression of TGF-β signaling in NK cells did not affect either NK cell development or homeostasis; however, it enhanced the ability of NK cells to limit metastases in two different tumor models in mice. Together, these results suggest that the kinase mTOR is a crucial signaling integrator of pro- and anti-inflammatory cytokines in NK cells. Moreover, we propose that boosting the metabolic activity of antitumor lymphocytes could be an effective strategy to promote immune-mediated tumor suppression.

Journal ArticleDOI
TL;DR: This work demonstrates robust zero-field SOT switching of a perpendicular CoFe free layer where the symmetry is broken by magnetic coupling to a second in-plane exchange-biased CoFe layer via a nonmagnetic Ru or Pt spacer.
Abstract: A new approach to magnetic switching by spin–orbit torque uses interlayer exchange coupling to overcome the need for an external magnetic field. Manipulation of the magnetization of a perpendicular ferromagnetic free layer by spin–orbit torque (SOT)1,2,3,4 is an attractive alternative to spin-transfer torque (STT) in oscillators and switches such as magnetic random-access memory (MRAM) where a high current is passed across an ultrathin tunnel barrier5. A small symmetry-breaking bias field is usually needed for deterministic SOT switching but it is impractical to generate the field externally for spintronic applications. Here, we demonstrate robust zero-field SOT switching of a perpendicular CoFe free layer where the symmetry is broken by magnetic coupling to a second in-plane exchange-biased CoFe layer via a nonmagnetic Ru or Pt spacer6. The preferred magnetic state of the free layer is determined by the current polarity and the sign of the interlayer exchange coupling (IEC). Our strategy offers a potentially scalable solution to realize bias-field-free switching that can lead to a generation of SOT devices, combining a high storage density and endurance with a low power consumption.

Journal ArticleDOI
TL;DR: Oropharyngeal dysphagia should be given more importance and attention and thus be included in all standard screening protocols, treated, and regularly monitored to prevent its main complications.
Abstract: This position document has been developed by the Dysphagia Working Group, a committee of members from the European Society for Swallowing Disorders and the European Union Geriatric Medicine Society, and invited experts. It consists of 12 sections that cover all aspects of clinical management of oropharyngeal dysphagia (OD) related to geriatric medicine and discusses prevalence, quality of life, and legal and ethical issues, as well as health economics and social burden. OD constitutes impaired or uncomfortable transit of food or liquids from the oral cavity to the esophagus, and it is included in the World Health Organization’s classification of diseases. It can cause severe complications such as malnutrition, dehydration, respiratory infections, aspiration pneumonia, and increased readmissions, institutionalization, and morbimortality. OD is a prevalent and serious problem among all phenotypes of older patients as oropharyngeal swallow response is impaired in older people and can cause aspiration. Despite its prevalence and severity, OD is still underdiagnosed and untreated in many medical centers. There are several validated clinical and instrumental methods (videofluoroscopy and fiberoptic endoscopic evaluation of swallowing) to diagnose OD, and treatment is mainly based on compensatory measures, although new treatments to stimulate the oropharyngeal swallow response are under research. OD matches the definition of a geriatric syndrome as it is highly prevalent among older people, is caused by multiple factors, is associated with several comorbidities and poor prognosis, and needs a multidimensional approach to be treated. OD should be given more importance and attention and thus be included in all standard screening protocols, treated, and regularly monitored to prevent its main complications. More research is needed to develop and standardize new treatments and management protocols for older patients with OD, which is a challenging mission for our societies.