scispace - formally typeset
Search or ask a question

Showing papers by "Federal University of Rio de Janeiro published in 2016"


Journal ArticleDOI
Daniel J. Klionsky1, Kotb Abdelmohsen2, Akihisa Abe3, Joynal Abedin4  +2519 moreInstitutions (695)
TL;DR: In this paper, the authors present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macro-autophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure flux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation, it is imperative to target by gene knockout or RNA interference more than one autophagy-related protein. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways implying that not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular assays, we hope to encourage technical innovation in the field.

5,187 citations


Journal ArticleDOI
30 Jun 2016-Nature
TL;DR: Substantial enhancement or over-delivery on current INDCs by additional national, sub-national and non-state actions is required to maintain a reasonable chance of meeting the target of keeping warming well below 2 degrees Celsius.
Abstract: The Paris climate agreement aims at holding global warming to well below 2 degrees Celsius and to "pursue efforts" to limit it to 1.5 degrees Celsius. To accomplish this, countries have submitted Intended Nationally Determined Contributions (INDCs) outlining their post-2020 climate action. Here we assess the effect of current INDCs on reducing aggregate greenhouse gas emissions, its implications for achieving the temperature objective of the Paris climate agreement, and potential options for overachievement. The INDCs collectively lower greenhouse gas emissions compared to where current policies stand, but still imply a median warming of 2.6-3.1 degrees Celsius by 2100. More can be achieved, because the agreement stipulates that targets for reducing greenhouse gas emissions are strengthened over time, both in ambition and scope. Substantial enhancement or over-delivery on current INDCs by additional national, sub-national and non-state actions is required to maintain a reasonable chance of meeting the target of keeping warming well below 2 degrees Celsius.

2,333 citations


Journal ArticleDOI
TL;DR: The Zika virus genome was detected in amniotic fluid samples of two pregnant women in Brazil whose fetuses were diagnosed with microcephaly and results suggest that the virus can cross the placental barrier.
Abstract: Summary Background The incidence of microcephaly in Brazil in 2015 was 20 times higher than in previous years. Congenital microcephaly is associated with genetic factors and several causative agents. Epidemiological data suggest that microcephaly cases in Brazil might be associated with the introduction of Zika virus. We aimed to detect and sequence the Zika virus genome in amniotic fluid samples of two pregnant women in Brazil whose fetuses were diagnosed with microcephaly. Methods In this case study, amniotic fluid samples from two pregnant women from the state of Paraiba in Brazil whose fetuses had been diagnosed with microcephaly were obtained, on the recommendation of the Brazilian health authorities, by ultrasound-guided transabdominal amniocentesis at 28 weeks' gestation. The women had presented at 18 weeks' and 10 weeks' gestation, respectively, with clinical manifestations that could have been symptoms of Zika virus infection, including fever, myalgia, and rash. After the amniotic fluid samples were centrifuged, DNA and RNA were extracted from the purified virus particles before the viral genome was identified by quantitative reverse transcription PCR and viral metagenomic next-generation sequencing. Phylogenetic reconstruction and investigation of recombination events were done by comparing the Brazilian Zika virus genome with sequences from other Zika strains and from flaviviruses that occur in similar regions in Brazil. Findings We detected the Zika virus genome in the amniotic fluid of both pregnant women. The virus was not detected in their urine or serum. Tests for dengue virus, chikungunya virus, Toxoplasma gondii , rubella virus, cytomegalovirus, herpes simplex virus, HIV, Treponema pallidum , and parvovirus B19 were all negative. After sequencing of the complete genome of the Brazilian Zika virus isolated from patient 1, phylogenetic analyses showed that the virus shares 97–100% of its genomic identity with lineages isolated during an outbreak in French Polynesia in 2013, and that in both envelope and NS5 genomic regions, it clustered with sequences from North and South America, southeast Asia, and the Pacific. After assessing the possibility of recombination events between the Zika virus and other flaviviruses, we ruled out the hypothesis that the Brazilian Zika virus genome is a recombinant strain with other mosquito-borne flaviviruses. Interpretation These findings strengthen the putative association between Zika virus and cases of microcephaly in neonates in Brazil. Moreover, our results suggest that the virus can cross the placental barrier. As a result, Zika virus should be considered as a potential infectious agent for human fetuses. Pathogenesis studies that confirm the tropism of Zika virus for neuronal cells are warranted. Funding Consellho Nacional de Desenvolvimento e Pesquisa (CNPq), Fundacao de Amparo a Pesquisa do Estado do Rio de Janeiro (FAPERJ).

1,004 citations


Journal ArticleDOI
TL;DR: Current knowledge of the classic immune components are reviewed and the concept of IBD immunopathogenesis is expanded to include various cells, mediators and pathways that have not been traditionally associated with disease mechanisms, but that profoundly affect the overall intestinal inflammatory process.
Abstract: IBD is a chronic inflammatory condition of the gastrointestinal tract encompassing two main clinical entities: Crohn's disease and ulcerative colitis. Although Crohn's disease and ulcerative colitis have historically been studied together because they share common features (such as symptoms, structural damage and therapy), it is now clear that they represent two distinct pathophysiological entities. Both Crohn's disease and ulcerative colitis are associated with multiple pathogenic factors including environmental changes, an array of susceptibility gene variants, a qualitatively and quantitatively abnormal gut microbiota and a broadly dysregulated immune response. In spite of this realization and the identification of seemingly pertinent environmental, genetic, microbial and immune factors, a full understanding of IBD pathogenesis is still out of reach and, consequently, treatment is far from optimal. An important reason for this unsatisfactory situation is the currently limited comprehension of what are the truly relevant components of IBD immunopathogenesis. This article will comprehensively review current knowledge of the classic immune components and will expand the concept of IBD immunopathogenesis to include various cells, mediators and pathways that have not been traditionally associated with disease mechanisms, but that profoundly affect the overall intestinal inflammatory process.

999 citations


Journal ArticleDOI
13 May 2016-Science
TL;DR: Results suggest that ZIKV abrogates neurogenesis during human brain development when it targets human brain cells, reducing their viability and growth as neurospheres and brain organoids.
Abstract: Since the emergence of Zika virus (ZIKV), reports of microcephaly have increased considerably in Brazil; however, causality between the viral epidemic and malformations in fetal brains needs further confirmation. We examined the effects of ZIKV infection in human neural stem cells growing as neurospheres and brain organoids. Using immunocytochemistry and electron microscopy, we showed that ZIKV targets human brain cells, reducing their viability and growth as neurospheres and brain organoids. These results suggest that ZIKV abrogates neurogenesis during human brain development.

975 citations


Journal ArticleDOI
TL;DR: A survey of original evidence shows that histological data always supported a 1:1 ratio of glia to neurons in the entire human brain, and a range of 40–130 billion glial cells, and the current status of knowledge about the number of cells is reviewed.
Abstract: For half a century, the human brain was believed to contain about 100 billion neurons and one trillion glial cells, with a glia:neuron ratio of 10:1. A new counting method, the isotropic fractionator, has challenged the notion that glia outnumber neurons and revived a question that was widely thought to have been resolved. The recently validated isotropic fractionator demonstrates a glia:neuron ratio of less than 1:1 and a total number of less than 100 billion glial cells in the human brain. A survey of original evidence shows that histological data always supported a 1:1 ratio of glia to neurons in the entire human brain, and a range of 40-130 billion glial cells. We review how the claim of one trillion glial cells originated, was perpetuated, and eventually refuted. We compile how numbers of neurons and glial cells in the adult human brain were reported and we examine the reasons for an erroneous consensus about the relative abundance of glial cells in human brains that persisted for half a century. Our review includes a brief history of cell counting in human brains, types of counting methods that were and are employed, ranges of previous estimates, and the current status of knowledge about the number of cells. We also discuss implications and consequences of the new insights into true numbers of glial cells in the human brain, and the promise and potential impact of the newly validated isotropic fractionator for reliable quantification of glia and neurons in neurological and psychiatric diseases. J. Comp. Neurol. 524:3865-3895, 2016. © 2016 Wiley Periodicals, Inc.

683 citations


Journal ArticleDOI
TL;DR: Review of selected capabilities of HYDRUS implemented since 2008 New standard and nonstandard specialized add‐on modules significantly expanded capabilities of the software.
Abstract: The HYDRUS-1D and HYDRUS (2D/3D) computer software packages are widely used finite-element models for simulating the one- and two- or three-dimensional movement of water, heat, and multiple solutes in variably saturated media, respectively. In 2008, Simůnek et al. (2008b) described the entire history of the development of the various HYDRUS programs and related models and tools such as STANMOD, RETC, ROSETTA, UNSODA, UNSATCHEM, HP1, and others. The objective of this manuscript is to review selected capabilities of HYDRUS that have been implemented since 2008. Our review is not limited to listing additional processes that were implemented in the standard computational modules, but also describes many new standard and nonstandard specialized add-on modules that significantly expanded the capabilities of the two software packages. We also review additional capabilities that have been incorporated into the graphical user interface (GUI) that supports the use of HYDRUS (2D/3D). Another objective of this manuscript is to review selected applications of the HYDRUS models such as evaluation of various irrigation schemes, evaluation of the effects of plant water uptake on groundwater recharge, assessing the transport of particle-like substances in the subsurface, and using the models in conjunction with various geophysical methods.

661 citations


Journal ArticleDOI
Georges Aad1, Brad Abbott2, Jalal Abdallah3, Ovsat Abdinov4, Baptiste Abeloos5, Rosemarie Aben6, Ossama AbouZeid7, N. L. Abraham8, Halina Abramowicz9, Henso Abreu10, Ricardo Abreu11, Yiming Abulaiti12, Bobby Samir Acharya13, Bobby Samir Acharya14, Leszek Adamczyk15, David H. Adams16, Jahred Adelman17, Stefanie Adomeit18, Tim Adye19, A. A. Affolder20, Tatjana Agatonovic-Jovin21, Johannes Agricola22, Juan Antonio Aguilar-Saavedra23, Steven Ahlen24, Faig Ahmadov25, Faig Ahmadov4, Giulio Aielli26, Henrik Akerstedt12, T. P. A. Åkesson27, Andrei Akimov, Gian Luigi Alberghi28, Justin Albert29, S. Albrand30, M. J. Alconada Verzini31, Martin Aleksa32, Igor Aleksandrov25, Calin Alexa, Gideon Alexander9, Theodoros Alexopoulos33, Muhammad Alhroob2, Malik Aliev34, Gianluca Alimonti, John Alison35, Steven Patrick Alkire36, Bmm Allbrooke8, Benjamin William Allen11, Phillip Allport37, Alberto Aloisio38, Alejandro Alonso39, Francisco Alonso31, Cristiano Alpigiani40, Mahmoud Alstaty1, B. Alvarez Gonzalez32, D. Álvarez Piqueras41, Mariagrazia Alviggi38, Brian Thomas Amadio42, K. Amako, Y. Amaral Coutinho43, Christoph Amelung44, D. Amidei45, S. P. Amor Dos Santos46, António Amorim47, Simone Amoroso32, Glenn Amundsen44, Christos Anastopoulos48, Lucian Stefan Ancu49, Nansi Andari17, Timothy Andeen50, Christoph Falk Anders51, G. Anders32, John Kenneth Anders20, Kelby Anderson35, Attilio Andreazza52, Andrei51, Stylianos Angelidakis53, Ivan Angelozzi6, Philipp Anger54, Aaron Angerami36, Francis Anghinolfi32, Alexey Anisenkov55, Nuno Anjos56 
Aix-Marseille University1, University of Oklahoma2, University of Iowa3, Azerbaijan National Academy of Sciences4, Université Paris-Saclay5, University of Amsterdam6, University of California, Santa Cruz7, University of Sussex8, Tel Aviv University9, Technion – Israel Institute of Technology10, University of Oregon11, Stockholm University12, King's College London13, International Centre for Theoretical Physics14, AGH University of Science and Technology15, Brookhaven National Laboratory16, Northern Illinois University17, Ludwig Maximilian University of Munich18, Rutherford Appleton Laboratory19, University of Liverpool20, University of Belgrade21, University of Göttingen22, University of Granada23, Boston University24, Joint Institute for Nuclear Research25, University of Rome Tor Vergata26, Lund University27, University of Bologna28, University of Victoria29, University of Grenoble30, National University of La Plata31, CERN32, National Technical University of Athens33, University of Salento34, University of Chicago35, Columbia University36, University of Birmingham37, University of Naples Federico II38, University of Copenhagen39, University of Washington40, University of Valencia41, Lawrence Berkeley National Laboratory42, Federal University of Rio de Janeiro43, Brandeis University44, University of Michigan45, University of Coimbra46, University of Lisbon47, University of Sheffield48, University of Geneva49, University of Texas at Austin50, Heidelberg University51, University of Milan52, National and Kapodistrian University of Athens53, Dresden University of Technology54, Novosibirsk State University55, IFAE56
TL;DR: In this article, a combined ATLAS and CMS measurements of the Higgs boson production and decay rates, as well as constraints on its couplings to vector bosons and fermions, are presented.
Abstract: Combined ATLAS and CMS measurements of the Higgs boson production and decay rates, as well as constraints on its couplings to vector bosons and fermions, are presented. The combination is based on the analysis of five production processes, namely gluon fusion, vector boson fusion, and associated production with a $W$ or a $Z$ boson or a pair of top quarks, and of the six decay modes $H \to ZZ, WW$, $\gamma\gamma, \tau\tau, bb$, and $\mu\mu$. All results are reported assuming a value of 125.09 GeV for the Higgs boson mass, the result of the combined measurement by the ATLAS and CMS experiments. The analysis uses the CERN LHC proton--proton collision data recorded by the ATLAS and CMS experiments in 2011 and 2012, corresponding to integrated luminosities per experiment of approximately 5 fb$^{-1}$ at $\sqrt{s}=7$ TeV and 20 fb$^{-1}$ at $\sqrt{s} = 8$ TeV. The Higgs boson production and decay rates measured by the two experiments are combined within the context of three generic parameterisations: two based on cross sections and branching fractions, and one on ratios of coupling modifiers. Several interpretations of the measurements with more model-dependent parameterisations are also given. The combined signal yield relative to the Standard Model prediction is measured to be 1.09 $\pm$ 0.11. The combined measurements lead to observed significances for the vector boson fusion production process and for the $H \to \tau\tau$ decay of $5.4$ and $5.5$ standard deviations, respectively. The data are consistent with the Standard Model predictions for all parameterisations considered.

618 citations


Journal ArticleDOI
29 Sep 2016-Nature
TL;DR: A global map of abundant, double-stranded DNA viruses complete with genomic and ecological contexts is presented to present a necessary foundation for the meaningful integration of viruses into ecosystem models where they act as key players in nutrient cycling and trophic networks.
Abstract: Ocean microbes drive biogeochemical cycling on a global scale. However, this cycling is constrained by viruses that affect community composition, metabolic activity, and evolutionary trajectories. Owing to challenges with the sampling and cultivation of viruses, genome-level viral diversity remains poorly described and grossly understudied, with less than 1% of observed surface-ocean viruses known. Here we assemble complete genomes and large genomic fragments from both surface- and deep-ocean viruses sampled during the Tara Oceans and Malaspina research expeditions, and analyse the resulting 'global ocean virome' dataset to present a global map of abundant, double-stranded DNA viruses complete with genomic and ecological contexts. A total of 15,222 epipelagic and mesopelagic viral populations were identified, comprising 867 viral clusters (defined as approximately genus-level groups). This roughly triples the number of known ocean viral populations and doubles the number of candidate bacterial and archaeal virus genera, providing a near-complete sampling of epipelagic communities at both the population and viral-cluster level. We found that 38 of the 867 viral clusters were locally or globally abundant, together accounting for nearly half of the viral populations in any global ocean virome sample. While two-thirds of these clusters represent newly described viruses lacking any cultivated representative, most could be computationally linked to dominant, ecologically relevant microbial hosts. Moreover, we identified 243 viral-encoded auxiliary metabolic genes, of which only 95 were previously known. Deeper analyses of four of these auxiliary metabolic genes (dsrC, soxYZ, P-II (also known as glnB) and amoC) revealed that abundant viruses may directly manipulate sulfur and nitrogen cycling throughout the epipelagic ocean. This viral catalog and functional analyses provide a necessary foundation for the meaningful integration of viruses into ecosystem models where they act as key players in nutrient cycling and trophic networks.

557 citations


Journal ArticleDOI
TL;DR: In this article, the authors synthesize reservoir CH4, CO2, and N2O emission data with three main objectives: (1) to generate a global estimate of GHG emissions from reservoirs, (2) to identify the best predictors of these emissions, and (3) to consider the effect of methodology on emission estimates.
Abstract: Collectively, reservoirs created by dams are thought to be an important source of greenhouse gases (GHGs) to the atmosphere. So far, efforts to quantify, model, and manage these emissions have been limited by data availability and inconsistencies in methodological approach. Here, we synthesize reservoir CH4, CO2, and N2O emission data with three main objectives: (1) to generate a global estimate of GHG emissions from reservoirs, (2) to identify the best predictors of these emissions, and (3) to consider the effect of methodology on emission estimates. We estimate that GHG emissions from reservoir water surfaces account for 0.8 (0.5-1.2) Pg CO2 equivalents per year, with the majority of this forcing due to CH4. We then discuss the potential for several alternative pathways such as dam degassing and downstream emissions to contribute significantly to overall emissions. Although prior studies have linked reservoir GHG emissions to reservoir age and latitude, we find that factors related to reservoir productivity are better predictors of emission.

515 citations


Journal ArticleDOI
TL;DR: It is demonstrated that expression and activity of the NADase CD38 increase with aging and that CD38 is required for the age-related NAD decline and mitochondrial dysfunction via a pathway mediated at least in part by regulation of SIRT3 activity.

Journal ArticleDOI
TL;DR: The biology of phosphatidylserine is discussed with respect to its role as a global immunosuppressive signal and how PS is exploited to drive diverse pathological processes such as infection and cancer.
Abstract: Apoptosis is an evolutionarily conserved and tightly regulated cell death modality. It serves important roles in physiology by sculpting complex tissues during embryogenesis and by removing effete cells that have reached advanced age or whose genomes have been irreparably damaged. Apoptosis culminates in the rapid and decisive removal of cell corpses by efferocytosis, a term used to distinguish the engulfment of apoptotic cells from other phagocytic processes. Over the past decades, the molecular and cell biological events associated with efferocytosis have been rigorously studied, and many eat-me signals and receptors have been identified. The externalization of phosphatidylserine (PS) is arguably the most emblematic eat-me signal that is in turn bound by a large number of serum proteins and opsonins that facilitate efferocytosis. Under physiological conditions, externalized PS functions as a dominant and evolutionarily conserved immunosuppressive signal that promotes tolerance and prevents local and systemic immune activation. Pathologically, the innate immunosuppressive effect of externalized PS has been hijacked by numerous viruses, microorganisms, and parasites to facilitate infection, and in many cases, establish infection latency. PS is also profoundly dysregulated in the tumor microenvironment and antagonizes the development of tumor immunity. In this review, we discuss the biology of PS with respect to its role as a global immunosuppressive signal and how PS is exploited to drive diverse pathological processes such as infection and cancer. Finally, we outline the rationale that agents targeting PS could have significant value in cancer and infectious disease therapeutics.

Journal ArticleDOI
Georges Aad1, Brad Abbott2, Jalal Abdallah3, Ovsat Abdinov4  +2828 moreInstitutions (191)
TL;DR: In this article, the performance of the ATLAS muon identification and reconstruction using the first LHC dataset recorded at s√ = 13 TeV in 2015 was evaluated using the Monte Carlo simulations.
Abstract: This article documents the performance of the ATLAS muon identification and reconstruction using the first LHC dataset recorded at s√ = 13 TeV in 2015. Using a large sample of J/ψ→μμ and Z→μμ decays from 3.2 fb−1 of pp collision data, measurements of the reconstruction efficiency, as well as of the momentum scale and resolution, are presented and compared to Monte Carlo simulations. The reconstruction efficiency is measured to be close to 99% over most of the covered phase space (|η| 2.2, the pT resolution for muons from Z→μμ decays is 2.9% while the precision of the momentum scale for low-pT muons from J/ψ→μμ decays is about 0.2%.

Journal ArticleDOI
TL;DR: The figures suggest that the conservation status of South American freshwater fish faunas is better than in most other regions of the world, but the marine fishes are as threatened as elsewhere.
Abstract: The freshwater and marine fish faunas of South America are the most diverse on Earth, with current species richness estimates standing above 9100 species. In addition, over the last decade at least 100 species were described every year. There are currently about 5160 freshwater fish species, and the estimate for the freshwater fish fauna alone points to a final diversity between 8000 and 9000 species. South America also has c. 4000 species of marine fishes. The mega-diverse fish faunas of South America evolved over a period of >100 million years, with most lineages tracing origins to Gondwana and the adjacent Tethys Sea. This high diversity was in part maintained by escaping the mass extinctions and biotic turnovers associated with Cenozoic climate cooling, the formation of boreal and temperate zones at high latitudes and aridification in many places at equatorial latitudes. The fresh waters of the continent are divided into 13 basin complexes, large basins consolidated as a single unit plus historically connected adjacent coastal drainages, and smaller coastal basins grouped together on the basis of biogeographic criteria. Species diversity, endemism, noteworthy groups and state of knowledge of each basin complex are described. Marine habitats around South America, both coastal and oceanic, are also described in terms of fish diversity, endemism and state of knowledge. Because of extensive land use changes, hydroelectric damming, water divergence for irrigation, urbanization, sedimentation and overfishing 4-10% of all fish species in South America face some degree of extinction risk, mainly due to habitat loss and degradation. These figures suggest that the conservation status of South American freshwater fish faunas is better than in most other regions of the world, but the marine fishes are as threatened as elsewhere. Conserving the remarkable aquatic habitats and fishes of South America is a growing challenge in face of the rapid anthropogenic changes of the 21st century, and deserves attention from conservationists and policy makers.

Journal ArticleDOI
TL;DR: There is a persistent interest in extending cosmology beyond the standard model, ΛCDM, motivated by a range of apparently serious theoretical issues, involving such questions as the cosmological constant problem, the particle nature of dark matter, the validity of general relativity on large scales, the existence of anomalies in the CMB and on small scales, and the predictivity and testability of the inflationary paradigm as mentioned in this paper.

Journal ArticleDOI
Roel Aaij1, C. Abellán Beteta2, Bernardo Adeva3, Marco Adinolfi4  +761 moreInstitutions (64)
TL;DR: An angular analysis of the B0 → K*0(→ K+π−)μ+μ− decay is presented in this paper, where the angular observables and their correlations are reported in bins of q2, the invariant mass squared of the dimuon system.
Abstract: An angular analysis of the B0 → K*0(→ K+π−)μ+μ− decay is presented. The dataset corresponds to an integrated luminosity of 3.0 fb−1 of pp collision data collected at the LHCb experiment. The complete angular information from the decay is used to determine CP-averaged observables and CP asymmetries, taking account of possible contamination from decays with the K+π− system in an S-wave configuration. The angular observables and their correlations are reported in bins of q2, the invariant mass squared of the dimuon system. The observables are determined both from an unbinned maximum likelihood fit and by using the principal moments of the angular distribution. In addition, by fitting for q2-dependent decay amplitudes in the region 1.1 < q2 < 6.0 GeV2/c4, the zero-crossing points of several angular observables are computed. A global fit is performed to the complete set of CP-averaged observables obtained from the maximum likelihood fit. This fit indicates differences with predictions based on the Standard Model at the level of 3.4 standard deviations. These differences could be explained by contributions from physics beyond the Standard Model, or by an unexpectedly large hadronic effect that is not accounted for in the Standard Model predictions.[Figure not available: see fulltext.]

Journal ArticleDOI
TL;DR: It is shown that the brains of parrots and songbirds contain on average twice as many neurons as primate brains of the same mass, indicating that avian brains have higher neuron packing densities than mammalian brains.
Abstract: Some birds achieve primate-like levels of cognition, even though their brains tend to be much smaller in absolute size. This poses a fundamental problem in comparative and computational neuroscience, because small brains are expected to have a lower information-processing capacity. Using the isotropic fractionator to determine numbers of neurons in specific brain regions, here we show that the brains of parrots and songbirds contain on average twice as many neurons as primate brains of the same mass, indicating that avian brains have higher neuron packing densities than mammalian brains. Additionally, corvids and parrots have much higher proportions of brain neurons located in the pallial telencephalon compared with primates or other mammals and birds. Thus, large-brained parrots and corvids have forebrain neuron counts equal to or greater than primates with much larger brains. We suggest that the large numbers of neurons concentrated in high densities in the telencephalon substantially contribute to the neural basis of avian intelligence.

Journal ArticleDOI
16 Dec 2016-Science
TL;DR: Applying a 1-kilometer buffer to all roads is presented and a global map of roadless areas and an assessment of their status, quality, and extent of coverage by protected areas are presented to halt their continued loss.
Abstract: Roads fragment landscapes and trigger human colonization and degradation of ecosystems, to the detriment of biodiversity and ecosystem functions. The planet’s remaining large and ecologically important tracts of roadless areas sustain key refugia for biodiversity and provide globally relevant ecosystem services. Applying a 1-kilometer buffer to all roads, we present a global map of roadless areas and an assessment of their status, quality, and extent of coverage by protected areas. About 80% of Earth’s terrestrial surface remains roadless, but this area is fragmented into ~600,000 patches, more than half of which are

Journal ArticleDOI
TL;DR: In this largest meta-analysis of hypertensive patients, the nocturnal BP fall provided substantial prognostic information, independent of 24-hour SBP levels, and heterogeneity was low for systolic night-to-day ratio and Reverse dipping and moderate for extreme dippers.
Abstract: The prognostic importance of the nocturnal systolic blood pressure (SBP) fall, adjusted for average 24-hour SBP levels, is unclear. The Ambulatory Blood Pressure Collaboration in Patients With Hypertension (ABC-H) examined this issue in a meta-analysis of 17 312 hypertensives from 3 continents. Risks were computed for the systolic night-to-day ratio and for different dipping patterns (extreme, reduced, and reverse dippers) relative to normal dippers. ABC-H investigators provided multivariate adjusted hazard ratios (HRs), with and without adjustment for 24-hour SBP, for total cardiovascular events (CVEs), coronary events, strokes, cardiovascular mortality, and total mortality. Average 24-hour SBP varied from 131 to 140 mm Hg and systolic night-to-day ratio from 0.88 to 0.93. There were 1769 total CVEs, 916 coronary events, 698 strokes, 450 cardiovascular deaths, and 903 total deaths. After adjustment for 24-hour SBP, the systolic night-to-day ratio predicted all outcomes: from a 1-SD increase, summary HRs were 1.12 to 1.23. Reverse dipping also predicted all end points: HRs were 1.57 to 1.89. Reduced dippers, relative to normal dippers, had a significant 27% higher risk for total CVEs. Risks for extreme dippers were significantly influenced by antihypertensive treatment (P<0.001): untreated patients had increased risk of total CVEs (HR, 1.92), whereas treated patients had borderline lower risk (HR, 0.72) than normal dippers. For CVEs, heterogeneity was low for systolic night-to-day ratio and reverse/reduced dipping and moderate for extreme dippers. Quality of included studies was moderate to high, and publication bias was undetectable. In conclusion, in this largest meta-analysis of hypertensive patients, the nocturnal BP fall provided substantial prognostic information, independent of 24-hour SBP levels.

Journal ArticleDOI
Georges Aad1, Brad Abbott2, Jalal Abdallah3, Ovsat Abdinov4  +2812 moreInstitutions (207)
TL;DR: In this paper, an independent b-tagging algorithm based on the reconstruction of muons inside jets as well as the b tagging algorithm used in the online trigger are also presented.
Abstract: The identification of jets containing b hadrons is important for the physics programme of the ATLAS experiment at the Large Hadron Collider. Several algorithms to identify jets containing b hadrons are described, ranging from those based on the reconstruction of an inclusive secondary vertex or the presence of tracks with large impact parameters to combined tagging algorithms making use of multi-variate discriminants. An independent b-tagging algorithm based on the reconstruction of muons inside jets as well as the b-tagging algorithm used in the online trigger are also presented. The b-jet tagging efficiency, the c-jet tagging efficiency and the mistag rate for light flavour jets in data have been measured with a number of complementary methods. The calibration results are presented as scale factors defined as the ratio of the efficiency (or mistag rate) in data to that in simulation. In the case of b jets, where more than one calibration method exists, the results from the various analyses have been combined taking into account the statistical correlation as well as the correlation of the sources of systematic uncertainty.

Journal ArticleDOI
14 Oct 2016-Science
TL;DR: It is found that vaccination with DNA expressing the premembrane and envelope proteins of ZIKV was immunogenic in mice and nonhuman primates, and protection against viremia after ZikV challenge correlated with serum neutralizing activity.
Abstract: Zika virus (ZIKV) was identified as a cause of congenital disease during the explosive outbreak in the Americas and Caribbean that began in 2015. Because of the ongoing fetal risk from endemic disease and travel-related exposures, a vaccine to prevent viremia in women of childbearing age and their partners is imperative. We found that vaccination with DNA expressing the premembrane and envelope proteins of ZIKV was immunogenic in mice and nonhuman primates, and protection against viremia after ZIKV challenge correlated with serum neutralizing activity. These data not only indicate that DNA vaccination could be a successful approach to protect against ZIKV infection, but also suggest a protective threshold of vaccine-induced neutralizing activity that prevents viremia after acute infection.

Journal ArticleDOI
TL;DR: A meta-analysis encompassing 221 study landscapes worldwide reveals forest restoration enhances biodiversity by 15–84% and vegetation structure by 36–77%, compared with degraded ecosystems.
Abstract: Two billion ha have been identified globally for forest restoration. Our meta-analysis encompassing 221 study landscapes worldwide reveals forest restoration enhances biodiversity by 15-84% and vegetation structure by 36-77%, compared with degraded ecosystems. For the first time, we identify the main ecological drivers of forest restoration success (defined as a return to a reference condition, that is, old-growth forest) at both the local and landscape scale. These are as follows: the time elapsed since restoration began, disturbance type and landscape context. The time elapsed since restoration began strongly drives restoration success in secondary forests, but not in selectively logged forests (which are more ecologically similar to reference systems). Landscape restoration will be most successful when previous disturbance is less intensive and habitat is less fragmented in the landscape. Restoration does not result in full recovery of biodiversity and vegetation structure, but can complement old-growth forests if there is sufficient time for ecological succession.

Journal ArticleDOI
TL;DR: Analysis of 820 phages with annotated hosts shows how current knowledge and insights about the interaction mechanisms and ecology of coevolving phages and bacteria can be exploited to predict phage–host relationships, with potential relevance for medical and industrial applications.
Abstract: Metagenomics has changed the face of virus discovery by enabling the accurate identification of viral genome sequences without requiring isolation of the viruses. As a result, metagenomic virus discovery leaves the first and most fundamental question about any novel virus unanswered: What host does the virus infect? The diversity of the global virosphere and the volumes of data obtained in metagenomic sequencing projects demand computational tools for virus-host prediction. We focus on bacteriophages (phages, viruses that infect bacteria), the most abundant and diverse group of viruses found in environmental metagenomes. By analyzing 820 phages with annotated hosts, we review and assess the predictive power of in silico phage-host signals. Sequence homology approaches are the most effective at identifying known phage-host pairs. Compositional and abundance-based methods contain significant signal for phage-host classification, providing opportunities for analyzing the unknowns in viral metagenomes. Together, these computational approaches further our knowledge of the interactions between phages and their hosts. Importantly, we find that all reviewed signals significantly link phages to their hosts, illustrating how current knowledge and insights about the interaction mechanisms and ecology of coevolving phages and bacteria can be exploited to predict phage-host relationships, with potential relevance for medical and industrial applications.

Journal ArticleDOI
TL;DR: This work uses genome-resolved metagenomics to build a genome-based ecological model of the microbial community in a full-scale PNA reactor and yields 23 near-complete draft genomes, including the most complete Omnitrophica genome to date.
Abstract: Partial-nitritation anammox (PNA) is a novel wastewater treatment procedure for energy-efficient ammonium removal. Here we use genome-resolved metagenomics to build a genome-based ecological model of the microbial community in a full-scale PNA reactor. Sludge from the bioreactor examined here is used to seed reactors in wastewater treatment plants around the world; however, the role of most of its microbial community in ammonium removal remains unknown. Our analysis yielded 23 near-complete draft genomes that together represent the majority of the microbial community. We assign these genomes to distinct anaerobic and aerobic microbial communities. In the aerobic community, nitrifying organisms and heterotrophs predominate. In the anaerobic community, widespread potential for partial denitrification suggests a nitrite loop increases treatment efficiency. Of our genomes, 19 have no previously cultivated or sequenced close relatives and six belong to bacterial phyla without any cultivated members, including the most complete Omnitrophica (formerly OP3) genome to date.

Journal ArticleDOI
16 Sep 2016-Science
TL;DR: The site-resolved observation of charge and spin correlations in the two-dimensional (2D) Fermi-Hubbard model realized with ultracold atoms shows strong bunching of doublons and holes, in agreement with numerical calculations.
Abstract: Strong electron correlations lie at the origin of high-temperature superconductivity. Its essence is believed to be captured by the Fermi-Hubbard model of repulsively interacting fermions on a lattice. Here we report on the site-resolved observation of charge and spin correlations in the two-dimensional (2D) Fermi-Hubbard model realized with ultracold atoms. Antiferromagnetic spin correlations are maximal at half-filling and weaken monotonically upon doping. At large doping, nearest-neighbor correlations between singly charged sites are negative, revealing the formation of a correlation hole, the suppressed probability of finding two fermions near each other. As the doping is reduced, the correlations become positive, signaling strong bunching of doublons and holes, in agreement with numerical calculations. The dynamics of the doublon-hole correlations should play an important role for transport in the Fermi-Hubbard model.

Journal ArticleDOI
Georges Aad1, Brad Abbott2, Jalal Abdallah3, Ovsat Abdinov4  +2862 moreInstitutions (191)
TL;DR: The methods employed in the ATLAS experiment to correct for the impact of pile-up on jet energy and jet shapes, and for the presence of spurious additional jets, are described, with a primary focus on the large 20.3 kg-1 data sample.
Abstract: The large rate of multiple simultaneous protonproton interactions, or pile-up, generated by the Large Hadron Collider in Run 1 required the development of many new techniques to mitigate the advers ...

Journal ArticleDOI
TL;DR: Combined findings from clinical, laboratory, imaging, and pathological examinations provided a more complete picture of the severe damage and developmental abnormalities caused by ZIKV infection than has been previously reported.
Abstract: Importance Recent studies have reported an increase in the number of fetuses and neonates with microcephaly whose mothers were infected with the Zika virus (ZIKV) during pregnancy. To our knowledge, most reports to date have focused on select aspects of the maternal or fetal infection and fetal effects. Objective To describe the prenatal evolution and perinatal outcomes of 11 neonates who had developmental abnormalities and neurological damage associated with ZIKV infection in Brazil. Design, Setting, and Participants We observed 11 infants with congenital ZIKV infection from gestation to 6 months in the state of Paraiba, Brazil. Ten of 11 women included in this study presented with symptoms of ZIKV infection during the first half of pregnancy, and all 11 had laboratory evidence of the infection in several tissues by serology or polymerase chain reaction. Brain damage was confirmed through intrauterine ultrasonography and was complemented by magnetic resonance imaging. Histopathological analysis was performed on the placenta and brain tissue from infants who died. The ZIKV genome was investigated in several tissues and sequenced for further phylogenetic analysis. Main Outcomes and Measures Description of the major lesions caused by ZIKV congenital infection. Results Of the 11 infants, 7 (63.6%) were female, and the median (SD) maternal age at delivery was 25 (6) years. Three of 11 neonates died, giving a perinatal mortality rate of 27.3%. The median (SD) cephalic perimeter at birth was 31 (3) cm, a value lower than the limit to consider a microcephaly case. In all patients, neurological impairments were identified, including microcephaly, a reduction in cerebral volume, ventriculomegaly, cerebellar hypoplasia, lissencephaly with hydrocephalus, and fetal akinesia deformation sequence (ie, arthrogryposis). Results of limited testing for other causes of microcephaly, such as genetic disorders and viral and bacterial infections, were negative, and the ZIKV genome was found in both maternal and neonatal tissues (eg, amniotic fluid, cord blood, placenta, and brain). Phylogenetic analyses showed an intrahost virus variation with some polymorphisms in envelope genes associated with different tissues. Conclusions and Relevance Combined findings from clinical, laboratory, imaging, and pathological examinations provided a more complete picture of the severe damage and developmental abnormalities caused by ZIKV infection than has been previously reported. The term congenital Zika syndrome is preferable to refer to these cases, as microcephaly is just one of the clinical signs of this congenital malformation disorder.

Journal ArticleDOI
Morad Aaboud, Alexander Kupco1, P. Davison2, Samuel Webb3  +2869 moreInstitutions (194)
TL;DR: The luminosity determination for the ATLAS detector at the LHC during pp collisions at s√= 8 TeV in 2012 is presented in this article, where the evaluation of the luminosity scale is performed using several luminometers.
Abstract: The luminosity determination for the ATLAS detector at the LHC during pp collisions at s√= 8 TeV in 2012 is presented. The evaluation of the luminosity scale is performed using several luminometers ...


Journal ArticleDOI
Abstract: Middleboxes or network appliances like firewalls, proxies, and WAN optimizers have become an integral part of today’s ISP and enterprise networks Middlebox functionalities are usually deployed on expensive and proprietary hardware that require trained personnel for deployment and maintenance Middleboxes contribute significantly to a network’s capital and operation costs In addition, organizations often require their traffic to pass through a specific sequence of middleboxes for compliance with security and performance policies This makes the middlebox deployment and maintenance tasks even more complicated Network function virtualization (NFV) is an emerging and promising technology that is envisioned to overcome these challenges It proposes to move packet processing from dedicated hardware middleboxes to software running on commodity servers In NFV terminology, software middleboxes are referred to as virtualized network functions (VNFs) It is a challenging problem to determine the required number and placement of VNFs that optimizes network operational costs and utilization, without violating service level agreements We call this the VNF orchestration problem (VNF-OP) and provide an integer linear programming formulation with implementation in CPLEX We also provide a dynamic programming-based heuristic to solve larger instances of VNF-OP Trace driven simulations on real-world network topologies demonstrate that the heuristic can provide solutions that are within 13 times of the optimal solution Our experiments suggest that a VNF-based approach can provide more than $ {4\times }$ reduction in the operational cost of a network