scispace - formally typeset
Search or ask a question

Showing papers by "University of Maryland, College Park published in 2013"


Journal ArticleDOI
TL;DR: TopHat2 is described, which incorporates many significant enhancements to TopHat, and combines the ability to identify novel splice sites with direct mapping to known transcripts, producing sensitive and accurate alignments, even for highly repetitive genomes or in the presence of pseudogenes.
Abstract: TopHat is a popular spliced aligner for RNA-sequence (RNA-seq) experiments. In this paper, we describe TopHat2, which incorporates many significant enhancements to TopHat. TopHat2 can align reads of various lengths produced by the latest sequencing technologies, while allowing for variable-length indels with respect to the reference genome. In addition to de novo spliced alignment, TopHat2 can align reads across fusion breaks, which can occur after genomic translocations. TopHat2 combines the ability to identify novel splice sites with direct mapping to known transcripts, producing sensitive and accurate alignments, even for highly repetitive genomes or in the presence of pseudogenes. TopHat2 is available at http://ccb.jhu.edu/software/tophat.

11,380 citations


Journal ArticleDOI
15 Nov 2013-Science
TL;DR: Intensive forestry practiced within subtropical forests resulted in the highest rates of forest change globally, and boreal forest loss due largely to fire and forestry was second to that in the tropics in absolute and proportional terms.
Abstract: Quantification of global forest change has been lacking despite the recognized importance of forest ecosystem services. In this study, Earth observation satellite data were used to map global forest loss (2.3 million square kilometers) and gain (0.8 million square kilometers) from 2000 to 2012 at a spatial resolution of 30 meters. The tropics were the only climate domain to exhibit a trend, with forest loss increasing by 2101 square kilometers per year. Brazil's well-documented reduction in deforestation was offset by increasing forest loss in Indonesia, Malaysia, Paraguay, Bolivia, Zambia, Angola, and elsewhere. Intensive forestry practiced within subtropical forests resulted in the highest rates of forest change globally. Boreal forest loss due largely to fire and forestry was second to that in the tropics in absolute and proportional terms. These results depict a globally consistent and locally relevant record of forest change.

7,890 citations


Journal ArticleDOI
TL;DR: The results show how the changes in the internal parameters associated with the peptide backbone via CMAP and the χ1 and χ2 dihedral parameters leads to improved treatment of the analyzed nonbond interactions.
Abstract: Protein structure and dynamics can be characterized on the atomistic level with both nuclear magnetic resonance (NMR) experiments and molecular dynamics (MD) simulations. Here, we quantify the ability of the recently presented CHARMM36 (C36) force field (FF) to reproduce various NMR observables using MD simulations. The studied NMR properties include backbone scalar couplings across hydrogen bonds, residual dipolar couplings (RDCs) and relaxation order parameter, as well as scalar couplings, RDCs, and order parameters for side-chain amino- and methyl-containing groups. It is shown that the C36 FF leads to better correlation with experimental data compared to the CHARMM22/CMAP FF and suggest using C36 in protein simulations. Although both CHARMM FFs contains the same nonbond parameters, our results show how the changes in the internal parameters associated with the peptide backbone via CMAP and the χ1 and χ2 dihedral parameters leads to improved treatment of the analyzed nonbond interactions. This highlights the importance of proper treatment of the internal covalent components in modeling nonbond interactions with molecular mechanics FFs. © 2013 Wiley Periodicals, Inc.

2,288 citations


Journal ArticleDOI
Christopher J L Murray1, Jerry Puthenpurakal Abraham2, Mohammed K. Ali3, Miriam Alvarado1, Charles Atkinson1, Larry M. Baddour4, David Bartels5, Emelia J. Benjamin6, Kavi Bhalla5, Gretchen L. Birbeck7, Ian Bolliger1, Roy Burstein1, Emily Carnahan1, Honglei Chen8, David Chou1, Sumeet S. Chugh9, Aaron Cohen10, K. Ellicott Colson1, Leslie T. Cooper11, William G. Couser12, Michael H. Criqui13, Kaustubh Dabhadkar3, Nabila Dahodwala14, Goodarz Danaei5, Robert P. Dellavalle15, Don C. Des Jarlais16, Daniel Dicker1, Eric L. Ding5, E. Ray Dorsey17, Herbert C. Duber1, Beth E. Ebel12, Rebecca E. Engell1, Majid Ezzati18, David T. Felson6, Mariel M. Finucane5, Seth Flaxman19, Abraham D. Flaxman1, Thomas D. Fleming1, Mohammad H. Forouzanfar1, Greg Freedman1, Michael Freeman1, Sherine E. Gabriel4, Emmanuela Gakidou1, Richard F. Gillum20, Diego Gonzalez-Medina1, Richard A. Gosselin21, Bridget F. Grant8, Hialy R. Gutierrez22, Holly Hagan23, Rasmus Havmoeller24, Rasmus Havmoeller9, Howard J. Hoffman8, Kathryn H. Jacobsen25, Spencer L. James1, Rashmi Jasrasaria1, Sudha Jayaraman5, Nicole E. Johns1, Nicholas J Kassebaum12, Shahab Khatibzadeh5, Lisa M. Knowlton5, Qing Lan, Janet L Leasher26, Stephen S Lim1, John K Lin5, Steven E. Lipshultz27, Stephanie J. London8, Rafael Lozano, Yuan Lu5, Michael F. Macintyre1, Leslie Mallinger1, Mary M. McDermott28, Michele Meltzer29, George A. Mensah8, Catherine Michaud30, Ted R. Miller31, Charles Mock12, Terrie E. Moffitt32, Ali A. Mokdad1, Ali H. Mokdad1, Andrew E. Moran22, Dariush Mozaffarian5, Dariush Mozaffarian33, Tasha B. Murphy1, Mohsen Naghavi1, K.M. Venkat Narayan3, Robert G. Nelson8, Casey Olives12, Saad B. Omer3, Katrina F Ortblad1, Bart Ostro34, Pamela M. Pelizzari35, David Phillips1, C. Arden Pope36, Murugesan Raju37, Dharani Ranganathan1, Homie Razavi, Beate Ritz38, Frederick P. Rivara12, Thomas Roberts1, Ralph L. Sacco27, Joshua A. Salomon5, Uchechukwu K.A. Sampson39, Ella Sanman1, Amir Sapkota40, David C. Schwebel41, Saeid Shahraz42, Kenji Shibuya43, Rupak Shivakoti17, Donald H. Silberberg14, Gitanjali M Singh5, David Singh44, Jasvinder A. Singh41, David A. Sleet, Kyle Steenland3, Mohammad Tavakkoli5, Jennifer A. Taylor45, George D. Thurston23, Jeffrey A. Towbin46, Monica S. Vavilala12, Theo Vos1, Gregory R. Wagner47, Martin A. Weinstock48, Marc G. Weisskopf5, James D. Wilkinson27, Sarah Wulf1, Azadeh Zabetian3, Alan D. Lopez49 
14 Aug 2013-JAMA
TL;DR: To measure the burden of diseases, injuries, and leading risk factors in the United States from 1990 to 2010 and to compare these measurements with those of the 34 countries in the Organisation for Economic Co-operation and Development (OECD), systematic analysis of descriptive epidemiology was used.
Abstract: Importance Understanding the major health problems in the United States and how they are changing over time is critical for informing national health policy. Objectives To measure the burden of diseases, injuries, and leading risk factors in the United States from 1990 to 2010 and to compare these measurements with those of the 34 countries in the Organisation for Economic Co-operation and Development (OECD) countries. Design We used the systematic analysis of descriptive epidemiology of 291 diseases and injuries, 1160 sequelae of these diseases and injuries, and 67 risk factors or clusters of risk factors from 1990 to 2010 for 187 countries developed for the Global Burden of Disease 2010 Study to describe the health status of the United States and to compare US health outcomes with those of 34 OECD countries. Years of life lost due to premature mortality (YLLs) were computed by multiplying the number of deaths at each age by a reference life expectancy at that age. Years lived with disability (YLDs) were calculated by multiplying prevalence (based on systematic reviews) by the disability weight (based on population-based surveys) for each sequela; disability in this study refers to any short- or long-term loss of health. Disability-adjusted life-years (DALYs) were estimated as the sum of YLDs and YLLs. Deaths and DALYs related to risk factors were based on systematic reviews and meta-analyses of exposure data and relative risks for risk-outcome pairs. Healthy life expectancy (HALE) was used to summarize overall population health, accounting for both length of life and levels of ill health experienced at different ages. Results US life expectancy for both sexes combined increased from 75.2 years in 1990 to 78.2 years in 2010; during the same period, HALE increased from 65.8 years to 68.1 years. The diseases and injuries with the largest number of YLLs in 2010 were ischemic heart disease, lung cancer, stroke, chronic obstructive pulmonary disease, and road injury. Age-standardized YLL rates increased for Alzheimer disease, drug use disorders, chronic kidney disease, kidney cancer, and falls. The diseases with the largest number of YLDs in 2010 were low back pain, major depressive disorder, other musculoskeletal disorders, neck pain, and anxiety disorders. As the US population has aged, YLDs have comprised a larger share of DALYs than have YLLs. The leading risk factors related to DALYs were dietary risks, tobacco smoking, high body mass index, high blood pressure, high fasting plasma glucose, physical inactivity, and alcohol use. Among 34 OECD countries between 1990 and 2010, the US rank for the age-standardized death rate changed from 18th to 27th, for the age-standardized YLL rate from 23rd to 28th, for the age-standardized YLD rate from 5th to 6th, for life expectancy at birth from 20th to 27th, and for HALE from 14th to 26th. Conclusions and Relevance From 1990 to 2010, the United States made substantial progress in improving health. Life expectancy at birth and HALE increased, all-cause death rates at all ages decreased, and age-specific rates of years lived with disability remained stable. However, morbidity and chronic disability now account for nearly half of the US health burden, and improvements in population health in the United States have not kept pace with advances in population health in other wealthy nations.

2,159 citations


Journal ArticleDOI
TL;DR: In this article, the authors review the theoretical underpinning, techniques, and results of efforts to estimate the CO-to-H2 conversion factor in different environments, and recommend a conversion factor XCO = 2×10 20 cm −2 (K km s −1 ) −1 with ±30% uncertainty.
Abstract: CO line emission represents the most accessible and widely used tracer of the molecular interstellar medium. This renders the translation of observed CO intensity into total H2 gas mass critical to understand star formation and the interstellar medium in our Galaxy and beyond. We review the theoretical underpinning, techniques, and results of efforts to estimate this CO-to-H2 “conversion factor,” XCO, in different environments. In the Milky Way disk, we recommend a conversion factor XCO = 2×10 20 cm −2 (K km s −1 ) −1 with ±30% uncertainty. Studies of other “normal galaxies” return similar values in Milky Way-like disks, but with greater scatter and systematic uncertainty. Departures from this Galactic conversion factor are both observed and expected. Dust-based determinations, theoretical arguments, and scaling relations all suggest that XCO increases with decreasing metallicity, turning up sharply below metallicity ≈ 1/3–1/2 solar in a manner consistent with model predictions that identify shielding as a key parameter. Based on spectral line modeling and dust observations, XCO appears to drop in the central, bright regions of some but not all galaxies, often coincident with regions of bright CO emission and high stellar surface density. This lower XCO is also present in the overwhelmingly molecular interstellar medium of starburst galaxies, where several lines of evidence point to a lower CO-to-H2 conversion factor. At high redshift, direct evidence regarding the conversion factor remains scarce; we review what is known based on dynamical modeling and other arguments. Subject headings: ISM: general — ISM: molecules — galaxies: ISM — radio lines: ISM

2,004 citations


Journal ArticleDOI
TL;DR: It is shown that metagenomeSeq outperforms the tools currently used in this field and relies on a novel normalization technique and a statistical model that accounts for undersampling in large-scale marker-gene studies.
Abstract: We introduce a methodology to assess differential abundance in sparse high-throughput microbial marker-gene survey data. Our approach, implemented in the metagenomeSeq Bioconductor package, relies on a novel normalization technique and a statistical model that accounts for undersampling-a common feature of large-scale marker-gene studies. Using simulated data and several published microbiota data sets, we show that metagenomeSeq outperforms the tools currently used in this field.

1,664 citations


Journal ArticleDOI
M. G. Aartsen1, Rasha Abbasi2, Y. Abdou3, Markus Ackermann, Jenni Adams4, Juanan Aguilar5, Markus Ahlers2, D. Altmann6, J. Auffenberg2, X. Bai, Michael J. Baker2, S. W. Barwick7, V. Baum8, R. C. Bay9, J. J. Beatty10, S. Bechet11, J. Becker Tjus12, K.-H. Becker13, M. L. Benabderrahmane, Segev BenZvi2, P. Berghaus, D. Berley14, Elisa Bernardini, A. Bernhard, D. Bertrand11, D. Z. Besson15, Gary Binder9, Gary Binder16, Daniel Bindig13, M. Bissok17, E. Blaufuss14, J. Blumenthal17, D. J. Boersma18, S. Bohaichuk19, C. Bohm20, D. Bose21, S. Böser22, Olga Botner18, L. Brayeur21, H.-P. Bretz, A. M. Brown4, R. Bruijn23, Jürgen Brunner, M. J. Carson3, J. Casey24, M. Casier21, Dmitry Chirkin2, A. Christov5, B. Christy14, K. Clark25, F. Clevermann26, S. Coenders17, Seth M. Cohen23, D. F. Cowen25, A. H. Cruz Silva, M. Danninger20, J. Daughhetee24, J. C. Davis10, M. Day2, C. De Clercq21, S. De Ridder3, Paolo Desiati2, K. D. de Vries21, Tyce DeYoung25, Juan Carlos Diaz-Velez2, Matt Dunkman25, R. Eagan25, B. Eberhardt8, B. Eichmann12, J. Eisch2, R. W. Ellsworth14, S. Euler17, Paul Evenson, O. Fadiran2, A. R. Fazely27, Anatoli Fedynitch12, J. Feintzeig2, T. Feusels3, Kirill Filimonov9, Chad Finley20, T. Fischer-Wasels13, S. Flis20, A. Franckowiak22, K. Frantzen26, T. Fuchs26, Thomas K. Gaisser, J. C. Gallagher2, L. Gerhardt9, L. Gerhardt16, L. Gladstone2, Thorsten Glusenkamp, A. Goldschmidt16, G. Golup21, J. G. Gonzalez, J. A. Goodman14, Dariusz Gora, Dylan T. Grandmont19 
20 Nov 2013-Science
TL;DR: The presence of a high-energy neutrino flux containing the most energetic neutrinos ever observed is revealed, including 28 events at energies between 30 and 1200 TeV, although the origin of this flux is unknown and the findings are consistent with expectations for a neutRino population with origins outside the solar system.
Abstract: We report on results of an all-sky search for high-energy neutrino events interacting within the IceCube neutrino detector conducted between May 2010 and May 2012. The search follows up on the previous detection of two PeV neutrino events, with improved sensitivity and extended energy coverage down to about 30 TeV. Twenty-six additional events were observed, substantially more than expected from atmospheric backgrounds. Combined, both searches reject a purely atmospheric origin for the 28 events at the 4 sigma level. These 28 events, which include the highest energy neutrinos ever observed, have flavors, directions, and energies inconsistent with those expected from the atmospheric muon and neutrino backgrounds. These properties are, however, consistent with generic predictions for an additional component of extraterrestrial origin.

1,490 citations


Journal ArticleDOI
TL;DR: In this article, the authors used data from the Census Bureau's Business Dynamics Statistics and Longitudinal Business Database to explore the many issues at the core of this ongoing debate and find that the relationship between firm size and employment growth is sensitive to these issues.
Abstract: The view that small businesses create the most jobs remains appealing to policymakers and small business advocates. Using data from the Census Bureau's Business Dynamics Statistics and Longitudinal Business Database, we explore the many issues at the core of this ongoing debate. We find that the relationship between firm size and employment growth is sensitive to these issues. However, our main finding is that once we control for firm age, there is no systematic relationship between firm size and growth. Our findings highlight the important role of business start-ups and young businesses in U.S. job creation.

1,430 citations


Journal ArticleDOI
TL;DR: A label consistent K-SVD (LC-KSVD) algorithm to learn a discriminative dictionary for sparse coding and introduces a new label consistency constraint called "discriminative sparse-code error" to enforce discriminability in sparse codes during the dictionary learning process.
Abstract: A label consistent K-SVD (LC-KSVD) algorithm to learn a discriminative dictionary for sparse coding is presented. In addition to using class labels of training data, we also associate label information with each dictionary item (columns of the dictionary matrix) to enforce discriminability in sparse codes during the dictionary learning process. More specifically, we introduce a new label consistency constraint called "discriminative sparse-code error" and combine it with the reconstruction error and the classification error to form a unified objective function. The optimal solution is efficiently obtained using the K-SVD algorithm. Our algorithm learns a single overcomplete dictionary and an optimal linear classifier jointly. The incremental dictionary learning algorithm is presented for the situation of limited memory resources. It yields dictionaries so that feature points with the same class labels have similar sparse codes. Experimental results demonstrate that our algorithm outperforms many recently proposed sparse-coding techniques for face, action, scene, and object category recognition under the same learning conditions.

1,232 citations


Journal ArticleDOI
TL;DR: It is emphasized that the facile synthesis of a GO membrane exploiting the ideal properties of inexpensive GO materials offers a myriad of opportunities to modify its physicochemical properties, potentially making the GO membrane a next-generation, cost-effective, and sustainable alternative to the long-existing thin-film composite polyamide membranes for water separation applications.
Abstract: We report a novel procedure to synthesize a new type of water separation membrane using graphene oxide (GO) nanosheets such that water can flow through the nanochannels between GO layers while unwanted solutes are rejected by size exclusion and charge effects. The GO membrane was made via layer-by-layer deposition of GO nanosheets, which were cross-linked by 1,3,5-benzenetricarbonyl trichloride, on a polydopamine-coated polysulfone support. The cross-linking not only provided the stacked GO nanosheets with the necessary stability to overcome their inherent dispensability in water environment but also fine-tuned the charges, functionality, and spacing of the GO nanosheets. We then tested the membranes synthesized with different numbers of GO layers to demonstrate their interesting water separation performance. It was found that the GO membrane flux ranged between 80 and 276 LMH/MPa, roughly 4–10 times higher than that of most commercial nanofiltration membranes. Although the GO membrane in the present deve...

1,224 citations


Journal ArticleDOI
TL;DR: The Global Fire Emissions Database (GFED4) as discussed by the authors provides global monthly burned area at 0.25°m spatial resolution from mid-1995 through the present and daily burned area for the time series extending back to August 2000.
Abstract: [1] We describe the fourth generation of the Global Fire Emissions Database (GFED4) burned area data set, which provides global monthly burned area at 0.25° spatial resolution from mid-1995 through the present and daily burned area for the time series extending back to August 2000. We produced the full data set by combining 500 m MODIS burned area maps with active fire data from the Tropical Rainfall Measuring Mission (TRMM) Visible and Infrared Scanner (VIRS) and the Along-Track Scanning Radiometer (ATSR) family of sensors. We found that the global annual area burned for the years 1997 through 2011 varied from 301 to 377Mha, with an average of 348Mha. We assessed the interannual variability and trends in burned area on the basis of a region-specific definition of fire years. With respect to trends, we found a gradual decrease of 1.7Mhayr − 1 ( − 1.4%yr − 1) in Northern Hemisphere Africa since 2000, a gradual increase of 2.3Mhayr − 1 (+1.8%yr − 1) in Southern Hemisphere Africa also since 2000, a slight increase of 0.2Mhayr − 1 (+2.5%yr − 1) in Southeast Asia since 1997, and a rapid decrease of approximately 5.5Mhayr − 1 ( − 10.7%yr − 1) from 2001 through 2011 in Australia, followed by a major upsurge in 2011 that exceeded the annual area burned in at least the previous 14 years. The net trend in global burned area from 2000 to 2012 was a modest decrease of 4.3Mhayr − 1 ( − 1.2%yr − 1). We also performed a spectral analysis of the daily burned area time series and found no vestiges of the 16 day MODIS repeat cycle.

Journal ArticleDOI
M. Aguilar1, G Alberti2, Behcet Alpat, A. Alvino2  +344 moreInstitutions (39)
TL;DR: The very accurate data show that the positron fraction is steadily increasing from 10 to ∼ 250 GeV, but, from 20 to 250 GeV, the slope decreases by an order of magnitude, showing the existence of new physical phenomena.
Abstract: A precision measurement by the Alpha Magnetic Spectrometer on the International Space Station of the positron fraction in primary cosmic rays in the energy range from 0.5 to 350 GeV based on 6.8 × 10(6) positron and electron events is presented. The very accurate data show that the positron fraction is steadily increasing from 10 to ∼ 250 GeV, but, from 20 to 250 GeV, the slope decreases by an order of magnitude. The positron fraction spectrum shows no fine structure, and the positron to electron ratio shows no observable anisotropy. Together, these features show the existence of new physical phenomena.

Journal ArticleDOI
TL;DR: The NIH-TB Cognition Battery is intended to serve as a brief, convenient set of measures to supplement other outcome measures in epidemiologic and longitudinal research and clinical trials and will provide a “common currency” among researchers for comparisons across a wide range of studies and populations.
Abstract: Motor function involves complex physiologic processes and requires the integration of multiple systems, including neuromuscular, musculoskeletal, and cardiopulmonary, and neural motor and sensory-perceptual systems. Motor-functional status is indicative of current physical health status, burden of disease, and long-term health outcomes, and is integrally related to daily functioning and quality of life. Given its importance to overall neurologic health and function, motor function was identified as a key domain for inclusion in the NIH Toolbox for Assessment of Neurological and Behavioral Function (NIH Toolbox). We engaged in a 3-stage developmental process to: 1) identify key subdomains and candidate measures for inclusion in the NIH Toolbox, 2) pretest candidate measures for feasibility across the age span of people aged 3 to 85 years, and 3) validate candidate measures against criterion measures in a sample of healthy individuals aged 3 to 85 years (n = 340). Based on extensive literature review and input from content experts, the 5 subdomains of dexterity, strength, balance, locomotion, and endurance were recommended for inclusion in the NIH Toolbox motor battery. Based on our validation testing, valid and reliable measures that are simultaneously low-cost and portable have been recommended to assess each subdomain, including the 9-hole peg board for dexterity, grip dynamometry for upper-extremity strength, standing balance test, 4-m walk test for gait speed, and a 2-minute walk test for endurance.

Journal ArticleDOI
18 Jan 2013-Science
TL;DR: With the first plenary meeting of the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES) soon under way, partners are developing—and seeking consensus around—Essential Biod diversity Variables (EBVs) that could form the basis of monitoring programs worldwide.
Abstract: Reducing the rate of biodiversity loss and averting dangerous biodiversity change are international goals, reasserted by the Aichi Targets for 2020 by Parties to the United Nations (UN) Convention on Biological Diversity (CBD) after failure to meet the 2010 target (1, 2). However, there is no global, harmonized observation system for delivering regular, timely data on biodiversity change (3). With the first plenary meeting of the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES) soon under way, partners from the Group on Earth Observations Biodiversity Observation Network (GEO BON) (4) are developing—and seeking consensus around—Essential Biodiversity Variables (EBVs) that could form the basis of monitoring programs worldwide.

Journal ArticleDOI
TL;DR: Using high-resolution neutron power diffraction technique, the first direct structural evidence is found showing that real UiO-66 material contains significant amount of missing-linker defects, an unusual phenomenon for MOFs.
Abstract: UiO-66 is a highly important prototypical zirconium metal–organic framework (MOF) compound because of its excellent stabilities not typically found in common porous MOFs. In its perfect crystal structure, each Zr metal center is fully coordinated by 12 organic linkers to form a highly connected framework. Using high-resolution neutron power diffraction technique, we found the first direct structural evidence showing that real UiO-66 material contains significant amount of missing-linker defects, an unusual phenomenon for MOFs. The concentration of the missing-linker defects is surprisingly high, ∼10% in our sample, effectively reducing the framework connection from 12 to ∼11. We show that by varying the concentration of the acetic acid modulator and the synthesis time, the linker vacancies can be tuned systematically, leading to dramatically enhanced porosity. We obtained samples with pore volumes ranging from 0.44 to 1.0 cm3/g and Brunauer–Emmett–Teller surface areas ranging from 1000 to 1600 m2/g, the l...

Journal ArticleDOI
TL;DR: In this paper, the IRAM Plateau de Bure high-z blue sequence CO 3-2 survey of the molecular gas properties in massive, main-sequence star-forming galaxies (SFGs) near the cosmic star formation peak is presented.
Abstract: We present PHIBSS, the IRAM Plateau de Bure high-z blue sequence CO 3-2 survey of the molecular gas properties in massive, main-sequence star-forming galaxies (SFGs) near the cosmic star formation peak. PHIBSS provides 52 CO detections in two redshift slices at z ~ 1.2 and 2.2, with log(M *(M ☉)) ≥ 10.4 and log(SFR(M ☉/yr)) ≥ 1.5. Including a correction for the incomplete coverage of the M* -SFR plane, and adopting a "Galactic" value for the CO-H2 conversion factor, we infer average gas fractions of ~0.33 at z ~ 1.2 and ~0.47 at z ~ 2.2. Gas fractions drop with stellar mass, in agreement with cosmological simulations including strong star formation feedback. Most of the z ~ 1-3 SFGs are rotationally supported turbulent disks. The sizes of CO and UV/optical emission are comparable. The molecular-gas-star-formation relation for the z = 1-3 SFGs is near-linear, with a ~0.7 Gyr gas depletion timescale; changes in depletion time are only a secondary effect. Since this timescale is much less than the Hubble time in all SFGs between z ~ 0 and 2, fresh gas must be supplied with a fairly high duty cycle over several billion years. At given z and M *, gas fractions correlate strongly with the specific star formation rate (sSFR). The variation of sSFR between z ~ 0 and 3 is mainly controlled by the fraction of baryonic mass that resides in cold gas.

Journal ArticleDOI
A. A. Abdo1, A. A. Abdo2, Marco Ajello3, Alice Allafort4  +254 moreInstitutions (60)
TL;DR: In this article, a catalog of gamma-ray pulsar detections using three years of data acquired by the Large Area Telescope (LAT) on the Fermi satellite is presented.
Abstract: This catalog summarizes 117 high-confidence > 0.1 GeV gamma-ray pulsar detections using three years of data acquired by the Large Area Telescope (LAT) on the Fermi satellite. Half are neutron stars discovered using LAT data, through periodicity searches in gamma-ray and radio data around LAT unassociated source positions. The 117 pulsars are evenly divided into three groups: millisecond pulsars, young radio-loud pulsars, and young radio-quiet pulsars. We characterize the pulse profiles and energy spectra and derive luminosities when distance information exists. Spectral analysis of the off-peak phase intervals indicates probable pulsar wind nebula emission for four pulsars, and off-peak magnetospheric emission for several young and millisecond pulsars. We compare the gamma-ray properties with those in the radio, optical, and X-ray bands. We provide flux limits for pulsars with no observed gamma-ray emission, highlighting a small number of gamma-faint, radio-loud pulsars. The large, varied gamma-ray pulsar sample constrains emission models. Fermi's selection biases complement those of radio surveys, enhancing comparisons with predicted population distributions.

Book
09 Jun 2013
TL;DR: The Lay Epistemic Framework as discussed by the authors is a theory of lay epistemics and its history and scope is described in detail in the article "Know All: A Theory ofLay Epistemics".
Abstract: 1. The Lay Epistemic Framework: Its History and Scope.- 2. Knowing All: A Theory of Lay Epistemics.- 3. Empirical Research in the Lay Epistemic Framework.- 4. Unique and Nonunique Aspects of Attributions.- 5. A Bridge to Consistency Theories.- 6. Attitudes as Knowledge Structures.- 7. Further Domains of Application: Social Comparison Processes and Minority-Influence Phenomena.- 8. The Issue of Accuracy in Social Perception and Cognition.- 9. Knowing How to Cure: Implications for Cognitive Therapy.- 10. The Social Psychology of Science: On the Lay Epistemic Underpinnings of Research Methodology.- References.- Author Index.

Journal ArticleDOI
TL;DR: This synthesis reveals that pollinator persistence will depend on both the maintenance of high-quality habitats around farms and on local management practices that may offset impacts of intensive monoculture agriculture.
Abstract: Bees provide essential pollination services that are potentially affected both by local farm management and the surrounding landscape. To better understand these different factors, we modelled the relative effects of landscape composition (nesting and floral resources within foraging distances), landscape configuration (patch shape, interpatch connectivity and habitat aggregation) and farm management (organic vs. conventional and local-scale field diversity), and their interactions, on wild bee abundance and richness for 39 crop systems globally. Bee abundance and richness were higher in diversified and organic fields and in landscapes comprising more high-quality habitats; bee richness on conventional fields with low diversity benefited most from high-quality surrounding land cover. Landscape configuration effects were weak. Bee responses varied slightly by biome. Our synthesis reveals that pollinator persistence will depend on both the maintenance of high-quality habitats around farms and on local management practices that may offset impacts of intensive monoculture agriculture.

Journal ArticleDOI
TL;DR: The authors examined the short and long-run average causal impact of catastrophic natural disasters on economic growth by combining information from comparative case studies and found that only extremely large disasters have a negative effect on output, both in the short-and long-term.
Abstract: This paper examines the short and long-run average causal impact of catastrophic natural disasters on economic growth by combining information from comparative case studies. The counterfactual of the cases studied is assessed by constructing synthetic control groups, taking advantage of the fact that the timing of large sudden natural disasters is an exogenous event. It is found that only extremely large disasters have a negative effect on output, both in the short and long run. However, this result appears in two events where radical political revolutions followed the natural disasters. Once these political changes are controlled for, even extremely large disasters do not display any significant effect on economic growth. It is also found that smaller, but still very large natural disasters, have no discernible effect on output.

Journal ArticleDOI
Markus Ackermann, Marco Ajello1, Alice Allafort2, Luca Baldini3  +197 moreInstitutions (42)
15 Feb 2013-Science
TL;DR: The characteristic pion-decay feature is detected in the gamma-ray spectra of two SNRs, IC 443 and W44, with the Fermi Large Area Telescope, providing direct evidence that cosmic-ray protons are accelerated in SNRs.
Abstract: Cosmic rays are particles (mostly protons) accelerated to relativistic speeds. Despite wide agreement that supernova remnants (SNRs) are the sources of galactic cosmic rays, unequivocal evidence for the acceleration of protons in these objects is still lacking. When accelerated protons encounter interstellar material, they produce neutral pions, which in turn decay into gamma rays. This offers a compelling way to detect the acceleration sites of protons. The identification of pion-decay gamma rays has been difficult because high-energy electrons also produce gamma rays via bremsstrahlung and inverse Compton scattering. We detected the characteristic pion-decay feature in the gamma-ray spectra of two SNRs, IC 443 and W44, with the Fermi Large Area Telescope. This detection provides direct evidence that cosmic-ray protons are accelerated in SNRs.

Book
21 Mar 2013
TL;DR: In this paper, the authors give a systematic grounding in the theory of Hamiltonian differential equations from a dynamical systems point of view and develop a solid foundation for students to read some of the current research on Hamiltonian systems.
Abstract: This book gives a systematic grounding in the theory of Hamiltonian differential equations from a dynamical systems point of view. It develops a solid foundation for students to read some of the current research on Hamiltonian systems. Topics covered include a detailed discussion of linear Hamiltonian systems, an introduction to the theory of integrals and reduction, Poincare's continuation of periodic solution, normal forms and applications of KAM theory. A chapter is devoted to the theory of twist maps and various extensions of the classic Poincare-Birkhoff fixed point theorem.

Journal ArticleDOI
J. Aasi1, J. Abadie1, B. P. Abbott1, R. Abbott1  +745 moreInstitutions (73)
TL;DR: In this article, the authors inject squeezed states to improve the performance of one of the detectors of the Laser Interferometer Gravitational-Wave Observatory (LIGO) beyond the quantum noise limit, most notably in the frequency region down to 150 Hz.
Abstract: Nearly a century after Einstein first predicted the existence of gravitational waves, a global network of Earth-based gravitational wave observatories1, 2, 3, 4 is seeking to directly detect this faint radiation using precision laser interferometry. Photon shot noise, due to the quantum nature of light, imposes a fundamental limit on the attometre-level sensitivity of the kilometre-scale Michelson interferometers deployed for this task. Here, we inject squeezed states to improve the performance of one of the detectors of the Laser Interferometer Gravitational-Wave Observatory (LIGO) beyond the quantum noise limit, most notably in the frequency region down to 150 Hz, critically important for several astrophysical sources, with no deterioration of performance observed at any frequency. With the injection of squeezed states, this LIGO detector demonstrated the best broadband sensitivity to gravitational waves ever achieved, with important implications for observing the gravitational-wave Universe with unprecedented sensitivity.

Journal ArticleDOI
TL;DR: In this paper, a topological invariant for periodically driven systems of noninteracting particles is proposed, based on the analysis of the Floquet spectra of driven systems and the band structures of static Hamiltonians.
Abstract: Recently, several authors have investigated topological phenomena in periodically driven systems of noninteracting particles. These phenomena are identified through analogies between the Floquet spectra of driven systems and the band structures of static Hamiltonians. Intriguingly, these works have revealed phenomena that cannot be characterized by analogy to the topological classification framework for static systems. In particular, in driven systems in two dimensions (2D), robust chiral edge states can appear even though the Chern numbers of all the bulk Floquet bands are zero. Here, we elucidate the crucial distinctions between static and driven 2D systems, and construct a new topological invariant that yields the correct edge-state structure in the driven case. We provide formulations in both the time and frequency domains, which afford additional insight into the origins of the “anomalous” spectra that arise in driven systems. Possibilities for realizing these phenomena in solid-state and cold-atomic systems are discussed.

Journal ArticleDOI
M. G. Aartsen1, Rasha Abbasi2, Y. Abdou3, Markus Ackermann  +284 moreInstitutions (36)
TL;DR: These two neutrino-induced events could be a first indication of an astrophysical neutrinos flux; the moderate significance, however, does not permit a definitive conclusion at this time.
Abstract: We report on the observation of two neutrino-induced events which have an estimated deposited energy in the IceCube detector of 1.04 +/- 0.16 and 1.14 +/- 0.17 PeV, respectively, the highest neutrino energies observed so far. These events are consistent with fully contained particle showers induced by neutral-current nu(e,mu,tau) ((nu) over bar (e,mu,tau)) or charged-current nu(e) ((nu) over bar (e)) interactions within the IceCube detector. The events were discovered in a search for ultrahigh energy neutrinos using data corresponding to 615.9 days effective live time. The expected number of atmospheric background is 0.082 +/- 0.004(stat)(-0.057)(+0.041)(syst). The probability of observing two or more candidate events under the atmospheric background-only hypothesis is 2.9 x 10(-3) (2.8 sigma) taking into account the uncertainty on the expected number of background events. These two events could be a first indication of an astrophysical neutrino flux; the moderate significance, however, does not permit a definitive conclusion at this time.

Journal ArticleDOI
TL;DR: In this article, an integrated framework based on telecoupling, an umbrella concept that refers to socioeconomic and environmental interactions over distances, is proposed to understand and integrate various distant interactions better.
Abstract: Interactions between distant places are increasingly widespread and influential, often leading to unexpected outcomes with profound implications for sustainability. Numerous sustainability studies have been conducted within a particular place with little attention to the impacts of distant interactions on sustainability in multiple places. Although distant forces have been studied, they are usually treated as exogenous variables and feedbacks have rarely been considered. To understand and integrate various distant interactions better, we propose an integrated framework based on telecoupling, an umbrella concept that refers to socioeconomic and environmental interactions over distances. The concept of telecoupling is a logical extension of research on coupled human and natural systems, in which interactions occur within particular geographic locations. The telecoupling framework contains five major interrelated components, i.e., coupled human and natural systems, flows, agents, causes, and effects. We illustrate the framework using two examples of distant interactions associated with trade of agricultural commodities and invasive species, highlight the implications of the framework, and discuss research needs and approaches to move research on telecouplings forward. The framework can help to analyze system components and their interrelationships, identify research gaps, detect hidden costs and untapped benefits, provide a useful means to incorporate feedbacks as well as trade-offs and synergies across multiple systems (sending, receiving, and spillover systems), and improve the understanding of distant interactions and the effectiveness of policies for socioeconomic and environmental sustainability from local to global levels.

Journal ArticleDOI
TL;DR: In this paper, the authors report on the AeroCom Phase II direct aerosol effect (DAE) experiment where 16 detailed global aerosol models have been used to simulate the changes in the aerosol distribution over the industrial era.
Abstract: We report on the AeroCom Phase II direct aerosol effect (DAE) experiment where 16 detailed global aerosol models have been used to simulate the changes in the aerosol distribution over the industrial era. All 16 models have estimated the radiative forcing (RF) of the anthropogenic DAE, and have taken into account anthropogenic sulphate, black carbon (BC) and organic aerosols (OA) from fossil fuel, biofuel, and biomass burning emissions. In addition several models have simulated the DAE of anthropogenic nitrate and anthropogenic influenced secondary organic aerosols (SOA). The model simulated all-sky RF of the DAE from total anthropogenic aerosols has a range from −0.58 to −0.02 Wm−2, with a mean of −0.27 Wm−2 for the 16 models. Several models did not include nitrate or SOA and modifying the estimate by accounting for this with information from the other AeroCom models reduces the range and slightly strengthens the mean. Modifying the model estimates for missing aerosol components and for the time period 1750 to 2010 results in a mean RF for the DAE of −0.35 Wm−2. Compared to AeroCom Phase I (Schulz et al., 2006) we find very similar spreads in both total DAE and aerosol component RF. However, the RF of the total DAE is stronger negative and RF from BC from fossil fuel and biofuel emissions are stronger positive in the present study than in the previous AeroCom study. We find a tendency for models having a strong (positive) BC RF to also have strong (negative) sulphate or OA RF. This relationship leads to smaller uncertainty in the total RF of the DAE compared to the RF of the sum of the individual aerosol components. The spread in results for the individual aerosol components is substantial, and can be divided into diversities in burden, mass extinction coefficient (MEC), and normalized RF with respect to AOD. We find that these three factors give similar contributions to the spread in results.

Journal ArticleDOI
TL;DR: In this article, the electrochemical performance of mesoporous carbon (C)/tin (Sn) anodes in Na-ion and Li-ion batteries is systematically investigated, showing that the desodiation potential of Sn anodes is approximately 021 V lower than delithiation potential, due to the large Na ion size and large volume change of porous C/Sn composite anode during alloy/dealloy reactions.
Abstract: The electrochemical performance of mesoporous carbon (C)/tin (Sn) anodes in Na-ion and Li-ion batteries is systematically investigated The mesoporous C/Sn anodes in a Na-ion battery shows similar cycling stability but lower capacity and poorer rate capability than that in a Li-ion battery The desodiation potentials of Sn anodes are approximately 021 V lower than delithiation potentials The low capacity and poor rate capability of C/Sn anode in Na-ion batteries is mainly due to the large Na-ion size, resulting in slow Na-ion diffusion and large volume change of porous C/Sn composite anode during alloy/dealloy reactions Understanding of the reaction mechanism between Sn and Na ions will provide insight towards exploring and designing new alloy-based anode materials for Na-ion batteries

Journal ArticleDOI
TL;DR: This research commentary recommends a series of actions the researcher can take to mitigate the p-value problem in large samples and illustrates them with an example of over 300,000 camera sales on eBay.
Abstract: The Internet has provided IS researchers with the opportunity to conduct studies with extremely large samples, frequently well over 10,000 observations. There are many advantages to large samples, but researchers using statistical inference must be aware of the p-value problem associated with them. In very large samples, p-values go quickly to zero, and solely relying on p-values can lead the researcher to claim support for results of no practical significance. In a survey of large sample IS research, we found that a significant number of papers rely on a low p-value and the sign of a regression coefficient alone to support their hypotheses. This research commentary recommends a series of actions the researcher can take to mitigate the p-value problem in large samples and illustrates them with an example of over 300,000 camera sales on eBay. We believe that addressing the p-value problem will increase the credibility of large sample IS research as well as provide more insights for readers.

Journal ArticleDOI
07 Feb 2013-Nature
TL;DR: The current experimental and theoretical status of spin–orbit coupling in ultracold atomic systems is outlined, discussing unique features that enable physics impossible in any other known setting.
Abstract: The current experimental and theoretical status of spin–orbit coupling in ultracold atomic systems is discussed, highlighting unique features that enable otherwise impossible physics. Quantum gases — such as atomic Bose–Einstein condensates and degenerate Fermi gases — are suitable for realizing a range of physical phenomena. Their properties can be engineered on demand through the manipulation of synthetic spin–orbit couplings using laser fields. In this review, Victor Galitski and Ian Spielman outline the current experimental and theoretical status of spin–orbit coupling in ultracold atomic systems, discussing the unique features that enable physics impossible in any other known setting. Spin–orbit coupling links a particle’s velocity to its quantum-mechanical spin, and is essential in numerous condensed matter phenomena, including topological insulators and Majorana fermions. In solid-state materials, spin–orbit coupling originates from the movement of electrons in a crystal’s intrinsic electric field, which is uniquely prescribed in any given material. In contrast, for ultracold atomic systems, the engineered ‘material parameters’ are tunable: a variety of synthetic spin–orbit couplings can be engineered on demand using laser fields. Here we outline the current experimental and theoretical status of spin–orbit coupling in ultracold atomic systems, discussing unique features that enable physics impossible in any other known setting.