scispace - formally typeset
Search or ask a question

Showing papers by "Max Planck Society published in 2013"


Journal ArticleDOI
06 Jun 2013-Cell
TL;DR: Nine tentative hallmarks that represent common denominators of aging in different organisms are enumerated, with special emphasis on mammalian aging, to identify pharmaceutical targets to improve human health during aging, with minimal side effects.

9,980 citations


Journal ArticleDOI
TL;DR: Astropy as discussed by the authors is a Python package for astronomy-related functionality, including support for domain-specific file formats such as flexible image transport system (FITS) files, Virtual Observatory (VO) tables, common ASCII table formats, unit and physical quantity conversions, physical constants specific to astronomy, celestial coordinate and time transformations, world coordinate system (WCS) support, generalized containers for representing gridded as well as tabular data, and a framework for cosmological transformations and conversions.
Abstract: We present the first public version (v02) of the open-source and community-developed Python package, Astropy This package provides core astronomy-related functionality to the community, including support for domain-specific file formats such as flexible image transport system (FITS) files, Virtual Observatory (VO) tables, and common ASCII table formats, unit and physical quantity conversions, physical constants specific to astronomy, celestial coordinate and time transformations, world coordinate system (WCS) support, generalized containers for representing gridded as well as tabular data, and a framework for cosmological transformations and conversions Significant functionality is under activedevelopment, such as a model fitting framework, VO client and server tools, and aperture and point spread function (PSF) photometry tools The core development team is actively making additions and enhancements to the current code base, and we encourage anyone interested to participate in the development of future Astropy versions

9,720 citations


Journal ArticleDOI
18 Jul 2013-Nature
TL;DR: A sequential deposition method for the formation of the perovskite pigment within the porous metal oxide film that greatly increases the reproducibility of their performance and allows the fabrication of solid-state mesoscopic solar cells with unprecedented power conversion efficiencies and high stability.
Abstract: Following pioneering work, solution-processable organic-inorganic hybrid perovskites-such as CH3NH3PbX3 (X = Cl, Br, I)-have attracted attention as light-harvesting materials for mesoscopic solar cells. So far, the perovskite pigment has been deposited in a single step onto mesoporous metal oxide films using a mixture of PbX2 and CH3NH3X in a common solvent. However, the uncontrolled precipitation of the perovskite produces large morphological variations, resulting in a wide spread of photovoltaic performance in the resulting devices, which hampers the prospects for practical applications. Here we describe a sequential deposition method for the formation of the perovskite pigment within the porous metal oxide film. PbI2 is first introduced from solution into a nanoporous titanium dioxide film and subsequently transformed into the perovskite by exposing it to a solution of CH3NH3I. We find that the conversion occurs within the nanoporous host as soon as the two components come into contact, permitting much better control over the perovskite morphology than is possible with the previously employed route. Using this technique for the fabrication of solid-state mesoscopic solar cells greatly increases the reproducibility of their performance and allows us to achieve a power conversion efficiency of approximately 15 per cent (measured under standard AM1.5G test conditions on solar zenith angle, solar light intensity and cell temperature). This two-step method should provide new opportunities for the fabrication of solution-processed photovoltaic cells with unprecedented power conversion efficiencies and high stability equal to or even greater than those of today's best thin-film photovoltaic devices.

8,427 citations


Journal ArticleDOI
TL;DR: In this article, the authors make a case for the importance of reporting variance explained (R2) as a relevant summarizing statistic of mixed-effects models, which is rare, even though R2 is routinely reported for linear models and also generalized linear models (GLM).
Abstract: Summary The use of both linear and generalized linear mixed-effects models (LMMs and GLMMs) has become popular not only in social and medical sciences, but also in biological sciences, especially in the field of ecology and evolution. Information criteria, such as Akaike Information Criterion (AIC), are usually presented as model comparison tools for mixed-effects models. The presentation of ‘variance explained’ (R2) as a relevant summarizing statistic of mixed-effects models, however, is rare, even though R2 is routinely reported for linear models (LMs) and also generalized linear models (GLMs). R2 has the extremely useful property of providing an absolute value for the goodness-of-fit of a model, which cannot be given by the information criteria. As a summary statistic that describes the amount of variance explained, R2 can also be a quantity of biological interest. One reason for the under-appreciation of R2 for mixed-effects models lies in the fact that R2 can be defined in a number of ways. Furthermore, most definitions of R2 for mixed-effects have theoretical problems (e.g. decreased or negative R2 values in larger models) and/or their use is hindered by practical difficulties (e.g. implementation). Here, we make a case for the importance of reporting R2 for mixed-effects models. We first provide the common definitions of R2 for LMs and GLMs and discuss the key problems associated with calculating R2 for mixed-effects models. We then recommend a general and simple method for calculating two types of R2 (marginal and conditional R2) for both LMMs and GLMMs, which are less susceptible to common problems. This method is illustrated by examples and can be widely employed by researchers in any fields of research, regardless of software packages used for fitting mixed-effects models. The proposed method has the potential to facilitate the presentation of R2 for a wide range of circumstances.

7,749 citations


Journal ArticleDOI
TL;DR: A novel dataset captured from a VW station wagon for use in mobile robotics and autonomous driving research, using a variety of sensor modalities such as high-resolution color and grayscale stereo cameras and a high-precision GPS/IMU inertial navigation system.
Abstract: We present a novel dataset captured from a VW station wagon for use in mobile robotics and autonomous driving research. In total, we recorded 6 hours of traffic scenarios at 10-100 Hz using a variety of sensor modalities such as high-resolution color and grayscale stereo cameras, a Velodyne 3D laser scanner and a high-precision GPS/IMU inertial navigation system. The scenarios are diverse, capturing real-world traffic situations, and range from freeways over rural areas to inner-city scenes with many static and dynamic objects. Our data is calibrated, synchronized and timestamped, and we provide the rectified and raw image sequences. Our dataset also contains object labels in the form of 3D tracklets, and we provide online benchmarks for stereo, optical flow, object detection and other tasks. This paper describes our recording platform, the data format and the utilities that we provide.

7,153 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present cosmological parameter constraints based on the final nine-year WMAP data, in conjunction with a number of additional cosmology data sets.
Abstract: We present cosmological parameter constraints based on the final nine-year WMAP data, in conjunction with a number of additional cosmological data sets. The WMAP data alone, and in combination, continue to be remarkably well fit by a six-parameter CDM model. When WMAP data are combined with measurements of the high-l cosmic microwave background (CMB) anisotropy, the baryon acoustic oscillation (BAO) scale, and the Hubble constant, the matter and energy densities, bh 2 , ch 2 , and , are each determined to a precision of 1.5%. The amplitude of the primordial spectrum is measured to within 3%, and there is now evidence for a tilt in the primordial spectrum at the 5 level, confirming the first detection of tilt based on the five-year WMAP data. At the end of the WMAP mission, the nine-year data decrease the allowable volume of the six-dimensional CDM parameter space by a factor of 68,000 relative to pre-WMAP measurements. We investigate a number of data combinations and show that their CDM parameter fits are consistent. New limits on deviations from the six-parameter model are presented, for example: the fractional contribution of tensor modes is limited to r < 0.13 (95% CL); the spatial curvature parameter is limited to k = 0.0027 +0.0039 0.0038 ; the summed mass of neutrinos is limited to P m < 0.44 eV (95% CL); and the number of relativistic species is found to lie within Ne = 3.84±0.40, when the full data are analyzed. The joint constraint on Ne and the primordial helium abundance, YHe, agrees with the prediction of standard Big Bang nucleosynthesis. We compare recent Planck measurements of the Sunyaev‐Zel’dovich eect with our seven-year measurements, and show their mutual agreement. Our analysis of the polarization pattern around temperature extrema is updated. This confirms a fundamental prediction of the standard cosmological model and provides a striking illustration of acoustic oscillations and adiabatic initial conditions in the early universe. Subject headings: cosmic microwave background, cosmology: observations, early universe, dark matter, space vehicles, space vehicles: instruments, instrumentation: detectors, telescopes

5,488 citations


Journal ArticleDOI
TL;DR: The results of this study may be used as a guideline for selecting primer pairs with the best overall coverage and phylum spectrum for specific applications, therefore reducing the bias in PCR-based microbial diversity studies.
Abstract: 16S ribosomal RNA gene (rDNA) amplicon analysis remains the standard approach for the cultivation-independent investigation of microbial diversity. The accuracy of these analyses depends strongly on the choice of primers. The overall coverage and phylum spectrum of 175 primers and 512 primer pairs were evaluated in silico with respect to the SILVA 16S/18S rDNA non-redundant reference dataset (SSURef 108 NR). Based on this evaluation a selection of 'best available' primer pairs for Bacteria and Archaea for three amplicon size classes (100-400, 400-1000, ≥ 1000 bp) is provided. The most promising bacterial primer pair (S-D-Bact-0341-b-S-17/S-D-Bact-0785-a-A-21), with an amplicon size of 464 bp, was experimentally evaluated by comparing the taxonomic distribution of the 16S rDNA amplicons with 16S rDNA fragments from directly sequenced metagenomes. The results of this study may be used as a guideline for selecting primer pairs with the best overall coverage and phylum spectrum for specific applications, therefore reducing the bias in PCR-based microbial diversity studies.

5,346 citations


Journal ArticleDOI
TL;DR: In this paper, the authors provided an assessment of black-carbon climate forcing that is comprehensive in its inclusion of all known and relevant processes and that is quantitative in providing best estimates and uncertainties of the main forcing terms: direct solar absorption; influence on liquid, mixed phase, and ice clouds; and deposition on snow and ice.
Abstract: Black carbon aerosol plays a unique and important role in Earth's climate system. Black carbon is a type of carbonaceous material with a unique combination of physical properties. This assessment provides an evaluation of black-carbon climate forcing that is comprehensive in its inclusion of all known and relevant processes and that is quantitative in providing best estimates and uncertainties of the main forcing terms: direct solar absorption; influence on liquid, mixed phase, and ice clouds; and deposition on snow and ice. These effects are calculated with climate models, but when possible, they are evaluated with both microphysical measurements and field observations. Predominant sources are combustion related, namely, fossil fuels for transportation, solid fuels for industrial and residential uses, and open burning of biomass. Total global emissions of black carbon using bottom-up inventory methods are 7500 Gg yr−1 in the year 2000 with an uncertainty range of 2000 to 29000. However, global atmospheric absorption attributable to black carbon is too low in many models and should be increased by a factor of almost 3. After this scaling, the best estimate for the industrial-era (1750 to 2005) direct radiative forcing of atmospheric black carbon is +0.71 W m−2 with 90% uncertainty bounds of (+0.08, +1.27) W m−2. Total direct forcing by all black carbon sources, without subtracting the preindustrial background, is estimated as +0.88 (+0.17, +1.48) W m−2. Direct radiative forcing alone does not capture important rapid adjustment mechanisms. A framework is described and used for quantifying climate forcings, including rapid adjustments. The best estimate of industrial-era climate forcing of black carbon through all forcing mechanisms, including clouds and cryosphere forcing, is +1.1 W m−2 with 90% uncertainty bounds of +0.17 to +2.1 W m−2. Thus, there is a very high probability that black carbon emissions, independent of co-emitted species, have a positive forcing and warm the climate. We estimate that black carbon, with a total climate forcing of +1.1 W m−2, is the second most important human emission in terms of its climate forcing in the present-day atmosphere; only carbon dioxide is estimated to have a greater forcing. Sources that emit black carbon also emit other short-lived species that may either cool or warm climate. Climate forcings from co-emitted species are estimated and used in the framework described herein. When the principal effects of short-lived co-emissions, including cooling agents such as sulfur dioxide, are included in net forcing, energy-related sources (fossil fuel and biofuel) have an industrial-era climate forcing of +0.22 (−0.50 to +1.08) W m−2 during the first year after emission. For a few of these sources, such as diesel engines and possibly residential biofuels, warming is strong enough that eliminating all short-lived emissions from these sources would reduce net climate forcing (i.e., produce cooling). When open burning emissions, which emit high levels of organic matter, are included in the total, the best estimate of net industrial-era climate forcing by all short-lived species from black-carbon-rich sources becomes slightly negative (−0.06 W m−2 with 90% uncertainty bounds of −1.45 to +1.29 W m−2). The uncertainties in net climate forcing from black-carbon-rich sources are substantial, largely due to lack of knowledge about cloud interactions with both black carbon and co-emitted organic carbon. In prioritizing potential black-carbon mitigation actions, non-science factors, such as technical feasibility, costs, policy design, and implementation feasibility play important roles. The major sources of black carbon are presently in different stages with regard to the feasibility for near-term mitigation. This assessment, by evaluating the large number and complexity of the associated physical and radiative processes in black-carbon climate forcing, sets a baseline from which to improve future climate forcing estimates.

4,591 citations


Journal ArticleDOI
Kerstin Howe, Matthew D. Clark, Carlos Torroja1, Carlos Torroja2  +171 moreInstitutions (11)
25 Apr 2013-Nature
TL;DR: A high-quality sequence assembly of the zebrafish genome is generated, made up of an overlapping set of completely sequenced large-insert clones that were ordered and oriented using a high-resolution high-density meiotic map, providing a clearer understanding of key genomic features such as a unique repeat content, a scarcity of pseudogenes, an enrichment of zebra fish-specific genes on chromosome 4 and chromosomal regions that influence sex determination.
Abstract: Zebrafish have become a popular organism for the study of vertebrate gene function. The virtually transparent embryos of this species, and the ability to accelerate genetic studies by gene knockdown or overexpression, have led to the widespread use of zebrafish in the detailed investigation of vertebrate gene function and increasingly, the study of human genetic disease. However, for effective modelling of human genetic disease it is important to understand the extent to which zebrafish genes and gene structures are related to orthologous human genes. To examine this, we generated a high-quality sequence assembly of the zebrafish genome, made up of an overlapping set of completely sequenced large-insert clones that were ordered and oriented using a high-resolution high-density meiotic map. Detailed automatic and manual annotation provides evidence of more than 26,000 protein-coding genes, the largest gene set of any vertebrate so far sequenced. Comparison to the human reference genome shows that approximately 70% of human genes have at least one obvious zebrafish orthologue. In addition, the high quality of this genome assembly provides a clearer understanding of key genomic features such as a unique repeat content, a scarcity of pseudogenes, an enrichment of zebrafish-specific genes on chromosome 4 and chromosomal regions that influence sex determination.

3,573 citations


Journal ArticleDOI
26 Apr 2013-Science
TL;DR: Pulsar J0348+0432 is only the second neutron star with a precisely determined mass of 2 M☉
Abstract: Many physically motivated extensions to general relativity (GR) predict significant deviations at energies present in massive neutron stars. We report the measurement of a 2.01 \(\pm \) 0.04 solar mass (M\(_\odot \)) pulsar in a 2.46-h orbit around a 0.172 \(\pm \) 0.003 M\(_\odot \) white dwarf. The high pulsar mass and the compact orbit make this system a sensitive laboratory of a previously untested strong-field gravity regime. Thus far, the observed orbital decay agrees with GR, supporting its validity even for the extreme conditions present in the system. The resulting constraints on deviations support the use of GR-based templates for ground-based gravitational wave detection experiments. Additionally, the system strengthens recent constraints on the properties of dense matter and provides novel insight to binary stellar astrophysics and pulsar recycling.

3,224 citations


Proceedings ArticleDOI
02 Dec 2013
TL;DR: This paper lifts two state-of-the-art 2D object representations to 3D, on the level of both local feature appearance and location, and shows their efficacy for estimating 3D geometry from images via ultra-wide baseline matching and 3D reconstruction.
Abstract: While 3D object representations are being revived in the context of multi-view object class detection and scene understanding, they have not yet attained wide-spread use in fine-grained categorization. State-of-the-art approaches achieve remarkable performance when training data is plentiful, but they are typically tied to flat, 2D representations that model objects as a collection of unconnected views, limiting their ability to generalize across viewpoints. In this paper, we therefore lift two state-of-the-art 2D object representations to 3D, on the level of both local feature appearance and location. In extensive experiments on existing and newly proposed datasets, we show our 3D object representations outperform their state-of-the-art 2D counterparts for fine-grained categorization and demonstrate their efficacy for estimating 3D geometry from images via ultra-wide baseline matching and 3D reconstruction.

Journal ArticleDOI
TL;DR: The final nine-year maps and basic results from the Wilkinson Microwave Anisotropy Probe (WMAP) mission are presented in this paper, where the authors present a highly constrained Lambda-CDM cosmological model with precise and accurate parameters.
Abstract: We present the final nine-year maps and basic results from the Wilkinson Microwave Anisotropy Probe (WMAP) mission. The full nine-year analysis of the time-ordered data provides updated characterizations and calibrations of the experiment. We also provide new nine-year full sky temperature maps that were processed to reduce the asymmetry of the effective beams. Temperature and polarization sky maps are examined to separate cosmic microwave background (CMB) anisotropy from foreground emission, and both types of signals are analyzed in detail.We provide new point source catalogs as well as new diffuse and point source foreground masks. An updated template-removal process is used for cosmological analysis; new foreground fits are performed, and new foreground reduced are presented.We nowimplement an optimal C(exp -1)1 weighting to compute the temperature angular power spectrum. The WMAP mission has resulted in a highly constrained Lambda-CDM cosmological model with precise and accurate parameters in agreement with a host of other cosmological measurements. When WMAP data are combined with finer scale CMB, baryon acoustic oscillation, and Hubble constant measurements, we find that big bang nucleosynthesis is well supported and there is no compelling evidence for a non-standard number of neutrino species (N(sub eff) = 3.84 +/- 0.40). The model fit also implies that the age of the universe is (sub 0) = 13.772 +/- 0.059 Gyr, and the fit Hubble constant is H(sub 0) = 69.32 +/- 0.80 km/s/ Mpc. Inflation is also supported: the fluctuations are adiabatic, with Gaussian random phases; the detection of a deviation of the scalar spectral index from unity, reported earlier by the WMAP team, now has high statistical significance (n(sub s) = 0.9608+/-0.0080); and the universe is close to flat/Euclidean (Omega = −0.0027+0.0039/−0.0038). Overall, the WMAP mission has resulted in a reduction of the cosmological parameter volume by a factor of 68,000 for the standard six-parameter Lambda-CDM model, based on CMB data alone. For a model including tensors, the allowed seven-parameter volume has been reduced by a factor 117,000. Other cosmological observations are in accord with the CMB predictions, and the combined data reduces the cosmological parameter volume even further.With no significant anomalies and an adequate goodness of fit, the inflationary flat Lambda-CDM model and its precise and accurate parameters rooted in WMAP data stands as the standard model of cosmology.

Journal ArticleDOI
TL;DR: This article attempts to strengthen the links between the two research communities by providing a survey of work in reinforcement learning for behavior generation in robots by highlighting both key challenges in robot reinforcement learning as well as notable successes.
Abstract: Reinforcement learning offers to robotics a framework and set of tools for the design of sophisticated and hard-to-engineer behaviors. Conversely, the challenges of robotic problems provide both inspiration, impact, and validation for developments in reinforcement learning. The relationship between disciplines has sufficient promise to be likened to that between physics and mathematics. In this article, we attempt to strengthen the links between the two research communities by providing a survey of work in reinforcement learning for behavior generation in robots. We highlight both key challenges in robot reinforcement learning as well as notable successes. We discuss how contributions tamed the complexity of the domain and study the role of algorithms, representations, and prior knowledge in achieving these successes. As a result, a particular focus of our paper lies on the choice between model-based and model-free as well as between value-function-based and policy-search methods. By analyzing a simple problem in some detail we demonstrate how reinforcement learning approaches may be profitably applied, and we note throughout open questions and the tremendous potential for future research.

Journal ArticleDOI
TL;DR: The plant microbiota emerges as a fundamental trait that includes mutualism enabled through diverse biochemical mechanisms, as revealed by studies on plant growth- Promoting and plant health-promoting bacteria.
Abstract: Plants host distinct bacterial communities on and inside various plant organs, of which those associated with roots and the leaf surface are best characterized. The phylogenetic composition of these communities is defined by relatively few bacterial phyla, including Actinobacteria, Bacteroidetes, Firmicutes, and Proteobacteria. A synthesis of available data suggests a two-step selection process by which the bacterial microbiota of roots is differentiated from the surrounding soil biome. Rhizodeposition appears to fuel an initial substrate-driven community shift in the rhizosphere, which converges with host genotype–dependent finetuning of microbiota profiles in the selection of root endophyte assemblages. Substrate-driven selection also underlies the establishment of phyllosphere communities but takes place solely at the immediate leaf surface. Both the leaf and root microbiota contain bacteria that provide indirect pathogen protection, but root microbiota members appear to serve additional host functions through the acquisition of nutrients from soil for plant growth. Thus, the plant microbiota emerges as a fundamental trait that includes mutualism enabled through diverse biochemical mechanisms, as revealed by studies on plant growth–promoting and plant health–promoting bacteria.

Journal ArticleDOI
TL;DR: Recent technological and intellectual advances that have changed thinking about five questions about how have bacteria facilitated the origin and evolution of animals; how do animals and bacteria affect each other’s genomes; how does normal animal development depend on bacterial partners; and how is homeostasis maintained between animals and their symbionts are highlighted.
Abstract: In the last two decades, the widespread application of genetic and genomic approaches has revealed a bacterial world astonishing in its ubiquity and diversity. This review examines how a growing knowledge of the vast range of animal–bacterial interactions, whether in shared ecosystems or intimate symbioses, is fundamentally altering our understanding of animal biology. Specifically, we highlight recent technological and intellectual advances that have changed our thinking about five questions: how have bacteria facilitated the origin and evolution of animals; how do animals and bacteria affect each other’s genomes; how does normal animal development depend on bacterial partners; how is homeostasis maintained between animals and their symbionts; and how can ecological approaches deepen our understanding of the multiple levels of animal–bacterial interaction. As answers to these fundamental questions emerge, all biologists will be challenged to broaden their appreciation of these interactions and to include investigations of the relationships between and among bacteria and their animal partners as we seek a better understanding of the natural world.

Journal ArticleDOI
M. P. van Haarlem1, Michael W. Wise1, Michael W. Wise2, A. W. Gunst1  +219 moreInstitutions (27)
TL;DR: In dit artikel zullen the authors LOFAR beschrijven: van de astronomische mogelijkheden met de nieuwe telescoop tot aan een nadere technische beshrijving of het instrument.
Abstract: LOFAR, the LOw-Frequency ARray, is a new-generation radio interferometer constructed in the north of the Netherlands and across europe. Utilizing a novel phased-array design, LOFAR covers the largely unexplored low-frequency range from 10-240 MHz and provides a number of unique observing capabilities. Spreading out from a core located near the village of Exloo in the northeast of the Netherlands, a total of 40 LOFAR stations are nearing completion. A further five stations have been deployed throughout Germany, and one station has been built in each of France, Sweden, and the UK. Digital beam-forming techniques make the LOFAR system agile and allow for rapid repointing of the telescope as well as the potential for multiple simultaneous observations. With its dense core array and long interferometric baselines, LOFAR achieves unparalleled sensitivity and angular resolution in the low-frequency radio regime. The LOFAR facilities are jointly operated by the International LOFAR Telescope (ILT) foundation, as an observatory open to the global astronomical community. LOFAR is one of the first radio observatories to feature automated processing pipelines to deliver fully calibrated science products to its user community. LOFAR's new capabilities, techniques and modus operandi make it an important pathfinder for the Square Kilometre Array (SKA). We give an overview of the LOFAR instrument, its major hardware and software components, and the core science objectives that have driven its design. In addition, we present a selection of new results from the commissioning phase of this new radio observatory.

Journal ArticleDOI
S. Hong Lee1, Stephan Ripke2, Stephan Ripke3, Benjamin M. Neale2  +402 moreInstitutions (124)
TL;DR: Empirical evidence of shared genetic etiology for psychiatric disorders can inform nosology and encourages the investigation of common pathophysiologies for related disorders.
Abstract: Most psychiatric disorders are moderately to highly heritable. The degree to which genetic variation is unique to individual disorders or shared across disorders is unclear. To examine shared genetic etiology, we use genome-wide genotype data from the Psychiatric Genomics Consortium (PGC) for cases and controls in schizophrenia, bipolar disorder, major depressive disorder, autism spectrum disorders (ASD) and attention-deficit/hyperactivity disorder (ADHD). We apply univariate and bivariate methods for the estimation of genetic variation within and covariation between disorders. SNPs explained 17-29% of the variance in liability. The genetic correlation calculated using common SNPs was high between schizophrenia and bipolar disorder (0.68 ± 0.04 s.e.), moderate between schizophrenia and major depressive disorder (0.43 ± 0.06 s.e.), bipolar disorder and major depressive disorder (0.47 ± 0.06 s.e.), and ADHD and major depressive disorder (0.32 ± 0.07 s.e.), low between schizophrenia and ASD (0.16 ± 0.06 s.e.) and non-significant for other pairs of disorders as well as between psychiatric disorders and the negative control of Crohn's disease. This empirical evidence of shared genetic etiology for psychiatric disorders can inform nosology and encourages the investigation of common pathophysiologies for related disorders.

Journal ArticleDOI
TL;DR: The physics of DW-MRI is reviewed, currently preferred methodology is indicated, and the limits of interpretation of its results are explained, with a list of 'Do's and Don'ts' which define good practice in this expanding area of imaging neuroscience.

Journal ArticleDOI
TL;DR: Astropy as mentioned in this paper provides core astronomy-related functionality to the community, including support for domain-specific file formats such as Flexible Image Transport System (FITS) files, Virtual Observatory (VO) tables, and common ASCII table formats, unit and physical quantity conversions, physical constants specific to astronomy, celestial coordinate and time transformations, world coordinate system (WCS) support, generalized containers for representing gridded as well as tabular data, and a framework for cosmological transformations and conversions.
Abstract: We present the first public version (v0.2) of the open-source and community-developed Python package, Astropy. This package provides core astronomy-related functionality to the community, including support for domain-specific file formats such as Flexible Image Transport System (FITS) files, Virtual Observatory (VO) tables, and common ASCII table formats, unit and physical quantity conversions, physical constants specific to astronomy, celestial coordinate and time transformations, world coordinate system (WCS) support, generalized containers for representing gridded as well as tabular data, and a framework for cosmological transformations and conversions. Significant functionality is under active development, such as a model fitting framework, VO client and server tools, and aperture and point spread function (PSF) photometry tools. The core development team is actively making additions and enhancements to the current code base, and we encourage anyone interested to participate in the development of future Astropy versions.

Journal ArticleDOI
TL;DR: The Baryon Oscillation Spectroscopic Survey (BOSS) as discussed by the authors was designed to measure the scale of baryon acoustic oscillations (BAO) in the clustering of matter over a larger volume than the combined efforts of all previous spectroscopic surveys of large-scale structure.
Abstract: The Baryon Oscillation Spectroscopic Survey (BOSS) is designed to measure the scale of baryon acoustic oscillations (BAO) in the clustering of matter over a larger volume than the combined efforts of all previous spectroscopic surveys of large-scale structure. BOSS uses 1.5 million luminous galaxies as faint as i = 19.9 over 10,000 deg2 to measure BAO to redshifts z < 0.7. Observations of neutral hydrogen in the Lyα forest in more than 150,000 quasar spectra (g < 22) will constrain BAO over the redshift range 2.15 < z < 3.5. Early results from BOSS include the first detection of the large-scale three-dimensional clustering of the Lyα forest and a strong detection from the Data Release 9 data set of the BAO in the clustering of massive galaxies at an effective redshift z = 0.57. We project that BOSS will yield measurements of the angular diameter distance dA to an accuracy of 1.0% at redshifts z = 0.3 and z = 0.57 and measurements of H(z) to 1.8% and 1.7% at the same redshifts. Forecasts for Lyα forest constraints predict a measurement of an overall dilation factor that scales the highly degenerate DA (z) and H –1(z) parameters to an accuracy of 1.9% at z ~ 2.5 when the survey is complete. Here, we provide an overview of the selection of spectroscopic targets, planning of observations, and analysis of data and data quality of BOSS.

Journal ArticleDOI
26 Sep 2013-Nature
TL;DR: Se sequencing and deep analysis of messenger RNA and microRNA from lymphoblastoid cell lines of 462 individuals from the 1000 Genomes Project—the first uniformly processed high-throughput RNA-sequencing data from multiple human populations with high-quality genome sequences discover extremely widespread genetic variation affecting the regulation of most genes.
Abstract: Genome sequencing projects are discovering millions of genetic variants in humans, and interpretation of their functional effects is essential for understanding the genetic basis of variation in human traits. Here we report sequencing and deep analysis of messenger RNA and microRNA from lymphoblastoid cell lines of 462 individuals from the 1000 Genomes Project--the first uniformly processed high-throughput RNA-sequencing data from multiple human populations with high-quality genome sequences. We discover extremely widespread genetic variation affecting the regulation of most genes, with transcript structure and expression level variation being equally common but genetically largely independent. Our characterization of causal regulatory variation sheds light on the cellular mechanisms of regulatory and loss-of-function variation, and allows us to infer putative causal variants for dozens of disease-associated loci. Altogether, this study provides a deep understanding of the cellular mechanisms of transcriptome variation and of the landscape of functional variants in the human genome.

Journal ArticleDOI
TL;DR: It is shown that label-free snapshot proteomics can be used to obtain quantitative time-resolved profiles of human plasma coronas formed on silica and polystyrene nanoparticles of various size and surface functionalization.
Abstract: In biological fluids, proteins bind to the surface of nanoparticles to form a coating known as the protein corona, which can critically affect the interaction of the nanoparticles with living systems. As physiological systems are highly dynamic, it is important to obtain a time-resolved knowledge of protein-corona formation, development and biological relevancy. Here we show that label-free snapshot proteomics can be used to obtain quantitative time-resolved profiles of human plasma coronas formed on silica and polystyrene nanoparticles of various size and surface functionalization. Complex time- and nanoparticle-specific coronas, which comprise almost 300 different proteins, were found to form rapidly (<0.5 minutes) and, over time, to change significantly in terms of the amount of bound protein, but not in composition. Rapid corona formation is found to affect haemolysis, thrombocyte activation, nanoparticle uptake and endothelial cell death at an early exposure time.

Journal ArticleDOI
TL;DR: A perspective on the context and evolutionary significance of hybridization during speciation is offered, highlighting issues of current interest and debate and suggesting that the Dobzhansky–Muller model of hybrid incompatibilities requires a broader interpretation.
Abstract: Hybridization has many and varied impacts on the process of speciation. Hybridization may slow or reverse differentiation by allowing gene flow and recombination. It may accelerate speciation via adaptive introgression or cause near-instantaneous speciation by allopolyploidization. It may have multiple effects at different stages and in different spatial contexts within a single speciation event. We offer a perspective on the context and evolutionary significance of hybridization during speciation, highlighting issues of current interest and debate. In secondary contact zones, it is uncertain if barriers to gene flow will be strengthened or broken down due to recombination and gene flow. Theory and empirical evidence suggest the latter is more likely, except within and around strongly selected genomic regions. Hybridization may contribute to speciation through the formation of new hybrid taxa, whereas introgression of a few loci may promote adaptive divergence and so facilitate speciation. Gene regulatory networks, epigenetic effects and the evolution of selfish genetic material in the genome suggest that the Dobzhansky-Muller model of hybrid incompatibilities requires a broader interpretation. Finally, although the incidence of reinforcement remains uncertain, this and other interactions in areas of sympatry may have knock-on effects on speciation both within and outside regions of hybridization.

Book ChapterDOI
01 Nov 2013
TL;DR: In this article, an overview of model capabilities as assessed in this chapter, including improvements, or lack thereof, relative to models assessed in the AR4, is presented, along with an assessment of recent work connecting model performance to the detection and attribution of climate change as well as to future projections.
Abstract: Climate models have continued to be developed and improved since the AR4, and many models have been extended into Earth System models by including the representation of biogeochemical cycles important to climate change. These models allow for policy-relevant calculations such as the carbon dioxide (CO2) emissions compatible with a specified climate stabilization target. In addition, the range of climate variables and processes that have been evaluated has greatly expanded, and differences between models and observations are increasingly quantified using ‘performance metrics’. In this chapter, model evaluation covers simulation of the mean climate, of historical climate change, of variability on multiple time scales and of regional modes of variability. This evaluation is based on recent internationally coordinated model experiments, including simulations of historic and paleo climate, specialized experiments designed to provide insight into key climate processes and feedbacks and regional climate downscaling. Figure 9.44 provides an overview of model capabilities as assessed in this chapter, including improvements, or lack thereof, relative to models assessed in the AR4. The chapter concludes with an assessment of recent work connecting model performance to the detection and attribution of climate change as well as to future projections.

Journal ArticleDOI
TL;DR: In this paper, the authors construct decadal budgets for methane sources and sinks between 1980 and 2010, using a combination of atmospheric measurements and results from chemical transport models, ecosystem models, climate chemistry models and inventories of anthropogenic emissions.
Abstract: Methane is an important greenhouse gas, responsible for about 20% of the warming induced by long-lived greenhouse gases since pre-industrial times. By reacting with hydroxyl radicals, methane reduces the oxidizing capacity of the atmosphere and generates ozone in the troposphere. Although most sources and sinks of methane have been identified, their relative contributions to atmospheric methane levels are highly uncertain. As such, the factors responsible for the observed stabilization of atmospheric methane levels in the early 2000s, and the renewed rise after 2006, remain unclear. Here, we construct decadal budgets for methane sources and sinks between 1980 and 2010, using a combination of atmospheric measurements and results from chemical transport models, ecosystem models, climate chemistry models and inventories of anthropogenic emissions. The resultant budgets suggest that data-driven approaches and ecosystem models overestimate total natural emissions. We build three contrasting emission scenarios-which differ in fossil fuel and microbial emissions-to explain the decadal variability in atmospheric methane levels detected, here and in previous studies, since 1985. Although uncertainties in emission trends do not allow definitive conclusions to be drawn, we show that the observed stabilization of methane levels between 1999 and 2006 can potentially be explained by decreasing-to-stable fossil fuel emissions, combined with stable-to-increasing microbial emissions. We show that a rise in natural wetland emissions and fossil fuel emissions probably accounts for the renewed increase in global methane levels after 2006, although the relative contribution of these two sources remains uncertain. © 2013 Macmillan Publishers Limited.

Journal ArticleDOI
TL;DR: Providing a future energy supply that is secure and CO_2-neutral will require switching to nonfossil energy sources such as wind, solar, nuclear, and geothermal energy and developing methods for transforming the energy produced by these new sources into forms that can be stored, transported, and used upon demand.
Abstract: Two major energy-related problems confront the world in the next 50 years. First, increased worldwide competition for gradually depleting fossil fuel reserves (derived from past photosynthesis) will lead to higher costs, both monetarily and politically. Second, atmospheric CO_2 levels are at their highest recorded level since records began. Further increases are predicted to produce large and uncontrollable impacts on the world climate. These projected impacts extend beyond climate to ocean acidification, because the ocean is a major sink for atmospheric CO2.1 Providing a future energy supply that is secure and CO_2-neutral will require switching to nonfossil energy sources such as wind, solar, nuclear, and geothermal energy and developing methods for transforming the energy produced by these new sources into forms that can be stored, transported, and used upon demand.

Journal ArticleDOI
TL;DR: The collective vision of the future of extracellular enzyme research is offered: one that will depend on imaginative thinking as well as technological advances, and be built upon synergies between diverse disciplines.
Abstract: This review focuses on some important and challenging aspects of soil extracellular enzyme research. We report on recent discoveries, identify key research needs and highlight the many opportunities offered by interactions with other microbial enzymologists. The biggest challenges are to understand how the chemical, physical and biological properties of soil affect enzyme production, diffusion, substrate turnover and the proportion of the product that is made available to the producer cells. Thus, the factors that regulate the synthesis and secretion of extracellular enzymes and their distribution after they are externalized are important topics, not only for soil enzymologists, but also in the broader context of microbial ecology. In addition, there are many uncertainties about the ways in which microbes and their extracellular enzymes overcome the generally destructive, inhibitory and competitive properties of the soil matrix, and the various strategies they adopt for effective substrate detection and utilization. The complexity of extracellular enzyme activities in depolymerising macromolecular organics is exemplified by lignocellulose degradation and how the many enzymes involved respond to structural diversity and changing nutrient availabilities. The impacts of climate change on microbes and their extracellular enzymes, although of profound importance, are not well understood but we suggest how they may be predicted, assessed and managed. We describe recent advances that allow for the manipulation of extracellular enzyme activities to facilitate bioremediation, carbon sequestration and plant growth promotion. We also contribute to the ongoing debate as to how to assay enzyme activities in soil and what the measurements tell us, in the context of both traditional methods and the newer techniques that are being developed and adopted. Finally, we offer our collective vision of the future of extracellular enzyme research: one that will depend on imaginative thinking as well as technological advances, and be built upon synergies between diverse disciplines.

Journal ArticleDOI
TL;DR: The panel design that grasps the dynamic character of the ageing process, its multidisciplinary approach that delivers the full picture of individual and societal ageing, and its cross-nationally ex-ante harmonized design that permits international comparisons of health, economic and social outcomes in Europe and the USA.
Abstract: SHARE is a unique panel database of micro data on health, socio-economic status and social and family networks covering most of the European Union and Israel. To date, SHARE has collected three panel waves (2004, 2006, 2010) of current living circumstances and retrospective life histories (2008, SHARELIFE); 6 additional waves are planned until 2024. The more than 150 000 interviews give a broad picture of life after the age of 50 years, measuring physical and mental health, economic and non-economic activities, income and wealth, transfers of time and money within and outside the family as well as life satisfaction and well-being. The data are available to the scientific community free of charge at www.share-project.org after registration. SHARE is harmonized with the US Health and Retirement Study (HRS) and the English Longitudinal Study of Ageing (ELSA) and has become a role model for several ageing surveys worldwide. SHARE's scientific power is based on its panel design that grasps the dynamic character of the ageing process, its multidisciplinary approach that delivers the full picture of individual and societal ageing, and its cross-nationally ex-ante harmonized design that permits international comparisons of health, economic and social outcomes in Europe and the USA.

Journal ArticleDOI
TL;DR: In this article, the Max-Planck-Institute Earth System Model (MPI-ESM) is used in the Coupled Model Intercomparison Project phase 5 (CMIP5) in a series of climate change experiments for either idealized CO2-only forcing or forcings based on observations and the Representative Concentration Pathway (RCP) scenarios.
Abstract: [1] The new Max-Planck-Institute Earth System Model (MPI-ESM) is used in the Coupled Model Intercomparison Project phase 5 (CMIP5) in a series of climate change experiments for either idealized CO2-only forcing or forcings based on observations and the Representative Concentration Pathway (RCP) scenarios. The paper gives an overview of the model configurations, experiments related forcings, and initialization procedures and presents results for the simulated changes in climate and carbon cycle. It is found that the climate feedback depends on the global warming and possibly the forcing history. The global warming from climatological 1850 conditions to 2080–2100 ranges from 1.5°C under the RCP2.6 scenario to 4.4°C under the RCP8.5 scenario. Over this range, the patterns of temperature and precipitation change are nearly independent of the global warming. The model shows a tendency to reduce the ocean heat uptake efficiency toward a warmer climate, and hence acceleration in warming in the later years. The precipitation sensitivity can be as high as 2.5% K−1 if the CO2 concentration is constant, or as small as 1.6% K−1, if the CO2 concentration is increasing. The oceanic uptake of anthropogenic carbon increases over time in all scenarios, being smallest in the experiment forced by RCP2.6 and largest in that for RCP8.5. The land also serves as a net carbon sink in all scenarios, predominantly in boreal regions. The strong tropical carbon sources found in the RCP2.6 and RCP8.5 experiments are almost absent in the RCP4.5 experiment, which can be explained by reforestation in the RCP4.5 scenario.

Journal ArticleDOI
TL;DR: In this article, a multi-epoch abundance matching (MEAM) model was proposed to determine the relationship between the stellar masses of galaxies and the masses of their host dark matter haloes over the entire cosmic history from z � 4 to the present.
Abstract: We present a new statistical method to determine the relationship between the stellar masses of galaxies and the masses of their host dark matter haloes over the entire cosmic history from z � 4 to the present. This multi-epoch abundance matching (MEAM) model self-consistently takes into account that satellite galaxies first become satellites at times earlier than they are observed. We employ a redshift-dependent parameterization of the stellar-to-halo mass relation to populate haloes and subhaloes in the Millennium simulations with galaxies, requiring that the observed stellar mass functions at several redshifts be reproduced simultaneously. We show that physically meaningful growth of massive galaxies is consistent with these data only if observational mass errors are taken into account. Using merger trees extracted from the dark matter simulations in combination with MEAM, we predict the average assembly histories of galaxies, separating into star formation within the galaxies (in-situ) and accretion of stars (ex-situ). Our main results are: The peak star formation efficiency decreases with redshift from 23 per cent at z = 0 to 9 per cent at z = 4 while the corresponding halo mass increases from 10 11.8 M⊙ to 10 12.5 M⊙. The star formation rate of central galaxies peaks at a redshift which depends on halo mass; for massive haloes this peak is at early cosmic times while for low-mass galaxies the peak has not been reached yet. In haloes similar to that of the Milky-Way about half of the central stellar mass is assembled after z = 0.7. In low-mass haloes, the accretion of satellites contributes little to the assembly of their central galaxies, while in massive haloes more than half of the central stellar mass is formed ex-situ with significant accretion of satellites at z < 2. We find that our method implies a cosmic star formation history and an evolution of specific star formation rates which are consistent with those inferred directly. We present convenient fitting functions for stellar masses, star formation rates, and accretion rates as functions of halo mass and redshift.