scispace - formally typeset
Search or ask a question

Showing papers by "Max Planck Society published in 2005"


Journal ArticleDOI
TL;DR: The software suite GROMACS (Groningen MAchine for Chemical Simulation) that was developed at the University of Groningen, The Netherlands, in the early 1990s is described, which is a very fast program for molecular dynamics simulation.
Abstract: This article describes the software suite GROMACS (Groningen MAchine for Chemical Simulation) that was developed at the University of Groningen, The Netherlands, in the early 1990s. The software, written in ANSI C, originates from a parallel hardware project, and is well suited for parallelization on processor clusters. By careful optimization of neighbor searching and of inner loop performance, GROMACS is a very fast program for molecular dynamics simulation. It does not have a force field of its own, but is compatible with GROMOS, OPLS, AMBER, and ENCAD force fields. In addition, it can handle polarizable shell models and flexible constraints. The program is versatile, as force routines can be added by the user, tabulated functions can be specified, and analyses can be easily customized. Nonequilibrium dynamics and free energy determinations are incorporated. Interfaces with popular quantum-chemical packages (MOPAC, GAMES-UK, GAUSSIAN) are provided to perform mixed MM/QM simulations. The package includes about 100 utility and analysis programs. GROMACS is in the public domain and distributed (with source code and documentation) under the GNU General Public License. It is maintained by a group of developers from the Universities of Groningen, Uppsala, and Stockholm, and the Max Planck Institute for Polymer Research in Mainz. Its Web site is http://www.gromacs.org.

13,116 citations


Book
23 Nov 2005
TL;DR: The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and deals with the supervised learning problem for both regression and classification.
Abstract: A comprehensive and self-contained introduction to Gaussian processes, which provide a principled, practical, probabilistic approach to learning in kernel machines. Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. GPs have received increased attention in the machine-learning community over the past decade, and this book provides a long-needed systematic and unified treatment of theoretical and practical aspects of GPs in machine learning. The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics. The book deals with the supervised-learning problem for both regression and classification, and includes detailed algorithms. A wide variety of covariance (kernel) functions are presented and their properties discussed. Model selection is discussed both from a Bayesian and a classical perspective. Many connections to other well-known techniques from machine learning and statistics are discussed, including support-vector machines, neural networks, splines, regularization networks, relevance vector machines and others. Theoretical issues including learning curves and the PAC-Bayesian framework are treated, and several approximation methods for learning with large datasets are discussed. The book contains illustrative examples and exercises, and code and datasets are available on the Web. Appendixes provide mathematical background and a discussion of Gaussian Markov processes.

11,357 citations


Journal ArticleDOI
TL;DR: ERA-40 is a re-analysis of meteorological observations from September 1957 to August 2002 produced by the European Centre for Medium-Range Weather Forecasts (ECMWF) in collaboration with many institutions as mentioned in this paper.
Abstract: ERA-40 is a re-analysis of meteorological observations from September 1957 to August 2002 produced by the European Centre for Medium-Range Weather Forecasts (ECMWF) in collaboration with many institutions. The observing system changed considerably over this re-analysis period, with assimilable data provided by a succession of satellite-borne instruments from the 1970s onwards, supplemented by increasing numbers of observations from aircraft, ocean-buoys and other surface platforms, but with a declining number of radiosonde ascents since the late 1980s. The observations used in ERA-40 were accumulated from many sources. The first part of this paper describes the data acquisition and the principal changes in data type and coverage over the period. It also describes the data assimilation system used for ERA-40. This benefited from many of the changes introduced into operational forecasting since the mid-1990s, when the systems used for the 15-year ECMWF re-analysis (ERA-15) and the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) re-analysis were implemented. Several of the improvements are discussed. General aspects of the production of the analyses are also summarized. A number of results indicative of the overall performance of the data assimilation system, and implicitly of the observing system, are presented and discussed. The comparison of background (short-range) forecasts and analyses with observations, the consistency of the global mass budget, the magnitude of differences between analysis and background fields and the accuracy of medium-range forecasts run from the ERA-40 analyses are illustrated. Several results demonstrate the marked improvement that was made to the observing system for the southern hemisphere in the 1970s, particularly towards the end of the decade. In contrast, the synoptic quality of the analysis for the northern hemisphere is sufficient to provide forecasts that remain skilful well into the medium range for all years. Two particular problems are also examined: excessive precipitation over tropical oceans and a too strong Brewer-Dobson circulation, both of which are pronounced in later years. Several other aspects of the quality of the re-analyses revealed by monitoring and validation studies are summarized. Expectations that the ‘second-generation’ ERA-40 re-analysis would provide products that are better than those from the firstgeneration ERA-15 and NCEP/NCAR re-analyses are found to have been met in most cases. © Royal Meteorological Society, 2005. The contributions of N. A. Rayner and R. W. Saunders are Crown copyright.

7,110 citations


Journal ArticleDOI
TL;DR: GADGET-2 as mentioned in this paper is a massively parallel tree-SPH code, capable of following a collisionless fluid with the N-body method, and an ideal gas by means of smoothed particle hydrodynamics.
Abstract: We discuss the cosmological simulation code GADGET-2, a new massively parallel TreeSPH code, capable of following a collisionless fluid with the N-body method, and an ideal gas by means of smoothed particle hydrodynamics (SPH). Our implementation of SPH manifestly conserves energy and entropy in regions free of dissipation, while allowing for fully adaptive smoothing lengths. Gravitational forces are computed with a hierarchical multipole expansion, which can optionally be applied in the form of a TreePM algorithm, where only short-range forces are computed with the ‘tree’ method while long-range forces are determined with Fourier techniques. Time integration is based on a quasi-symplectic scheme where long-range and short-range forces can be integrated with different time-steps. Individual and adaptive short-range time-steps may also be employed. The domain decomposition used in the parallelization algorithm is based on a space-filling curve, resulting in high flexibility and tree force errors that do not depend on the way the domains are cut. The code is efficient in terms of memory consumption and required communication bandwidth. It has been used to compute the first cosmological N-body simulation with more than 10 10 dark matter particles, reaching a homogeneous spatial dynamic range of 10 5 per dimension in a three-dimensional box. It has also been used to carry out very large cosmological SPH simulations that account for radiative cooling and star formation, reaching total particle numbers of more than 250 million. We present the algorithms used by the code and discuss their accuracy and performance using a number of test problems. GADGET-2 is publicly released to the research community. Ke yw ords: methods: numerical ‐ galaxies: interactions ‐ dark matter.

6,196 citations


Journal ArticleDOI
TL;DR: This paper considers the requirements and implementation constraints on a framework that simultaneously enables an efficient discretization with associated hierarchical indexation and fast analysis/synthesis of functions defined on the sphere and demonstrates how these are explicitly satisfied by HEALPix.
Abstract: HEALPix the Hierarchical Equal Area isoLatitude Pixelization is a versatile structure for the pixelization of data on the sphere. An associated library of computational algorithms and visualization software supports fast scientific applications executable directly on discretized spherical maps generated from very large volumes of astronomical data. Originally developed to address the data processing and analysis needs of the present generation of cosmic microwave background experiments (e.g., BOOMERANG, WMAP), HEALPix can be expanded to meet many of the profound challenges that will arise in confrontation with the observational output of future missions and experiments, including, e.g., Planck, Herschel, SAFIR, and the Beyond Einstein inflation probe. In this paper we consider the requirements and implementation constraints on a framework that simultaneously enables an efficient discretization with associated hierarchical indexation and fast analysis/synthesis of functions defined on the sphere. We demonstrate how these are explicitly satisfied by HEALPix.

5,518 citations


Journal ArticleDOI
02 Jun 2005-Nature
TL;DR: It is shown that baryon-induced features in the initial conditions of the Universe are reflected in distorted form in the low-redshift galaxy distribution, an effect that can be used to constrain the nature of dark energy with future generations of observational surveys of galaxies.
Abstract: The cold dark matter model has become the leading theoretical picture for the formation of structure in the Universe. This model, together with the theory of cosmic inflation, makes a clear prediction for the initial conditions for structure formation and predicts that structures grow hierarchically through gravitational instability. Testing this model requires that the precise measurements delivered by galaxy surveys can be compared to robust and equally precise theoretical calculations. Here we present a simulation of the growth of dark matter structure using 2,1603 particles, following them from redshift z = 127 to the present in a cube-shaped region 2.230 billion lightyears on a side. In postprocessing, we also follow the formation and evolution of the galaxies and quasars. We show that baryon-induced features in the initial conditions of the Universe are reflected in distorted form in the low-redshift galaxy distribution, an effect that can be used to constrain the nature of dark energy with future generations of observational surveys of galaxies.

4,814 citations


Journal ArticleDOI
27 May 2005-Science
TL;DR: Using in vivo two-photon imaging in neocortex, it is found that microglial cells are highly active in their presumed resting state, continually surveying their microenvironment with extremely motile processes and protrusions.
Abstract: Microglial cells represent the immune system of the mammalian brain and therefore are critically involved in various injuries and diseases. Little is known about their role in the healthy brain and their immediate reaction to brain damage. By using in vivo two-photon imaging in neocortex, we found that microglial cells are highly active in their presumed resting state, continually surveying their microenvironment with extremely motile processes and protrusions. Furthermore, blood-brain barrier disruption provoked immediate and focal activation of microglia, switching their behavior from patroling to shielding of the injured site. Microglia thus are busy and vigilant housekeepers in the adult brain.

4,458 citations


Journal ArticleDOI
TL;DR: In this paper, the authors adapted the naturally occurring algal protein Channelrhodopsin-2, a rapidly gated light-sensitive cation channel, by using lentiviral gene delivery in combination with high-speed optical switching to photostimulate mammalian neurons.
Abstract: Temporally precise, noninvasive control of activity in well-defined neuronal populations is a long-sought goal of systems neuroscience. We adapted for this purpose the naturally occurring algal protein Channelrhodopsin-2, a rapidly gated light-sensitive cation channel, by using lentiviral gene delivery in combination with high-speed optical switching to photostimulate mammalian neurons. We demonstrate reliable, millisecond-timescale control of neuronal spiking, as well as control of excitatory and inhibitory synaptic transmission. This technology allows the use of light to alter neural processing at the level of single spikes and synaptic events, yielding a widely applicable tool for neuroscientists and biomedical engineers.

4,411 citations


Journal ArticleDOI
29 Sep 2005-Nature
TL;DR: 13 models of the ocean–carbon cycle are used to assess calcium carbonate saturation under the IS92a ‘business-as-usual’ scenario for future emissions of anthropogenic carbon dioxide and indicate that conditions detrimental to high-latitude ecosystems could develop within decades, not centuries as suggested previously.
Abstract: Today's surface ocean is saturated with respect to calcium carbonate, but increasing atmospheric carbon dioxide concentrations are reducing ocean pH and carbonate ion concentrations, and thus the level of calcium carbonate saturation. Experimental evidence suggests that if these trends continue, key marine organisms—such as corals and some plankton—will have difficulty maintaining their external calcium carbonate skeletons. Here we use 13 models of the ocean–carbon cycle to assess calcium carbonate saturation under the IS92a 'business-as-usual' scenario for future emissions of anthropogenic carbon dioxide. In our projections, Southern Ocean surface waters will begin to become undersaturated with respect to aragonite, a metastable form of calcium carbonate, by the year 2050. By 2100, this undersaturation could extend throughout the entire Southern Ocean and into the subarctic Pacific Ocean. When live pteropods were exposed to our predicted level of undersaturation during a two-day shipboard experiment, their aragonite shells showed notable dissolution. Our findings indicate that conditions detrimental to high-latitude ecosystems could develop within decades, not centuries as suggested previously.

4,244 citations


Journal ArticleDOI
TL;DR: A new, MATLAB based toolbox for the SPM2 software package is introduced which enables the integration of probabilistic cytoarchitectonic maps and results of functional imaging studies and an easy-to-use tool for the integrated analysis of functional and anatomical data in a common reference space.

3,911 citations


Journal ArticleDOI
TL;DR: Fundamental concepts of nonlinear microscopy are reviewed and conditions relevant for achieving large imaging depths in intact tissue are discussed.
Abstract: With few exceptions biological tissues strongly scatter light, making high-resolution deep imaging impossible for traditional⎯including confocal⎯fluorescence microscopy. Nonlinear optical microscopy, in particular two photon–excited fluorescence microscopy, has overcome this limitation, providing large depth penetration mainly because even multiply scattered signal photons can be assigned to their origin as the result of localized nonlinear signal generation. Two-photon microscopy thus allows cellular imaging several hundred microns deep in various organs of living animals. Here we review fundamental concepts of nonlinear microscopy and discuss conditions relevant for achieving large imaging depths in intact tissue.

Journal ArticleDOI
TL;DR: In response to stress, the brain activates several neuropeptide-secreting systems, which eventually leads to the release of adrenal corticosteroid hormones, which subsequently feed back on the brain and bind to two types of nuclear receptor that act as transcriptional regulators as mentioned in this paper.
Abstract: In response to stress, the brain activates several neuropeptide-secreting systems. This eventually leads to the release of adrenal corticosteroid hormones, which subsequently feed back on the brain and bind to two types of nuclear receptor that act as transcriptional regulators. By targeting many genes, corticosteroids function in a binary fashion, and serve as a master switch in the control of neuronal and network responses that underlie behavioural adaptation. In genetically predisposed individuals, an imbalance in this binary control mechanism can introduce a bias towards stress-related brain disease after adverse experiences. New candidate susceptibility genes that serve as markers for the prediction of vulnerable phenotypes are now being identified.

Journal ArticleDOI
TL;DR: It is argued and present evidence that great apes understand the basics of intentional action, but they still do not participate in activities involving joint intentions and attention (shared intentionality), and children's skills of shared intentionality develop gradually during the first 14 months of life.
Abstract: We propose that the crucial difference between human cognition and that of other species is the ability to participate with others in collaborative activities with shared goals and intentions: shared intentionality. Participation in such activities requires not only especially powerful forms of intention reading and cultural learning, but also a unique motivation to share psychological states with oth- ers and unique forms of cognitive representation for doing so. The result of participating in these activities is species-unique forms of cultural cognition and evolution, enabling everything from the creation and use of linguistic symbols to the construction of social norms and individual beliefs to the establishment of social institutions. In support of this proposal we argue and present evidence that great apes (and some children with autism) understand the basics of intentional action, but they still do not participate in activities involving joint intentions and attention (shared intentionality). Human children's skills of shared intentionality develop gradually during the first 14 months of life as two ontogenetic pathways intertwine: (1) the general ape line of understanding others as animate, goal-directed, and intentional agents; and (2) a species-unique motivation to share emotions, experience, and activities with other persons. The develop- mental outcome is children's ability to construct dialogic cognitive representations, which enable them to participate in earnest in the collectivity that is human cognition.

Journal ArticleDOI
Piero Carninci, Takeya Kasukawa1, Shintaro Katayama, Julian Gough  +194 moreInstitutions (36)
02 Sep 2005-Science
TL;DR: Detailed polling of transcription start and termination sites and analysis of previously unidentified full-length complementary DNAs derived from the mouse genome provide a comprehensive platform for the comparative analysis of mammalian transcriptional regulation in differentiation and development.
Abstract: This study describes comprehensive polling of transcription start and termination sites and analysis of previously unidentified full-length complementary DNAs derived from the mouse genome. We identify the 5' and 3' boundaries of 181,047 transcripts with extensive variation in transcripts arising from alternative promoter usage, splicing, and polyadenylation. There are 16,247 new mouse protein-coding transcripts, including 5154 encoding previously unidentified proteins. Genomic mapping of the transcriptome reveals transcriptional forests, with overlapping transcription on both strands, separated by deserts in which few transcripts are observed. The data provide a comprehensive platform for the comparative analysis of mammalian transcriptional regulation in differentiation and development.

Journal ArticleDOI
22 Sep 2005-Nature
TL;DR: An increase in future drought events could turn temperate ecosystems into carbon sources, contributing to positive carbon-climate feedbacks already anticipated in the tropics and at high latitudes.
Abstract: Future climate warming is expected to enhance plant growth in temperate ecosystems and to increase carbon sequestration. But although severe regional heatwaves may become more frequent in a changing climate their impact on terrestrial carbon cycling is unclear. Here we report measurements of ecosystem carbon dioxide fluxes, remotely sensed radiation absorbed by plants, and country-level crop yields taken during the European heatwave in 2003.We use a terrestrial biosphere simulation model to assess continental-scale changes in primary productivity during 2003, and their consequences for the net carbon balance. We estimate a 30 per cent reduction in gross primary productivity over Europe, which resulted in a strong anomalous net source of carbon dioxide (0.5 Pg Cyr21) to the atmosphere and reversed the effect of four years of net ecosystem carbon sequestration. Our results suggest that productivity reduction in eastern and western Europe can be explained by rainfall deficit and extreme summer heat, respectively. We also find that ecosystem respiration decreased together with gross primary productivity, rather than accelerating with the temperature rise. Model results, corroborated by historical records of crop yields, suggest that such a reduction in Europe's primary productivity is unprecedented during the last century. An increase in future drought events could turn temperate ecosystems into carbon sources, contributing to positive carbon-climate feedbacks already anticipated in the tropics and at high latitudes.

Journal ArticleDOI
TL;DR: HHpred is a fast server for remote protein homology detection and structure prediction and is the first to implement pairwise comparison of profile hidden Markov models (HMMs) and allows to search a wide choice of databases.
Abstract: HHpred is a fast server for remote protein homology detection and structure prediction and is the first to implement pairwise comparison of profile hidden Markov models (HMMs). It allows to search a wide choice of databases, such as the PDB, SCOP, Pfam, SMART, COGs and CDD. It accepts a single query sequence or a multiple alignment as input. Within only a few minutes it returns the search results in a user-friendly format similar to that of PSI-BLAST. Search options include local or global alignment and scoring secondary structure similarity. HHpred can produce pairwise query-template alignments, multiple alignments of the query with a set of templates selected from the search results, as well as 3D structural models that are calculated by the MODELLER software from these alignments. A detailed help facility is available. As a demonstration, we analyze the sequence of SpoVT, a transcriptional regulator from Bacillus subtilis. HHpred can be accessed at http://protevo.eb.tuebingen.mpg.de/hhpred.

Journal ArticleDOI
10 Feb 2005-Nature
TL;DR: Simulations that simultaneously follow star formation and the growth of black holes during galaxy–galaxy collisions find that, in addition to generating a burst of star formation, a merger leads to strong inflows that feed gas to the supermassive black hole and thereby power the quasar.
Abstract: In the early Universe, while galaxies were still forming, black holes as massive as a billion solar masses powered quasars. Supermassive black holes are found at the centres of most galaxies today, where their masses are related to the velocity dispersions of stars in their host galaxies and hence to the mass of the central bulge of the galaxy. This suggests a link between the growth of the black holes and their host galaxies, which has indeed been assumed for a number of years. But the origin of the observed relation between black hole mass and stellar velocity dispersion, and its connection with the evolution of galaxies, have remained unclear. Here we report simulations that simultaneously follow star formation and the growth of black holes during galaxy-galaxy collisions. We find that, in addition to generating a burst of star formation, a merger leads to strong inflows that feed gas to the supermassive black hole and thereby power the quasar. The energy released by the quasar expels enough gas to quench both star formation and further black hole growth. This determines the lifetime of the quasar phase (approaching 100 million years) and explains the relationship between the black hole mass and the stellar velocity dispersion.

Journal ArticleDOI
01 Jul 2005-Carbon
TL;DR: In this article, experimental conditions and mathematical fitting procedures for the collection and analysis of Raman spectra of soot and related carbonaceous materials have been investigated and optimised with a Raman microscope system operated at three different laser excitation wavelengths (514, 633, and 780 nm).

Journal ArticleDOI
TL;DR: The atomic force microscope (AFM) is not only used to image the topography of solid surfaces at high resolution but also to measure force-versus-distance curves as discussed by the authors, which provide valuable information on local material properties such as elasticity, hardness, Hamaker constant, adhesion and surface charge densities.


Journal ArticleDOI
TL;DR: In this paper, the authors analyse the effect of extrapolation of night-time values of ecosystem respiration into the daytime; this is usually done with a temperature response function that is derived from long-term data sets.
Abstract: This paper discusses the advantages and disadvantages of the different methods that separate net ecosystem exchange (NEE) into its major components, gross ecosystem carbon uptake (GEP) and ecosystem respiration (Reco). In particular, we analyse the effect of the extrapolation of night-time values of ecosystem respiration into the daytime; this is usually done with a temperature response function that is derived from long-term data sets. For this analysis, we used 16 one-year-long data sets of carbon dioxide exchange measurements from European and US-American eddy covariance networks. These sites span from the boreal to Mediterranean climates, and include deciduous and evergreen forest, scrubland and crop ecosystems. We show that the temperature sensitivity of Reco, derived from long-term (annual) data sets, does not reflect the short-term temperature sensitivity that is effective when extrapolating from night- to daytime. Specifically, in summer active ecosystems the long

Journal ArticleDOI
TL;DR: UNLABELLED ROCR is a package for evaluating and visualizing the performance of scoring classifiers in the statistical language R that features over 25 performance measures that can be freely combined to create two-dimensional performance curves.
Abstract: Summary: ROCR is a package for evaluating and visualizing the performance of scoring classifiers in the statistical language R. It features over 25 performance measures that can be freely combined to create two-dimensional performance curves. Standard methods for investigating trade-offs between specific performance measures are available within a uniform framework, including receiver operating characteristic (ROC) graphs, precision/recall plots, lift charts and cost curves. ROCR integrates tightly with R's powerful graphics capabilities, thus allowing for highly adjustable plots. Being equipped with only three commands and reasonable default values for optional parameters, ROCR combines flexibility with ease of usage. Availability:http://rocr.bioinf.mpi-sb.mpg.de. ROCR can be used under the terms of the GNU General Public License. Running within R, it is platform-independent. Contact: tobias.sing@mpi-sb.mpg.de

Journal ArticleDOI
Joseph Adams1, Madan M. Aggarwal2, Zubayer Ahammed3, J. Amonett4  +363 moreInstitutions (46)
TL;DR: In this paper, the most important experimental results from the first three years of nucleus-nucleus collision studies at RHIC were reviewed, with emphasis on results of the STAR experiment.

Journal ArticleDOI
TL;DR: Hundreds of Arabidopsis genes were found that outperform traditional reference genes in terms of expression stability throughout development and under a range of environmental conditions, and the developed PCR primers or hybridization probes for the novel reference genes will enable better normalization and quantification of transcript levels inArabidopsis in the future.
Abstract: Gene transcripts with invariant abundance during development and in the face of environmental stimuli are essential reference points for accurate gene expression analyses, such as RNA gel-blot analysis or quantitative reverse transcription-polymerase chain reaction (PCR). An exceptionally large set of data from Affymetrix ATH1 whole-genome GeneChip studies provided the means to identify a new generation of reference genes with very stable expression levels in the model plant species Arabidopsis (Arabidopsis thaliana). Hundreds of Arabidopsis genes were found that outperform traditional reference genes in terms of expression stability throughout development and under a range of environmental conditions. Most of these were expressed at much lower levels than traditional reference genes, making them very suitable for normalization of gene expression over a wide range of transcript levels. Specific and efficient primers were developed for 22 genes and tested on a diverse set of 20 cDNA samples. Quantitative reverse transcription-PCR confirmed superior expression stability and lower absolute expression levels for many of these genes, including genes encoding a protein phosphatase 2A subunit, a coatomer subunit, and an ubiquitin-conjugating enzyme. The developed PCR primers or hybridization probes for the novel reference genes will enable better normalization and quantification of transcript levels in Arabidopsis in the future.

Journal ArticleDOI
TL;DR: In this paper, the authors present longitudinal measures of five-year change in the regional brain volumes in healthy adults and assess the average and individual differences in volume changes and the effects of age, sex and hypertension with latent difference score modeling.
Abstract: Brain aging research relies mostly on cross-sectional studies, which infer true changes from age differences. We present longitudinal measures of five-year change in the regional brain volumes in healthy adults. Average and individual differences in volume changes and the effects of age, sex and hypertension were assessed with latent difference score modeling. The caudate, the cerebellum, the hippocampus and the association cortices shrunk substantially. There was minimal change in the entorhinal and none in the primary visual cortex. Longitudinal measures of shrinkage exceeded cross-sectional estimates. All regions except the inferior parietal lobule showed individual differences in change. Shrinkage of the cerebellum decreased from young to middle adulthood, and increased from middle adulthood to old age. Shrinkage of the hippocampus, the entorhinal cortices, the inferior temporal cortex and the prefrontal white matter increased with age. Moreover, shrinkage in the hippocampus and the cerebellum accelerated with age. In the hippocampus, both linear and quadratic trends in incremental age-related shrinkage were limited to the hypertensive participants. Individual differences in shrinkage correlated across some regions, suggesting common causes. No sex differences in age trends except for the caudate were observed. We found no evidence of neuroprotective effects of larger brain size or educational attainment.

Journal ArticleDOI
TL;DR: Examining the expression patterns of large gene families, it is found that they are often more similar than would be expected by chance, indicating that many gene families have been co-opted for specific developmental processes.
Abstract: Regulatory regions of plant genes tend to be more compact than those of animal genes, but the complement of transcription factors encoded in plant genomes is as large or larger than that found in those of animals. Plants therefore provide an opportunity to study how transcriptional programs control multicellular development. We analyzed global gene expression during development of the reference plant Arabidopsis thaliana in samples covering many stages, from embryogenesis to senescence, and diverse organs. Here, we provide a first analysis of this data set, which is part of the AtGenExpress expression atlas. We observed that the expression levels of transcription factor genes and signal transduction components are similar to those of metabolic genes. Examining the expression patterns of large gene families, we found that they are often more similar than would be expected by chance, indicating that many gene families have been co-opted for specific developmental processes.

Journal ArticleDOI
01 Apr 2005-Science
TL;DR: The iron cycle, in which iron-containing soil dust is transported from land through the atmosphere to the oceans, affecting ocean biogeochemistry and hence having feedback effects on climate and dust production, is reviewed.
Abstract: The environmental conditions of Earth, including the climate, are determined by physical, chemical, biological, and human interactions that transform and transport materials and energy. This is the "Earth system": a highly complex entity characterized by multiple nonlinear responses and thresholds, with linkages between disparate components. One important part of this system is the iron cycle, in which iron-containing soil dust is transported from land through the atmosphere to the oceans, affecting ocean biogeochemistry and hence having feedback effects on climate and dust production. Here we review the key components of this cycle, identifying critical uncertainties and priorities for future research.

Journal ArticleDOI
TL;DR: A method for detecting distant homologous relationships between proteins based on the generalized alignment of protein sequences with a profile hidden Markov model (HMM) to the case of pairwise alignment of profile HMMs is presented.
Abstract: Motivation: Protein homology detection and sequence alignment are at the basis of protein structure prediction, function prediction and evolution. Results: We have generalized the alignment of protein sequences with a profile hidden Markov model (HMM) to the case of pairwise alignment of profile HMMs. We present a method for detecting distant homologous relationships between proteins based on this approach. The method (HHsearch) is benchmarked together with BLAST, PSI-BLAST, HMMER and the profile--profile comparison tools PROF_SIM and COMPASS, in an all-against-all comparison of a database of 3691 protein domains from SCOP 1.63 with pairwise sequence identities below 20%. Sensitivity: When the predicted secondary structure is included in the HMMs, HHsearch is able to detect between 2.7 and 4.2 times more homologs than PSI-BLAST or HMMER and between 1.44 and 1.9 times more than COMPASS or PROF_SIM for a rate of false positives of 10%. Approximately half of the improvement over the profile--profile comparison methods is attributable to the use of profile HMMs in place of simple profiles. Alignment quality: Higher sensitivity is mirrored by an increased alignment quality. HHsearch produced 1.2, 1.7 and 3.3 times more good alignments ('balanced' score >0.3) than the next best method (COMPASS), and 1.6, 2.9 and 9.4 times more than PSI-BLAST, at the family, superfamily and fold level, respectively. Speed: HHsearch scans a query of 200 residues against 3691 domains in 33 s on an AMD64 2GHz PC. This is 10 times faster than PROF_SIM and 17 times faster than COMPASS. Availability: HHsearch can be downloaded from http://www.protevo.eb.tuebingen.mpg.de/download/ together with up-to-date versions of SCOP and PFAM. A web server is available at http://www.protevo.eb.tuebingen.mpg.de/toolkit/index.php?view=hhpred Contact: johannes.soeding@tuebingen.mpg.de

Journal ArticleDOI
23 Sep 2005-Cell
TL;DR: A large, highly connected network of interacting pairs of human proteins was identified, characterizing ANP32A and CRMP1 as modulators of Wnt signaling and two novel Axin-1 interactions were validated experimentally.

Journal Article
TL;DR: This paper proposes to appropriately generalize the well-known notion of a separation margin and derive a corresponding maximum-margin formulation and presents a cutting plane algorithm that solves the optimization problem in polynomial time for a large class of problems.
Abstract: Learning general functional dependencies between arbitrary input and output spaces is one of the key challenges in computational intelligence. While recent progress in machine learning has mainly focused on designing flexible and powerful input representations, this paper addresses the complementary issue of designing classification algorithms that can deal with more complex outputs, such as trees, sequences, or sets. More generally, we consider problems involving multiple dependent output variables, structured output spaces, and classification problems with class attributes. In order to accomplish this, we propose to appropriately generalize the well-known notion of a separation margin and derive a corresponding maximum-margin formulation. While this leads to a quadratic program with a potentially prohibitive, i.e. exponential, number of constraints, we present a cutting plane algorithm that solves the optimization problem in polynomial time for a large class of problems. The proposed method has important applications in areas such as computational biology, natural language processing, information retrieval/extraction, and optical character recognition. Experiments from various domains involving different types of output spaces emphasize the breadth and generality of our approach.