scispace - formally typeset
Search or ask a question

Showing papers by "Max Planck Society published in 1999"


Proceedings ArticleDOI
08 Feb 1999
TL;DR: Support vector machines for dynamic reconstruction of a chaotic system, Klaus-Robert Muller et al pairwise classification and support vector machines, Ulrich Kressel.
Abstract: Introduction to support vector learning roadmap. Part 1 Theory: three remarks on the support vector method of function estimation, Vladimir Vapnik generalization performance of support vector machines and other pattern classifiers, Peter Bartlett and John Shawe-Taylor Bayesian voting schemes and large margin classifiers, Nello Cristianini and John Shawe-Taylor support vector machines, reproducing kernel Hilbert spaces, and randomized GACV, Grace Wahba geometry and invariance in kernel based methods, Christopher J.C. Burges on the annealed VC entropy for margin classifiers - a statistical mechanics study, Manfred Opper entropy numbers, operators and support vector kernels, Robert C. Williamson et al. Part 2 Implementations: solving the quadratic programming problem arising in support vector classification, Linda Kaufman making large-scale support vector machine learning practical, Thorsten Joachims fast training of support vector machines using sequential minimal optimization, John C. Platt. Part 3 Applications: support vector machines for dynamic reconstruction of a chaotic system, Davide Mattera and Simon Haykin using support vector machines for time series prediction, Klaus-Robert Muller et al pairwise classification and support vector machines, Ulrich Kressel. Part 4 Extensions of the algorithm: reducing the run-time complexity in support vector machines, Edgar E. Osuna and Federico Girosi support vector regression with ANOVA decomposition kernels, Mark O. Stitson et al support vector density estimation, Jason Weston et al combining support vector and mathematical programming methods for classification, Bernhard Scholkopf et al.

5,506 citations


Proceedings ArticleDOI
01 Jul 1999
TL;DR: A new technique for modeling textured 3D faces by transforming the shape and texture of the examples into a vector space representation, which regulates the naturalness of modeled faces avoiding faces with an “unlikely” appearance.
Abstract: In this paper, a new technique for modeling textured 3D faces is introduced. 3D faces can either be generated automatically from one or more photographs, or modeled directly through an intuitive user interface. Users are assisted in two key problems of computer aided face modeling. First, new face images or new 3D face models can be registered automatically by computing dense one-to-one correspondence to an internal face model. Second, the approach regulates the naturalness of modeled faces avoiding faces with an “unlikely” appearance. Starting from an example set of 3D face models, we derive a morphable face model by transforming the shape and texture of the examples into a vector space representation. New faces and expressions can be modeled by forming linear combinations of the prototypes. Shape and texture constraints derived from the statistics of our example faces are used to guide manual modeling or automated matching algorithms. We show 3D face reconstructions from single images and their applications for photo-realistic image manipulations. We also demonstrate face manipulations according to complex parameters such as gender, fullness of a face or its distinctiveness.

4,514 citations


Book
01 Jan 1999
TL;DR: Fast and frugal heuristics as discussed by the authors are simple rules for making decisions with realistic mental resources and can enable both living organisms and artificial systems to make smart choices, classifications, and predictions by employing bounded rationality.
Abstract: Fast and frugal heuristics - simple rules for making decisions with realistic mental resources - are presented here. These heuristics can enable both living organisms and artificial systems to make smart choices, classifications, and predictions by employing bounded rationality. But when and how can such fast and frugal heuristics work? What heuristics are in the mind's adaptive toolbox, and what building blocks compose them? Can judgments based simply on a single reason be as accurate as those based on many reasons? Could less knowledge even lead to systematically better predictions than more knowledge? This book explores these questions by developing computational models of heuristics and testing them through experiments and analysis. It shows how fast and frugal heuristics can yield adaptive decisions in situations as varied as choosing a mate, dividing resources among offspring, predicting high school drop-out rates, and playing the stock market.

4,384 citations


Journal ArticleDOI
TL;DR: The model can handle some of the main observations in the domain of speech errors (the major empirical domain for most other theories of lexical access), and the theory opens new ways of approaching the cerebral organization of speech production by way of high-temporal-resolution imaging.
Abstract: Preparing words in speech production is normally a fast and accurate process. We generate them two or three per second in fluent conversation; and overtly naming a clear picture of an object can easily be initiated within 600 msec after picture onset. The underlying process, however, is exceedingly complex. The theory reviewed in this target article analyzes this process as staged and feed-forward. After a first stage of conceptual preparation, word generation proceeds through lexical selection, morphological and phonological encoding, phonetic encoding, and articulation itself. In addition, the speaker exerts some degree of output control, by monitoring of self-produced internal and overt speech. The core of the theory, ranging from lexical selection to the initiation of phonetic encoding, is captured in a computational model, called WEAVER++. Both the theory and the computational model have been developed in interaction with reaction time experiments, particularly in picture naming or related word production paradigms, with the aim of accounting for the real-time processing in normal word production. A comprehensive review of theory, model, and experiments is presented. The model can handle some of the main observations in the domain of speech errors (the major empirical domain for most other theories of lexical access), and the theory opens new ways of approaching the cerebral organization of speech production by way of high-temporal-resolution imaging.

3,958 citations


Book
01 Jan 1999
TL;DR: Tomasello as discussed by the authors argued that the roots of the human capacity for symbol-based culture, and the kind of psychological development that takes p[lace within it, are based in a cluster of unique human cognitive capacities that emerge early in human ontogeny.
Abstract: This work builds a bridge between evolutionary theory and cultural psychology. The author is one of very few people to have done systematic research on the cognitive capacities of both nonhuman primates and human children. This work identifies what the differences are, and suggests where they might have come from. Tomasello argues that the roots of the human capacity for symbol-based culture, and the kind of psychological development that takes p[lace within it, are based in a cluster of unique human cognitive capacities that emerge early in human ontogeny. These include capacities fort sharing attention with other persons, for understanding that others have intentions of their own; and for imitating, not just what someone else does, but what someone else has intended to do. In this discussions of language, symbolic representation, and cognitive-development, the author describes with authority and ingenuity the "ratchet effect" of the capacities working over evolutionary and historical time to create the kind of cultural artifacts and settings within which each new generation of children develops. He also proposes a novel hypothesis, based on process of social cognition and cultural evolution, about what makes the cognitive representations of humans different from those of other primates.

3,901 citations


Proceedings ArticleDOI
23 Aug 1999
TL;DR: In this article, a non-linear classification technique based on Fisher's discriminant is proposed and the main ingredient is the kernel trick which allows the efficient computation of Fisher discriminant in feature space.
Abstract: A non-linear classification technique based on Fisher's discriminant is proposed. The main ingredient is the kernel trick which allows the efficient computation of Fisher discriminant in feature space. The linear classification in feature space corresponds to a (powerful) non-linear decision function in input space. Large scale simulations demonstrate the competitiveness of our approach.

2,896 citations


Journal ArticleDOI
TL;DR: In this paper, a simple model was proposed to estimate the bias of dark matter halos and their spatial distribution on large scales using the unconditional mass function, which was measured in numerical simulations of SCDM, OCDM and ΛCDM.
Abstract: Dark matter haloes are biased tracers of the underlying dark matter distribution. We use a simple model to provide a relation between the abundance of dark matter haloes and their spatial distribution on large scales. Our model shows that knowledge of the unconditional mass function alone is sufficient to provide an accurate estimate of the large-scale bias factor. We then use the mass function measured in numerical simulations of SCDM, OCDM and ΛCDM to compute this bias. Comparison with these simulations shows that this simple way of estimating the bias relation and its evolution is accurate for less massive haloes as well as massive ones. In particular, we show that haloes that are less/more massive than typical M* haloes at the time they form are more/less strongly clustered than is predicted by formulae based on the standard Press–Schechter mass function.

2,766 citations


Book
01 Jan 1999
TL;DR: In this article, the European Contribution Conclusion: Multi-level Problem-Solving in Europe References Index is presented, where the authors propose a solution without boundary control for solving multi-level problem solving in Europe.
Abstract: 1. Political Democracy in a Capitalist Economy 2. Negative and Positive Integration 3. Regulatory Competition and Re-Regulation 4. National Solutions without Boundary Control 5. The European Contribution Conclusion: Multi-level Problem-Solving in Europe References Index

2,726 citations


Journal ArticleDOI
24 Dec 1999-Science
TL;DR: Two areas with activation properties that become active during finger movement, regardless of how it is evoked, and their activation should increase when the same movement is elicited by the observation of an identical movement made by another individual are found.
Abstract: How does imitation occur? How can the motor plans necessary for imitating an action derive from the observation of that action? Imitation may be based on a mechanism directly matching the observed action onto an internal motor representation of that action (“direct matching hypothesis”). To test this hypothesis, normal human participants were asked to observe and imitate a finger movement and to perform the same movement after spatial or symbolic cues. Brain activity was measured with functional magnetic resonance imaging. If the direct matching hypothesis is correct, there should be areas that become active during finger movement, regardless of how it is evoked, and their activation should increase when the same movement is elicited by the observation of an identical movement made by another individual. Two areas with these properties were found in the left inferior frontal cortex (opercular region) and the rostral-most region of the right superior parietal lobule. Imitation has a central role in human development and learning of motor, communicative, and social skills (1, 2). However, the neural basis of imitation and its functional mechanisms are poorly understood. Data from patients with brain lesions suggest that frontal and parietal regions may be critical for human imitation (3) but do not provide insights on the mechanisms underlying it. Models of imitation based on instrumental

2,536 citations


Journal ArticleDOI
21 May 1999-Science
TL;DR: Predictions based on measurements suggest that actuators using optimized nanotube sheets may eventually provide substantially higher work densities per cycle than any previously known technology.
Abstract: Electromechanical actuators based on sheets of single-walled carbon nanotubes were shown to generate higher stresses than natural muscle and higher strains than high-modulus ferroelectrics. Like natural muscles, the macroscopic actuators are assemblies of billions of individual nanoscale actuators. The actuation mechanism (quantum chemical-based expansion due to electrochemical double-layer charging) does not require ion intercalation, which limits the life and rate of faradaic conducting polymer actuators. Unlike conventional ferroelectric actuators, low operating voltages of a few volts generate large actuator strains. Predictions based on measurements suggest that actuators using optimized nanotube sheets may eventually provide substantially higher work densities per cycle than any previously known technology.

2,334 citations


Journal ArticleDOI
TL;DR: Two supplementary versions of probe EUB338 are designed and evaluated for in situ detection of most of those phyla not detected with this probe, which should allow a more accurate quantification of members of the domain Bacteria in future molecular ecological studies.


Journal ArticleDOI
TL;DR: A focus of this review is nuclear export of messenger RNA, which apparently largely relies on export mediators distinct from importin beta-related factors.
Abstract: ▪ Abstract The compartmentation of eukaryotic cells requires all nuclear proteins to be imported from the cytoplasm, whereas, for example, transfer RNAs, messenger RNAs, and ribosomes are made in the nucleus and need to be exported to the cytoplasm. Nuclear import and export proceed through nuclear pore complexes and can occur along a great number of distinct pathways, many of which are mediated by importin β-related nuclear transport receptors. These receptors shuttle between nucleus and cytoplasm, and they bind transport substrates either directly or via adapter molecules. They all cooperate with the RanGTPase system to regulate the interactions with their cargoes. Another focus of our review is nuclear export of messenger RNA, which apparently largely relies on export mediators distinct from importin β-related factors. We discuss mechanistic aspects and the energetics of transport receptor function and describe a number of pathways in detail.

Journal ArticleDOI
17 Jun 1999-Nature
TL;DR: It is found that 39 different behaviour patterns, including tool usage, grooming and courtship behaviours, are customary or habitual in some communities but are absent in others where ecological explanations have been discounted.
Abstract: As an increasing number of field studies of chimpanzees (Pan troglodytes) have achieved long-term status across Africa, differences in the behavioural repertoires described have become apparent that suggest there is significant cultural variation1,2,3,4,5,6,7. Here we present a systematic synthesis of this information from the seven most long-term studies, which together have accumulated 151 years of chimpanzee observation. This comprehensive analysis reveals patterns of variation that are far more extensive than have previously been documented for any animal species except humans8,9,10,11. We find that 39 different behaviour patterns, including tool usage, grooming and courtship behaviours, are customary or habitual in some communities but are absent in others where ecological explanations have been discounted. Among mammalian and avian species, cultural variation has previously been identified only for single behaviour patterns, such as the local dialects of song-birds12,13. The extensive, multiple variations now documented for chimpanzees are thus without parallel. Moreover, the combined repertoire of these behaviour patterns in each chimpanzee community is itself highly distinctive, a phenomenon characteristic of human cultures14 but previously unrecognised in non-human species.

Journal ArticleDOI
TL;DR: In eukaryotic cells, most proteins in the cytosol and nucleus are degraded via the ubiquitin-proteasome pathway, and the 26S proteasome is a 2-MDa molecular machine built from approximately 31 different subunits, which catalyzes protein degradation.
Abstract: ▪ Abstract In eukaryotic cells, most proteins in the cytosol and nucleus are degraded via the ubiquitin-proteasome pathway. The 26S proteasome is a 2.5-MDa molecular machine built from ∼31 different subunits, which catalyzes protein degradation. It contains a barrel-shaped proteolytic core complex (the 20S proteasome), capped at one or both ends by 19S regulatory complexes, which recognize ubiquitinated proteins. The regulatory complexes are also implicated in unfolding and translocation of ubiquitinated targets into the interior of the 20S complex, where they are degraded to oligopeptides. Structure, assembly and enzymatic mechanism of the 20S complex have been elucidated, but the functional organization of the 19S complex is less well understood. Most subunits of the 19S complex have been identified, however, specific functions have been assigned to only a few. A low-resolution structure of the 26S proteasome has been obtained by electron microscopy, but the precise arrangement of subunits in the 19S co...

Journal ArticleDOI
05 Nov 1999-Science
TL;DR: Niche complementarity and positive species interactions appear to play a role in generating diversity-productivity relationships within sites in addition to sampling from the species pool.
Abstract: At eight European field sites, the impact of loss of plant diversity on primary productivity was simulated by synthesizing grassland communities with different numbers of plant species. Results differed in detail at each location, but there was an overall log-linear reduction of average aboveground biomass with loss of species. For a given number of species, communities with fewer functional groups were less productive. These diversity effects occurred along with differences associated with species composition and geographic location. Niche complementarity and positive species interactions appear to play a role in generating diversity-productivity relationships within sites in addition to sampling from the species pool.

Journal ArticleDOI
06 May 1999-Nature
TL;DR: After induction of long-lasting (but not short-lasting) functional enhancement of synapses in area CA1, new spines appear on the postsynaptic dendrite, whereas in control regions on the same dendrites or in slices where long-term potentiation was blocked, no significant spine growth occurred.
Abstract: Long-term enhancement of synaptic efficacy in the hippocampus is an important model for studying the cellular mechanisms of neuronal plasticity, circuit reorganization, and even learning and memory. Although these long-lasting functional changes are easy to induce, it has been very difficult to demonstrate that they are accompanied or even caused by morphological changes on the subcellular level. Here we combined a local superfusion technique with two-photon imaging, which allowed us to scrutinize specific regions of the postsynaptic dendrite where we knew that the synaptic changes had to occur. We show that after induction of long-lasting (but not short-lasting) functional enhancement of synapses in area CA1, new spines appear on the postsynaptic dendrite, whereas in control regions on the same dendrite or in slices where long-term potentiation was blocked, no significant spine growth occurred.

Journal ArticleDOI
TL;DR: In this article, an overview of the actual knowledge of the biogenic emissions of some volatile organic compounds (VOCs), i.e., isoprene, terpenes, alkanes, alkenes, carbonyls, alcohols, esters, and acids, is presented.
Abstract: This overview compiles the actual knowledge of the biogenic emissions of some volatile organic compounds (VOCs), i.e., isoprene, terpenes, alkanes, alkenes, alcohols, esters, carbonyls, and acids. We discuss VOC biosynthesis, emission inventories, relations between emission and plant physiology as well as temperature and radiation, and ecophysiological functions. For isoprene and monoterpenes, an extended summary of standard emission factors, with data related to the plant genus and species, is included. The data compilation shows that we have quite a substantial knowledge of the emission of isoprene and monoterpenes, including emission rates, emission regulation, and biosynthesis. The situation is worse in the case of numerous other compounds (other VOCs or OVOCs) being emitted by the biosphere. This is reflected in the insufficient knowledge of emission rates and biological functions. Except for the terpenoids, only a limited number of studies of OVOCs are available; data are summarized for alkanes, alkenes, carbonyls, alcohols, acids, and esters. In addition to closing these gaps of knowledge, one of the major objectives for future VOC research is improving our knowledge of the fate of organic carbon in the atmosphere, ending up in oxidation products and/or as aerosol particles.

Journal ArticleDOI
TL;DR: Heteronuclear multidimensional NMR experiments for the structure determination of proteins in solution employing pulsed field gradients using lasers and positron-proton collisions.

Journal ArticleDOI
TL;DR: The current status of the technique and recommendations on the measurement of [18F]-FDG uptake for tumour response monitoring from a consensus meeting of the EORTC PET study group held in Brussels in February 1998 and confirmed at a subsequent meeting in March 1999 are summarized.

Journal ArticleDOI
TL;DR: A review of recent research across several disciplines not surprisingly finds a wide variety of descriptions surrounding meanings, processes, scales and methods concerning the notion of transnationalism as discussed by the authors, and several clusters or themes are suggested by way of disentangling the term.
Abstract: A review of recent research across several disciplines not surprisingly finds a wide variety of descriptions surrounding meanings, processes, scales and methods concerning the notion of 'transnationalism'. Here, several clusters or themes are suggested by way of disentangling the term. These include transnationalism as a social morphology, as a type of consciousness, as a mode of cultural reproduction, as an avenue of capital, as a site of political engagement, and as a reconstruction of 'place' or locality. These and other approaches to transnationalism are being explored in a newly commissioned ESRC research programme on Transnational Communities (see http:// www.transcomm.ox.ac.uk).

PatentDOI
14 Dec 1999-Nature
TL;DR: In this article, agents and methods for growth factor receptor activation by modulating the G-protein mediated signal transduction pathway were described, and a method to activate the growth factor receptors was proposed.
Abstract: The present invention relates to agents and methods for growth-factor receptor activation by modulating the G-protein mediated signal transduction pathway.

Journal ArticleDOI
TL;DR: The successes, as well as the limits, of perturbation theory are presented, and its role in the emerging era of numerical relativity and supercomputers is discussed.
Abstract: Perturbations of stars and black holes have been one of the main topics of relativistic astrophysics for the last few decades. They are of particular importance today, because of their relevance to gravitational wave astronomy. In this review we present the theory of quasi-normal modes of compact objects from both the mathematical and astrophysical points of view. The discussion includes perturbations of black holes (Schwarzschild, Reissner-Nordstrom, Kerr and Kerr-Newman) and relativistic stars (non-rotating and slowly-rotating). The properties of the various families of quasi-normal modes are described, and numerical techniques for calculating quasi-normal modes reviewed. The successes, as well as the limits, of perturbation theory are presented, and its role in the emerging era of numerical relativity and supercomputers is discussed.

Journal ArticleDOI
18 Feb 1999-Nature
TL;DR: In this article, the authors used ab initio path integral simulations to address the question that the hydrated proton forms a fluxional defect in the hydrogen-bonded network, with both H9O4+ and H5O2+ occurring only in the sense of "limiting" or "ideal" structures.
Abstract: Explanations for the anomalously high mobility of protons in liquid water began with Grotthuss's idea1, 2 of ‘structural diffusion’ nearly two centuries ago Subsequent explanations have refined this concept by invoking thermal hopping3, 4, proton tunnelling5, 6 or solvation effects7 More recently, two main structural models have emerged for the hydrated proton Eigen8, 9 proposed the formation of an H9O4+ complex in which an H3O+ core is strongly hydrogen-bonded to three H2O molecules Zundel10, 11, meanwhile, supported the notion of an H5O2+ complex in which the proton isshared between two H2O molecules Here we use ab initio path integral12,13,14 simulations to address this question These simulations include time-independent equilibrium thermal and quantum fluctuations of all nuclei, and determine interatomic interactions from the electronic structure We find that the hydrated proton forms a fluxional defect in the hydrogen-bonded network, with both H9O4+ and H5O2+ occurring only in thesense of ‘limiting’ or ‘ideal’ structures The defect can become delocalized over several hydrogen bonds owing to quantum fluctuations Solvent polarization induces a small barrier to proton transfer, which is washed out by zero-point motion The proton can consequently be considered part of a ‘low-barrier hydrogen bond’15, 16, in which tunnelling is negligible and the simplest concepts of transition-state theory do not apply The rate of proton diffusion is determined by thermally induced hydrogen-bond breaking in the second solvation shell

Journal ArticleDOI
26 May 1999-Chaos
TL;DR: In this paper, the authors describe the implementation of methods of nonlinear time series analysis which are based on the paradigm of deterministic chaos and present a variety of algorithms for data representation, prediction, noise reduction, dimension and Lyapunov estimation.
Abstract: We describe the implementation of methods of nonlinear time series analysis which are based on the paradigm of deterministic chaos. A variety of algorithms for data representation, prediction, noise reduction, dimension and Lyapunov estimation, and nonlinearity testing are discussed with particular emphasis on issues of implementation and choice of parameters. Computer programs that implement the resulting strategies are publicly available as the TISEAN software package. The use of each algorithm will be illustrated with a typical application. As to the theoretical background, we will essentially give pointers to the literature. (c) 1999 American Institute of Physics.

Journal ArticleDOI
TL;DR: In this article, the effects of chirality and the structures of simple lipids are described in detail, including structures revealed by x-ray-diffraction experiments, computer simulations, molecular models, and a phenomenological theory of phase transitions.
Abstract: Lipid monolayers on the surface of water have been studied for over a hundred years, but in the last decade there has been a dramatic evolution in our understanding of the structures and phase transitions of these systems, driven by new experimental techniques and theoretical advances. In this review, dense monolayers of simple lipids are described in detail, including structures revealed by x-ray-diffraction experiments, computer simulations, molecular models, and a phenomenological theory of phase transitions. The effects of chirality and the structures of phospholipid monolayers are considered. Open questions and possible approaches to finding answers are discussed.

Journal ArticleDOI
TL;DR: This report describes the systematic and up-to-date analysis of genomes (PEDANT), a comprehensive database of the yeast genome (MYGD), a database reflecting the progress in sequencing the Arabidopsis thaliana genome (MATD), the database of assembled, annotated human EST clusters (MEST), and the collection of protein sequence data within the framework of the PIR-International Protein Sequence Database (described elsewhere in this volume).
Abstract: The Munich Information Center for Protein Sequences (MIPS-GSF, Neuherberg, Germany) continues to provide genome-related information in a systematic way. MIPS supports both national and European sequencing and functional analysis projects, develops and maintains automatically generated and manually annotated genome-specific databases, develops systematic classification schemes for the functional annotation of protein sequences, and provides tools for the comprehensive analysis of protein sequences. This report updates the information on the yeast genome (CYGD), the Neurospora crassa genome (MNCDB), the databases for the comprehensive set of genomes (PEDANT genomes), the database of annotated human EST clusters (HIB), the database of complete cDNAs from the DHGP (German Human Genome Project), as well as the project specific databases for the GABI (Genome Analysis in Plants) and HNB (Helmholtz-Netzwerk Bioinformatik) networks. The Arabidospsis thaliana database (MATDB), the database of mitochondrial proteins (MITOP) and our contribution to the PIR International Protein Sequence Database have been described elsewhere [Schoof et al. (2002) Nucleic Acids Res., 30, 91-93; Scharfe et al. (2000) Nucleic Acids Res., 28, 155-158; Barker et al. (2001) Nucleic Acids Res., 29, 29-32]. All databases described, the protein analysis tools provided and the detailed descriptions of our projects can be accessed through the MIPS World Wide Web server (http://mips.gsf.de).

Journal ArticleDOI
TL;DR: In this paper, the physical decay properties of the density matrix were studied for both metals and insulators, and several strategies for constructing O(N) algorithms were presented and critically examined.
Abstract: Methods exhibiting linear scaling with respect to the size of the system, the so-called O(N) methods, are an essential tool for the calculation of the electronic structure of large systems containing many atoms. They are based on algorithms that take advantage of the decay properties of the density matrix. In this article the physical decay properties of the density matrix will first be studied for both metals and insulators. Several strategies for constructing O(N) algorithms will then be presented and critically examined. Some issues that are relevant only for self-consistent O(N) methods, such as the calculation of the Hartree potential and mixing issues, will also be discussed. Finally some typical applications of O(N) methods are briefly described.

Journal ArticleDOI
22 Apr 1999-Nature
TL;DR: In this paper, the authors present results from a global climate model with sufficient resolution in the tropics to adequately represent the narrow equatorial upwelling and low-frequency waves, when the model is forced by a realistic future scenario of increasing greenhouse-gas concentrations, more frequent El-Nino-like conditions and stronger cold events in the tropical Pacific Ocean.
Abstract: The El Nino/Southern Oscillation (ENSO) phenomenon is the strongest natural interannual climate fluctuation1. ENSO originates in the tropical Pacific Ocean and has large effects on the ecology of the region, but it also influences the entire global climate system and affects the societies and economies of manycountries2. ENSO can be understood as an irregular low-frequency oscillation between a warm (El Nino) and a cold (La Nina) state. The strong El Ninos of 1982/1983 and 1997/1998, along with the more frequent occurrences of El Ninos during the past few decades, raise the question of whether human-induced ‘greenhouse’ warming affects, or will affect, ENSO3. Several global climate models have been applied to transient greenhouse-gas-induced warming simulations to address this question4,6, but the results have been debated owing to the inability of the models to fully simulate ENSO (because of their coarse equatorial resolution)7. Here we present results from a global climate model with sufficient resolution in the tropics to adequately represent the narrow equatorial upwelling and low-frequency waves. When the model is forced by a realistic future scenario of increasing greenhouse-gas concentrations, more frequent El-Nino-like conditions and stronger cold events in the tropical Pacific Ocean result.

Journal ArticleDOI
TL;DR: The geometry of feature space is reviewed, and the connection between feature space and input space is discussed by dealing with the question of how one can, given some vector in feature space, find a preimage in input space.
Abstract: This paper collects some ideas targeted at advancing our understanding of the feature spaces associated with support vector (SV) kernel functions. We first discuss the geometry of feature space. In particular, we review what is known about the shape of the image of input space under the feature space map, and how this influences the capacity of SV methods. Following this, we describe how the metric governing the intrinsic geometry of the mapped surface can be computed in terms of the kernel, using the example of the class of inhomogeneous polynomial kernels, which are often used in SV pattern recognition. We then discuss the connection between feature space and input space by dealing with the question of how one can, given some vector in feature space, find a preimage (exact or approximate) in input space. We describe algorithms to tackle this issue, and show their utility in two applications of kernel methods. First, we use it to reduce the computational complexity of SV decision functions; second, we combine it with the kernel PCA algorithm, thereby constructing a nonlinear statistical denoising technique which is shown to perform well on real-world data.