scispace - formally typeset
Search or ask a question

Showing papers by "Santa Fe Institute published in 2014"


Book
02 Oct 2014
TL;DR: After two centuries of studying equilibria-static patterns that call for no further behavioral adjustments-economists are beginning to study the general emergence of structures and the unfolding of patterns in the economy.
Abstract: After two centuries of studying equilibria—static patterns that call for no further behavioral adjustments—economists are beginning to study the general emergence of structures and the unfolding of patterns in the economy. When viewed in out-of-equilibrium formation, economic patterns sometimes simplify into the simple static equilibria of standard economics. More often they are ever changing, showing perpetually novel behavior and emergent phenomena. Complexity portrays the economy not as deterministic, predictable, and mechanistic, but as process dependent, organic, and always evolving.

783 citations


Journal ArticleDOI
TL;DR: In this article, the authors model contagions and cascades of failures among organizations linked through a network of financial interdependencies and identify how the network propagates discontinuous changes in asset values triggered by failures.
Abstract: We model contagions and cascades of failures among organizations linked through a network of financial interdependencies. We identify how the network propagates discontinuous changes in asset values triggered by failures (e.g., bankruptcies, defaults, and other insolvencies) and use that to study the consequences of integration (each organization becoming more dependent on its counterparties) and diversification (each organization interacting with a larger number of counterparties). Integration and diversification have different, nonmonotonic effects on the extent of cascades. Initial increases in diversification connect the network which permits cascades to propagate further, but eventually, more diversification makes contagion between any pair of organizations less likely as they become less dependent on each other. Integration also faces tradeoffs: increased dependence on other organizations versus less sensitivity to own investments. Finally, we illustrate some aspects of the model with data on European debt cross-holdings.

760 citations


Journal ArticleDOI
TL;DR: This work couple fine-grained climate projections to thermal performance data from 38 ectothermic invertebrate species and contrast projections with those of a simple model to show that projections based on mean temperature change alone differ substantially from those incorporating changes to the variation, and to the mean and variation in concert.
Abstract: Increases in the frequency, severity and duration of temperature extremes are anticipated in the near future. Although recent work suggests that changes in temperature variation will have disproportionately greater effects on species than changes to the mean, much of climate change research in ecology has focused on the impacts of mean temperature change. Here, we couple fine-grained climate projections (2050–2059) to thermal performance data from 38 ectothermic invertebrate species and contrast projections with those of a simple model. We show that projections based on mean temperature change alone differ substantially from those incorporating changes to the variation, and to the mean and variation in concert. Although most species show increases in performance at greater mean temperatures, the effect of mean and variance change together yields a range of responses, with temperate species at greatest risk of performance declines. Our work highlights the importance of using fine-grained temporal data to incorporate the full extent of temperature variation when assessing and projecting performance.

714 citations


Journal ArticleDOI
TL;DR: In this paper, the authors study cascades of failures in a network of interdependent financial organizations: how discontinuous changes in asset values (e.g., defaults and shutdowns) trigger further failures, and how this depends on network structure.
Abstract: *We study cascades of failures in a network of interdependent financial organizations: how discontinuous changes in asset values (e.g., defaults and shutdowns) trigger further failures, and how this depends on network structure. Integration (greater dependence on counterparties) and diversification (more counterparties per organization) have different, nonmonotonic effects on the extent of cascades. Diversification connects the network initially, permitting cascades to travel; but as it increases further, organizations are better insured against one another’s failures. Integration also faces trade-offs: increased dependence on other organizations versus less sensitivity to own investments. Finally, we illustrate the model with data on European debt cross-holdings. (JEL D85, F15, F34, F36, F65, G15, G32, G33, G38) Globalization brings with it increased financial interdependencies among many kinds of organizations—governments, central banks, investment banks, firms, etc.— that hold each other’s shares, debts, and other obligations. Such interdependencies can lead to cascading defaults and failures, which are often avoided through massive bailouts of institutions deemed “too big to fail.” Recent examples include the US government’s interventions in AIG, Fannie Mae, Freddie Mac, and General Motors; and the European Commission’s interventions in Greece and Spain. Although such bailouts circumvent the widespread failures that were more prevalent in the nineteenth and early twentieth centuries, they emphasize the need to study the risks created by a network of interdependencies. Understanding these risks is crucial to designing incentives and regulatory responses which defuse cascades before they are imminent. In this paper we develop a general model that produces new insights regarding financial contagions and cascades of failures among organizations linked through a network of financial interdependencies. Organizations’ values depend on each other—e.g., through cross-holdings of shares, debt, or other liabilities. If an

639 citations


Journal ArticleDOI
TL;DR: It is shown how functional biogeography bridges species-basedBiogeography and earth science to provide ideas and tools to help explain gradients in multifaceted diversity (including species, functional, and phylogenetic diversities), predict ecosystem functioning and services worldwide, and infuse regional and global conservation programs with a functional basis.
Abstract: Understanding, modeling, and predicting the impact of global change on ecosystem functioning across biogeographical gradients can benefit from enhanced capacity to represent biota as a continuous distribution of traits. However, this is a challenge for the field of biogeography historically grounded on the species concept. Here we focus on the newly emergent field of functional biogeography: the study of the geographic distribution of trait diversity across organizational levels. We show how functional biogeography bridges species-based biogeography and earth science to provide ideas and tools to help explain gradients in multifaceted diversity (including species, functional, and phylogenetic diversities), predict ecosystem functioning and services worldwide, and infuse regional and global conservation programs with a functional basis. Although much recent progress has been made possible because of the rising of multiple data streams, new developments in ecoinformatics, and new methodological advances, future directions should provide a theoretical and comprehensive framework for the scaling of biotic interactions across trophic levels and its ecological implications.

517 citations


Journal ArticleDOI
TL;DR: This work highlights the conceptual and computational issues that have prevented a more direct approach to measuring hypervolumes and presents a new multivariate kernel density estimation method that resolves many of these problems in an arbitrary number of dimensions.
Abstract: Aim The Hutchinsonian hypervolume is the conceptual foundation for many lines of ecological and evolutionary inquiry, including functional morphology, comparative biology, community ecology and niche theory. However, extant methods to sample from hypervolumes or measure their geometry perform poorly on high-dimensional or holey datasets. Innovation We first highlight the conceptual and computational issues that have prevented a more direct approach to measuring hypervolumes. Next, we present a new multivariate kernel density estimation method that resolves many of these problems in an arbitrary number of dimensions. Main conclusions We show that our method (implemented as the ‘hypervolume’ R package) can match several extant methods for hypervolume geometry and species distribution modelling. Tools to quantify high-dimensional ecological hypervolumes will enable a wide range of fundamental descriptive, inferential and comparative questions to be addressed.

484 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a mechanistic model for the thermal response of consumer-resource interactions, which predicts that temperature affects species interactions via key traits such as body velocity, detection distance, search rate and handling time.
Abstract: Summary Environmental temperature has systematic effects on rates of species interactions, primarily through its influence on organismal physiology. We present a mechanistic model for the thermal response of consumer–resource interactions. We focus on how temperature affects species interactions via key traits – body velocity, detection distance, search rate and handling time – that underlie per capita consumption rate. The model is general because it applies to all foraging strategies: active-capture (both consumer and resource body velocity are important), sit-and-wait (resource velocity dominates) and grazing (consumer velocity dominates). The model predicts that temperature influences consumer–resource interactions primarily through its effects on body velocity (either of the consumer, resource or both), which determines how often consumers and resources encounter each other, and that asymmetries in the thermal responses of interacting species can introduce qualitative, not just quantitative, changes in consumer–resource dynamics. We illustrate this by showing how asymmetries in thermal responses determine equilibrium population densities in interacting consumer–resource pairs. We test for the existence of asymmetries in consumer–resource thermal responses by analysing an extensive database on thermal response curves of ecological traits for 309 species spanning 15 orders of magnitude in body size from terrestrial, marine and freshwater habitats. We find that asymmetries in consumer–resource thermal responses are likely to be a common occurrence. Overall, our study reveals the importance of asymmetric thermal responses in consumer–resource dynamics. In particular, we identify three general types of asymmetries: (i) different levels of performance of the response, (ii) different rates of response (e.g. activation energies) and (iii) different peak or optimal temperatures. Such asymmetries should occur more frequently as the climate changes and species' geographical distributions and phenologies are altered, such that previously noninteracting species come into contact. By using characteristics of trophic interactions that are often well known, such as body size, foraging strategy, thermy and environmental temperature, our framework should allow more accurate predictions about the thermal dependence of consumer–resource interactions. Ultimately, integration of our theory into models of food web and ecosystem dynamics should be useful in understanding how natural systems will respond to current and future temperature change.

372 citations


Journal ArticleDOI
TL;DR: It is found in numerical simulations of artificially generated power grids that tree-like connection schemes--so-called dead ends and dead trees--strongly diminish stability, which may indicate a topological design principle for future power grids: avoid dead ends.
Abstract: The cheapest and thus widespread way to add new generators to a high-voltage power grid is by a simple tree-like connection scheme. However, it is not entirely clear how such locally cost-minimizing connection schemes affect overall system performance, in particular the stability against blackouts. Here we investigate how local patterns in the network topology influence a power grid's ability to withstand blackout-prone large perturbations. Employing basin stability, a nonlinear concept, we find in numerical simulations of artificially generated power grids that tree-like connection schemes--so-called dead ends and dead trees--strongly diminish stability. A case study of the Northern European power system confirms this result and demonstrates that the inverse is also true: repairing dead ends by addition of a few transmission lines substantially enhances stability. This may indicate a topological design principle for future power grids: avoid dead ends.

368 citations


Journal ArticleDOI
TL;DR: In this article, the authors developed a network approach to the amplification of financial contagion due to the combination of overlapping portfolios and leverage, and showed how it can be understood in terms of a generalized branching process.
Abstract: Common asset holdings are widely believed to have been the primary vector of contagion in the recent financial crisis. We develop a network approach to the amplification of financial contagion due to the combination of overlapping portfolios and leverage, and we show how it can be understood in terms of a generalized branching process. This can be used to compute the stability for any particular configuration of portfolios. By studying a stylized model we estimate the circumstances under which systemic instabilities are likely to occur as a function of parameters such as leverage, market crowding, diversification, and market impact. Although diversification may be good for individual institutions, it can create dangerous systemic effects, and as a result financial contagion gets worse with too much diversification. There is a critical threshold for leverage; below it financial networks are always stable, and above it the unstable region grows as leverage increases. Note that our model assumes passive portfolio management during a crisis; however, we show that dynamic deleveraging during a crisis can amplify instabilities. The financial system exhibits “robust yet fragile” behavior, with regions of the parameter space where contagion is rare but catastrophic whenever it occurs. Our model and methods of analysis can be calibrated to real data and provide simple yet powerful tools for macroprudential stress testing.

292 citations


Journal ArticleDOI
TL;DR: In this paper, the authors show that both the total number of contacts and the total communication activity grow superlinearly with city population size, according to well-defined scaling relations and resulting from a multiplicative increase that affects most citizens.
Abstract: The size of cities is known to play a fundamental role in social and economic life. Yet, its relation to the structure of the underlying network of human interactions has not been investigated empirically in detail. In this paper, we map society-wide communication networks to the urban areas of two European countries. We show that both the total number of contacts and the total communication activity grow superlinearly with city population size, according to well-defined scaling relations and resulting from a multiplicative increase that affects most citizens. Perhaps surprisingly, however, the probability that an individual's contacts are also connected with each other remains largely unaffected. These empirical results predict a systematic and scale-invariant acceleration of interaction-based spreading phenomena as cities get bigger, which is numerically confirmed by applying epidemiological models to the studied networks. Our findings should provide a microscopic basis towards understanding the superlinear increase of different socioeconomic quantities with city size, that applies to almost all urban systems and includes, for instance, the creation of new inventions or the prevalence of certain contagious diseases.

291 citations


Journal ArticleDOI
TL;DR: An intensive temporal study of indoor airborne bacterial communities in a high-traffic university building with a hybrid HVAC system indicates that both occupancy patterns and ventilation strategies are important for understanding airborne microbial community dynamics in the built environment.
Abstract: Architects and engineers are beginning to consider a new dimension of indoor air: the structure and composition of airborne microbial communities. A first step in this emerging field is to understand the forces that shape the diversity of bioaerosols across space and time within the built environment. In an effort to elucidate the relative influences of three likely drivers of indoor bioaerosol diversity - variation in outdoor bioaerosols, ventilation strategy, and occupancy load - we conducted an intensive temporal study of indoor airborne bacterial communities in a high-traffic university building with a hybrid HVAC (mechanically and naturally ventilated) system. Indoor air communities closely tracked outdoor air communities, but human-associated bacterial genera were more than twice as abundant in indoor air compared with outdoor air. Ventilation had a demonstrated effect on indoor airborne bacterial community composition; changes in outdoor air communities were detected inside following a time lag associated with differing ventilation strategies relevant to modern building design. Our results indicate that both occupancy patterns and ventilation strategies are important for understanding airborne microbial community dynamics in the built environment.

Journal ArticleDOI
TL;DR: In this article, the authors present a systematic effort to answer the question, What are archaeology's most important scientific challenges? Starting with a crowd-sourced query directed broadly to the professional community of archaeologists, the authors augmented, prioritized, and refined the responses during a two-day workshop focused specifically on this question.
Abstract: This article represents a systematic effort to answer the question, What are archaeology’s most important scientific challenges? Starting with a crowd-sourced query directed broadly to the professional community of archaeologists, the authors augmented, prioritized, and refined the responses during a two-day workshop focused specifically on this question. The resulting 25 “grand challenges” focus on dynamic cultural processes and the operation of coupled human and natural systems. We organize these challenges into five topics: (1) emergence, communities, and complexity; (2) resilience, persistence, transformation, and collapse; (3) movement, mobility, and migration; (4) cognition, behavior, and identity; and (5) human-environment interactions. A discussion and a brief list of references accompany each question. An important goal in identifying these challenges is to inform decisions on infrastructure investments for archaeology. Our premise is that the highest priority investments should enable us to address the most important questions. Addressing many of these challenges will require both sophisticated modeling and large-scale synthetic research that are only now becoming possible. Although new archaeological fieldwork will be essential, the greatest pay off will derive from investments that provide sophisticated research access to the explosion in systematically collected archaeological data that has occurred over the last several decades.

Journal ArticleDOI
TL;DR: It is argued that recent rapid warming in the Arctic and associated changes in the zonal mean zonal wind have created favorable conditions for double jet formation in the extratropics, which promotes the development of resonant flow regimes.
Abstract: The recent decade has seen an exceptional number of high-impact summer extremes in the Northern Hemisphere midlatitudes. Many of these events were associated with anomalous jet stream circulation patterns characterized by persistent high-amplitude quasi-stationary Rossby waves. Two mechanisms have recently been proposed that could provoke such patterns: (i) a weakening of the zonal mean jets and (ii) an amplification of quasi-stationary waves by resonance between free and forced waves in midlatitude waveguides. Based upon spectral analysis of the midtroposphere wind field, we show that the persistent jet stream patterns were, in the first place, due to an amplification of quasi-stationary waves with zonal wave numbers 6–8. However, we also detect a weakening of the zonal mean jet during these events; thus both mechanisms appear to be important. Furthermore, we demonstrate that the anomalous circulation regimes lead to persistent surface weather conditions and therefore to midlatitude synchronization of extreme heat and rainfall events on monthly timescales. The recent cluster of resonance events has resulted in a statistically significant increase in the frequency of high-amplitude quasi-stationary waves of wave numbers 7 and 8 in July and August. We show that this is a robust finding that holds for different pressure levels and reanalysis products. We argue that recent rapid warming in the Arctic and associated changes in the zonal mean zonal wind have created favorable conditions for double jet formation in the extratropics, which promotes the development of resonant flow regimes.

Journal ArticleDOI
TL;DR: It is argued for expanding the role of theory in ecology to accelerate scientific progress, enhance the ability to address environmental challenges, foster the development of synthesis and unification, and improve the design of experiments and large-scale environmental- monitoring programs.
Abstract: We argue for expanding the role of theory in ecology to accelerate scientific progress, enhance the ability to address environmental challenges, foster the development of synthesis and unification, and improve the design of experiments and large-scale environmental-monitoring programs. To achieve these goals, it is essential to foster the development of what we call efficient theories, which have several key attributes. Efficient theories are grounded in first principles, are usually expressed in the language of mathematics, make few assumptions and generate a large number of predictions per free parameter, are approximate, and entail predictions that provide well-understood standards for comparison with empirical data. We contend that the development and successive refinement of efficient theories provide a solid foundation for advancing environmental science in the era of big data.

Journal ArticleDOI
TL;DR: Some of the ways in which networks are helping economists to model and understand behavior are described, including a taxonomy of network properties and how they impact behaviors.
Abstract: In this paper I discuss what we have learned about how the structure of social networks impacts economic behaviors, and why it is important to include network information in many economic studies. I also discuss some issues of estimating models of network formation, and some of the challenges of accounting for endogenous networks in analyzing interactions.

Journal ArticleDOI
TL;DR: Proof-of-concept models clarify thinking, uncover hidden assumptions, and spur new directions of study in evolutionary biology by formally testing the logic of verbal hypotheses.
Abstract: Summary Progress in science often begins with verbal hypotheses meant to explain why certain biological phenomena exist. An important purpose of mathematical models in evolutionary research, as in many other fields, is to act as ‘‘proof-of-concept’’ tests of the logic in verbal explanations, paralleling the way in which empirical data are used to test hypotheses. Because not all subfields of biology use mathematics for this purpose, misunderstandings of the function of proofof-concept modeling are common. In the hope of facilitating communication, we discuss the role of proof-of-concept modeling in evolutionary biology.

Journal ArticleDOI
TL;DR: An approach based on network analysis, which allows projection of an El Niño event about 1 y ahead, is developed and it is shown that this method correctly predicted the absence of El Niño events in 2012 and 2013 and now it is announced that the approach indicated the return ofEl Niño in late 2014 with a 3-in-4 likelihood.
Abstract: The most important driver of climate variability is the El Nino Southern Oscillation, which can trigger disasters in various parts of the globe. Despite its importance, conventional forecasting is still limited to 6 mo ahead. Recently, we developed an approach based on network analysis, which allows projection of an El Nino event about 1 y ahead. Here we show that our method correctly predicted the absence of El Nino events in 2012 and 2013 and now announce that our approach indicated (in September 2013 already) the return of El Nino in late 2014 with a 3-in-4 likelihood. We also discuss the relevance of the next El Nino to the question of global warming and the present hiatus in the global mean surface temperature.

Journal ArticleDOI
TL;DR: This paper employs a definition of generalized Ricci curvature proposed by Ollivier in a general framework of Markov processes and metric spaces and applied in graph theory by Lin–Yau to derive lower RicCI curvature bounds on graphs in terms of such local clustering coefficients.
Abstract: In this paper, we explore the relationship between one of the most elementary and important properties of graphs, the presence and relative frequency of triangles, and a combinatorial notion of Ricci curvature. We employ a definition of generalized Ricci curvature proposed by Ollivier in a general framework of Markov processes and metric spaces and applied in graph theory by Lin–Yau. In analogy with curvature notions in Riemannian geometry, we interpret this Ricci curvature as a control on the amount of overlap between neighborhoods of two neighboring vertices. It is therefore naturally related to the presence of triangles containing those vertices, or more precisely, the local clustering coefficient, that is, the relative proportion of connected neighbors among all the neighbors of a vertex. This suggests to derive lower Ricci curvature bounds on graphs in terms of such local clustering coefficients. We also study curvature-dimension inequalities on graphs, building upon previous work of several authors.

Journal ArticleDOI
TL;DR: The question arose as the discipline sought to develop recommendations for investments in computational infrastructure that would enable the discipline to address its most compelling questions: what are archaeology’s most important scientific challenges?
Abstract: Archaeology is a source of essential data regarding the fundamental nature of human societies. Researchers across the behavioral and social sciences use archeological data in framing foundational arguments. Archaeological evidence frequently undergirds debate on contemporary issues. We propose here to answer “What are archaeology’s most important scientific challenges?” The question arose as we sought to develop recommendations for investments in computational infrastructure that would enable the discipline to address its most compelling questions. Absent a list of these questions, we undertook to develop our own.

Journal ArticleDOI
TL;DR: By applying the proposed algorithm recursively, subdividing communities until no statistically significant subcommunities can be found, it is shown that the algorithm can detect hierarchical structure in real-world networks more efficiently than previous methods.
Abstract: Modularity is a popular measure of community structure. However, maximizing the modularity can lead to many competing partitions, with almost the same modularity, that are poorly correlated with each other. It can also produce illusory ‘‘communities’’ in random graphs where none exist. We address this problem by using the modularity as a Hamiltonian at finite temperature and using an efficient belief propagation algorithm to obtain the consensus of many partitions with high modularity, rather than looking for a single partition that maximizes it. We show analytically and numerically that the proposed algorithm works all of the way down to the detectability transition in networks generated by the stochastic block model. It also performs well on real-world networks, revealing large communities in some networks where previous work has claimed no communities exist. Finally we show that by applying our algorithm recursively, subdividing communities until no statistically significant subcommunities can be found, we can detect hierarchical structure in real-world networks more efficiently than previous methods.

Journal ArticleDOI
TL;DR: GFT may be inaccurate, but improved methodologic underpinnings can yield accurate predictions, and applying similar methods elsewhere can improve digital disease detection, with broader transparency, improved accuracy, and real-world public health impacts.

Journal ArticleDOI
TL;DR: In this paper, a theoretical study of the computational tests used in the benchmarking explains why that may be the case, and a theoretical analysis of the D-Wave-2 type shows that they do not perform faster than standard desktop computers.
Abstract: Recent benchmarking of the computational speedup of quantum ``annealing'' machines of the D-Wave-2 type shows that they do not perform faster than a standard desktop computer. A timely theoretical study of the computational tests used in the benchmarking explains why that may be the case.

Journal ArticleDOI
14 Mar 2014
TL;DR: This work explores how big data can be useful in urban planning by formalizing the planning process as a general computational problem and shows that, under general conditions, new sources of data coordinated with urban policy can be applied following fundamental principles of engineering to achieve new solutions to important age-old urban problems.
Abstract: There is much enthusiasm currently about the possibilities created by new and more extensive sources of data to better understand and manage cities. Here, I explore how big data can be useful in urban planning by formalizing the planning process as a general computational problem. I show that, under general conditions, new sources of data coordinated with urban policy can be applied following fundamental principles of engineering to achieve new solutions to important age-old urban problems. I also show that comprehensive urban planning is computationally intractable (i.e., practically impossible) in large cities, regardless of the amounts of data available. This dilemma between the need for planning and coordination and its impossibility in detail is resolved by the recognition that cities are first and foremost self-organizing social networks embedded in space and enabled by urban infrastructure and services. As such, the primary role of big data in cities is to facilitate information flows and ...

Journal ArticleDOI
TL;DR: There is a pressing need for an increased research effort to develop a more comprehensive understanding of impacts, as well as for the development of policy measures under existing uncertainty.
Abstract: The impacts of global climate change on different aspects of humanity’s diverse life-support systems are complex and often difficult to predict. To facilitate policy decisions on mitigation and adaptation strategies, it is necessary to understand, quantify, and synthesize these climate-change impacts, taking into account their uncertainties. Crucial to these decisions is an understanding of how impacts in different sectors overlap, as overlapping impacts increase exposure, lead to interactions of impacts, and are likely to raise adaptation pressure. As a first step we develop herein a framework to study coinciding impacts and identify regional exposure hotspots. This framework can then be used as a starting point for regional case studies on vulnerability and multifaceted adaptation strategies. We consider impacts related to water, agriculture, ecosystems, and malaria at different levels of global warming. Multisectoral overlap starts to be seen robustly at a mean global warming of 3 °C above the 1980–2010 mean, with 11% of the world population subject to severe impacts in at least two of the four impact sectors at 4 °C. Despite these general conclusions, we find that uncertainty arising from the impact models is considerable, and larger than that from the climate models. In a low probability-high impact worst-case assessment, almost the whole inhabited world is at risk for multisectoral pressures. Hence, there is a pressing need for an increased research effort to develop a more comprehensive understanding of impacts, as well as for the development of policy measures under existing uncertainty.

Book ChapterDOI
TL;DR: This chapter puts emergence of plant viruses into the framework of evolutionary ecology, genetics, and epidemiology and stresses that viral emergence begins with the stochastic transmission of preexisting genetic variants from the reservoir to the new host, followed by adaptation to new hosts or vectors, and finalizes with an efficient epidemiological spread.
Abstract: Viruses are common agents of plant infectious diseases. During last decades, worldwide agriculture production has been compromised by a series of epidemics caused by new viruses that spilled over from reservoir species or by new variants of classic viruses that show new pathogenic and epidemiological properties. Virus emergence has been generally associated with ecological change or with intensive agronomical practices. However, the complete picture is much more complex since the viral populations constantly evolve and adapt to their new hosts and vectors. This chapter puts emergence of plant viruses into the framework of evolutionary ecology, genetics, and epidemiology. We will stress that viral emergence begins with the stochastic transmission of preexisting genetic variants from the reservoir to the new host, whose fate depends on their fitness on each hosts, followed by adaptation to new hosts or vectors, and finalizes with an efficient epidemiological spread.

Journal ArticleDOI
21 Feb 2014-Science
TL;DR: It is shown that the binding sites of larger genotype networks are not only more robust, but the sequences adjacent to such networks can also bind more transcription factors, thus demonstrating that robustness can facilitate evolvability.
Abstract: Robustness, the maintenance of a character in the presence of genetic change, can help preserve adaptive traits but also may hinder evolvability, the ability to bring forth novel adaptations. We used genotype networks to analyze the binding site repertoires of 193 transcription factors from mice and yeast, providing empirical evidence that robustness and evolvability need not be conflicting properties. Network vertices represent binding sites where two sites are connected if they differ in a single nucleotide. We show that the binding sites of larger genotype networks are not only more robust, but the sequences adjacent to such networks can also bind more transcription factors, thus demonstrating that robustness can facilitate evolvability.

Journal ArticleDOI
12 Feb 2014-PLOS ONE
TL;DR: A theory of settlement scaling in archaeology is developed, deriving the relationship between population and settled area from a consideration of the interplay between social and infrastructural networks and showing that total settlement area increases with population size, on average.
Abstract: Cities are increasingly the fundamental socio-economic units of human societies worldwide, but we still lack a unified characterization of urbanization that captures the social processes realized by cities across time and space. This is especially important for understanding the role of cities in the history of human civilization and for determining whether studies of ancient cities are relevant for contemporary science and policy. As a step in this direction, we develop a theory of settlement scaling in archaeology, deriving the relationship between population and settled area from a consideration of the interplay between social and infrastructural networks. We then test these models on settlement data from the Pre-Hispanic Basin of Mexico to show that this ancient settlement system displays spatial scaling properties analogous to those observed in modern cities. Our data derive from over 1,500 settlements occupied over two millennia and spanning four major cultural periods characterized by different levels of agricultural productivity, political centralization and market development. We show that, in agreement with theory, total settlement area increases with population size, on average, according to a scale invariant relation with an exponent in the range . As a consequence, we are able to infer aggregate socio-economic properties of ancient societies from archaeological measures of settlement organization. Our findings, from an urban settlement system that evolved independently from its old-world counterparts, suggest that principles of settlement organization are very general and may apply to the entire range of human history.

Journal ArticleDOI
TL;DR: The results indicate that human-associated microbial communities can be transferred to indoor surfaces following contact, and that such transmission is possible even when contact is indirect, but that proximity to other surfaces in the classroom does not influence community composition.
Abstract: Background: Humans can spend the majority of their time indoors, but little is known about the interactions between the human and built-environment microbiomes or the forces that drive microbial community assembly in the built environment. We sampled 16S rRNA genes from four different surface types throughout a university classroom to determine whether bacterial assemblages on each surface were best predicted by routine human interactions or by proximity to other surfaces within the classroom. We then analyzed our data with publicly-available datasets representing potential source environments. Results: Bacterial assemblages from the four surface types, as well as individual taxa, were indicative of different source pools related to the type of human contact each surface routinely encounters. Spatial proximity to other surfaces in the classroom did not predict community composition. Conclusions: Our results indicate that human-associated microbial communities can be transferred to indoor surfaces following contact, and that such transmission is possible even when contact is indirect, but that proximity to other surfaces in the classroom does not influence community composition.

Journal ArticleDOI
13 Jun 2014-Science
TL;DR: Analysis of animal growth energetics indicates that dinosaurs had intermediate metabolic rates and elevated but labile temperatures, suggesting that the modern dichotomy of endothermic versus ectothermic is overly simplistic.
Abstract: Were dinosaurs ectotherms or fast-metabolizing endotherms whose activities were unconstrained by temperature? To date, some of the strongest evidence for endothermy comes from the rapid growth rates derived from the analysis of fossil bones. However, these studies are constrained by a lack of comparative data and an appropriate energetic framework. Here we compile data on ontogenetic growth for extant and fossil vertebrates, including all major dinosaur clades. Using a metabolic scaling approach, we find that growth and metabolic rates follow theoretical predictions across clades, although some groups deviate. Moreover, when the effects of size and temperature are considered, dinosaur metabolic rates were intermediate to those of endotherms and ectotherms and closest to those of extant mesotherms. Our results suggest that the modern dichotomy of endothermic versus ectothermic is overly simplistic.

Journal ArticleDOI
26 Sep 2014-PLOS ONE
TL;DR: It is shown that combining phage and antibiotics substantially increases bacterial control compared to either separately, and that there is a specific time delay in antibiotic introduction independent of antibiotic dose, that minimizes both bacterial density and resistance to either antibiotics or phage.
Abstract: The evolution of antibiotic resistance in bacteria is a global concern and the use of bacteriophages alone or in combined therapies is attracting increasing attention as an alternative. Evolutionary theory predicts that the probability of bacterial resistance to both phages and antibiotics will be lower than to either separately, due for example to fitness costs or to trade-offs between phage resistance mechanisms and bacterial growth. In this study, we assess the population impacts of either individual or combined treatments of a bacteriophage and streptomycin on the nosocomial pathogen Pseudomonas aeruginosa. We show that combining phage and antibiotics substantially increases bacterial control compared to either separately, and that there is a specific time delay in antibiotic introduction independent of antibiotic dose, that minimizes both bacterial density and resistance to either antibiotics or phage. These results have implications for optimal combined therapeutic approaches.