scispace - formally typeset
Search or ask a question

Showing papers by "Santa Fe Institute published in 1998"


Journal ArticleDOI
10 Sep 1998-Nature
TL;DR: In this article, the scaling relationship between density and mass in resource-limited plants was investigated and a mechanistic model was developed to predict that average plant size should scale as the −4/3 power of maximum population density, in agreement with empirical evidence and comparable relationships in animals.
Abstract: Scaling relationships that describe variation in population density with body size in ecological communities, such as the thinning law in plant ecology1,2,3, can be explained in terms of how individuals use resources as a function of their size. Data for rates of xylem transport as a function of stem diameter show that rates of resource use in individual plants scale as approximately the 3/4 power of body mass, which is the same as metabolic rates of animals4,5,6,7. Here we use this relationship to develop a mechanistic model for relationships between density and mass in resource-limited plants. It predicts that average plant size should scale as the −4/3 power of maximum population density, in agreement with empirical evidence and comparable relationships in animals5,6,8, but significantly less than the −3/2 power predicted by geometric models1. Our model implies that fundamental constraints on metabolic rate are reflected in the scaling of population density and other ecological and evolutionary phenomena, including the finding that resource allocation among species in ecosystems is independent of body size5,6,8.

842 citations


Journal ArticleDOI
05 Feb 1998-Nature
TL;DR: Multiple phylogenetic analyses not only authenticate this case as the oldest known HIV-1 infection, but also place its viral sequence near the ancestral node of subtypes B and D in the major group, indicating that these HIV- 1 subtypes, and perhaps all major-group viruses, may have evolved from a single introduction into the African population not long before 1959.
Abstract: There is considerable genetic diversity among viruses of different subtypes (designated A to J) in the major group of human immunodeficiency virus type 1 (HIV-1), the form of HIV that is dominant in the global epidemic. If available, HIV-1 sequences pre-dating the recognition of AIDS could be crucial in defining the time of origin and the subsequent evolution of these viruses in humans. The oldest known case of HIV-1 infection was reported to be that of a sailor from Manchester who died of an AIDS-like illness in 1959; however, the authenticity of this case has not been confirmed. Genetic analysis of sequences from clinical materials obtained from 1971 to 1976 from members of a Norwegian family infected earlier than 1971 showed that they carried viruses of the HIV-1 outlier group, a variant form that is mainly restricted to West Africa. Here we report the amplification and characterization of viral sequences from a 1959 African plasma sample that was previously found to be HIV-1 seropositive. Multiple phylogenetic analyses not only authenticate this case as the oldest known HIV-1 infection, but also place its viral sequence near the ancestral node of subtypes B and D in the major group, indicating that these HIV-1 subtypes, and perhaps all major-group viruses, may have evolved from a single introduction into the African population not long before 1959.

593 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examined the properties of C LMC and found that it is neither an intensive nor an extensive thermodynamic variable and that it vanishes exponentially in the thermodynamic limit for all one-dimensional finite-range spin systems.

403 citations


Journal ArticleDOI
TL;DR: In this article, a model of division of labour in insect societies, based on variable response thresholds, is introduced, where response thresholds refer to the likelihood of reacting to task-associated stimuli.
Abstract: A model of division of labour in insect societies, based on variable response thresholds is introduced. Response thresholds refer to the likelihood of reacting to task–associated stimuli. Low–threshold individuals perform tasks at a lower level of stimulus than high–threshold individuals. Within individual workers, performing a given task induces a decrease in the corresponding threshold, and not performing the task induces an increase in the threshold. This combined reinforcement process leads to the emergence of specialized workers, i.e. workers that are more responsive to stimuli associated with particular task requirements, from a group of initially identical individuals. Predictions of the dynamics of task specialization resulting from this model are presented. Predictions are also made as to what should be observed when specialists of a given task are removed from the colony and reintroduced after a varying amount of time: the colony does not recover the same state as that prior to the perturbation, and the difference between before and after the perturbation is more strongly marked as the time between separation and reintroduction increases.

398 citations


Posted Content
TL;DR: In this article, a nonequilibrium price formation rule, developed in the context of trading with market orders, is used to study multi-period markets analytically. But it does not explain the internal dynamics leading to excess volatility and other phenomena that are difficult to explain using rational expectations models.
Abstract: Markets have internal dynamics leading to excess volatility and other phenomena that are difficult to explain using rational expectations models. This paper studies these using a nonequilibrium price formation rule, developed in the context of trading with market orders. Because this is so much simpler than a standard inter-temporal equilibrium model, it is possible to study multi-period markets analytically. There price dynamics have second order oscillatory terms. Value investing does not necessarily cause prices to track values. Trend following causes short term trends in prices, but also causes longer-term oscillations. When value investing and trend following are combined, even though there is little linear structure, there can be boom-bust cycles,excess and temporally correlated volatility, and fat tails in price fluctuations. The long term evolution of markets can be studied in terms of flows of money. Profits can be decomposed in terms of aggregate pairwise correlations. Under reinvestment of profits this leads to a capital allocation model that is equivalent to a standard model in population biology. An investigation of market efficiency shows that patterns created by trend followers are more resistant to efficiency than those created by value investors, and that profit maximizing behavior slows the progression to efficiency. Order of magnitude estimates suggest that the timescale for efficiency is years to decades.

353 citations


Journal ArticleDOI
TL;DR: It is demonstrated that it is possible to incorporate distributed intracellular delays into existing models for HIV dynamics and to use these refined models to estimate the half-life of free virus from data on the decline in HIV-1 RNA following treatment.
Abstract: We present and analyze a model for the interaction of human immunodeficiency virus type 1 (HIV-1) with target cells that includes a time delay between initial infection and the formation of productively infected cells. Assuming that the variation among cells with respect to this 'intracellular' delay can be approximated by a gamma distribution, a high flexible distribution that can mimic a variety of biologically plausible delays, we provide analytical solutions for the expected decline in plasma virus concentration after the initiation of antiretroviral therapy with one or more protease inhibitors. We then use the model to investigate whether the parameters that characterize viral dynamics can be identified from biological data. Using non-linear least-squares regression to fit the model to simulated data in which the delays conform to a gamma distribution, we show that good estimates for free viral clearance rates, infected cell death rates, and parameters characterizing the gamma distribution can be obtained. For simulated data sets in which the delays were generated using other biologically plausible distributions, reasonably good estimates for viral clearance rates, infected cell death rates, and mean delay times can be obtained using the gamma-delay model. For simulated data sets that include added simulated noise, viral clearance rate estimates are not as reliable. If the mean intracellular delay is known, however, we show that reasonable estimates for the viral clearance rate can be obtained by taking the harmonic mean of viral clearance rate estimates from a group of patients. These results demonstrate that it is possible to incorporate distributed intracellular delays into existing models for HIV dynamics and to use these refined models to estimate the half-life of free virus from data on the decline in HIV-1 RNA following treatment.

274 citations


Journal ArticleDOI
TL;DR: A simple mathematical model of regulation of division of labor in insect societies based on fixed-response thresholds that can account for experimental observations of Wilson (1984), extended to more complicated situations, and explored its properties are introduced.

261 citations


Journal ArticleDOI
TL;DR: It is argued that, in order to explain the observed distributions, gene families have to behave in a coherent fashion within the genome; i.e., the probabilities of duplications of genes within a gene family are not independent of each other.
Abstract: We compare the frequency distribution of gene family sizes in the complete genomes of six bacteria (Escherichia coli, Haemophilus influenzae, Helicobacter pylori, Mycoplasma genitalium, Mycoplasma pneumoniae, and Synechocystis sp. PCC6803), two Archaea (Methanococcus jannaschii and Methanobacterium thermoautotrophicum), one eukaryote (Saccharomyces cerevisiae), the vaccinia virus, and the bacteriophage T4. The sizes of the gene families versus their frequencies show power-law distributions that tend to become flatter (have a larger exponent) as the number of genes in the genome increases. Power-law distributions generally occur as the limit distribution of a multiplicative stochastic process with a boundary constraint. We discuss various models that can account for a multiplicative process determining the sizes of gene families in the genome. In particular, we argue that, in order to explain the observed distributions, gene families have to behave in a coherent fashion within the genome; i.e., the probabilities of duplications of genes within a gene family are not independent of each other. Likewise, the probabilities of deletions of genes within a gene family are not independent of each other.

230 citations


Journal ArticleDOI
TL;DR: Recursion formulae enumerating various sub-classes of polynucleotides as well as certain structural elements (sub-graphs) are constructed and first order asymptotics are derived.

199 citations


Journal ArticleDOI
TL;DR: The extent to which the folding of RNA sequence induces a "statistical topology" on the set of minimum free energy secondary structures is studied and the resulting nearness relation suggests a notion of "continuous" structure transformation.

194 citations


Journal ArticleDOI
21 May 1998-Nature
TL;DR: The statistical properties of a data set containing over 600 species, namely the North American breeding bird survey, are studied, finding that the distribution of changes in population abundance over a one-year interval is remarkably symmetrical, with long tails extending over six orders of magnitude.
Abstract: Population biologists have long been interested in the variability of natural populations1,2,3,4,5,6. One approach to dealing with ecological complexity is to reduce the system to one or a few species, for which meaningful equations can be solved. Here we explore an alternative approach7,8 by studying the statistical properties of a data set containing over 600 species, namely the North American breeding bird survey9. The survey has recorded annual species abundances over a 31-year period along more than 3,000 observation routes10. We now analyse the dynamics of population variability using this data set, and find scaling features in common with inanimate systems composed of strongly interacting subunits11. Specifically, we find that the distribution of changes in population abundance over a one-year interval is remarkably symmetrical, with long tails extending over six orders of magnitude. The variance of the population over a time series increases as a power-law with increasing time lag, indicating long-range correlation in population size fluctuations12. We also find that the distribution of species lifetimes (the time between colonization and local extinction) within local patches is a power-law with an exponential cutoff imposed by the finite length of the time series. Our results provide a quantitative basis for modelling the dynamics of large species assemblages.

Journal ArticleDOI
TL;DR: It is shown that, under certain conditions, pillars are transformed into walls or galleries or chambers, and that this transformation may not be driven by any change in the termites behavior.
Abstract: A simple model of the emergence of pillars in termite nests by Deneubourg is modified to include several additional features that break the homogeneity of the original model: (i) a convection air stream that drives molecules of pheromone along a given direction; (ii) a net flux of individuals in a specific direction; (iii) a well-defined self-maintained pheromone trail; and (iv) a pheromonal template representing the effect of the presence of a queen that continuously emits pheromone. It is shown that, under certain conditions, pillars are transformed into walls or galleries or chambers, and that this transformation may not be driven by any change in the termites' behaviour. Because the same type of response at the individual level can generate different patterns under different conditions, and because previous construction modifies current building conditions, we hypothesize that nest complexity can result from the unfolding of a morphogenetic process that progressively generates a diversity of history-dependent structures.

Proceedings Article
01 Jan 1998
TL;DR: An attempt to understand genomic networks would benefit from the context of a general theory of discrete dynamical networks which is currently emerging, as well as order-chaos measures on typical trajectories that further characterize network dynamics.
Abstract: Many natural processes consist of networks of interacting elements which affect each other's state over time, the dynamics depending on the pattern of connections and the updating rules for each element. Genomic regulatory networks are arguably networks of this sort. An attempt to understand genomic networks would benefit from the context of a general theory of discrete dynamical networks which is currently emerging. A key notion here is global dynamics, whereby state-space is organized into basins of attraction, objects that have only recently become accessible by computer simulation of idealized models, in particular "random Boolean networks". Cell types have been explained as attractors in genomic networks, where the network architecture is biased to achieve a balance between stability and adaptability in response to perturbation. Based on computer simulations using the software Discrete Dynamics Lab (DDLab), these ideas are described, as well as order-chaos measures on typical trajectories that further characterize network dynamics.

Journal ArticleDOI
TL;DR: The method confirms the known structures in HIV-1 and predicts previously unknown conserved RNA secondary structures in HCV, and can be used for small data sets of approximately 10 sequences, efficiently exploiting the information contained in the sequence variability.
Abstract: We propose a new method for detecting conserved RNA secondary structures in a family of related RNA sequences. Our method is based on a combination of thermodynamic structure prediction and phylogenetic comparison. In contrast to purely phylogenetic methods, our algorithm can be used for small data sets of approximately 10 sequences, efficiently exploiting the information contained in the sequence variability. The procedure constructs a prediction only for those parts of sequences that are consistent with a single conserved structure. Our implementation produces reasonable consensus structures without user interference. As an example we have analysed the complete HIV-1 and hepatitis C virus (HCV) genomes as well as the small segment of hantavirus. Our method confirms the known structures in HIV-1 and predicts previously unknown conserved RNA secondary structures in HCV.

Journal ArticleDOI
TL;DR: The good performance of ACO is demonstrated in finding good solutions to the traveling salesman problem using a genetic algorithm to find the best set of parameters.
Abstract: Ant Colony Optimization (ACO) is a promising new approach to combinatorial optimization. Here ACO is applied to the traveling salesman problem (TSP). Using a genetic algorithm (GA) to find the best set of parameters, we demonstrate the good performance of ACO in finding good solutions to the TSP.

Journal ArticleDOI
Eric Bonabeau1
TL;DR: The scope of this remark goes beyond social insects and applies to a wide range of biological systems, including ecosystems.
Abstract: Social insect societies are complex adaptive systems that self-organize within a set of constraints. Although it is important to acknowledge that global order in social insects can arise as a result of internal interactions among insects, it is equally important to include external factors and constraints in the picture, especially as the colony and its environment may influence each other through interactions among internal and external factors. The scope of this remark goes beyond social insects and applies to a wide range of biological systems, including ecosystems.

Journal ArticleDOI
TL;DR: In this paper, the first few levels of a hierarchy of complexity for two-or-more-dimensional patterns are studied, and several definitions of "regular language" or "local rule" that are equivalent in d = 1 lead to distinct classes in d ≥ 2.
Abstract: In dynamical systems such as cellular automata and iterated maps, it is often useful to look at a language or set of symbol sequences produced by the system. There are well-established classification schemes, such as the Chomsky hierarchy, with which we can measure the complexity of these sets of sequences, and thus the complexity of the systems which produce them. In this paper, we look at the first few levels of a hierarchy of complexity for two-or-more-dimensional patterns. We show that several definitions of “regular language” or “local rule” that are equivalent in d=1 lead to distinct classes in d≥2. We explore the closure properties and computational complexity of these classes, including undecidability and L, NL, and NP-completeness results. We apply these classes to cellular automata, in particular to their sets of fixed and periodic points, finite-time images, and limit sets. We show that it is undecidable whether a CA in d≥2 has a periodic point of a given period, and that certain “local lattice languages” are not finite-time images or limit sets of any CA. We also show that the entropy of a d-dimensional CA's finite-time image cannot decrease faster than t −d unless it maps every initial condition to a single homogeneous state.

Journal ArticleDOI
TL;DR: In the Demographic Prisoner's Dilemma (PD) game as mentioned in this paper, agents with finite vision move to random sites on a lattice and play a fixed culturally-inherited zero-memory strategy of cooperate (C) or defect (D) against neighbors.
Abstract: The emergence of cooperation in Prisoner's Dilemma (PD) games is generally assumed to require repeated play (and strategies such as Tit-For-Tat, involving memory of previous interactions) or features ("tags") permitting cooperators and defectors to distinguish one another. In the Demographic Prisoner's Dilemma, neither assumption is made: agents with finite vision move to random sites on a lattice and play a fixed culturally-inherited zero-memory strategy of cooperate (C) or defect (D) against neighbors. Agents are indistinguishable to one another--they are "tagless". Positive payoffs accrue to agents playing C against C, or D against C. Negative payoffs accrue to agents playing C against D, or D against D. Payoffs accumulate. If accumulated payoffs exceed some threshold, agents clone offspring of the same strategy onto neighboring sites and continue play. If accumulated payoffs are negative, agents die and are removed. Spatial zones of cooperation emerge.

Journal ArticleDOI
TL;DR: It is shown that the maximum fitness attained during the adaptive walk of a population evolving on such a fitness landscape increases with increasing degree of neutrality, and is directly related to the fitness of the most fit percolating network.
Abstract: We introduce a model of evolution on a fitness landscape possessing a tunable degree of neutrality. The model allows us to study the general properties of molecular species undergoing neutral evolution. We find that a number of phenomena seen in RNA sequence-structure maps are present also in our general model. Examples are the occurrence of 'common' structures that occupy a fraction of the genotype space which tends to unity as the length of the genotype increases, and the formation of percolating neutral networks that cover the genotype space in such a way that a member of such a network can be found within a small radius of any point in the space. We also describe a number of new phenomena that appear to be general properties of systems possessing selective neutrality. In particular, we show that the maximum fitness attained during the adaptive walk of a population evolving on such a fitness landscape increases with increasing degree of neutrality, and is directly related to the fitness of the most fit percolating network.

Journal ArticleDOI
TL;DR: Guy Theraulaz is a CNRS (Centre National de la Recherche Scientifique) fellow and is currently at the Ethology and AnimalPsychology Laboratory, Paul Sabatier Uni-versity in Toulouse, France.
Abstract: Guy Theraulaz is a CNRS (Centre Nationalde la Recherche Scientifique) fellow andis currently at the Ethology and AnimalPsychology Laboratory, Paul Sabatier Uni-versity in Toulouse, France. Eric Bonabeauis the Interval Research Fellow at the SantaFe Institute, Santa Fe, New Mexico. Jean-Louis Deneubourg is a Lecturer at theUniversite Libre de Bruxelles and a fellowof the Belgian FNRS. He is currently run-ning the Theoretical Behavioral Ecologyunit at the Non-Linear Phenomena andComplex Systems Study Center.

Journal ArticleDOI
TL;DR: One of the proposed Monte Carlo algorithms, a cluster algorithm for the equivalent three color model, appears to have a dynamic exponent close to zero, making it particularly useful for simulations of critical ice models.
Abstract: Ice models are a class of simple classical models of the statistical properties of the hydrogen atoms in water ice. In ice, the oxygen atoms are located on a lattice, and each oxygen atom has four hydrogen bonds to neighboring oxygen atoms, giving a fourfold-coordinated lattice. However, as has long been known, the proton ~hydrogen atom! which forms a hydrogen bond is located not at the center point of the line between two oxygens, but at a point closer to one of the two. Bernal and Fowler @1# and Pauling @2# proposed that the protons are arranged according to two rules, known as the ice rules: ~1! there is precisely one hydrogen atom on each hydrogen bond and ~2! there are precisely two hydrogen atoms near each oxygen atom. Ice models are a class of models mimicking the behavior of systems which obey these rules. The most widely studied ice model is the model on a square lattice in two dimensions. A version of this model was solved exactly by Lieb @3‐5#. The exact solution gives us, for instance, the critical temperature and the free energy of the model. However, there are a number of quantities of interest which cannot be obtained from the exact solution, and for these quantities we turn to Monte Carlo simulation. In this paper we introduce a number of Monte Carlo algorithms for the simulation of ice models, and compare their efficiency. We will show that one of them, the three-color cluster algorithm for square ice described in Sec. V, possesses a very small dynamic exponent ~possibly zero!, and so suffers very little from critical slowing down. We also extend this algorithm to the case of energetic ice models in Sec. VII. Using these algorithms we determine numerically several critical exponents which have not been accurately measured previously: the dimensionality of the percolating cluster of symmetric vertices in the F model at critical temperature, the scaling of the largest loop in the loop representation of both square ice and the F model at critical temperature, and the scaling of the trajectory of a wandering defect in square ice.

Journal ArticleDOI
TL;DR: The authors systematically investigate the effect of blockage sites in a cellular automata model for traffic flow and uses this information for a fast implementation of traffic in Dallas.
Abstract: The authors systematically investigate the effect of blockage sites in a cellular automata model for traffic flow. Different scheduling schemes for the blockage sites are considered. None of them returns a linear relationship between the fraction of green time and the throughput. The authors use this information for a fast implementation of traffic in Dallas.

Journal ArticleDOI
TL;DR: A model of analog computer which can recognize various languages in real time by composing iterated maps, which can be seen as a real-time, constant-space, off-line version of Blum, Shub, and Smale's real-valued machines.

Journal ArticleDOI
TL;DR: A highly detailed model of phage display is devised and analyzed specifically designed to study the influence of the stochastic nature of each laboratory step to better devise molecular search strategies that take the errors into account.

Journal ArticleDOI
TL;DR: In this article, the authors studied how infectious diseases spread in space within one cycle of an epidemic and found that infectious diseases can be spread in a single cycle of a pandemic without detection.
Abstract: How infectious diseases spread in space within one cycle of an epidemic is an important question that has received considerable theoretical attention. There are, however, few empirical studies to s...

Book ChapterDOI
27 Sep 1998
TL;DR: The results show, via a close quantitative agreement, that the embedded-particle framework captures the main information processing mechanisms of the emergent computation that arise in these evolved CAs.
Abstract: We introduce a class of embedded-particle models for describing the emergent computational strategies observed in cellular automata (CAs) that were evolved for performing certain computational tasks. The models are evaluated by comparing their estimated performances with the actual performances of the CAs they model. The results show, via a close quantitative agreement, that the embedded-particle framework captures the main information processing mechanisms of the emergent computation that arise in these evolved CAs.

Journal ArticleDOI
TL;DR: The theory of autocatalytic binary ligation is reviewed within the context of a consistently applied Michaelis-Menten quasi-steady-state approximation to obtain explicit analytical results describing time-course data from experiments.

Journal ArticleDOI
TL;DR: It is shown that a wide variety of nonlinear cellular automata (CAs) can be decomposed into a quasidirect product of linear ones, and that CAs based on nilpotent groups can be predicted in depth O(log t) or O(1) by circuits with binary or “sum mod p” gates, respectively.

Journal ArticleDOI
TL;DR: The notion of temporal granularity is introduced which measures the extent to which discrete-time implementations of continuous-time models can track the payo! of a derivative security by characterizing the asymptotic distribution of the replication errors that arise from delta-hedging derivative securities in discrete time.
Abstract: Continuous-time stochastic processes have become central to many disciplines, yet the fact that they are approximations to physically realizable phenomena is often overlooked. We quantify one aspect of the approximation errors of continuous-time models by investigating the replication errors that arise from delta hedging derivative securities in discrete time. We characterize the asymptotic distribution of these replication errors and their joint distribution with other assets as the number of discrete time periods increases. We introduce the notion of "temporal granularity" for continuous-time stochastic processes, which allows us to quantify the extent to which discrete-time implementations of continuous-time models can track the payoff of a derivative security. We show that granularity is a function of the contract specifications of the derivative security, and of the degree of market completeness. We derive closed form expressions for the granularity of geometric Brownian motion and of an Ornstein-Uhlenbeck process for call and put options, and perform Monte Carlo simulations that illustrate the practical relevance of granularity.

Journal ArticleDOI
TL;DR: A reliable procedure for estimating the amplitude spectra is presented and is based on certain correlation functions that are easily obtained from empirical studies of the landscapes.
Abstract: Fitness landscapes can be decomposed into elementary landscapes using a Fourier transform that is determined by the structure of the underlying configuration space. The amplitude spectrum obtained from the Fourier transform contains information about the ruggedness of the landscape. It can be used for classification and comparison purposes. We consider here three very different types of landscapes using both mutation and recombination to define the topological structure of the configuration spaces. A reliable procedure for estimating the amplitude spectra is presented. The method is based on certain correlation functions that are easily obtained from empirical studies of the landscapes.