scispace - formally typeset
Search or ask a question

Showing papers by "Santa Fe Institute published in 2016"


Journal ArticleDOI
TL;DR: An evolutionary definition of a cell type is proposed that allows cell types to be delineated and compared within and between species, and the distinction between developmental and evolutionary lineages is discussed.
Abstract: Cell types are the basic building blocks of multicellular organisms and are extensively diversified in animals. Despite recent advances in characterizing cell types, classification schemes remain ambiguous. We propose an evolutionary definition of a cell type that allows cell types to be delineated and compared within and between species. Key to cell type identity are evolutionary changes in the 'core regulatory complex' (CoRC) of transcription factors, that make emergent sister cell types distinct, enable their independent evolution and regulate cell type-specific traits termed apomeres. We discuss the distinction between developmental and evolutionary lineages, and present a roadmap for future research.

523 citations


Journal ArticleDOI
TL;DR: This work focuses on four major phases that witnessed broad anthropogenic alterations to biodiversity—the Late Pleistocene global human expansion, the Neolithic spread of agriculture, the era of island colonization, and the emergence of early urbanized societies and commercial networks.
Abstract: The exhibition of increasingly intensive and complex niche construction behaviors through time is a key feature of human evolution, culminating in the advanced capacity for ecosystem engineering exhibited by Homo sapiens. A crucial outcome of such behaviors has been the dramatic reshaping of the global biosphere, a transformation whose early origins are increasingly apparent from cumulative archaeological and paleoecological datasets. Such data suggest that, by the Late Pleistocene, humans had begun to engage in activities that have led to alterations in the distributions of a vast array of species across most, if not all, taxonomic groups. Changes to biodiversity have included extinctions, extirpations, and shifts in species composition, diversity, and community structure. We outline key examples of these changes, highlighting findings from the study of new datasets, like ancient DNA (aDNA), stable isotopes, and microfossils, as well as the application of new statistical and computational methods to datasets that have accumulated significantly in recent decades. We focus on four major phases that witnessed broad anthropogenic alterations to biodiversity—the Late Pleistocene global human expansion, the Neolithic spread of agriculture, the era of island colonization, and the emergence of early urbanized societies and commercial networks. Archaeological evidence documents millennia of anthropogenic transformations that have created novel ecosystems around the world. This record has implications for ecological and evolutionary research, conservation strategies, and the maintenance of ecosystem services, pointing to a significant need for broader cross-disciplinary engagement between archaeology and the biological and environmental sciences.

516 citations


Journal ArticleDOI
TL;DR: Growthcurver summarizes the growth characteristics of microbial growth curve experiments conducted in a plate reader and is an easy-to-use R package available for installation from the Comprehensive R Archive Network (CRAN).
Abstract: Plate readers can measure the growth curves of many microbial strains in a high-throughput fashion. The hundreds of absorbance readings collected simultaneously for hundreds of samples create technical hurdles for data analysis. Growthcurver summarizes the growth characteristics of microbial growth curve experiments conducted in a plate reader. The data are fitted to a standard form of the logistic equation, and the parameters have clear interpretations on population-level characteristics, like doubling time, carrying capacity, and growth rate. Growthcurver is an easy-to-use R package available for installation from the Comprehensive R Archive Network (CRAN). The source code is available under the GNU General Public License and can be obtained from Github (Sprouffske K, Growthcurver sourcecode, 2016).

461 citations


Journal ArticleDOI
TL;DR: This target article sketches the evidence from five domains that bear on the explanatory adequacy of cultural group selection and competing hypotheses to explain human cooperation and presents evidence, including quantitative evidence, that the answer to all of the questions is “yes” and argues that it is not clear that any extant alternative tocultural group selection can be a complete explanation.
Abstract: Human cooperation is highly unusual. We live in large groups composed mostly of non-relatives. Evolutionists have proposed a number of explanations for this pattern, including cultural group selection and extensions of more general processes such as reciprocity, kin selection, and multi-level selection acting on genes. Evolutionary processes are consilient; they affect several different empirical domains, such as patterns of behavior and the proximal drivers of that behavior. In this target article, we sketch the evidence from five domains that bear on the explanatory adequacy of cultural group selection and competing hypotheses to explain human cooperation. Does cultural transmission constitute an inheritance system that can evolve in a Darwinian fashion? Are the norms that underpin institutions among the cultural traits so transmitted? Do we observe sufficient variation at the level of groups of considerable size for group selection to be a plausible process? Do human groups compete, and do success and failure in competition depend upon cultural variation? Do we observe adaptations for cooperation in humans that most plausibly arose by cultural group selection? If the answer to one of these questions is "no," then we must look to other hypotheses. We present evidence, including quantitative evidence, that the answer to all of the questions is "yes" and argue that we must take the cultural group selection hypothesis seriously. If culturally transmitted systems of rules (institutions) that limit individual deviance organize cooperation in human societies, then it is not clear that any extant alternative to cultural group selection can be a complete explanation.

422 citations


Journal ArticleDOI
TL;DR: This is the first study demonstrating that the diversity of different microbial groups has significantly lower rates of turnover across temperature gradients than other major taxa, which has important implications for assessing the effects of human-caused changes in climate, land use and other factors.
Abstract: Climate warming is increasingly leading to marked changes in plant and animal biodiversity, but it remains unclear how temperatures affect microbial biodiversity, particularly in terrestrial soils. Here we show that, in accordance with metabolic theory of ecology, taxonomic and phylogenetic diversity of soil bacteria, fungi and nitrogen fixers are all better predicted by variation in environmental temperature than pH. However, the rates of diversity turnover across the global temperature gradients are substantially lower than those recorded for trees and animals, suggesting that the diversity of plant, animal and soil microbial communities show differential responses to climate change. To the best of our knowledge, this is the first study demonstrating that the diversity of different microbial groups has significantly lower rates of turnover across temperature gradients than other major taxa, which has important implications for assessing the effects of human-caused changes in climate, land use and other factors.

383 citations


Journal ArticleDOI
19 Feb 2016-Science
TL;DR: Economic policy needs interdisciplinary network analysis and behavioral modeling for better monitoring and management of highly interconnected economic and financial systems and, thus, may help anticipate and manage future crises.
Abstract: Traditional economic theory could not explain, much less predict, the near collapse of the financial system and its long-lasting effects on the global economy. Since the 2008 crisis, there has been increasing interest in using ideas from complexity theory to make sense of economic and financial markets. Concepts, such as tipping points, networks, contagion, feedback, and resilience have entered the financial and regulatory lexicon, but actual use of complexity models and results remains at an early stage. Recent insights and techniques offer potential for better monitoring and management of highly interconnected economic and financial systems and, thus, may help anticipate and manage future crises.

344 citations


Journal ArticleDOI
TL;DR: This work focuses in particular on the problem of community detection in networks and develops a mathematically principled approach that combines a network and its metadata to detect communities more accurately than can be done with either alone.
Abstract: For many networks of scientific interest we know both the connections of the network and information about the network nodes, such as the age or gender of individuals in a social network. Here we demonstrate how this 'metadata' can be used to improve our understanding of network structure. We focus in particular on the problem of community detection in networks and develop a mathematically principled approach that combines a network and its metadata to detect communities more accurately than can be done with either alone. Crucially, the method does not assume that the metadata are correlated with the communities we are trying to find. Instead, the method learns whether a correlation exists and correctly uses or ignores the metadata depending on whether they contain useful information. We demonstrate our method on synthetic networks with known structure and on real-world networks, large and small, drawn from social, biological and technological domains.

344 citations


Journal ArticleDOI
TL;DR: By analyzing word lists covering nearly two-thirds of the world’s languages, it is demonstrated that a considerable proportion of 100 basic vocabulary items carry strong associations with specific kinds of human speech sounds, occurring persistently across continents and linguistic lineages (linguistic families or isolates).
Abstract: It is widely assumed that one of the fundamental properties of spoken language is the arbitrary relation between sound and meaning. Some exceptions in the form of nonarbitrary associations have been documented in linguistics, cognitive science, and anthropology, but these studies only involved small subsets of the 6,000+ languages spoken in the world today. By analyzing word lists covering nearly two-thirds of the world’s languages, we demonstrate that a considerable proportion of 100 basic vocabulary items carry strong associations with specific kinds of human speech sounds, occurring persistently across continents and linguistic lineages (linguistic families or isolates). Prominently among these relations, we find property words (“small” and i, “full” and p or b) and body part terms (“tongue” and l, “nose” and n). The areal and historical distribution of these associations suggests that they often emerge independently rather than being inherited or borrowed. Our results therefore have important implications for the language sciences, given that nonarbitrary associations have been proposed to play a critical role in the emergence of cross-modal mappings, the acquisition of language, and the evolution of our species’ unique communication system.

320 citations


Journal ArticleDOI
TL;DR: The Paris Agreement duly reflects the latest scientific understanding of systemic global warming risks as discussed by the authors, and Limiting the anthropogenic temperature anomaly to 1.5-2 °C is possible, yet requires transformational change across the board of modernity.
Abstract: The Paris Agreement duly reflects the latest scientific understanding of systemic global warming risks. Limiting the anthropogenic temperature anomaly to 1.5–2 °C is possible, yet requires transformational change across the board of modernity.

276 citations


Journal ArticleDOI
TL;DR: In this article, the authors show that energy storage can add value to wind and solar technologies, but cost reduction remains necessary to reach widespread profitability, and that renewable energy storage is vital to the widespread rollout of renewable electricity technologies.
Abstract: Energy storage is vital to the widespread rollout of renewable electricity technologies. Modelling shows that energy storage can add value to wind and solar technologies, but cost reduction remains necessary to reach widespread profitability.

258 citations


Journal ArticleDOI
TL;DR: It is suggested that the increased perceptual salience of the violation in utterance final position (due to phrase-final lengthening) influenced how S-V agreement violations were processed during sentence comprehension.
Abstract: Previous ERP studies have often reported two ERP components—LAN and P600—in response to subject-verb (S-V) agreement violations (e.g., the boys *runs). However, the latency, amplitude and scalp distribution of these components have been shown to vary depending on various experiment-related factors. One factor that has not received attention is the extent to which the relative perceptual salience related to either the utterance position (verbal inflection in utterance-medial vs. utterance-final contexts) or the type of agreement violation (errors of omission vs. errors of commission) may influence the auditory processing of S-V agreement. The lack of reports on these effects in ERP studies may be due to the fact that most studies have used the visual modality, which does not reveal acoustic information. To address this gap, we used ERPs to measure the brain activity of Australian English-speaking adults while they listened to sentences in which the S-V agreement differed by type of agreement violation and utterance position. We observed early negative and positive clusters (AN/P600 effects) for the overall grammaticality effect. Further analysis revealed that the mean amplitude and distribution of the P600 effect was only significant in contexts where the S-V agreement violation occurred utterance-finally, regardless of the type of agreement violation. The mean amplitude and distribution of the negativity did not differ significantly across types of agreement violation and utterance position. These findings suggest that the increased perceptual salience of the violation in utterance-final position (due to phrase-final lengthening) influenced how S-V agreement violations were processed during sentence comprehension. Implications for the functional interpretation of language-related ERPs and experimental design are discussed.

Journal ArticleDOI
TL;DR: In this paper, the authors survey the literature on the economic consequences of the structure of social networks and develop a taxonomy of macro and micro characteristics of social interaction networks and discuss both the theoretical and empirical findings concerning the role of those characteristics in determining learning, diffusion, decisions, and resulting behaviors.
Abstract: We survey the literature on the economic consequences of the structure of social networks. We develop a taxonomy of 'macro' and 'micro' characteristics of social interaction networks and discuss both the theoretical and empirical findings concerning the role of those characteristics in determining learning, diffusion, decisions, and resulting behaviors. We also discuss the challenges of accounting for the endogeneity of networks in assessing the relationship between the patterns of interactions and behaviors.

Journal ArticleDOI
TL;DR: The use of phages in combination with antibiotics in combination is advocated and the evolutionary basis for this claim is presented.

Journal ArticleDOI
TL;DR: In this paper, the authors present a model of the energy consumption of personal vehicles in the USA, allowing an evaluation of the adoption potential of electric vehicles, and evaluate the need of individual drivers to meet their needs.
Abstract: Large-scale adoption of electric vehicles will only occur if the needs of individual drivers are met. Here the authors present a model of the energy consumption of personal vehicles in the USA, allowing an evaluation of the adoption potential of electric vehicles.

Journal ArticleDOI
TL;DR: This work forms Moore's law as a correlated geometric random walk with drift, and derives a closed form expression approximating the distribution of forecast errors as a function of time, making it possible to collapse the forecast errors for many different technologies at different time horizons onto the same universal distribution.

Journal ArticleDOI
TL;DR: It is shown that while most large urban systems in Western Europe approximately agree with theoretical expectations, the small number of cities in each nation and their natural variability preclude drawing strong conclusions, and a simple statistical procedure is demonstrated to identify urban scaling relations, which then clearly emerge as a property of European cities.
Abstract: Over the last few decades, in disciplines as diverse as economics, geography and complex systems, a perspective has arisen proposing that many properties of cities are quantitatively predictable due to agglomeration or scaling effects. Using new harmonized definitions for functional urban areas, we examine to what extent these ideas apply to European cities. We show that while most large urban systems in Western Europe (France, Germany, Italy, Spain, UK) approximately agree with theoretical expectations, the small number of cities in each nation and their natural variability preclude drawing strong conclusions. We demonstrate how this problem can be overcome so that cities from different urban systems can be pooled together to construct larger datasets. This leads to a simple statistical procedure to identify urban scaling relations, which then clearly emerge as a property of European cities. We compare the predictions of urban scaling to Zipf's law for the size distribution of cities and show that while the former holds well the latter is a poor descriptor of European cities. We conclude with scenarios for the size and properties of future pan-European megacities and their implications for the economic productivity, technological sophistication and regional inequalities of an integrated European urban system.

Journal ArticleDOI
TL;DR: This work forecasts climate impacts on future forest growth in North America using a network of over two million tree-ring observations spanning North America and a space-for-time substitution methodology, and explores differing scenarios of increased water-use efficiency due to CO2 -fertilisation and increased effective precipitation.
Abstract: Predicting long-term trends in forest growth requires accurate characterisation of how the relationship between forest productivity and climatic stress varies across climatic regimes. Using a network of over two million tree-ring observations spanning North America and a space-for-time substitution methodology, we forecast climate impacts on future forest growth. We explored differing scenarios of increased water-use efficiency (WUE) due to CO2 -fertilisation, which we simulated as increased effective precipitation. In our forecasts: (1) climate change negatively impacted forest growth rates in the interior west and positively impacted forest growth along the western, southeastern and northeastern coasts; (2) shifting climate sensitivities offset positive effects of warming on high-latitude forests, leaving no evidence for continued 'boreal greening'; and (3) it took a 72% WUE enhancement to compensate for continentally averaged growth declines under RCP 8.5. Our results highlight the importance of locally adapted forest management strategies to handle regional differences in growth responses to climate change.

Journal ArticleDOI
TL;DR: A mathematical model was developed that predicted neutralization by a subset of experimentally evaluated bnAb combinations with high accuracy, and triple and quadruple combinations of bnAbs were identified that were significantly more effective than the best double combinations, and further improved the probability of having multiple bn Abs simultaneously active against a given virus.
Abstract: The identification of a new generation of potent broadly neutralizing HIV-1 antibodies (bnAbs) has generated substantial interest in their potential use for the prevention and/or treatment of HIV-1 infection. While combinations of bnAbs targeting distinct epitopes on the viral envelope (Env) will likely be required to overcome the extraordinary diversity of HIV-1, a key outstanding question is which bnAbs, and how many, will be needed to achieve optimal clinical benefit. We assessed the neutralizing activity of 15 bnAbs targeting four distinct epitopes of Env, including the CD4-binding site (CD4bs), the V1/V2-glycan region, the V3-glycan region, and the gp41 membrane proximal external region (MPER), against a panel of 200 acute/early clade C HIV-1 Env pseudoviruses. A mathematical model was developed that predicted neutralization by a subset of experimentally evaluated bnAb combinations with high accuracy. Using this model, we performed a comprehensive and systematic comparison of the predicted neutralizing activity of over 1,600 possible double, triple, and quadruple bnAb combinations. The most promising bnAb combinations were identified based not only on breadth and potency of neutralization, but also other relevant measures, such as the extent of complete neutralization and instantaneous inhibitory potential (IIP). By this set of criteria, triple and quadruple combinations of bnAbs were identified that were significantly more effective than the best double combinations, and further improved the probability of having multiple bnAbs simultaneously active against a given virus, a requirement that may be critical for countering escape in vivo. These results provide a rationale for advancing bnAb combinations with the best in vitro predictors of success into clinical trials for both the prevention and treatment of HIV-1 infection.

Journal ArticleDOI
TL;DR: Reciprocally, domestication provides a model system for evaluating on-going debates in evolutionary biology concerning the impact of niche construction, phenotypic plasticity, extra-genetic inheritance, and developmental bias in shaping the direction and tempo of evolutionary change.
Abstract: Niche Construction Theory (NCT) provides a powerful conceptual framework for understanding how and why humans and target species entered into domesticatory relationships that have transformed Earth's biota, landforms, and atmosphere, and shaped the trajectory of human cultural development. NCT provides fresh perspective on how niche-constructing behaviors of humans and plants and animals promote co-evolutionary interactions that alter selection pressures and foster genetic responses in domesticates. It illuminates the role of niche-altering activities in bequeathing an ecological inheritance that perpetuates the co-evolutionary relationships leading to domestication, especially as it pertains to traditional ecological knowledge and the transmission of learned behaviors aimed at enhancing returns from local environments. NCT also provides insights into the contexts and mechanisms that promote cooperative interactions in both humans and target species needed to sustain niche-constructing activities, ensuring that these activities pro- duce an ecological inheritance in which domesticates play an increasing role. A NCT perspective contributes to on-going debates in the social sciences over explanatory frameworks for domestication, in particular as they pertain to issues of reciprocal causa- tion, co-evolution, and the role of human intentionality. Reciprocally, domestication provides a model system for evaluating on-going debates in evolutionary biology con- cerning the impact of niche construction, phenotypic plasticity, extra-genetic inheritance, and developmental bias in shaping the direction and tempo of evolutionary change.

Journal ArticleDOI
TL;DR: In this paper, the authors examine the interplay between social norms and the enforcement of laws and show that laws that are in strong conflict with prevailing social norms may backfire, while gradual tightening of laws can be more effective in influencing social norms.
Abstract: We examine the interplay between social norms and the enforcement of laws. Agents choose a behavior (e.g., tax evasion, production of low-quality products, corruption, harassing behavior, substance abuse, etc.) and then are randomly matched with another agent. There are complementarities in behaviors so that an agent's payoff decreases with the mismatch between her behavior and her partner's, and from overall negative externalities created by the behavior of others. A law is an upper bound (cap) on behavior. A law-breaker, when detected, pays a fine and has her behavior forced down to the level of the law. Equilibrium law-breaking depends on social norms because detection relies, at least in part, on whistle-blowing. Law-abiding agents have an incentive to whistle-blow on a law-breaking partner because this reduces the mismatch with their partners' behaviors as well as the negative externalities. When laws are in conflict with norms and many agents are breaking the law, each agent anticipates little whistle-blowing and is more likely to also break the law. Tighter laws (banning more behaviors), greater fines, and better public enforcement, all have counteracting effects, reducing behavior among law-abiding individuals but increasing it among law-breakers. We show that laws that are in strong conflict with prevailing social norms may backfire, while gradual tightening of laws can be more effective in influencing social norms and behavior.

Posted Content
TL;DR: This work studies the subtle but important decisions underlying the specification of a configuration model, and investigates the role these choices play in graph sampling procedures and a suite of applications, placing particular emphasis on the importance of specifying the appropriate graph labeling under which to consider a null model.
Abstract: Random graph null models have found widespread application in diverse research communities analyzing network datasets, including social, information, and economic networks, as well as food webs, protein-protein interactions, and neuronal networks. The most popular family of random graph null models, called configuration models, are defined as uniform distributions over a space of graphs with a fixed degree sequence. Commonly, properties of an empirical network are compared to properties of an ensemble of graphs from a configuration model in order to quantify whether empirical network properties are meaningful or whether they are instead a common consequence of the particular degree sequence. In this work we study the subtle but important decisions underlying the specification of a configuration model, and investigate the role these choices play in graph sampling procedures and a suite of applications. We place particular emphasis on the importance of specifying the appropriate graph labeling (stub-labeled or vertex-labeled) under which to consider a null model, a choice that closely connects the study of random graphs to the study of random contingency tables. We show that the choice of graph labeling is inconsequential for studies of simple graphs, but can have a significant impact on analyses of multigraphs or graphs with self-loops. The importance of these choices is demonstrated through a series of three vignettes, analyzing network datasets under many different configuration models and observing substantial differences in study conclusions under different models. We argue that in each case, only one of the possible configuration models is appropriate. While our work focuses on undirected static networks, it aims to guide the study of directed networks, dynamic networks, and all other network contexts that are suitably studied through the lens of random graph null models.

Journal ArticleDOI
TL;DR: In this article, an overview of different sequential, nontailored, as well as specialized tailored algorithms on the Google instances is given, and the typical complexity of the benchmark problems using insights from the study of spin glasses.
Abstract: To date, a conclusive detection of quantum speedup remains elusive. Recently, a team by Google Inc. [V. S. Denchev et al., Phys. Rev. X 6, 031015 (2016)] proposed a weak-strong cluster model tailored to have tall and narrow energy barriers separating local minima, with the aim to highlight the value of finite-range tunneling. More precisely, results from quantum Monte Carlo simulations as well as the D-Wave 2X quantum annealer scale considerably better than state-of-the-art simulated annealing simulations. Moreover, the D-Wave 2X quantum annealer is $\ensuremath{\sim}{10}^{8}$ times faster than simulated annealing on conventional computer hardware for problems with approximately ${10}^{3}$ variables. Here, an overview of different sequential, nontailored, as well as specialized tailored algorithms on the Google instances is given. We show that the quantum speedup is limited to sequential approaches and study the typical complexity of the benchmark problems using insights from the study of spin glasses.

Journal ArticleDOI
TL;DR: In this paper, the authors used an extensive dataset on mobile phone activity in Rwanda and exploited the quasi-random timing and location of natural disasters to show that individuals make transfers and calls to people affected by disasters.

Journal ArticleDOI
14 Jan 2016-Nature
TL;DR: A critical functional relationship between boreal summer insolation and global carbon dioxide (CO2) concentration is proposed, which explains the beginning of the past eight glacial cycles and might anticipate future periods of glacial inception.
Abstract: The past rapid growth of Northern Hemisphere continental ice sheets, which terminated warm and stable climate periods, is generally attributed to reduced summer insolation in boreal latitudes. Yet such summer insolation is near to its minimum at present, and there are no signs of a new ice age. This challenges our understanding of the mechanisms driving glacial cycles and our ability to predict the next glacial inception. Here we propose a critical functional relationship between boreal summer insolation and global carbon dioxide (CO2) concentration, which explains the beginning of the past eight glacial cycles and might anticipate future periods of glacial inception. Using an ensemble of simulations generated by an Earth system model of intermediate complexity constrained by palaeoclimatic data, we suggest that glacial inception was narrowly missed before the beginning of the Industrial Revolution. The missed inception can be accounted for by the combined effect of relatively high late-Holocene CO2 concentrations and the low orbital eccentricity of the Earth. Additionally, our analysis suggests that even in the absence of human perturbations no substantial build-up of ice sheets would occur within the next several thousand years and that the current interglacial would probably last for another 50,000 years. However, moderate anthropogenic cumulative CO2 emissions of 1,000 to 1,500 gigatonnes of carbon will postpone the next glacial inception by at least 100,000 years. Our simulations demonstrate that under natural conditions alone the Earth system would be expected to remain in the present delicately balanced interglacial climate state, steering clear of both large-scale glaciation of the Northern Hemisphere and its complete deglaciation, for an unusually long time.

Journal ArticleDOI
TL;DR: It is suggested that opportunities exist to produce process-based range models for many species, by using hierarchical and inverse modeling to borrow strength across species, fill data gaps, fuse diverse data sets, and model across biological and spatial scales.
Abstract: Understanding and forecasting species' geographic distributions in the face of global change is a central priority in biodiversity science. The existing view is that one must choose between correlative models for many species versus process-based models for few species. We suggest that opportunities exist to produce process-based range models for many species, by using hierarchical and inverse modeling to borrow strength across species, fill data gaps, fuse diverse data sets, and model across biological and spatial scales. We review the statistical ecology and population and range modeling literature, illustrating these modeling strategies in action. A variety of large, coordinated ecological datasets that can feed into these modeling solutions already exist, and we highlight organisms that seem ripe for the challenge.

Journal ArticleDOI
TL;DR: This work adapts Forman's discretization of Ricci curvature to the case of undirected networks, both weighted and unweighted, and investigates the measure in a variety of model and real-world networks to suggest that it can be employed to gain novel insights on the organization of complex networks.
Abstract: We adapt Forman's discretization of Ricci curvature to the case of undirected networks, both weighted and unweighted, and investigate the measure in a variety of model and real-world networks. We find that most nodes and edges in model and real networks have a negative curvature. Furthermore, the distribution of Forman curvature of nodes and edges is narrow in random and small-world networks, while the distribution is broad in scale-free and real-world networks. In most networks, Forman curvature is found to display significant negative correlation with degree and centrality measures. However, Forman curvature is uncorrelated with clustering coefficient in most networks. Importantly, we find that both model and real networks are vulnerable to targeted deletion of nodes with highly negative Forman curvature. Our results suggest that Forman curvature can be employed to gain novel insights on the organization of complex networks.

Journal ArticleDOI
TL;DR: It is found that 5,310 years ago, maize in the Tehuacan Valley was on the whole genetically closer to modern maize than to its wild counterpart, sharply contrasting with the ubiquity of derived alleles in living landraces.

Journal ArticleDOI
TL;DR: A simple mathematical derivation of the universality is presented, and a model is provided, together with its economic implications of open-ended diversity created by urbanization, for understanding the observed empirical distribution of abundances across US metropolitan statistical areas.
Abstract: Understanding cities is central to addressing major global challenges from climate change to economic resilience. Although increasingly perceived as fundamental socio-economic units, the detailed fabric of urban economic activities is only recently accessible to comprehensive analyses with the availability of large datasets. Here, we study abundances of business categories across US metropolitan statistical areas, and provide a framework for measuring the intrinsic diversity of economic activities that transcends scales of the classification scheme. A universal structure common to all cities is revealed, manifesting self-similarity in internal economic structure as well as aggregated metrics (GDP, patents, crime). We present a simple mathematical derivation of the universality, and provide a model, together with its economic implications of open-ended diversity created by urbanization, for understanding the observed empirical distribution. Given the universal distribution, scaling analyses for individual business categories enable us to determine their relative abundances as a function of city size. These results shed light on the processes of economic differentiation with scale, suggesting a general structure for the growth of national economies as integrated urban systems.

Journal ArticleDOI
TL;DR: The fundamental limits on learning latent community structure in dynamic networks where nodes change their community membership over time, but where edges are generated independently at each time step are studied, and it is claimed that no algorithm can identify the communities better than chance.
Abstract: Dynamic networks are common in complex systems, and coarse-graining their evolving structure is a key step to understanding them. General mathematical tools for identifying the theoretical limits of such methods are presented.

Journal ArticleDOI
TL;DR: An empirical measure of semantic proximity between concepts is provided using cross-linguistic dictionaries to translate words to and from languages carefully selected to be representative of worldwide diversity to reveal cases where a particular language uses a single “polysemous” word to express multiple concepts that another language represents using distinct words.
Abstract: How universal is human conceptual structure? The way concepts are organized in the human brain may reflect distinct features of cultural, historical, and environmental background in addition to properties universal to human cognition. Semantics, or meaning expressed through language, provides indirect access to the underlying conceptual structure, but meaning is notoriously difficult to measure, let alone parameterize. Here, we provide an empirical measure of semantic proximity between concepts using cross-linguistic dictionaries to translate words to and from languages carefully selected to be representative of worldwide diversity. These translations reveal cases where a particular language uses a single "polysemous" word to express multiple concepts that another language represents using distinct words. We use the frequency of such polysemies linking two concepts as a measure of their semantic proximity and represent the pattern of these linkages by a weighted network. This network is highly structured: Certain concepts are far more prone to polysemy than others, and naturally interpretable clusters of closely related concepts emerge. Statistical analysis of the polysemies observed in a subset of the basic vocabulary shows that these structural properties are consistent across different language groups, and largely independent of geography, environment, and the presence or absence of a literary tradition. The methods developed here can be applied to any semantic domain to reveal the extent to which its conceptual structure is, similarly, a universal attribute of human cognition and language use.