scispace - formally typeset
Search or ask a question

Showing papers in "Complexity in 2013"


Journal ArticleDOI
TL;DR: Major distinctions are illustrated by current specific examples, including the evolution of cornets and the historical dynamics of information technologies, which provide examples of planned design that have no equivalent with natural evolution.
Abstract: Technological evolution has been compared to biological evolution by many authors over the last two centuries. As a parallel experiment of innovation involving economic, historical, and social components, artifacts define a universe of evolving properties that displays episodes of diversification and extinction. Here, we critically review previous work comparing the two types of evolution. Like biological evolution, technological evolution is driven by descent with variation and selection, and includes tinkering, convergence, and contingency. At the same time, there are essential differences that make the two types of evolution quite distinct. Major distinctions are illustrated by current specific examples, including the evolution of cornets and the historical dynamics of information technologies. Due to their fast and rich development, the later provide a unique opportunity to study technological evolution at all scales with unprecedented resolution. Despite the presence of patterns suggesting convergent trends between man-made systems end biological ones, they provide examples of planned design that have no equivalent with natural evolution.

116 citations



Journal ArticleDOI
TL;DR: This computational study synchronizes the Circular Restricted Three Body Problem with Lorenz Hyper Chaotic System (LHCS) using a Robust Adaptive Sliding Mode Controller (RASMC) together with uncertainties, external disturbances and fully unknown parameters.
Abstract: In this computational study, we synchronize the Circular Restricted Three Body Problem (CRTBP) with Lorenz Hyper Chaotic System (LHCS) using a Robust Adaptive Sliding Mode Controller (RASMC) together with uncertainties, external disturbances and fully unknown parameters. A simple suitable sliding surface, which includes synchronization errors, is constructed and appropriate update laws are used to tackle the uncertainties, external disturbances and unknown parameters. All simulations to achieve the synchronization for the implemented technique for the two non-identical systems under consideration are being done using Mathematica. © 2013 Wiley Periodicals, Inc. Complexity 18: 58-64, 2013

69 citations


Journal ArticleDOI
TL;DR: A critical threshold is derived for the Deffuant model on Z, above which the opinions converge toward the average value of the initial opinion distribution with probability one, provided the initial distribution has a finite second order moment.
Abstract: In the Deffuant model for social influence, pairs of adjacent agents interact at a constant rate and mix up their opinions (represented by continuous variables) only if the distance between opinions is short according to a threshold. We derive a critical threshold for the Deffuant model on Z, above which the opinions converge toward the average value of the initial opinion distribution with probability one, provided the initial distribution has a finite second order moment. We demonstrate our theoretical results by performing extensive numerical simulations on some continuous probability distributions including uniform, Beta, power-law and normal distributions. Noticed is a clear differentiation of convergence rate that unimodal opinions (regardless of being biased or not) achieve consensus much faster than even or polarized opinions. Hereby, the emergence of a single mainstream view is a prominent feature giving rise to fast consensus in public opinion formation and social contagious behavior. Finally, we discuss the Deffuant model on an infinite Cayley tree, through which general network architectures might be factored in. V C 2013 Wiley Periodicals, Inc. Complexity 19: 38–49, 2013

65 citations


Journal ArticleDOI
TL;DR: This work tracks user-generated messages that contain links to New York Times online articles and labels users according to the topic of the links they share, their geographic location, and their self-descriptive keywords, to map the social, political, and geographical properties of news-sharing communities on Twitter.
Abstract: The importance of collective social action in current events is manifest in the Arab Spring and Occupy movements. Electronic social media have become a pervasive channel for social interactions, and a basis of collective social response to information. The study of social media can reveal how individual actions combine to become the collective dynamics of society. Characterizing the groups that form spontaneously may reveal both how individuals self-identify and how they will act together. Here we map the social, political, and geographical properties of news-sharing communities on Twitter, a popular microblogging platform. We track user-generated messages that contain links to New York Times online articles and we label users according to the topic of the links they share, their geographic location, and their self-descriptive keywords. When users are clustered based on who follows whom in Twitter, we find social groups separate by whether they are interested in local (NY), national (US) or global (cosmopolitan) issues. The national group subdivides into liberal, conservative and other, the latter being a diverse but mostly business oriented group with sports, arts, and other splinters. The national political groups are based across the US but are distinct from the national group that is broadly interested in a variety of topics. A person who is cosmopolitan associates with others who are cosmopolitan, and a US liberal/conservative associates with others who are US liberal/conservative, creating separated social groups with those identities. The existence of “citizens” of local, national, and cosmopolitan communities is a basis for dialog and action at each of these levels of societal organization. © 2013 Wiley Periodicals, Inc. Complexity 19: 10–20, 2013

55 citations


Journal ArticleDOI
TL;DR: System biology and complexity theory reveal that, as in the quantum realm, experimental observations themselves limit the authors' capacity to understand a biological system completely because of scale-dependent ‘horizons of knowledge,’ a form of biological complementarity as predicted by Bohr and Delbruck.
Abstract: Niels Bohr and Max Delbruck believed that complementaritysuch as wave–particle dualitywas not limited to the quantum realm, but had correlates in the study of living things. Biological complementarity would indicate that no single technique or perspective allows comprehensive viewing of all of a biological entity’s complete qualities and behaviors; instead, complementary perspectives, necessarily and irrevocably excluding all others at the moment an experimental approach is selected, would be necessary to understand the whole. Systems biology and complexity theory reveal that, as in the quantum realm, experimental observations themselves limit our capacity to understand a biological system completely because of scale-dependent ‘‘horizons of knowledge,’’ a form of biological complementarity as predicted by Bohr and Delbruck. Specifically, observational selection is inherently, irreducibly coupled to observed biological systems as in the quantum realm. These nested systems, beginning with biomolecules in aqueous solution all the way up to the global ecosystem itself, are understood as a seamless whole operating simultaneously and complementarily at various levels. This selection of an observational stance is inseparable from descriptions of biology indicatesin accordance with views of thinkers such as von Neumann, Wigner, and Stappthat even at levels of scale governed by classical physics, at biological scales, observational choice remains inextricably woven into the establishment, in the observational moment, of the present conditions of existence. These conceptual shifts will not only have theoretical impact, but may point the way to new, successful therapeutic interventions, medically (at the scale of organisms) or environmentally/economically (at a global scale). V C 2013

49 citations


Journal ArticleDOI
TL;DR: On the basis of biological examples and examples from the history of technology, the authors demonstrate the centrality of exaptation for a modern understanding of niche, selection, and environment.
Abstract: Biological adaptation assumes the evolution of structures toward better functions. Yet, the roots of adaptive trajectories usually entail subverted—perverted—structures, derived from a different function: what Gould and Vrba called “exaptation.” Generally, this derivation is regarded as contingent or serendipitous, but it also may have regularities, if not rules, in both biological evolution and technological innovation. On the basis of biological examples and examples from the history of technology, the authors demonstrate the centrality of exaptation for a modern understanding of niche, selection, and environment. In some cases, biological understanding illuminates technical exaptation. Thus, the driver of exaptation is not simply chance matching of function and form; it depends on particular, permissive contexts.

48 citations


Journal ArticleDOI
TL;DR: To establish patterns of materialization of beliefs the authors are going to consider that these have defined mathematical structures that will allow us to understand better cultural processes of text, architecture, norms, and education that are forms or the materialized of an ideology.
Abstract: The concepts of substantive beliefs and derived beliefs are defined, a set of substantive beliefs S like open set and the neighborhood of an element substantive belief. A semantic operation of conjunction is defined with a structure of an Abelian group. Mathematical structures exist such as poset beliefs and join-semilattttice beliefs. A metric space of beliefs and the distance of belief depending on the believer are defined. The concepts of closed and opened ball are defined. S′ is defined as subgroup of the metric space of beliefs Σ and S′ is a totally limited set. The term s is defined (substantive belief) in terms of closing of S′. It is deduced that Σ is paracompact due to Stone's Theorem. The pseudometric space of beliefs is defined to show how the metric of the nonbelieving subject has a topological space like a nonmaterial abstract ideal space formed in the mind of the believing subject, fulfilling the conditions of Kuratowski axioms of closure. To establish patterns of materialization of beliefs we are going to consider that these have defined mathematical structures. This will allow us to understand better cultural processes of text, architecture, norms, and education that are forms or the materialization of an ideology. This materialization is the conversion by means of certain mathematical correspondences, of an abstract set whose elements are beliefs or ideas, in an impure set whose elements are material or energetic. Text is a materialization of ideology. © 2013 Wiley Periodicals, Inc. Complexity 19: 46–62, 2013

47 citations


Journal ArticleDOI
TL;DR: A new multivariate radial basis functions neural network model is proposed to predict the complex chaotic time series and it is found that the evaluation performances and prediction accuracy can achieve an excellent magnitude.
Abstract: In this article, a new multivariate radial basis functions neural network model is proposed to predict the complex chaotic time series. To realize the reconstruction of phase space, we apply the mutual information method and false nearest-neighbor method to obtain the crucial parameters time delay and embedding dimension, respectively, and then expand into the multivariate situation. We also proposed two the objective evaluations, mean absolute error and prediction mean square error, to evaluate the prediction accuracy. To illustrate the prediction model, we use two coupled Rossler systems as examples to do simultaneously single-step prediction and multistep prediction, and find that the evaluation performances and prediction accuracy can achieve an excellent magnitude. © 2013 Wiley Periodicals, Inc. Complexity, 2013.

39 citations


Journal ArticleDOI
TL;DR: Menzerath's law has been argued to be inevitable and non-coding DNA dominates genomes as discussed by the authors, however, the wide range of manifestations of Menzerath law in and outside genomes suggests that the striking similarities between non coding DNA and certain linguistics units could be anecdotal for understanding the recurrence of that statistical law.
Abstract: The importance of statistical patterns of language has been debated over decades. Although Zipf's law is perhaps the most popular case, recently, Menzerath's law has begun to be involved. Menzerath's law manifests in language, music and genomes as a tendency of the mean size of the parts to decrease as the number of parts increases in many situations. This statistical regularity emerges also in the context of genomes, for instance, as a tendency of species with more chromosomes to have a smaller mean chromosome size. It has been argued that the instantiation of this law in genomes is not indicative of any parallel between language and genomes because a the law is inevitable and b noncoding DNA dominates genomes. Here mathematical, statistical, and conceptual challenges of these criticisms are discussed. Two major conclusions are drawn: the law is not inevitable and languages also have a correlate of noncoding DNA. However, the wide range of manifestations of the law in and outside genomes suggests that the striking similarities between noncoding DNA and certain linguistics units could be anecdotal for understanding the recurrence of that statistical law. © 2012 Wiley Periodicals, Inc. Complexity, 2012

34 citations


Journal ArticleDOI
TL;DR: It is concluded that ideologies, myths and beliefs can all be analyzed in terms of systems within a cultural context, which means that such systems can figure in logicmathematical analyses.
Abstract: Mythical and religious belief systems in a social context can be regarded as a conglomeration of sacrosanct rites which revolve around substantive values that involve an element of faith. Moreover, we can conclude that ideologies, myths and beliefs can all be analyzed in terms of systems within a cultural context. The significance of being able to define ideologies, myths and beliefs as systems is that they can figure in cultural explanations. This, in turn, means that such systems can figure in logicmathematical analyses.

Journal ArticleDOI
TL;DR: This work captures the dynamics of CNS using Price's equation, and captures the adaptive purpose of the universe using an optimization program, confirming that CNS acts according to a formal design objective.
Abstract: The cosmological natural selection (CNS) hypothesis holds that the fundamental constants of nature have been fine-tuned by an evolutionary process in which universes produce daughter universes via the formation of black holes. Here, we formulate the CNS hypothesis using standard mathematical tools of evolutionary biology. Specifically, we capture the dynamics of CNS using Price's equation, and we capture the adaptive purpose of the universe using an optimization program. We establish mathematical correspondences between the dynamics and optimization formalisms, confirming that CNS acts according to a formal design objective, with successive generations of universes appearing designed to produce black holes. © 2013 Wiley Periodicals, Inc. Complexity 18: 48–56, 2013

Journal ArticleDOI
TL;DR: A hybrid binary classification model, namely FMLP, is proposed for credit scoring, based on the basic concepts of fuzzy logic and artificial neural networks (ANNs), and it can be concluded that the proposed model can be an appropriate alternative tool for financial binary classification problems, especially in high uncertainty conditions.
Abstract: The credit scoring is a risk evaluation task considered as a critical decision for financial institutions in order to avoid wrong decision that may result in huge amount of losses. Classification models are one of the most widely used groups of data mining approaches that greatly help decision makers and managers to reduce their credit risk of granting credits to customers instead of intuitive experience or portfolio management. Accuracy is one of the most important criteria in order to choose a credit-scoring model; and hence, the researches directed at improving upon the effectiveness of credit scoring models have never been stopped. In this article, a hybrid binary classification model, namely FMLP, is proposed for credit scoring, based on the basic concepts of fuzzy logic and artificial neural networks (ANNs). In the proposed model, instead of crisp weights and biases, used in traditional multilayer perceptrons (MLPs), fuzzy numbers are used in order to better model of the uncertainties and complexities in financial data sets. Empirical results of three well-known benchmark credit data sets indicate that hybrid proposed model outperforms its component and also other those classification models such as support vector machines (SVMs), K-nearest neighbor (KNN), quadratic discriminant analysis (QDA), and linear discriminant analysis (LDA). Therefore, it can be concluded that the proposed model can be an appropriate alternative tool for financial binary classification problems, especially in high uncertainty conditions. © 2013 Wiley Periodicals, Inc. Complexity 18: 46–57, 2013

Journal ArticleDOI
TL;DR: It is shown that nonextensivity and self-organization of systems is a result of mismatch between its elements.
Abstract: The article concerns the new nonextensive model of self-organizing systems and consists of two interrelated parts. The first one presents a new nonextensive model of interaction between elements of systems. The second one concerns the relationship between microscopic and macroscopic processes in complex systems. It is shown that nonextensivity and self-organization of systems is a result of mismatch between its elements. © 2013 Wiley Periodicals, Inc. Complexity 18: 28–36, 2013

Journal ArticleDOI
TL;DR: It is claimed that a combination of groundwater modeling, optimization, and a game theoretical analysis may in fact avoid the tragedy of the commons and shown that the success of the optimal management program depends heavily on the information the users have about the resource.
Abstract: Groundwater is the natural resource most extracted in the world. It supplies 50% of the total potable water requirements, 40% of the industry take, and 20% of agriculture groundwater is a strategic resource for every country. That common-pool resources are highly susceptible to lead to a tragedy of the commons is a well-known fact. We claim that a combination of groundwater modeling, optimization, and a game theoretical analysis may in fact avoid the tragedy. A groundwater model in MODFLOW from Zamora aquifer in Mexico was used as input of a basic but instructive, optimization problem: extract the greatest possible volume of water, but at the same time minimizing the drawdown and drawdown velocity. The solutions of the optimization problem were used to construct the payoffs of a hypothetical game among the aquifer users, the resource's administrator, and a resource protector entity. We show that the success of the optimal management program depends heavily on the information that the users have about the resource. Therefore, better decision-making processes are a consequence of sustainability literacy. Particularly, water literacy could lead to the usage of water considering it as a part of an ecosystem and not only as a natural resource. Additionally, a new non-classical equation for underground flow was derived, that may be specially important to understand and predict the groundwater flow in highly heterogeneous conditions as in karstic aquifers or fractured media. © 2013 Wiley Periodicals, Inc. Complexity 19: 9–21, 2013

Journal ArticleDOI
TL;DR: It is shown that during the deceleration process more than 90% of kinetic energy of charged nuclear reaction products is converted to electric energy and stored as electric energy in a stack of charged capacitors with a gap size of 500 nm and graphene electrodes.
Abstract: The efficiency of conventional techniques used to harvest energy in nuclear reactors lies around 35%. This limit exists, because the nuclear energy is converted to electrical energy via heat engines. We study an alternative approach where the kinetic energy of nuclear reaction products is directly converted into electric energy in a stack of charged capacitors with a gap size of 500 nm and graphene electrodes. Graphene is expected to be chemically and mechanically stable in high-radiation environments, because its tensile strength of 130 GPa is very large, about 100 times larger than most metals. The dielectric strength of such nanocapacitors exceeds 1 GV/m, because avalanching is suppressed at small gap sizes. In a 1 GV/m electric field charged nuclear reaction products, such as 5.6 MeV alpha particles, come to rest in of a stack with 5000 nanocapacitors. We show that during the deceleration process more than 90% of kinetic energy of charged nuclear reaction products is converted to electric energy and stored as electric energy in the stack. Each stack is 2.5-mm thick and produces a high-voltage DC current. A device with a 1-Ci241Am source is expected to generate 22 mW of electric power.

Journal ArticleDOI
TL;DR: With these schemes many complex economic systems subject to increasing returns can be formalized mathematically, for they allow for positive and negative feedbacks among many variables, “jumps,” “bad” behaved dynamics, dis continuities, and interrelation among systems.
Abstract: This article extends the traditional Polya scheme consisting of one urn with two colors to the schemes where multiple independent and/or interdependent urns with multiple additions and/or withdraws and several independent and/or interdependent colors are considered. It also argues that with these schemes many complex economic systems subject to increasing returns can be formalized mathematically, for they allow for positive and negative feedbacks among many variables, “jumps,” “bad” behaved dynamics, dis continuities, and interrelation among systems. © 2013 Wiley Periodicals, Inc. Complexity 19: 21–37, 2013

Journal ArticleDOI
TL;DR: A network-based analysis of the economy as a network of economic sectors connected by trade flows is applied to the United States in the years before and after the 2008 financial crisis to provide a fresh look at the U.S. government's economic revival policies.
Abstract: To compare the relative power of individual sectors to pull the entire economy, i.e., the power-of-pull, this article utilizes a complex system perspective to model the economy as a network of economic sectors connected by trade flows. A sector's power-of-pull is defined and calculated as a function of the powers-of-pull of those sectors that it pulls through network linkages, and their powers-of-pull are, in turn, functions of those sectors that they further pull ad infinitum throughout the network. Theoretically, boosting activities in sectors with a higher power-of-pull will generate greater network effects while stimulating the entire economy, especially during recessions. This method is applied to the United States in the years before and after the 2008 financial crisis. The results provide a fresh look at the U.S. government's economic revival policies and reveal fundamental changes in the economic structure of the U.S. This work advocates a network-based analysis of the economy as a complex system. © 2013 Wiley Periodicals, Inc. Complexity 18: 37–47, 2013

Journal ArticleDOI
TL;DR: In this article, a new formulation of the complexity profile is presented, which expands its possible application to high-dimensional real-world and mathematically defined systems, and is constructed from the pairwise dependencies between components of the system.
Abstract: Quantifying the complexity of systems consisting of many interacting parts has been an important challenge in the field of complex systems in both abstract and applied contexts. One approach, the complexity profile, is a measure of the information to describe a system as a function of the scale at which it is observed. We present a new formulation of the complexity profile, which expands its possible application to high-dimensional real-world and mathematically defined systems. The new method is constructed from the pairwise dependencies between components of the system. The pairwise approach may serve as both a formulation in its own right and a computationally feasible approximation to the original complexity profile. We compare it to the original complexity profile by giving cases where they are equivalent, proving properties common to both methods, and demonstrating where they differ. Both formulations satisfy linear superposition for unrelated systems and conservation of total degrees of freedom (sum rule). The new pairwise formulation is also a monotonically nonincreasing function of scale. Furthermore, we show that the new formulation defines a class of related complexity profile functions for a given system, demonstrating the generality of the formalism. © 2013 Wiley Periodicals, Inc. Complexity 18:20–27, 2013

Journal ArticleDOI
TL;DR: The Multiple Scales Method is used to analyze the chaotic behavior and different types of fixed points in ferroresonance of voltage transformers considering core loss and the chaos is created and increased in the system.
Abstract: In this article, the Multiple Scales Method is used to analyze the chaotic behavior and different types of fixed points in ferroresonance of voltage transformers considering core loss. This phenomenon has nonlinear chaotic dynamics and includes subharmonic, quasi-periodic, and also chaotic oscillations. In this article, the chaotic behavior and various ferroresonant oscillations modes of the voltage transformer is studied. This phenomenon consists of different types of bifurcations such as Period Doubling Bifurcation (PDB), Saddle Node Bifurcation (SNB), Hopf Bifurcation (HB) and chaos. The dynamic analysis of ferroresonant circuit is based on bifurcation theory. The bifurcation and phase plane diagrams are illustrated using a continuous method and linear and nonlinear models of core loss. To analyze ferroresonance phenomenon, the Lyapunov exponents are calculated via Multiple Scales Method obtaining Feigenbaum numbers. The bifurcation diagrams illustrate the variation of the control parameter. Therefore, the chaos is created and increased in the system. © 2013 Wiley Periodicals, Inc. Complexity 18: 34-45, 2013

Journal ArticleDOI
TL;DR: It is shown that even a simple protein exhibits the hallmarks of complex systems, and the molecular bases of this complex behavior are possessed completely by the protein itself, because such complexity emerges without considering the solvent explicitly.
Abstract: Biological functions are intimately rooted in biopolymer dynamics. It is commonly accepted that proteins can be considered as complex systems, but the origin of such complexity is still not fully understood. Moreover, it is still not really clear if proteins are true complex systems or complicated ones. Here, molecular dynamics simulations on a two helix bundle protein have been performed, and protein trajectories have been analyzed by using correlation functions in the frequency domain. We show that even a simple protein exhibits the hallmarks of complex systems. Moreover, the molecular bases of this complex behavior are possessed completely by the protein itself, because such complexity emerges without considering the solvent explicitly. © 2012 Wiley Periodicals, Inc. Complexity, 2012

Journal ArticleDOI
TL;DR: A partial theory of consciousness as relations defined by typical data is proposed, based on the idea that a brain state on its own is almost meaningless but in the context of the typical brain states, defined by the brain's structure, a particular brain state is highly structured by relations.
Abstract: The theoretical base for consciousness, in particular, an explanation of how consciousness is defined by the brain, has long been sought by science. We propose a partial theory of consciousness as relations defined by typical data. The theory is based on the idea that a brain state on its own is almost meaningless but in the context of the typical brain states, defined by the brain's structure, a particular brain state is highly structured by relations. The proposed theory can be applied and tested both theoretically and experimentally. Precisely how typical data determines relations is fully established using discrete mathematics. © 2012 Wiley Periodicals, Inc. Complexity, 2012

Journal ArticleDOI
TL;DR: This article presents a state observer based iterative learning control to solve the trajectory tracking problem of a class of time-varying Multi-Input-Multi-Output nonlinear systems with arbitrary relative degree.
Abstract: This article presents a state observer based iterative learning control to solve the trajectory tracking problem of a class of time-varying Multi-Input-Multi-Output nonlinear systems with arbitrary relative degree. For this purpose, an asymptotically stable observer is derived for the system under consideration. There after, this observer is integrated with the iterative learning controller by replacing the state in the control law with its estimation yielded by the state observer. Hence, the stability of the whole control (nonlinear system plus controller plus observer) is guaranteed. Simulation result on nonlinear system shows that the trajectory tracking error decreases through the iterations. © 2013 Wiley Periodicals, Inc. Complexity 19: 37–45, 2013

Journal ArticleDOI
TL;DR: Variation of wavelet entropy during low beta NFT was investigated and it was revealed that there is a highly significant negative correlation between the change in low beta activity andWavelet entropy.
Abstract: Neurofeedback training NFT has an important role in improvement of cognitive functions in both clinical and healthy individuals. In this study, variation of wavelet entropy during low beta NFT was investigated. To investigate the effect of low beta NFT on wavelet entropy, correlation between the change in low beta activity and the change in wavelet entropy was computed. The results revealed that there is a highly significant negative correlation between the change in low beta activity and wavelet entropy. The given outcome suggests that enhancing low beta activity through NFT associated with decrements in wavelet entropy. Furthermore, we discuss a new implementation of NFT, based on wavelet entropy for future research. © 2012 Wiley Periodicals, Inc. Complexity, 2012

Journal ArticleDOI
TL;DR: A genetic algorithm was used to evolve cellular automata to perform certain computational tasks, in an effort to gain more insight into the question: “How does evolution produce sophisticated emergent computation in systems composed of simple components limited to local interactions?”
Abstract: This article presents a brief history of the Evolving Cellular Automata (EvCA) project. In the EvCA project, a genetic algorithm was used to evolve cellular automata to perform certain (nontrivial) computational tasks, in an effort to gain more insight into the question: “How does evolution produce sophisticated emergent computation in systems composed of simple components limited to local interactions?” Next to providing many interesting results and useful insights, the EvCA project seems to have spawned a whole research area of its own. Here, a brief overview is given of how it all started, developed, and inspired further work. © 2013 Wiley Periodicals, Inc. Complexity 18: 15–19, 2013

Journal ArticleDOI
TL;DR: This analysis indicates that a strong clustering can be a warning sign and collusion amongst construction firms in a number of regions in Japan in the 2000s can be identified with the formation of clusters of anomalous highly connected companies.
Abstract: The world economy consists of highly interconnected and interdependent commercial and financial networks. Here, we develop temporal and structural network tools to analyze the state of the economy and the financial markets. Our analysis indicates that a strong clustering can be a warning sign. Reduction in diversity, which was an essential aspect of the dynamics surrounding the financial markets crisis of 2008, is seen as a key emergent feature arising naturally from the evolutionary and adaptive dynamics inherent to the financial markets. Similarly, collusion amongst construction firms in a number of regions in Japan in the 2000s can be identified with the formation of clusters of anomalous highly connected companies. © 2013 Wiley Periodicals, Inc. Complexity 19: 22–36, 2013

Journal ArticleDOI
TL;DR: It is shown that firm sizes at national and industrial level are highly skew but not Zipf-distributed, and that, while self-organizing industrial structures of this kind are due to increasing returns and hard to describe with conventional theories, system dynamics and urn theory are equipped with adequate tools to analyze them.
Abstract: Using data from Fortune Magazine's 500 American largest corporations from 1955 to 2010, this article shows that firm sizes at national and industrial level are highly skew but not Zipf-distributed. It also argues that, while self-organizing industrial structures of this kind are due to increasing returns and hard to describe with conventional theories, system dynamics and urn theory are equipped with adequate tools to analyze them.

Journal ArticleDOI
TL;DR: It is found that two absorbing states-full cooperation and full defection-can be reached, assuming that players can delete interaction relations unilaterally, but new relations can only be created with the mutual consent of both partners.
Abstract: We study the emergence of cooperation in an environment where players in prisoner's dilemma game PDG not only update their strategies but also change their interaction relations Different from previous studies in which players update their strategies according to the imitation rule, in this article, the strategies are updated with limited foresight We find that two absorbing states-full cooperation and full defection-can be reached, assuming that players can delete interaction relations unilaterally, but new relations can only be created with the mutual consent of both partners Simulation experiments show that high levels of cooperation in large populations can be achieved when the temptation to defect in PDG is low Moreover, we explore the factors which influence the level of cooperation These results provide new insights into the cooperation in social dilemma and into corresponding control strategies © 2012Wiley Periodicals, Inc Complexity, 2012

Journal ArticleDOI
TL;DR: The dynamic of the set of oscillators and the base tends to evolve towards a certain region that is close to a transition in dynamics of the oscillators, where more frequencies start to appear in the frequency spectra of the phases of the metronomes.
Abstract: In this article, we study the dynamics of coupled oscillators. We use mechanical metronomes that are placed over a rigid base. The base moves by a motor in a one-dimensional direction and the movements of the base follow some functions of the phases of the metronomes (in other words, it is controlled to move according to a provided function). Because of the motor and the feedback, the phases of the metronomes affect the movements of the base, whereas on the other hand, when the base moves, it affects the phases of the metronomes in return. For a simple function for the base movement (such as y = γx[rθ1 + (1 − r)θ2] in which y is the velocity of the base, γx is a multiplier, r is a proportion, and θ1 and θ2 are phases of the metronomes), we show the effects on the dynamics of the oscillators. Then, we study how this function changes in time when its parameters adapt by a feedback. By numerical simulations and experimental tests, we show that the dynamic of the set of oscillators and the base tends to evolve towards a certain region. This region is close to a transition in dynamics of the oscillators, where more frequencies start to appear in the frequency spectra of the phases of the metronomes. We interpret this as an adaptation towards the edge of chaos.

Journal ArticleDOI
TL;DR: Thanks to substantial improvements in the theory of metabolic fluxes and the application of 13C isotope markers in experimental flux studies, Pareto efficiency of bacterial metabolism can be determined and direct answers to the long standing questions of optimization according to multiple criteria in nature can be given.
Abstract: Thanks to substantial improvements in the theory of metabolic fluxes and the application of 13C isotope markers in experimental flux studies, Pareto efficiency of bacterial metabolism can now be determined and direct answers to the long standing questions of optimization according to multiple criteria in nature can be given. Cells or organisms operate close to Pareto optima but the performance with respect to every single criterion is almost always improvable. Rational design and evolutionary methods are routinely used for the production of biomolecules with optimized properties. Examples are proteins for technical applications, for example in detergents, and optimally binding nucleic acid molecules called aptamers. Among the various perspectives of synthetic biology, the usage of DNA for information storage is particularly promising: In a pilot experiment, an entire book including figures and a Java script, in total more than 5 megabit, were stored on a single DNA molecule. © 2013 Wiley Periodicals, Inc. Complexity 18: 21–31, 2013