scispace - formally typeset
Search or ask a question

Showing papers on "Turing machine published in 2005"


Journal ArticleDOI
TL;DR: The main result in this paper establishes the energy savings derived by using probabilistic AND as well as NOT gates constructed from an idealized switch that produces a Probabilistic bit (PBIT).
Abstract: The main result in this paper establishes the energy savings derived by using probabilistic AND as well as NOT gates constructed from an idealized switch that produces a probabilistic bit (PBIT). A probabilistic switch produces the desired value as an output that is 0 or 1 with probability p, represented as a PBIT, and, hence, can produce the wrong output value with a probability of (1-p). In contrast with a probabilistic switch, a conventional deterministic switch produces a BIT whose value is always correct. Our switch-based gate constructions are a particular case of a systematic methodology developed for building energy-aware networks for computing, using PBITS. Interesting examples of such networks include AND, OR, and NOT gates (or, as functions, Boolean conjunction, disjunction, and negation, respectively). To quantify the energy savings, novel measures of "technology independent" energy complexity are also introduced - these measures parallel conventional machine-independent notions of computational complexity such as the algorithm's running time and space. Networks of switches can be related to Turing machines and to Boolean circuits, both of which are widely known and well-understood models of computation. Our gate and network constructions lend substance to the following thesis (established for the first time by K.V. Palem): the mathematical technique referred to as randomization yielding probabilistic algorithms results in energy savings through a physical interpretation based on statistical thermodynamics and, hence, can serve as a basis for energy-aware computing. While the estimates of the energy saved through PBIT-based probabilistic computing switches and networks developed rely on the constructs and thermodynamic models due to Boltzmann, Gibbs, and Planck, this work has also led to the innovation of probabilistic CMOS-based devices and computing frameworks. Thus, for completeness, the relationship between the physical models on which this work is based and the electrical domain of CMOS-based switching is discussed.

194 citations


Journal ArticleDOI
Steven Pinker1
TL;DR: This review shows that Jerry Fodor's arguments dismissing of the relevance of evolution to psychology are unsound and the supposed gap between human cognition and computational models may be illusory.
Abstract: In my book How the Mind Works, I defended the theory that the human mind is a naturally selected system of organs of computation. Jerry Fodor claims that 'the mind doesn't work that way' (in a book with that title) because (1) Turing Machines cannot duplicate humans' ability to perform abduction (inference to the best explanation); (2) though a massively modular system could succeed at abduction, such a system is implausible on other grounds; and (3) evolution adds nothing to our under- standing of the mind. In this review I show that these arguments are flawed. First, my claim that the mind is a computational system is different from the claim Fodor attacks (that the mind has the architecture of a Turing Machine); therefore the practical limitations of Turing Machines are irrelevant. Second, Fodor identifies abduction with the cumulative accomplishments of the scientific community over millennia. This is very different from the accomplishments of human common sense, so the supposed gap between human cognition and computational models may be illusory. Third, my claim about biological specialization, as seen in organ systems, is distinct from Fodor's own notion of encapsulated modules, so the limitations of the latter are irrelevant. Fourth, Fodor's arguments dismissing of the relevance of evolution to psychology are unsound.

166 citations


Journal ArticleDOI
TL;DR: The first polynomial time-space lower bounds for satisfiability on general models of computation are established, showing that for any constant c less than the golden ratio there exists a positive constant d such that no deterministic random-access Turing machine can solve satisfiability in time n and space d.
Abstract: We establish the first polynomial time-space lower bounds for satisfiability on general models of computation. We show that for any constant c less than the golden ratio there exists a positive constant d such that no deterministic random-access Turing machine can solve satisfiability in time nc and space nd, where d approaches 1 when c does. On conondeterministic instead of deterministic machines, we prove the same for any constant c less than √2.Our lower bounds apply to nondeterministic linear time and almost all natural NP-complete problems known. In fact, they even apply to the class of languages that can be solved on a nondeterministic machine in linear time and space n1/c.Our proofs follow the paradigm of indirect diagonalization. We also use that paradigm to prove time-space lower bounds for languages higher up in the polynomial-time hierarchy.

95 citations


01 Jan 2005
TL;DR: The CHR machine is introduced, a model of computation based on the operational semantics of CHR that is compared to those of the well-understood Turing machine and Random Access Memory machine and proves the interesting result that every algorithm can be implemented in CHR with the best known time and space complexity.
Abstract: Constraint Handling Rules (CHR) is a high-level rule-based programming language which is increasingly used for general-purpose programming. We introduce the CHR machine, a model of computation based on the operational semantics of CHR. Its computational power and time complexity properties are compared to those of the well-understood Turing machine and Random Access Memory machine. This allows us to prove the interesting result that every algorithm can be implemented in CHR with the best known time and space complexity. We also investigate the practical relevance of this result and the constant factors involved. Finally we expand the scope of the discussion to other (declarative) programming languages.

79 citations


Book ChapterDOI
08 Jun 2005
TL;DR: This paper identifies and analyzes the historical reasons for widespread belief that no model of computation more expressive than Turing machines can exist, and presents one such model, Persistent Turing Machines (PTMs), which capture sequential interaction, which is a limited form of concurrency.
Abstract: According to the interactive view of computation, communication happens during the computation, not before or after it. This approach, distinct from concurrency theory and the theory of computation, represents a paradigm shift that changes our understanding of what is computation and how it is modeled. Interaction machines extend Turing machines with interaction to capture the behavior of concurrent systems, promising to bridge these two fields. This promise is hindered by the widespread belief, incorrectly known as the Church-Turing thesis, that no model of computation more expressive than Turing machines can exist. Yet Turing's original thesis only refers to the computation of functions and explicitly excludes other computational paradigms such as interaction. In this paper, we identify and analyze the historical reasons for this widespread belief. Only by accepting that it is false can we begin to properly investigate formal models of interaction machines. We conclude the paper by presenting one such model, Persistent Turing Machines (PTMs). PTMs capture sequential interaction, which is a limited form of concurrency; they allow us to formulate the Sequential Interaction Thesis, going beyond the expressiveness of Turing machines and of the Church-Turing thesis.

73 citations


Journal ArticleDOI
TL;DR: This paper demonstrates the modeling power of event graphs by presenting a model that simulates a Turing machine, therefore, according to Church's thesis, event-graph models are able to model any system that can be implemented on a modern computer.
Abstract: Event graphs model the dynamics of a discrete-event simulation model. This paper demonstrates the modeling power of event graphs by presenting a model that simulates a Turing machine. Therefore, according to Church's thesis, event-graph models are able to model any system that can be implemented on a modern computer. Theoretical and practical implications of this assertion are also discussed.

63 citations


Book ChapterDOI
01 Jan 2005
TL;DR: This tutorial reviews basic concepts in complexity theory, as well as various No Free Lunch results and how these results relate to computational complexity.
Abstract: This tutorial reviews basic concepts in complexity theory, as well as various No Free Lunch results and how these results relate to computational complexity The tutorial explains basic concepts in an informal fashion that illuminates key concepts No Free Lunch theorems for search can be summarized by the following result: For all possible performance measures, no search algorithm is better than another when its performance is averaged over all possible discrete functions

63 citations


Journal ArticleDOI
TL;DR: The first electronic digital computers were variations on the protean design of a limited Turing machine, which described not a single device but a schema, and which could assume many forms and could develop in many directions as discussed by the authors.
Abstract: The first electronic digital computers were variations on the protean design of a limited Turing machine, which described not a single device but a schema, and which could assume many forms and could develop in many directions. It became what various groups of people made of it. The computer thus has little or no history of its own. Rather, it has histories derived from the histories of the groups of practitioners who saw in it, or in some yet to be envisioned form of it, the potential to realise their agendas and aspirations. What kinds of computers we have designed since 1945, and what kinds of programs we have written for them, reflect not so much the nature of the computer as the purposes and aspirations of the communities who guided those designs and wrote those programs. Their work reflects not the history of the computer but the histories of those groups, even as the use of computers in many cases fundamentally redirected the course of those histories. Separating the histories of computing,...

61 citations


Journal ArticleDOI
TL;DR: It is shown that an Evolutionary Turing Machine is able to solve nonalgorithmically the halting problem of the Universal Turing Machine and, asymptotically, the best evolutionary algorithm problem, suggesting that thebest evolutionary algorithm does not exist, but it can be potentially indefinitely approximated using evolutionary techniques.
Abstract: We outline a theory of evolutionary computation using a formal model of evolutionary computation – the Evolutionary Turing Machine – which is introduced as the extension of the Turing Machine model. Evolutionary Turing Machines provide a better and a more complete model for evolutionary computing than conventional Turing Machines, algorithms, and Markov chains. The convergence and convergence rate are defined and investigated in terms of this new model. The sufficient conditions needed for the completeness and optimality of evolutionary search are investigated. In particular, the notion of the total optimality as an instance of the multiobjective optimization of the Universal Evolutionary Turing Machine is introduced. This provides an automatic way to deal with the intractability of evolutionary search by optimizing the quality of solutions and search costs simultaneously. Based on a new model a very flexible classification of optimization problem hardness for the evolutionary techniques is proposed. The expressiveness of evolutionary computation is investigated. We show that the problem of the best evolutionary algorithm is undecidable independently of whether the fitness function is time dependent or fixed. It is demonstrated that the evolutionary computation paradigm is more expressive than Turing Machines, and thus the conventional computer science based on them. We show that an Evolutionary Turing Machine is able to solve nonalgorithmically the halting problem of the Universal Turing Machine and, asymptotically, the best evolutionary algorithm problem. In other words, the best evolutionary algorithm does not exist, but it can be potentially indefinitely approximated using evolutionary techniques.

54 citations


Journal ArticleDOI
TL;DR: In this article, the reachability of nonlinear dynamic and control systems is investigated using Turing machines to perform approximate computations, and the main result is that the reachable set is lower-computable, but is only outer-complete if it equals the chain-reachable set.

51 citations


Book ChapterDOI
24 Feb 2005
TL;DR: A complete characterization in the case of finitely generated groups is given and it is shown that such a group has an automatic presentation if and only if it is virtually abelian.
Abstract: A structure is said to be computable if its domain can be represented by a set which is accepted by a Turing machine and if its relations can then be checked using Turing machines. Restricting the Turing machines in this definition to finite automata gives us a class of structures with a particularly simple computational structure; these structures are said to have automatic presentations. Given their nice algorithmic properties, these have been of interest in a wide variety of areas. An area of particular interest has been the classification of automatic structures. One of the prime examples under consideration has been the class of groups. We give a complete characterization in the case of finitely generated groups and show that such a group has an automatic presentation if and only if it is virtually abelian.

Journal ArticleDOI
TL;DR: In this article, the fundamental conditions to generate complex features at the edge of chaos have been established, and using the CNN-UM architecture, a new world of algorithms is opening.
Abstract: Present day classical computers, developed during the last sixty years are logic machines, based on binary logic and arithmetic, acting on discrete valued (binary coded) data. Its unique property is algorithmic (stored) programmability, invented by John von Neumann. The mathematical concept is based on a universal machine on integers (Turing machine). Cellular automata, introduced also by J. von Neumann, are fully parallel array processors with all discrete space, time and state values. Their beautiful properties have been recently rediscovered showing the deeper qualitative properties, if we allow the states and time to be continuous values like in CNN, a broader class of dynamics will be generated. Even more, the fundamental conditions to generate complex features at the edge of chaos have been established: the need of local activity. Taking one step further, and using the CNN-UM architecture, a new world of algorithms is opening.

Journal ArticleDOI
TL;DR: The existence of polynomial time algorithms to approximate the Julia sets of given hyperbolic rational functions is proved and strict computable error estimation is given w.r.t. the Hausdorff metric on the complex sphere.

Book ChapterDOI
04 Oct 2005
TL;DR: A semi-decidable symbolic algebraic dense-time TCTL model checking algorithm, which satisfies two desirable properties: it can be derived automatically from the symbolic description, and it extends to and generalizes other versions of temporal logics.
Abstract: Motivated by applications to systems biology, and the emergence of semi-algebraic hybrid systems as a natural framework for modeling biochemical networks, we continue exploring the decidability problem for model-checking with TCTL (Timed Computation Tree Logic) over this broad class of semi-algebraic hybrid systems. Previously, we had introduced these models, demonstrated the close connection to the goals of systems biology. However, we had only developed the techniques for bounded reachability, arguing for the adequacy of such an approach in a majority of the biological applications. Here, we present a semi-decidable symbolic algebraic dense-time TCTL model checking algorithm, which satisfies two desirable properties: it can be derived automatically from the symbolic description, and it extends to and generalizes other versions of temporal logics. The main mathematical device at the core of this approach is Tarski-Collins’ real quantifier elimination employed at each fixpoint iteration, whose high complexity is the crux of its unfortunate limitation. Along with these results, we prove the undecidability of this problem in the more powerful “real” Turing machine formalism of Blum, Shub and Smale. We then demonstrate a preliminary version of our model-checker Tolque on the Delta-Notch example.

Book ChapterDOI
08 Jun 2005
TL;DR: It is shown that closed-form analytic maps and flows can simulate Turing machines in an error-robust manner and are explicitly obtained and the simulation is performed in real time.
Abstract: In this paper, we show that closed-form analytic maps and flows can simulate Turing machines in an error-robust manner. The maps and ODEs defining the flows are explicitly obtained and the simulation is performed in real time.

Book
01 Jan 2005
TL;DR: In this paper, the authors present a Primer on the Tools and Concepts of Computable Economics (K. Vela Velupillai and F. A. Doria), as well as a discussion of the relationship between knowledge and information in modeling.
Abstract: Preface.I. Introduction.(K. Vela Velupillai).II. Computing the Future (N. C. A. da Costa and F. A. Doria).III. Making Mathematics Effective in Economics (Joseph L. McCauley).IV. Complexity and Information in Modeling (J. Rissanen).V. Introduction: Algorithmic and Exchangeable Aspects (John J. McCall).VI. Constructive and Classical Models for Results in Economics and Game Theory (Kislaya Prasad).VII. A Primer on the Tools and Concepts of Computable Economics (K. Vela Velupillai).VIII. Emergence and Universal Computation (Cassey Lee).IX. Research and Development in Computable Production Functions (Francesco Luna).X. Computable Knowledge and Undecidability: A Turing Machine Metaphor Applied to Endogenous Growth Models (Stefano Zambelli).XI. Rights and Decentralized Computatation (Hakan J. Holm).Author Index.Subject Index.

Journal Article
TL;DR: The design of a nanomechanical DNA device that autonomously mimics the operation of a 2-state 5-color universal Turing machine, called an Autonomous DNA Turing Machine (ADTM), is presented, capable of universal computation and hence complex translational motion, which is defined as universal translationalmotion.
Abstract: Intelligent nanomechanical devices that operate in an autonomous fashion are of great theoretical and practical interest. Recent successes in building large scale DNA nano-structures, in constructing DNA mechanical devices, and in DNA computing provide a solid foundation for the next step forward: designing autonomous DNA mechanical devices capable of arbitrarily complex behavior. One prototype system towards this goal can be an autonomous DNA mechanical device capable of universal computation, by mimicking the operation of a universal Turing machine. Building on our prior theoretical design and prototype experimental construction of an autonomous unidirectional DNA walking device moving along a linear track, we present here the design of a nanomechanical DNA device that autonomously mimics the operation of a 2-state 5-color universal Turing machine. Our autonomous nanomechanical device, called an Autonomous DNA Turing Machine (ADTM), is thus capable of universal computation and hence complex translational motion, which we define as universal translational motion.

Journal ArticleDOI
TL;DR: The solvability of decentralized supervisor synthesis problems may be undecidable, even in cases where all pertinent languages are represented as finite automata, by reduction from the halting problem for Turing machines.

Journal Article
TL;DR: It is argued that producing reliable self-organised software systems (SOSS) will necessarily involve considerable use of adaptive approaches and a system for annotating such systems with hypotheses, and conditions of application is proposed that would be a natural extension of current methods of open source code development.
Abstract: The 'engineering' and 'adaptive' approaches to system production are distinguished. It is argued that producing reliable self-organised software systems (SOSS) will necessarily involve considerable use of adaptive approaches. A class of apparently simple multi-agent systems is defined, which however has all the power of a Turing machine, and hence is beyond formal specification and design methods (in general). It is then shown that such systems can be evolved to perform simple tasks. This highlights how we may be faced with systems whose workings we have not wholly designed and hence that we will have to treat them more as natural science treat the systems it encounters, namely using the classic experimental method. An example is briefly discussed. A system for annotating such systems with hypotheses, and conditions of application is proposed that would be a natural extension of current methods of open source code development.

Posted Content
TL;DR: The halting problem for Turing machines is decidable on a set of asymptotic probability one programs as discussed by the authors, and the halting problem H intersects B is polynomial time decidable.
Abstract: The halting problem for Turing machines is decidable on a set of asymptotic probability one. Specifically, there is a set B of Turing machine programs such that (i) B has asymptotic probability one, so that as the number of states n increases, the proportion of all n-state programs that are in B goes to one; (ii) B is polynomial time decidable; and (iii) the halting problem H intersect B is polynomial time decidable. The proof is sensitive to the particular computational model.

Book ChapterDOI
01 Jan 2005
TL;DR: In this article, a class of apparently simple multi-agent systems is defined, which however has all the power of a Turing machine, and hence is beyond formal specification and design methods (in general).
Abstract: The ‘engineering' and ‘adaptive' approaches to system production are distinguished. It is argued that producing reliable self-organised software systems (SOSS) will necessarily involve considerable use of adaptive approaches. A class of apparently simple multi-agent systems is defined, which however has all the power of a Turing machine, and hence is beyond formal specification and design methods (in general). It is then shown that such systems can be evolved to perform simple tasks. This highlights how we may be faced with systems whose workings we have not wholly designed and hence that we will have to treat them more as natural science treat the systems it encounters, namely using the classic experimental method. An example is briefly discussed. A system for annotating such systems with hypotheses, and conditions of application is proposed that would be a natural extension of current methods of open source code development.

Journal ArticleDOI
TL;DR: It is demonstrated why a contradiction only occurs if a type of machine can compute its own diagonal function, and why such a situation does not occur for the methods of hypercomputation under attack, andwhy it is unlikely to occur for any other serious methods.
Abstract: The diagonal method is often used to show that Turing machines cannot solve their own halting problem. There have been several recent attempts to show that this method also exposes either contradiction or arbitrariness in other theoretical models of computation which claim to be able to solve the halting problem for Turing machines. We show that such arguments are flawed—a contradiction only occurs if a type of machine can compute its own diagonal function. We then demonstrate why such a situation does not occur for the methods of hypercomputation under attack, and why it is unlikely to occur for any other serious methods.

Journal ArticleDOI
TL;DR: It is proved that there is no algorithm to tell whether an arbitrarily constructed Quantum Turing Machine has same time steps for different branches of computation, and the notion of halting to be probabilistic in quantum Turing Machine cannot be avoided.
Abstract: We prove that there is no algorithm to tell whether an arbitrarily constructed Quantum Turing Machine has same time steps for different branches of computation. We, hence, cannot avoid the notion of halting to be probabilistic in Quantum Turing Machine.

Journal Article
TL;DR: It is proved that if the rules of the P system are applied sequentially, then the accepted language class is strictly included in the class of languages accepted by one-way Turing machines with a logarithmically bounded workspace.
Abstract: We characterize the classes of languages described by P automata, i.e., accepting P systems with communication rules only. Motivated by properties of natural computing systems, we study computational complexity classes with a certain restriction on the use of the available workspace in the course of computations and relate these to the language classes described by P automata. We prove that if the rules of the P system are applied sequentially, then the accepted language class is strictly included in the class of languages accepted by one-way Turing machines with a logarithmically bounded workspace, and if the rules are applied in the maximal parallel manner, then the class of context-sensitive languages is obtained.

Book ChapterDOI
18 Dec 2005
TL;DR: This proof is the first entirely semantical proof of polytime soundness for light logics and provides a notable simplification of the original already semanticals proof of Polytime Soundness for LFPL.
Abstract: We give new proofs of soundness (all representable functions on base types lies in certain complexity classes) for Elementary Affine Logic, LFPL (a language for polytime computation close to realistic functional programming introduced by one of us), Light Affine Logic and Soft Affine Logic. The proofs are based on a common semantical framework which is merely instantiated in four different ways. The framework consists of an innovative modification of realizability which allows us to use resource-bounded computations as realisers as opposed to including all Turing computable functions as is usually the case in realizability constructions. For example, all realisers in the model for LFPL are polynomially bounded computations whence soundness holds by construction of the model. The work then lies in being able to interpret all the required constructs in the model. While being the first entirely semantical proof of polytime soundness for light logics, our proof also provides a notable simplification of the original already semantical proof of polytime soundness for LFPL. A new result made possible by the semantic framework is the addition of polymorphism and a modality to LFPL thus allowing for an internal definition of inductive datatypes.

Posted Content
TL;DR: In this article, the computational complexity of uniformizing a domain with a given computable boundary was studied and upper and lower bounds were given in two settings: when the approximation of boundary is given either as a list of pixels, or by a Turing Machine.
Abstract: In this paper we consider the computational complexity of uniformizing a domain with a given computable boundary. We give nontrivial upper and lower bounds in two settings: when the approximation of boundary is given either as a list of pixels, or by a Turing Machine.

Book ChapterDOI
04 Apr 2005
TL;DR: The problem of reachability is tackled and a public (i.e., restriction free) fragment is characterized for which it is decidable and has been shown to be Turing complete by Maffeis and Phillips.
Abstract: Mobile Ambients has been proposed by Cardelli and Gordon as a foundational calculus for mobile computing. Since its introduction, the computational strength as well as the decidability of properties have been investigated for several fragments and variants of the standard calculus. We tackle the problem of reachability and we characterize a public (i.e., restriction free) fragment for which it is decidable. This fragment is obtained by removing the open capability and restricting the use of replication to guarded processes. Quite surprisingly, this fragment has been shown to be Turing complete by Maffeis and Phillips.

Journal Article
TL;DR: The goal of the $-calculus is to propose a computational model with built-in performance measure as its central element that allows not only the expression of solutions, but also provides the means to incrementally construct solutions for computationally hard, real-life problems.
Abstract: This paper presents a novel model for resource bounded computation based on process algebras. Such model is called the $-calculus (cost calculus). Resource bounded computation attempts to find the best answer possible given operational constraints. The $-calculus provides a uniform representation for optimization in the presence of limited resources. It uses cost-optimization to find the best quality solutions while using a minimal amount of resources. A unique aspect of the approach is to propose a resource bounded process algebra as a generic problem solving paradigm targeting interactive AI applications. The goal of the $-calculus is to propose a computational model with built-in performance measure as its central element. This measure allows not only the expression of solutions, but also provides the means to incrementally construct solutions for computationally hard, real-life problems. This is a dramatic contrast with other models like Turing machines, λ-calculus, or conventional process algebras. This highly expressive model must therefore be able to express approximate solutions. This paper describes the syntax and operational cost semantics of the calculus. A standard cost function has been defined for strongly and weakly congruent cost expressions. Example optimization problems are given which take into account the incomplete knowledge and the amount of resources used by an agent. The contributions of the paper are twofold: firstly, some necessary conditions for achieving global optimization by performing local optimization in time and/or space are found. That deals with incomplete information and complexity during problem solving. Secondly, developing an algebra which expresses current practices, e.g., neural nets, cellular automata, dynamic programming, evolutionary computation, or mobile robotics as limiting cases, provides a tool for exploring the theoretical underpinnings of these methods. As the result, hybrid methods can be naturally expressed and developed using the algebra.

Journal Article
TL;DR: It is proved that the classical complexity class NP equals the set of languages accepted by AHNEPs in polynomial time.
Abstract: We consider time complexity classes defined on accepting hybrid networks of evolutionary processors (AHNEP) similarly to the classical time complexity classes defined on the standard computing model of Turing machine. By definition, AHNEPs are deterministic. We prove that the classical complexity class NP equals the set of languages accepted by AHNEPs in polynomial time.