Showing papers on "Computability published in 1990"
••
TL;DR: An alternative approach inspired by the theories of process-algebra as developed in the computer science literature is presented, which rests on a new formalism of concurrency that can adequately handle nondeterminism and can be used for analysis of a wide range of discrete event phenomena.
Abstract: Much of discrete event control theory has been developed within the framework of automata and formal languages. An alternative approach inspired by the theories of process-algebra as developed in the computer science literature is presented. The framework, which rests on a new formalism of concurrency, can adequately handle nondeterminism and can be used for analysis of a wide range of discrete event phenomena. >
156 citations
••
11 Nov 1990TL;DR: The present approach breaks new ground by simultaneously scheduling and allocating in practical execution times, guaranteeing globally optimal solutions for a specific objective function, and providing a polynomial run-time algorithm for solving some instances of this NP-complete problem.
Abstract: A relaxed linear programming model which simultaneously schedules and allocates functional units and registers is presented for synthesizing cost-constrained globally optimal architectures. This approach is important for industrial applications, because it provides exploration of optimal synthesized architectures and early architectural decisions have the greatest impact on the final design. An integer programming formulation of the architectural synthesis problem is transformed into the mode packing problem. Polyhedral theory is used to formulate constraints that decrease the size of the search space, thus improving solution efficiency. Execution times are an order of magnitude faster than for previous heuristic techniques. The present approach breaks new ground by (1) simultaneously scheduling and allocating in practical execution times, (2) guaranteeing globally optimal solutions for a specific objective function, and (3) providing a polynomial run-time algorithm for solving some instances of this NP-complete problem. >
131 citations
••
01 Jan 1990TL;DR: The purpose of this article is to survey the existing literature related to the latter with an emphasis on enumeration reducibility and its associated degree structure.
Abstract: In a computation using auxiliary informational inputs one can think of the external resource making itself available in different ways. One way is via an oracle as in Turing reducibility, where information is supplied on demand without any time delay. Alternatively the Scott graph model for lambda calculus suggests a situation where new information, only some of it immediately related to the current computation, is constantly being generated (or enumerated) over a period of time in an order which is not under the control of the computer. For some purposes, such as in classifying the relative computability of total functions without any time restrictions, it makes no difference whether oracles or enumerations supply auxiliary informational inputs. But this is not generally the case in situations involving partially accessible information or time-bounds, where nondeterministic computations are involved. Clearly both models of computation (based on oracles or enumerations) have wide validity, although much more is known about the former via the rich and extensive theory of Turing computability. The purpose of this article is to survey the existing literature related to the latter with an emphasis on enumeration reducibility and its associated degree structure.
86 citations
••
18 Jun 1990
TL;DR: This theory is motivated by the observation that it is impossible to control the initial state of a machine when it is powered on and the desire to decide equivalence of two designs based solely on their netlists and logic device models, without knowledge of intended initial states or intended environments.
Abstract: A theory of sequential hardware equivalence [1] is presented, including the notions of gate-level model (GLM), hardware finite state machine (HFSM), state equivalence (∼), alignability, resetability, and sequential hardware equivalence (≈). This theory is motivated by (1) the observation that it is impossible to control the initial state of a machine when it is powered on, and (2) the desire to decide equivalence of two designs based solely on their netlists and logic device models, without knowledge of intended initial states or intended environments.
55 citations
••
02 Apr 1990TL;DR: A bottom-up query processing algorithm BT is presented that is guaranteed to terminate in polynomial time if the periods are polynomially bounded and it is shown that it can be decided whether a set of temporal rules is inflationary.
Abstract: We study conditions guaranteeing polynomial time computability of queries in temporal deductive databases. We show that if for a given set of temporal rules, the period of its least models is bounded from the above by a polynomial in the database size, then also the time to process yes-no queries (as well as to compute finite representations of all query answers) can be polynomially bounded. We present a bottom-up query processing algorithm BT that is guaranteed to terminate in polynomial time if the periods are polynomially bounded. Polynomial periodicity is our most general criterion, however it can not be directly applied. Therefore, we exhibit two weaker criteria, defining inflationary and I-periodic sets of temporal rules. We show that it can be decided whether a set of temporal rules is inflationary. I-periodicity is undecidable (as we show), but it can be closely approximated by a syntactic notion of multi-separability.
48 citations
•
01 Jan 1990
TL;DR: Processes as data types: Observational semantics and logic, action versus state based logics for transition systems, and approaching fair computations by ultra metrics.
Abstract: Processes as data types: Observational semantics and logic.- Metric pomset semantics for a concurrent language with recursion.- Fault-tolerant naming and mutual exclusion.- Flow event structures and flow nets.- Three equivalent semantics for CCS.- Towards a semantic approach to SIMD architectures and their languages.- Concerning the size of clocks.- Transition systems with algebraic structure as models of computations.- Concurrency and computability.- Causal trees interleaving + causality.- Partially commutative formal power series.- Infinite traces.- Equivalences and refinement.- CCS and petri nets.- About fixpoints for concurrency.- Observers, experiments, and agents: A comprehensive approach to parallelism.- Action versus state based logics for transition systems.- Approaching fair computations by ultra metrics.- On distributed languages and models for distributed computation.
46 citations
••
40 citations
••
04 Dec 1990
TL;DR: It turned out that in the area of inductive inference of total recursive functions monotonicity can rarely be guaranteed and these results are compared to the problem of inductively inferring text patterns from finite samples.
Abstract: Monotonic and non-monotonic reasoning is introduced into inductive inference In inductive inference, which is a mathematical theory of algorithmic learning from possibly incomplete information, monotonicity means to construct hypotheses somehow incrementally whereas the necessity of non-monotonic reasoning indicates that during hypothesis formation considerable belief revisions may be required Therefore, it is of a particular interest to find areas of inductive inference where monotonic construction of hypotheses is always possible It turned out that in the area of inductive inference of total recursive functions monotonicity can rarely be guaranteed These results are compared to the problem of inductively inferring text patterns from finite samples For this area, there is a universal weakly monotonic inductive inference algorithm The computability of a stronger algorithm which is developed depends on the decidability of the inclusion problem for pattern languages This problems remains open Unfortunately, the latter algorithm turns out to be inconsistent, ie it sometimes generates hypotheses not able to reflect the information they are build upon Consistency and monotonicity can hardly be achieved simultaneously It arises the question under which circumstances an inductive inference algorithm for learning text patterns can be both consistent and monotonic This problem class is characterized by closedness under intersection
33 citations
••
TL;DR: In this paper, the authors investigated the computability of choice functions and their complexity, and demystified the excessive demands upon predictability in rational economic behavior in economic theory.
31 citations
01 Jan 1990
30 citations
••
22 Oct 1990TL;DR: The results show that, depending on the optical model, ray tracing is sometimes undecidable, sometimes PSPACE-hard, and sometimes in PSPACE.
Abstract: The ray-tracing problem is considered for optical systems consisting of a set of refractive or reflective surfaces. It is assumed that the position and the tangent of the incident angle of the initial light ray are rational. The computability and complexity of the ray-tracing problems are investigated for various optical models. The results show that, depending on the optical model, ray tracing is sometimes undecidable, sometimes PSPACE-hard, and sometimes in PSPACE. >
••
TL;DR: The structural analysis of the partial ordering 3 has been a major area of research in recursion theory since the pioneering paper of Kleene and Post [14] and a characterization of the possible ideals of the structure 3 can be found in this paper.
Abstract: Recursion theory deals with computability on the natural numbers. A function ƒ from N to N is computable (or recursive) if it can be calculated by some program on a Turing machine, or equivalently on any other general purpose computer. A major topic of interest, introduced in Post [23], is the notion of relative difficulty of computation. A function ƒ is computable relative to a function g if after equipping the machine with a black box subroutine that provides the values of g, there is a program (which now may call g via the subroutine) which computes ƒ . In this case we write ƒ
••
TL;DR: It follows from currently held conjectures in computational complexity that a wide class of inference processes considered for reasoning about uncertainty in AI are computationally infeasible.
Abstract: We observe that it follows from currently held conjectures in computational complexity that a wide class of inference processes considered for reasoning about uncertainty in AI are computationally infeasible.
••
TL;DR: A new definition of computability is proposed which lays the foundations for a theory of cybernetic and intelligent machines in which the classical limits imposed by discrete algorithmic procedures are offset by the use of continuous operators on unlimited data.
Abstract: Many attempts1, 7, 8, 35 have been made to overcome the limit imposed by the Turing Machine34 to realise general mathematical functions and models of (physical) phenomena. They center around the notion of computability. In this paper we propose a new definition of computability which lays the foundations for a theory of cybernetic and intelligent machines in which the classical limits imposed by discrete algorithmic procedures are offset by the use of continuous operators on unlimited data. This data is supplied to the machine in a totally parallel mode, as a field or wave. This theory of machines draws its concepts from category theory, Lie algebras, and general systems theory. It permits the incorporation of intelligent control into the design of the machine as a virtual element. The incorporated control can be realized in many (machine) configurations of which we give three: a) a quantum mechanical realization appropriate to a possible understanding of the quantum computer and other models of the physi...
••
01 Jan 1990
••
23 Apr 1990
TL;DR: This paper surveys some computability aspects in structural operational specifications, fairness concepts and testing equivalences, with special emphasis on infinite behaviours.
Abstract: This paper surveys some computability aspects in structural operational specifications, fairness concepts and testing equivalences, with special emphasis on infinite behaviours.
••
TL;DR: It is proved that a function f between two coherent domains X and Y is stable and computable if and only if f may be computed by an oracle-machine questioning “in a positive way” a simple class of oracles that supply informations about elements in X .
Abstract: The relation between stability and sequentiality is investigated in the category of Girard's coheent domains. We introduce and discuss a notion of computability for stable functions based on the recursive enumerability of their traces, in a way similar to the definition of computable functions in Scott's effectively given domains. We then relate this notion of computability to regular sets and relative algorithms (oracle-machines) of the theory of relativized computability. The notion of oracle-machine is used to formalize the idea of a main sequential program which calls an unspecified external agent O (a sort of subroutine call). In particular we prove that a function f between two coherent domains X and Y is stable and computable if and only if f may be computed by an oracle-machine questioning “in a positive way” a simple class of oracles that supply informations about elements in X .
•
TL;DR: In this paper, the authors provide a theory of equilibrium selection for one-shot two-player finite-action strategic-form common interest games, where players are restricted to use strategies which are computable in the sense of Church's thesis.
Abstract: This paper provides a theory of equilibrium selection for one-shot two- player finite-action strategic-form common interest games. A single round of costless unlimited pre-play communication is allowed. Players are restricted to use strategies which are computable in the sense of Church's thesis. The equilibrium notion used involves perturbations which are themselves computable. The only equilibrium payoff vector which survives these strategic restrictions and the computable perturbations is the unique Pareto-efficient one.
••
01 Jul 1990TL;DR: It is proved that if an existential assertion is provable in either of these systems, then it has a primitive recursive selection function, and it is a corollary that if a μ-recursive scheme is provably total, thenIt is extensionally equivalent to a primitive Recursion scheme.
Abstract: We work in the context of abstract data types, modelled as classes of many-sorted algebras. We develop notions of computability over such data types, in particular notions of primitive recursiveness and μ-recursiveness, which generalize the corresponding classical notions over the natural numbers. We also develop classical and intuitionistic formal systems for theories over such data types, and prove (in the case of universal theories) that if an existential assertion is provable in either of these systems, then it has a primitive recursive selection function. It is a corollary that if a μ-recursive scheme is provably total, then it is extensionally equivalent to a primitive recursive scheme. The methods are proof-theoretical, involving cut elimination. These results generalize to an abstract setting previous results of Parsons and Mints over the natural numbers.
••
TL;DR: This work has shown that using a wider class of enumerations, one can obtain an external characterization of search computability on arbitrary denumerable structures as in 1-2 and in the present paper this is done for REDS-computability.
Abstract: The best known concepts of \"effective\" computability on first order structures are, perhaps, prime and search computability of Moschovakis 1-3] and computability by means of effectively definable schemes (EDS) of Friedman 1-1]. A slight generalization of EDS appropriate for their use on partial structures are the recursively enumerable definitional schemes (REDS) introduced by Shepherdson in [6]. The connections between search computability, prime computability and REDS-computability are studied in [7]. It is proved there that on every partial structure prime computability implies REDS-computability and REDScomputability implies search computability. Moreover it is proved that there exist partial structures on which search computability is not equivalent with REDScomputability and REDS-computability is not equivalent with prime computability. On total structures REDS-computability and prime computability are equivalent but weaker than search computability. In [2] Lacombe uses an external approach to the definition of the effective functions on abstract structures with denumberable domains. He calls his notion V-recursiveness. The equivalence between V-recursiveness and search computability on denumberable structures with equality is proved by Moschovakis [4]. In [8] a generalization of V-recursiveness is studied. The main observation there is that using a wider class of enumerations (not only \"one to one\" as in 1-2]), one can obtain an external characterization of search computability on arbitrary denumerable structures. A natural problem is to find external characterizations of other notions of computability and especially of prime computability and REDS-computability. In the present paper this is done for REDS-computability. The more complicated case of prime computability will be considered in a forthcoming paper.
••
19 Nov 1990TL;DR: It is shown, that a hierarchy can be characterized with alternating pushdown automata, which is expected to be strict in contrast to a hierarchy with alternating finite automata or alternating space bounded automata.
Abstract: Alternation is a generalized principle of nondeterminism. The alternating turing machine is used to characterize the polynomial hierarchy. In this paper we show, that a hierarchy can be characterized with alternating pushdown automata, which we expect to be strict in contrast to a hierarchy with alternating finite automata or alternating space bounded automata. We describe a similar oracle hierarchy over the context-free languages, for which we construct complete languages. We show, that each level of the hierarchy with alternating pushdown automata is included in the corresponding level of the oracle hierarchy and that the logarithmic closure over both levels is the corresponding level of the polynomial hierarchy with one alternation less.
••
01 Jan 1990TL;DR: It is proved that both problems: the computability of the fixpoint in presence of an infinite universe and the completeness of the semantics restricted to interpretations containing only bounded depth terms are undecidable in general.
Abstract: This work is devoted to the integration of functions in Datalog. Functions are defined with a rewrite relation. We define a fixpoint semantics for Horn programs containing both relations and rewriting rules. The principal contribution is the bounded semantics. We study the two following problems: the computability of the fixpoint in presence of an infinite universe and the completeness of the semantics restricted to interpretations containing only bounded depth terms. We prove that both problems are undecidable in general, but decidable subcases are presented.
••
TL;DR: It is shown that a certain notion of computability via gödelization is different from Lacombe's notion ofV-recursiveness and the complexity of translating a Gödelnumbering into a direct sum of Friedbergnumberings.
Abstract: We present a simple proof of a Theorem of Khutoretskij on the number of incomparable one-one numberings of an r.e. family of r.e. sets. The proof directly generalizes to effective domains. In the second part, applying a Theorem of Goncharov, we show that for anyk≧ there exist total recursive functions having exactlyk recursive isomorphism classes. Using a Theorem of Selivanov, it is shown that a certain notion of computability via godelization is different from Lacombe's notion ofV-recursiveness. Finally, we discuss the complexity (w.r.t.T-degrees) of translating a Godelnumbering into a direct sum of Friedbergnumberings.
•
•
01 Jan 1990
TL;DR: Parity and the Pigeonhole Principle, Computational Models for Feasible Real Analysis, and Subrecursion and Lambda Representation over Free Algebras.
Abstract: Parity and the Pigeonhole Principle.- Computing over the Reals (or an Arbitrary Ring) Abstract.- On Model Theory for Intuitionistic Bounded Arithmetic with Applications to Independence Results.- Sequential, Machine Independent Characterizations of the Parallel Complexity Classes AlogTIME, ACk NCk and NC.- Characterizations of the Basic Feasible Functionals of Finite Type.- Functional Interpretations of Feasibly Constructive Arithmetic - Abstract.- Polynomial-time Combinatorial Operators are Polynomials.- Isols and Kneser Graphs.- Stockmeyer Induction.- Probabilities of Sentences about Two Linear Orderings.- Bounded Linear Logic: a Modular Approach to Polynomial Time Computability, Extended Abstract.- On Finite Model Theory (Extended Abstract).- Computational Models for Feasible Real Analysis.- Inverting a One-to-One Real Function is Inherently Sequential.- On Bounded ?11 Polynomial Induction.- Subrecursion and Lambda Representation over Free Algebras (Preliminary Summary).- Complexity-Theoretic Algebra: Vector Space Bases.- When is every Recursive Linear Ordering of Type ? Recursively Isomorphic to a Polynomial Time Linear Ordering over the Natural Numbers in Binary Form?.
••
01 Jan 1990TL;DR: The aim of the present paper is a comparative study of the computational power of the logic programs and search computability and the results show strong support for the claim that these programs are Turing-complete.
Abstract: The aim of the present paper is a comparative study of the computational power of the logic programs and search computability [1]
••
TL;DR: The concept of efficient computability is discussed, and several formulas are given for the evaluation of the Catalan numbers and other quantities, which are best suited for particular computation devices.
••
TL;DR: An attempt is made to propose a concept of limited rationality for choice junctions based on computability theory in computer science, which states that effective realization of game strategies bound by the “complexity of computing machines'.
Abstract: An attempt is made to propose a concept of limited rationality for choice junctions based on computability theory in computer science. Starting with the observation that it is possible to construct a machine simulating strategies of each individual in society, one machine for each individual's preference structure, we identify internal states of this machine with strategies or strategic preferences. Inputs are possible actions of other agents in society thus society is effectively operating as a game generated by machines. The main result states that effective realization of game strategies bound by the “complexity of computing machines'.
••
TL;DR: It is shown that in the deterministic case, 2-t tape Turing machines can simulate k-tape Turing machines without much increase in reversals while 1-tapes Turing machines do not have such a property if P ≠ PSPACE.