scispace - formally typeset
Open AccessJournal ArticleDOI

On the Computational Power of Neural Nets

TLDR
It is proved that one may simulate all Turing machines by such nets, and any multi-stack Turing machine in real time, and there is a net made up of 886 processors which computes a universal partial-recursive function.
About
This article is published in Journal of Computer and System Sciences.The article was published on 1995-02-01 and is currently open access. It has received 837 citations till now. The article focuses on the topics: Super-recursive algorithm & Universal Turing machine.

read more

Citations
More filters
Book

Deep Learning

TL;DR: Deep learning as mentioned in this paper is a form of machine learning that enables computers to learn from experience and understand the world in terms of a hierarchy of concepts, and it is used in many applications such as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames.
Posted Content

On the difficulty of training Recurrent Neural Networks

TL;DR: This paper proposes a gradient norm clipping strategy to deal with exploding gradients and a soft constraint for the vanishing gradients problem and validates empirically the hypothesis and proposed solutions.
Journal ArticleDOI

The empirical case for two systems of reasoning.

TL;DR: The distinction between rule-based and associative systems of reasoning has been discussed extensively in cognitive psychology as discussed by the authors, where the distinction is based on the properties that are normally assigned to rules.
Journal ArticleDOI

An Introduction to Deep Learning for the Physical Layer

TL;DR: In this article, an end-to-end reconstruction task was proposed to jointly optimize transmitter and receiver components in a single process, which can be extended to networks of multiple transmitters and receivers.
References
More filters
Journal ArticleDOI

A logical calculus of the ideas immanent in nervous activity

TL;DR: In this article, it is shown that many particular choices among possible neurophysiological assumptions are equivalent, in the sense that for every net behaving under one assumption, there exists another net which behaves under another and gives the same results, although perhaps not in the same time.
Book

Introduction to Automata Theory, Languages, and Computation

TL;DR: This book is a rigorous exposition of formal languages and models of computation, with an introduction to computational complexity, appropriate for upper-level computer science undergraduates who are comfortable with mathematical arguments.
Journal ArticleDOI

Finding Structure in Time

TL;DR: A proposal along these lines first described by Jordan (1986) which involves the use of recurrent links in order to provide networks with a dynamic memory and suggests a method for representing lexical categories and the type/token distinction is developed.
Journal ArticleDOI

An introduction to computing with neural nets

TL;DR: This paper provides an introduction to the field of artificial neural nets by reviewing six important neural net models that can be used for pattern classification and exploring how some existing classification and clustering algorithms can be performed using simple neuron-like components.
Journal ArticleDOI

Neural computation of decisions in optimization problems

TL;DR: Results of computer simulations of a network designed to solve a difficult but well-defined optimization problem-the Traveling-Salesman Problem-are presented and used to illustrate the computational power of the networks.
Related Papers (5)