scispace - formally typeset
Open AccessPosted Content

Connectionist-Symbolic Machine Intelligence using Cellular Automata based Reservoir-Hyperdimensional Computing

TLDR
A novel framework of reservoir computing is introduced, that is capable of both connectionist machine intelligence and symbolic computation, and it is proved that cellular automaton reservoir holds a distributed representation of attribute statistics, which provides a more effective computation than local representation.
Abstract
We introduce a novel framework of reservoir computing, that is capable of both connectionist machine intelligence and symbolic computation. Cellular automaton is used as the reservoir of dynamical systems. Input is randomly projected onto the initial conditions of automaton cells and nonlinear computation is performed on the input via application of a rule in the automaton for a period of time. The evolution of the automaton creates a space-time volume of the automaton state space, and it is used as the reservoir. The proposed framework is capable of long short-term memory and it requires orders of magnitude less computation compared to Echo State Networks. We prove that cellular automaton reservoir holds a distributed representation of attribute statistics, which provides a more effective computation than local representation. It is possible to estimate the kernel for linear cellular automata via metric learning, that enables a much more efficient distance computation in support vector machine framework. Also, binary reservoir feature vectors can be combined using Boolean operations as in hyperdimensional computing, paving a direct way for concept building and symbolic processing.

read more

Citations
More filters
Journal Article

In References to References

TL;DR: It is suggested that boars are susceptible to A. phagocytophilum infection, but are able to control infection, mainly through activation of innate immune responses and cytoskeleton rearrangement to promotephagocy-tosis and autophagy.
Journal ArticleDOI

Classification Using Hyperdimensional Computing: A Review

TL;DR: Hyperdimensional (HD) computing as discussed by the authors is built upon its unique data type referred to as hypervectors, which is typically in the range of tens of thousands of dimensions and is used to solve cognitive tasks.
Book ChapterDOI

Reservoir Computing as a Model for In-Materio Computing

TL;DR: The prospects of using Reservoir Computing as a model for in-materio computing are discussed, and new training techniques are introduced that could overcome training difficulties found in the current Evolution-in-Materio technique.
Journal ArticleDOI

Symbolic computation using cellular automata-based hyperdimensional computing

TL;DR: This letter introduces a novel framework of reservoir computing that is capable of both connectionist machine intelligence and symbolic computation, and suggests that binary reservoir feature vectors can be combined using Boolean operations as in hyperdimensional computing.
Journal ArticleDOI

Deep learning with cellular automaton-based reservoir computing

Stefano Nichele, +1 more
- 15 Dec 2017 - 
TL;DR: This work lays the foundation for implementations of deep learning with CA-based reservoir systems by providing a method of mapping binary inputs from the task onto the automata and a recurrent architecture for handling the sequential aspects.
References
More filters
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Journal ArticleDOI

Gradient-based learning applied to document recognition

TL;DR: In this article, a graph transformer network (GTN) is proposed for handwritten character recognition, which can be used to synthesize a complex decision surface that can classify high-dimensional patterns, such as handwritten characters.
Journal ArticleDOI

LIBSVM: A library for support vector machines

TL;DR: Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.
Book

The Nature of Statistical Learning Theory

TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?

Statistical learning theory

TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.