scispace - formally typeset
Journal ArticleDOI

Reliable information storage in memories designed from unreliable components

Michael G. Taylor
- 01 Dec 1968 - 
- Vol. 47, Iss: 10, pp 2299-2337
Reads0
Chats0
TLDR
The capabilities of memories are discussed, and it is shown that there exist memories, constructed entirely from unreliable components of the assumed type, which have nonzero information storage capacities.
Abstract
This is the first of two papers which consider the theoretical capabilities of computing systems designed from unreliable components This paper discusses the capabilities of memories; the second paper discusses the capabilities of entire computing systems Both present existence theorems analogous to the existence theorems of information theory The fundamental result of information theory is that communication channels have a capacity, C, such that for all information rates less than C, arbitrarily reliable communication can be achieved In analogy with this result, it is shown that each type of memory has an information storage capacity, C, such that for all memory redundancies greater than 1/C arbitrarily reliable information storage can be achieved Since memory components malfunction in many different ways, two representative models for component malfunctions are considered The first is based on the assumption that malfunctions of a particular component are statistically independent from one use to another The second is based on the assumption that components fail permanently but that bad components are periodically replaced with good ones In both cases, malfunctions in different components are assumed to be independent For both models it is shown that there exist memories, constructed entirely from unreliable components of the assumed type, which have nonzero information storage capacities

read more

Citations
More filters
Journal ArticleDOI

Coding for interactive communication

TL;DR: A deterministic method for simulating noiseless-channel protocols on noisy channels, with only a constant slowdown is described, an analog for general, interactive protocols of Shannon's coding theorem, which deals only with data transmission, i.e., one-way protocols.
Journal ArticleDOI

On the Optimal Recovery Threshold of Coded Matrix Multiplication

TL;DR: Novel coded computation strategies for distributed matrix–matrix products that outperform the recent “Polynomial code” constructions in recovery threshold, i.e., the required number of successful workers are provided.
Proceedings ArticleDOI

On networks of noisy gates

TL;DR: It is shown that many Boolean functions (including, in a certain sense, "almost all" Boolean functions) have the property that the number of noisy gates needed to compute them differs from the numberof noiseless gates by at most a constant factor.
Journal Article

Some Applications of Coding Theory in Computational Complexity

TL;DR: A survey of locally testable and locally decodable error-correcting codes can be found in this article, where the authors present a combinatorial core of probabilistically checkable proofs.
Journal ArticleDOI

Reliable computation with cellular automata

TL;DR: A one-dimensional array of cellular automata on which arbitrarily large computations can be implemented reliably, even though each automaton at each step makes an error with some constant probability, leads to the refutation of the positive probability conjecture.
References
More filters
Journal ArticleDOI

A mathematical theory of communication

TL;DR: This final installment of the paper considers the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now.
Journal ArticleDOI

A simple derivation of the coding theorem and some applications

TL;DR: Both amplitude-discrete and amplitude-continuous channels are treated, both with and without input constraints, and the exponential behavior of the bounds with block length is the best known for all transmission rates between 0 and capacity.
Journal ArticleDOI

A heuristic discussion of probabilistic decoding

TL;DR: The invited Profess01 Fano to commit to paprr his elegant but, unelaborate explanation of the principles of sequential decoding, a scheme which is currently contending for a position as the most practical implementation of Shannon’s theory of noisy communication channels.
Journal ArticleDOI

Computation in the presence of noise

TL;DR: It is shown that a simple combinational computer which can take the and or or of k or more input blocks can only be made arbitrarily reliable by making n/k arbitrarily large, so that the capacity for computation, in an information theory coding sense, is zero.
Related Papers (5)