scispace - formally typeset
Search or ask a question

Showing papers on "Randomness published in 1987"


Book
01 Jan 1987
TL;DR: The Correlation Theory of Random Processes as mentioned in this paper is a four-volume series that introduces the newcomer to the theory of random functions and provides the background necessary to understand papers and monographs on the subject and to carry out independent research in fields where fluctuations are of importance, e.g. radiophysics, optics, astronomy, and acoustics.
Abstract: This book is a four-volume series that introduces the newcomer to the theory of random functions. It aims at providing the background necessary to understand papers and monographs on the subject and to carry out independent research in fields where fluctuations are of importance, e.g. radiophysics, optics, astronomy, and acoustics. Volume 2, Correlation Theory of Random Processes, presents the correlation theory of nonstationary processes paying particular attention to periodically nonstationary processes. Physical phenomena like interference, coherence and polarisation of random oscillations, thermal noise in discrete dynamical systems, and the spectral representations of random actions on discrete systems are dealt with.

697 citations


Journal ArticleDOI
TL;DR: In this paper, the statistical mechanics of interfaces subject to quenched impurities are studied in two dimensions, and the presence of randomness changes the scaling of domain wall fluctuations, and modifies critical behavior at interface-driven depinning (wetting), and commensurate-to-incommensurate phase transitions.

286 citations


Book
01 Dec 1987
TL;DR: The papers gathered in this book were published over a period of more than twenty years in widely scattered journals and led to the discovery of randomness in arithmetic, which was presented in the recently published monograph on “Algorithmic Information Theory” by the author.
Abstract: The papers gathered in this book were published over a period of more than twenty years in widely scattered journals. They led to the discovery of randomness in arithmetic which was presented in the recently published monograph on "Algorithmic Information Theory" by the author. There the strongest possible version of Goedel's incompleteness theorem, using an information-theoretic approach based on the size of computer programs, was discussed. The present book is intended as a companion volume to the monograph and it will serve as a stimulus for work on complexity, randomness and unpredictability, in physics and biology as well as in metamathematics.

164 citations


Journal ArticleDOI
TL;DR: In this article, a correspondence between Eigen's model of macromolecular evolution and the equilibrium statistical mechanics of an inhomogeneous Ising system is developed, where the free energy landscape of random Ising systems with the Hopfield Hamiltonian as a special example is applied to the replication rate coefficient landscape.
Abstract: The correspondence between Eigen's model of macromolecular evolution and the equilibrium statistical mechanics of an inhomogeneous Ising system is developed. The free energy landscape of random Ising systems with the Hopfield Hamiltonian as a special example is applied to the replication rate coefficient landscape. The coupling constants are scaled with 1/l, since the maxima of any landscape must not increase with the length of the macromolecules. The calculated error threshold relation then agrees with Eigen's expression, which was derived in a different way. It gives an explicit expression for the superiority parameter in terms of the parameters of the landscape. The dynamics of selection and evolution is discussed.

149 citations


Book
01 Dec 1987

147 citations


Journal ArticleDOI
TL;DR: Probabilistic methods, synthesizing the power of finite element methods with second-order perturbation techniques, are formulated for linear and nonlinear problems in this paper, where the effects of combined random fields and cyclic loading/stress reversal are studied and compared with Monte Carlo simulation results.

130 citations


Journal ArticleDOI
Gregory J. Chaitin1
TL;DR: In this paper, the authors construct an equation involving only whole numbers and addition, multiplication, and exponentiation, with the property that if one varies a parameter and asks whether the number of solutions is finite or infinite, the answer to this question is indistinguishable from the result of independent tosses of a fair coin.

120 citations


Journal ArticleDOI
TL;DR: An algorithm is presented which efficiently generates “high quality” random sequences (quasirandom bit-sequences) from two independent semi-random sources.
Abstract: The semi-random source, defined by Santha and Vazirani, is a general mathematical mode for imperfect and correlated sources of randomness (physical sources such as noise dicdes). In this paper an algorithm is presented which efficiently generates “high quality” random sequences (quasirandom bit-sequences) from two independent semi-random sources. The general problem of extracting “high quality” bits is shown to be related to communication complexity theory, leading to a definition of strong communication complexity of a boolean function. A hierarchy theorem for strong communication complexity classes is proved; this allows the previous algorithm to be generalized to one that can generate quasi-random sequences from two communicating semi-random sources

117 citations


Posted Content
01 Jan 1987
TL;DR: In this article, the authors considered the consistency property of some test statistics based on a time series of data and provided Monte Carlo evidence on the power of the tests in Finite Samples.
Abstract: This Paper Considers the Consistency Property of Some Test Statistics Based on a Time Series of Data. While Th Eusual Consistency Criterion Is Based on Keeping the Sampling Interval Fixed, We Let the Sampling Interval Take Any Path As the Sample Size Increases to Infinity. We Consider Tests of the Null Hypotheses of the Random Walk and Randomness Against Positive Autocorrelation We Show That Tests of the Unit Root Hypothesis Based on the First-Order Correlation Coefficient of the Original Data Are Consistent As Long As the Span of the Data Is Increasing. Tests of the Same Hypothesis Based on the First-Order Correlation Coefficient Using the First-Differenced Data Are Consistent Only If the Span Is Increasing At a Rate Greater Than Square Root of 'T'. on the Other Hand Tests of the Randomness Hypotheses Based on the First-Order Correlation Coefficient Applied to the Original Data Are Consistent As Long As the Span Is Not Increasing Too Fast. We Provide Monte Carlo Evidence on the Power, in Finite Samples, of the Tests Studied Allowing Various Combinations of Span and Sampling Frequencies. It Is Found That the Consistency Properties Summarize Well the Behavior of the Power in Finite Samples. the Power of Tests for a Unit Root Is More Influenced by the Span Than the Number O Observations While Tests of Randomness Are More Powerfull When a Small Sampling Frequency Is Available.

111 citations


Journal ArticleDOI
TL;DR: In this article, the specific heat singularity of the 2D q-state Potts model with weak quenched bond randomness is obtained to two-loop order in an expansion in the parameter ( q − 2) around the Ising model, in analogy to the e-expansion for φ 4 scalar field theory around the gaussian model.

110 citations


Patent
11 Sep 1987
TL;DR: In this article, a zener diode random number generator circuit is described which produces a random binary number output having a statistical distribution exhibiting a controlled degree of randomness determined in response to an input control signal.
Abstract: A zener diode random number generator circuit is described which produces a random binary number output having a statistical distribution exhibiting a controlled degree of randomness determined in response to an input control signal. A microprocessor feedback circuit monitors the random number output and produces the input control signal in response to the difference between the degree of randomness of the output signal and that of a pre-determined statistical distribution. The digital feedback automatically adjusts the zener diode biasing point and the limiter threshold such that part-to-part tolerance, component aging, temperature variations, or voltage fluctuations will not adversely affect the randomness of the bit stream output. In the preferred embodiment, the microprocessor tests the ratio of ONES bits to ZERO bits of the random number such that a desired 1:1 ONES/ZERO ratio is approximated.

Book ChapterDOI
13 Apr 1987
TL;DR: Stream ciphers are based on pseudorandom key streams, i.e. on deterministically generated sequences of bits with acceptable properties of unpredictability and randomness.
Abstract: Stream ciphers are based on pseudorandom key streams, i.e. on deterministically generated sequences of bits with acceptable properties of unpredictability and randomness (see [4, ch. 9], [9]

Journal ArticleDOI
TL;DR: In this paper, the authors investigate how long it takes for a random number generator to become random, assuming that the generator is chosen by a second source, and show that for almost all odd values of p, 1.02 \log_2 p$ steps are enough.
Abstract: Random number generators often work by recursively computing $X_{n+1} \equiv aX_n + b (\mod p)$. Various schemes exist for combining these random number generators. In one scheme, $a$ and $b$ are themselves chosen each time from another generator. Assuming that this second source is truly random, we investigate how long it takes for $X_n$ to become random. For example, if $a = 1$ and $b = 0, 1$, or $-1$ each with probability $\frac{1}{3}$, then $cp^2$ steps are required to achieve randomness. On the other hand, if $a = 2$ and $b = 0, 1$, or $-1$, each with probability $\frac{1}{3}$, then $c \log p \log\log p$ steps always suffice to guarantee randomness, and for infinitely many $p$, are necessary as well, although, in fact, for almost all odd $p, 1.02 \log_2 p$ steps are enough.

Journal ArticleDOI
TL;DR: In this paper, the exact probability distribution of the two-dimensional displacement of a particle at the origin, moving in straight-line paths at constant speed, and changing its direction after exponentially distributed time intervals, where the lengths of the straight line paths and the turn angles are independent, the angles being uniformly distributed.
Abstract: A calculation is made of the exact probability distribution of the two-dimensional displacement of a particle at timet that starts at the origin, moves in straight-line paths at constant speed, and changes its direction after exponentially distributed time intervals, where the lengths of the straight-line paths and the turn angles are independent, the angles being uniformly distributed. This random walk is the simplest model for the locomotion of microorganisms on surfaces. Its weak convergence to a Wiener process is also shown.

Journal ArticleDOI
TL;DR: A review of the attempts to define random sequences can be found in this article, where the authors suggest two theorems: one concerning the number of subsequence selection procedures that transform a random sequence into a Random Sequence, and the other concerning the relationship between definitions of randomness based on sequence selection and those based on statistical tests.
Abstract: We review briefly the attempts to define random sequences (§0). These attempts suggest two theorems: one concerning the number of subsequence selection procedures that transform a random sequence into a random sequence (§§1–3 and 5); the other concerning the relationship between definitions of randomness based on subsequence selection and those based on statistical tests (§4).

Proceedings ArticleDOI
01 Jan 1987
TL;DR: Randomness is an important computational resource, and has found application in such diverse computational tasks as combinatorial algorithms, synchronization and deadlock resolution protocols, encrypting data and cryptographic protocols, and so on.
Abstract: Randomness is an important computational resource, and has found application in such diverse computational tasks as combinatorial algorithms, synchronization and deadlock resolution protocols, encrypting data and cryptographic protocols. Blum [Bl] pointed out the fundamental fact that whereas all these applications of randomness assume a source of independent, unbiased bits, the available physical sources of randomness (such as zener diodes) suffer seriously from problems of correlation. A general

Journal ArticleDOI
TL;DR: In this paper, a relative entropy function is proposed as a measure of dependence or conditional dependence for multivariate densities with the same marginal densities, which can be interpreted as an ordering of dependence.
Abstract: The preorder relation of Hardy, Littlewood and Polya (1929), Day (1973) and Chong (1974, 1976) is applied to multivariate probability densities. This preorder, which is called majorization here, can be interpreted as an ordering of randomness. When used to compare multivariate densities with the same marginal densities, it can be interpreted as an ordering of dependence or conditional dependence. Results in Hickey (1983, 1984) and Joe (1985) are generalized. A relative entropy function is proposed as a measure of dependence or conditional dependence for multivariate densities with the same marginals.

Journal ArticleDOI
TL;DR: In this article, the authors describe an alternative class of generators, which produce random values by compounding, or intermixing, several different sequences of random values. But these generators are easier to implement, especially on smaller computers, than Marsaglia's recommended generators.
Abstract: Recently, Marsaglia (1984, Proceedings of the Sixteenth Symposium on the Interface) noted a trend toward increasingly more demanding use of random number generators. After observing that, for current uses, generators must pass increasingly more sophisticated tests for randomness in order to provide acceptable results, Marsaglia described a class of “combination” generators and recommended two specific combination generators that passed an extensive battery of stringent tests for randomness. The present article describes an alternative class of generators, which produce random values by compounding, or intermixing, several different sequences of random values. These “compound” generators are easier to implement, especially on smaller computers, than are Marsaglia's recommended generators. Five specific compound generators, two standard generators, and Marsaglia's two combination generators were subjected to several tests for randomness, including Marsaglia's stringent tests. The standard generator...

Journal ArticleDOI
TL;DR: In this paper, the authors considered Derrida's generalized random energy model and proved almost sure and almost sure convergence of the free energy at any inverse temperature β for an arbitrary number n of hierarchical levels.
Abstract: Derrida's generalized random energy model is considered Almost sure andLp convergence of the free energy at any inverse temperatureβ are proven for an arbitrary numbern of hierarchical levels The explicit form of the free energy is given in the most general case and the limitn→∞ is discussed

Journal ArticleDOI
15 Sep 1987-EPL
TL;DR: In this paper, an infinite chain of atoms, where the bond lengths between neighbouring sites take two values, according to a quasi-periodic rule, associated to a circle map with an irrational rotation angle, is considered.
Abstract: We consider an infinite chain of atoms, where the bond lengths between neighbouring sites take two values, according to a quasi-periodic rule, associated to a circle map with an irrational rotation angle. In a particular case, this construction is related to the projection method used to describe quasi-crystals. The structure factor (Fourier spectrum) of the structure is shown, through a combined analytical and numerical analysis, to be "singular continuous", with nontrivial scaling properties. It does not contain any Dirac peak (discrete spectrum), nor a smooth component (absolutely continuous spectrum). This structure, therefore, exhibits a new kind of order, intermediate between quasi-periodicity and randomness.

Journal ArticleDOI
01 Dec 1987
TL;DR: In this article, two methods for the solution of partial differential equations (PDE) for the general case of random in time physical parameters are presented and their application to solution of unsteady regional groundwater flow equations are illustrated.
Abstract: Two methods for the solution of partial differential equations (PDE) for the general case of random in time physical parameters are presented and their application to the solution of unsteady regional groundwater flow equations are illustrated. The first method is the semigroup approach which directly offers a solution without resorting to “closure approximations” (hierarchy techniques), perturbation techniques, or Montecarlo simulation techniques. The semigroup approach can also handle the general stochastic problem when randomness also appears as initial conditions, boundary conditions or forcing terms. The second method is an approximation scheme to obtain the semigroup solution in complex cases and permits the solution of equations with more than one random coefficient.

Journal ArticleDOI
TL;DR: It is concluded that extremely long maximum-length sequences, with periods of up to 2 9689 — 1, can in fact be used as reliable random-number generators for many purposes.

Journal ArticleDOI
TL;DR: In this paper, a crack length distribution function is derived in a closed form under some assumptions, and the cross effect does appear between the effect of randomness due to loading and that due to propagation resistance, with the aid of this result, the discussion is made on how the reliability of structural components degrades with time in consideration of uncertainties associated with initial flaws.

Journal ArticleDOI
TL;DR: It is shown the existence of nonuniform schemes for the following sampling problem: Given a sample space with n points, an unknown set of size n 2 , and s random points, it is possible to generate deterministically from them s + k points such that the probability of not hitting the unknown set is exponentially smaller in k than 2−s.
Abstract: We show the existence of nonuniform schemes for the following sampling problem: Given a sample space with n points, an unknown set of size n 2 , and s random points, it is possible to generate deterministically from them s + k points such that the probability of not hitting the unknown set is exponentially smaller in k than 2−s. Tight bounds are given for the quality of such schemes. Explicit, uniform versions of these schemes could be used for efficiently reducing the error probability of randomized algorithms. A survey of known constructions (whose quality is very far from the existential result) is included.

Journal ArticleDOI
TL;DR: In this article, the use of a building thermal analysis methodology in which the stochastic nature of the forcing functions is considered is discussed. But this method provides a rational and convenient way of handling uncertainty in the analysis and design of thermal system.


Journal ArticleDOI
TL;DR: In this article, the fractal dimension of the support of the invariant measure is calculated in a simple approximation and its dependence on the physical parameters is discussed, and the dependence of fractal dimensions on the parameters of the Ising model is discussed.
Abstract: Previous results relating the one-dimensional random field Ising model to a discrete stochastic mapping are generalized to a two-valued correlated random (Markovian) field and to the case of zero temperature. The fractal dimension of the support of the invariant measure is calculated in a simple approximation and its dependence on the physical parameters is discussed.

Journal ArticleDOI
TL;DR: In this article, a density parameter λ and a randomness parameter η are used to generate a point distribution about a regular lattice base, where the density parameter varies between Pλ = (e−λλλ)/λ! and ∞.

01 Jan 1987
TL;DR: In the Probabilistic Finite Element Method (PFEM) as mentioned in this paper, finite element methods have been efficiently combined with second-order perturbation techniques to provide an effective method for informing the designer of the range of response which is likely in a given problem.
Abstract: In the Probabilistic Finite Element Method (PFEM), finite element methods have been efficiently combined with second-order perturbation techniques to provide an effective method for informing the designer of the range of response which is likely in a given problem. The designer must provide as input the statistical character of the input variables, such as yield strength, load magnitude, and Young's modulus, by specifying their mean values and their variances. The output then consists of the mean response and the variance in the response. Thus the designer is given a much broader picture of the predicted performance than with simply a single response curve. These methods are applicable to a wide class of problems, provided that the scale of randomness is not too large and the probabilistic density functions possess decaying tails. By incorporating the computational techniques we have developed in the past 3 years for efficiency, the probabilistic finite element methods are capable of handling large systems with many sources of uncertainties. Sample results for an elastic-plastic ten-bar structure and an elastic-plastic plane continuum with a circular hole subject to cyclic loadings with the yield stress on the random field are given.

Journal ArticleDOI
TL;DR: It will be shown how the model parameter determining the mode of growth can be estimated with the maximum likelihood procedure from observed data and a notion of complete partition randomness is presented as an alternative to the growth hypotheses.