scispace - formally typeset
Search or ask a question

Showing papers on "Randomness published in 1976"


Journal ArticleDOI
TL;DR: A new approach to the problem of evaluating the complexity ("randomness") of finite sequences is presented, related to the number of steps in a self-delimiting production process by which a given sequence is presumed to be generated.
Abstract: A new approach to the problem of evaluating the complexity ("randomness") of finite sequences is presented. The proposed complexity measure is related to the number of steps in a self-delimiting production process by which a given sequence is presumed to be generated. It is further related to the number of distinct substrings and the rate of their occurrence along the sequence. The derived properties of the proposed measure are discussed and motivated in conjunction with other well-established complexity criteria.

2,473 citations


Journal ArticleDOI
TL;DR: In this paper, various distance-based methods of testing for randomness in a population of spatially distributed events are described, with special emphasis placed upon preliminary analysis in which the complete enumeration of the events within the study area is not available.
Abstract: Summary Various distance-based methods of testing for randomness in a population of spatially distributed events are described. Special emphasis is placed upon preliminary analysis in which the complete enumeration of the events within the study area is not available. Analytical progress in assessing the power of the techniques against extremes of aggregation and regularity is reviewed and the results obtained from the Monte Carlo simulation of more realistic processes are presented. It is maintained that the method of T-square sampling can help to provide quick and informative results and is especially suited to large populations. Some comments on contiguous quadrat methods are made.

235 citations


Journal ArticleDOI
Milan Zeleny1
TL;DR: It is suggested here that the model is based on a number of significant conceptual simplifications, designed to explain and predict changes in individual choice behavior under varying conditions of choice.
Abstract: The traditional compensatory multi-attribute model persists in being inconsistent with empirical findings. It is suggested here that the model is based on a number of significant conceptual simplifications. Although the individual choice behavior is partially characterized by randomness, the model attempts to reveal the underlying deterministic rationale of the choice behavior, abstracted from random fluctuations. If such a model is conceptually false then no stochastic refinement can validate it, although some statistical tests might be improved slightly. We need a different kind of deterministic choice rationale, a new paradigm, capable of explaining and predicting those changes, intransitivities and inconsistencies of the individual choice behavior for which we currently have only a stochastic explanation. Such a methodology can then be further refined via stochastic extension. The model presented here is designed to explain and predict changes in individual choice behavior under varying conditions of choice.

79 citations


Journal ArticleDOI
TL;DR: A procedure is proposed for studying the hypothesis that a sequence of dichotomous random variables are independent and identically (randomly) distributed, and an application to testing for Schwann cell disease is discussed.
Abstract: A procedure is proposed for studying the hypothesis that a sequence of dichotomous random variables are independent and identically (randomly) distributed. The procedure is relatively sensitive to departures from randomness involving multiple clustering. An application to testing for Schwann cell disease is discussed. The distrubution of the sample coefficient of variation from an exponential distribution, a useful statistic in testing randomness for a continuous model, is also studied.

73 citations


Journal ArticleDOI
TL;DR: In this article, the behavior of a simple aquatic ecosystem with algae, bacteria, Daphnia, detritus, and usable dissolved organic carbon as its components was studied using random differential equation models.
Abstract: The behavior of a simple aquatic ecosystem with algae, bacteria, Daphnia , detritus, and usable dissolved organic carbon as its components was studied using random differential equation models. The randomness in the system has been introduced through initial conditions, input variables, and parameters. The computer simulation results show that the deterministic solution and the stochastic mean are different. Also, there was some degree of increased “stability” in the stochastic system. A large number of numerical solutions generated by the Monte Carlo procedure give a range of biomass values which is very similar to the range of values obtained from field measurements. These models enable us to make predictions in terms of means and associated variances and thus are more useful than deterministic models for applied problems in ecology and resource management.

69 citations


Journal ArticleDOI
TL;DR: A central limit theorem for exchangeably dissociated random variables is proved and some remarks on the closeness of the normal approximation are made in this paper, where the weak convergence of the empirical distribution process to a Gaussian process is proved.
Abstract: Families of exchangeably dissociated random variables are defined and discussed. These include families of the form g(Y,, Y,, Y , Y,) for some function g of m arguments and some sequence Y, of i.i.d. random variables on any suitable space. A central limit theorem for exchangeably dissociated random variables is proved and some remarks on the closeness of the normal approximation are made. The weak convergence of the empirical distribution process to a Gaussian process is proved. Some applications to data analysis are discussed. CENTRAL LIMIT THEOREM; DISTANCE DISTRIBUTION; SIMILARITY MEASURE; TEST OF RANDOMNESS; TEST OF CLUSTERING; CLUSTER ANALYSIS; DEPENDENT RANDOM VARIABLES; WEAK CONVERGENCE; EMPIRICAL DISTRIBUTION FUNCTION; GAUSSIAN PROCESS; GRAPH COLOURING

65 citations


Journal ArticleDOI
TL;DR: The concept of Markov chains, applied to stratigraphic sections, is reliable in analyzing cyclic patterns in lithologic successions, and entropy for the whole system of sedimentation is introduced to discuss variability of the condition in the depositional processes.
Abstract: The concept of Markov chains, applied to stratigraphic sections, is reliable in analyzing cyclic patterns in lithologic successions. Randomness in the occurrence of lithologies repeating in a succession is evaluated generally in terms of entropies which can be calculated for the Markov matrix equated with the succession. Two types of entropies pertain to every state; one is relevant to the Markov matrix expressing the upward transitions, and the other, relevant to the matrix expressing the downward transitions. The latter and the former with respect to a certain state, making an entropy set, correspond to the degree of randomness in its linking with the others which occur as the precursor and the successor, respectively. It is obvious that the entropy sets which are calculated for all state variables serve as a reliable criterion in the discrimination of cyclic pattern of the succession. We are able based on the entropy sets to classify the various patterns into asymmetric, symmetric, and random cycles, which are exhibited also in actual lithologic successions. The entropy sets are calculated for Markov matrices which have been reported from a number of areas in the world, and compared with the cyclic patterns supposed there. Entropy for the whole system of sedimentation also is introduced to discuss variability of the condition in the depositional processes.

62 citations


Journal ArticleDOI
TL;DR: In this article, an exactly soluble model for a spin-glass phase transition is presented, which is essentially a mean-field theory; but instead of "bond" randomness of the exchange interaction (as used by other authors), a site randomness is assumed.
Abstract: An exactly soluble model for a spin-glass phase transition is presented. It is essentially a mean-field theory; but instead of "bond" randomness of the exchange interaction (as used by other authors), a "site" randomness is assumed. This enables one to calculate the "quenched" free energy without any uncertain mathematical procedures. Typical results are given for a variety of interesting cases.

52 citations


Journal ArticleDOI
TL;DR: In this article, the cumulative probabilities of the minimum number of edges needed to connect a random graph with n vertices and v edges were derived for n = 10(1)30(5)80-(10)100).
Abstract: Statistics based on a theory of random graphs have been proposed as an analytic aid to assess the randomness of a clustered structure. Probability tables for two such statistics are tabulated. Exact values of Pn,v , the cumulative probabilities of the minimum number of edges needed to connect a random graph, are tabulated for n = 10(1)30(5)80-(10)100. Exact and approximate values of En,v, the expected number of components in a random graph with n vertices and v edges, are tabulated for n = 10(1)30(5)100.

45 citations


Journal ArticleDOI
TL;DR: A stochastic computational method was developed to study properties of cross-bridge models for muscle contraction, by following the time history of individual cross- bridge model of Andrew Huxley (1957).

45 citations


Journal ArticleDOI
TL;DR: A new technique suitable for the detection of randomness in patterns of plant individuals, based on the properties of the Simplicial graph, is described and shown to be a superior method of analysis to those currently used.
Abstract: A new technique suitable for the detection of randomness in patterns of plant individuals, based on the properties of the Simplicial graph, is described and shown to be a superior method of analysis to those currently used.

Posted ContentDOI
TL;DR: In this article, the statistical properties of daily closing futures prices for nine commodities are studied and two hypotheses are examined: price changes are normally distributed, and prices follow a random walk process.
Abstract: The statistical properties of daily closing futures prices for nine commodities are studied. Two hypotheses are examined: Price changes are normally distributed, and prices follow a random walk process. Normality is tested by estimating kurtosis, the R/S statistic, and characteristic exponents. The Gaussian hypothesis is rejected in a large proportion of cases. Randomness is tested by using the turning point test and the phase length test. Both tests reject the random walk hypothesis.

Journal ArticleDOI
TL;DR: A number of quantitative tests have been used recently to seek for randomness in sequences of commodity futures price changes, such as spectral analysis, serial correlation coefficients and runs tests as discussed by the authors.
Abstract: THIS NOTE CONSIDERS the underlying logic and method of operation of a number of quantitative tests which have been used recently to seek for randomness in sequences of commodity futures price changes. Through absence of any experimental design and false analysis of the results, randomness in the price changes could be rejected when it would seem to hold rather well. The methods used have been filter tests, spectral analysis, serial correlation coefficients and runs tests. These are all complementary, as filters seek non-linear "financial-type" dependence, spectra give a non-parametric frequency domain approach, and serial correlations and runs are parametric and non-parametric tests in the time domain. Theoretical evidence for independence has been given by Samuelson [5], who proved that properly anticipated prices fluctuate randomly. Economically, it makes good sense to consider the commodity futures market as an example of an efficient capital market (Fama [2]), where prices fully reflect all available information. The link between these two markets has been discussed by Dusak [1], who argues that the appropriate asset is the discounted value of the spot commodity. It is necessary to use a test procedure which enables us to distinguish between the null hypothesis (independence) and the alternative hypothesis (dependence). If this is not done, it is logically impossible to distinguish between these two alternatives. There is no problem in using serial correlation coefficients and runs tests as their sampling distributions was well approximated by the normal distribution for larger samples-significant values are used as evidence of dependence. However, the situation is much more complex for both filters and spectra. Returns from using filter rules are conventionally compared with returns from a buy-and-hold investment strategy, because otherwise there is no way of measuring dependence or market inefficiency. If there is a positive return, then without a bench-mark like buy-and-hold, there is no way of finding out if it was due to a successful filter strategy or an upward drifting random walk. The returns are then often averaged over filter sizes, different contracts or years. In spite of this averaging, some measure of dispersion is still needed to say exactly how much better the filter is performing in absolute terms relative to buy-and-hold. If no measure of dispersion is presented, then we do not know if we have differed significantly from buy-and-hold, the null hypothesis of independence. Due to the common "market" factor across filter sizes and contracts, conventional dispersion measures may be biased. Also, no formal proof seems to exist of the equality of returns from filters and buy-and-hold under the assumption of independent price changes. To make the use of filters inferentially sound, the

Journal ArticleDOI
TL;DR: In this article, a theory of dynamical electron-electron correlation in a random A x B 1−x alloy is developed as an extension to the Kanamori model: two electrons in a narrow band, interacting via randomly varying contact pair interaction are considered.

Journal ArticleDOI
TL;DR: In this paper, conditions for quasifixation are investigated and observed frequencies of the gene medionigra of Panaxia dominula are considered in relation to the model of random selection without dominance.
Abstract: The equations satisfied by the gene frequency for various cases of dominance are converted to the type of stochastic differential equation associated with a diffusion process. Using the physical approach to stochastic integrals, the solutions of the corresponding Fokker-Planck equations are obtained. Conditions for quasifixation are investigated and observed frequencies of the gene medionigra of Panaxia dominula are considered in relation to the model of random selection without dominance. Stochastic changes in the gene frequencies of populations are attributable to random sampling of gametes or random fluctuations in the pressures of selection and migration. Wright [l] pointed out that in real populations the relative importance of these various sources of randomness may be quite different for different genes. In this work we wish to study the effects of random selection on gene frequency. In order to do this we will assume that the population size is large enough to make the effects of random sampling of gametes negligible. We will also suppose that the population considered is sufficiently isolated from others of the same species that the effects of migration, whether deterministic or stochastic, may be ignored. Thus the only source of variability will be the fluctuations in the relative fitnesses of the genotypes. Furthermore, because the population size is assumed to be large, the gene frequencies may be studied through their continuous approximations. The first study of the temporal evolution of gene frequency distributions in the case of randomly varying selection was performed by Kimura [2]. If X(f) denotes an allele frequency at time t, Kimura found an analytic

Journal ArticleDOI
TL;DR: In this article, a phase transition analysis of the two-dimensional random antiferromagnet Rb2Mn0.5Ni 0.5F4 was performed and the system was found to exhibit a well-defined phase transition with critical exponents identical to those of the isomorphous pure materials K2NiF4 and K2mnF4.
Abstract: A neutron scattering study of the order parameter, correlation length and staggered susceptibility of the two-dimensional random antiferromagnet Rb2Mn0.5Ni0.5F4 is reported. The system is found to exhibit a well-defined phase transition with critical exponents identical to those of the isomorphous pure materials K2NiF4 and K2MnF4. Thus, in these systems, which have the asymptotic critical behaviour of the two-dimensional Ising model, randomness has no measurable effect on the phase-transition behaviour.

Journal ArticleDOI
TL;DR: In this paper, a random wave simulation system, which makes possible generating random waves having statistically same properties as those of sea waves, has been proposed and the characteristics of this system are demonstrated experimentally through several cases of random wave simulations.

Journal ArticleDOI
TL;DR: A memory network designed to learn sequences of inputs separated by various time intervals and to repeat these sequences when cued by their initial portions is considered, illustrating the model's operating characteristics and memory capacity.
Abstract: Models of circuit action in the mammalian hippocampus have led us to a study of habituation circuits. In order to help model the process of habituation we consider here a memory network designed to learn sequences of inputs separated by various time intervals and to repeat these sequences when cued by their initial portions. The structure of the memory is based on the anatomy of the dentate gyrus region of the mammalian hippocampus. The model consists of a number of arrays of cells called lamellae. Each array consists of four lines of model cells coupled uniformly to neighbors within the array and with some randomness to cells in other lamellae. All model cells operate according to first-order differential equations. Two of the lines of cells in each lamella are coupled such that sufficient excitation by a system input generates a wave of activity that travels down the lamella. Such waves effect dynamic storage of the representation of each input, allowing association connections to form that code both the set of cells stimulated by each input and the time interval between successive inputs. Results of simulation of two networks are presented illustrating the model's operating characteristics and memory capacity.

Journal ArticleDOI
TL;DR: In this article, an exact kinetic equation is derived for the spectral density function in the case of wave propagation in a nondispersive medium characterized by large-scale space-time fluctuations, and a quantity called the degree of coherence function is defined as a quantitative measure of the irreversible effects of randomness.
Abstract: Within the framework of the quasioptical description and the pure Markovian random process approximation, an exact kinetic equation is derived for the spectral density function in the case of wave propagation in a nondispersive medium characterized by large‐scale space–time fluctuations. Also, a quantity, called the degree of coherence function, is defined as a quantitative measure of the irreversible effects of randomness.

Journal ArticleDOI
TL;DR: A method for measuring areas using Visual Scene Analysis techniques is described, and a comparison made with a point counting approach, in the light of different randomness criteria.
Abstract: A method for measuring areas using Visual Scene Analysis techniques is described, and a comparison made with a point counting approach. The theoretical accuracy of such methods is discussed with reference to a previously published analysis, in the light of different randomness criteria.

Journal ArticleDOI
TL;DR: In this paper, the authors present a theory for treating randomly diluted antiferromagnets with uniaxial anisotropy in the presence of an arbitrary external field along the symmetry axis.
Abstract: The authors present a theory for treating randomly diluted antiferromagnets with uniaxial anisotropy in the presence of an arbitrary external field along the symmetry axis. In particular, they examine the behaviour of the interface between the antiferromagnetic and the spin-flop phases as a function of the dilution. It is found that the dependence of the critical field on magnetic concentration is steeper than indicated by virtual-crystal-mean-field arguments. Moreover, this dependence is found to be a strong function of the relative size of the anisotropy. The present theory uses a coherent potential ansatz for treating randomness in the exchange interactions but ignores dynamical effects occasioned by fluctuations in the local anisotropy field. Thus, its relevance would be limited to systems where the average anisotropy, D(2S-1), acting on a magnetic ion is substantially smaller than the average exchange, 2SzJ, experienced by the same.


Journal Article
TL;DR: Some fundamentals of probability theory that form the building blocks of stochastic processes are presented and the concept of chance as it applies to dice or cards is discussed.
Abstract: The components of a pavement system, its loadings and responses, its constitutive materials, and conditions of weather vary in time and location in a random manner. Mathematical models of such systems are known as stochastic processes. This paper presents some fundamentals of probability theory that form the building blocks of such processes. Specific topics treated are deterministic and stochastic systems, randomness and probability, tree diagrams, permutations and computations, conditional probabilities, independence, and Bayes' theorem. Examples are presented to demonstrate the use of the concepts relative to factors entering the analysis, design, construction, and proofing of pavement systems. The concept of chance as it applies to dice or cards is discussed. In this paper a collection of tools is described, and their use is demonstrated. /Author/

ReportDOI
01 Jan 1976
TL;DR: In this article, the authors used the wave equation to connect the statistical properties of the random medium to the implied statistical properties for the wave parameters within the framework of a correlation theory, thus restricting the realm of validity to high frequencies and small refractive index fluctuations.
Abstract: : The observation fundamental to this work is that the ocean is usually in a state of turbulent motion. Correspondingly, the value of the temperature at every point in the ocean undergoes irregular fluctuations. In particular, since the index of refraction of the ocean is a function of temperature, we shall take the viewpoint that the refractive index is random and assume that the Kolmogorov theory of locally homogeneous and isotropic turbulence provides a sufficiently good description of the refractive index microstructure. To extract information concerning the randomness of an acoustic wave propagating through this turbulent and unbounded ocean, we make use of the wave equation to connect the statistical properties of the random medium to the implied statistical properties of the wave parameters within the framework of a correlation theory. We accomplish this only to first order in perturbation theory, thus restricting the realm of validity of our results to high frequencies and small refractive index fluctuations. The structure function of the logrithmic amplitude we find, generalizes similar results of Tatarski and Chernov away from the transversal, correspondingly longitudinal restrictions inherent in their work.

Journal ArticleDOI
TL;DR: In this paper, the probability of collapse of plates of random plastic moment is considered within the yield-line theory, where the unit yield moments in the hinges are assumed statistically independent and a weight factor is introduced to account for combined stress states in a hinge line.

Journal ArticleDOI
TL;DR: In this paper, a methodology for incorporating into the decision-making process of design the explicit consideration of possible future performance of the designed structure in the presence of uncertainty in both loading and structural parameters assumed in the design process is presented.

Journal ArticleDOI
TL;DR: Here proposed are complexity measures of finite sequences of symbols, based on finite automata, defined and characterized using ultimately periodic sequences, and a refined measure, F-complexity, introduced, showing that highly random sequences have large F- complexities, but the converse is not always true.
Abstract: Here proposed are complexity measures of finite sequences of symbols, based on finite automata. Basic properties of these measures are demonstrated. The relation between the complexity for generating a sequence and the randomness of the generated sequence is also discussed. First, the notion of A-complexity is defined and characterized using ultimately periodic sequences (Theorem 1). A refined measure, F-complexity, is then introduced. It is shown that highly random sequences have large F-complexities (Theorem 2), but the converse is not always true (Theorem 3). Finally, the c-complexity is proposed to remedy this shortcoming of F-complexity. It includes as special cases both A-complexity and F-complexity. It is shown that certain sequences with high c-complexities, complete periodic sequences, are equidistributed (Theorem 4).

Journal ArticleDOI
TL;DR: In this paper, a Monte Carlo simulation method for fatigue failure is presented, by which the randomness of two material properties as well as that of the applied load can be incorporated into a stochastic model using an appropriate failure criterion to predict the statistical characteristics of fatigue life under constant and random amplitude cyclic loading conditions.
Abstract: This paper presents a Monte Carlo simulation method for fatigue failure, by which the randomness of two material properties as well as that of the applied load can be incorporated into a stochastic model using an appropriate failure criterion to predict the statistical characteristics of fatigue life under constant and random amplitude cyclic loading conditions. In this technique, both the endurance limit S sub e and the fatigue strength coefficient S sub f are treated as stochastic variables. The combined effect of the randomness of S sub e, S sub f, and the applied stress on the statistical characteristics of fatigue lives is predicted analytically using digital simulation of fatigue tests. The life distributions and their statistical characteristics are found to be in good agreement with those obtained from analyzing the experimental results, indicating that the proposed technique and the underlying assumptions and hypotheses are adequate. The suggested method is believed to be an effective, fast, and easy-to-use design tool which is suitable for use on electronic computers. It is ideal for parametric studies compared with the costly and time-consuming laboratory fatigue tests. Minimum experimental data are needed as a basis for the analysis. New results are presented which show the effect of the randomness of the loads and material properties on the randomness of fatigue life distribution.

Journal ArticleDOI
TL;DR: There exists no known random sequence since each test for randomness rejects sequences generatable "at random" and hence any sequence accepted on such a basis is from a stratified space and is not random.
Abstract: There exists no known random sequence since each test for randomness rejects sequences generatable "at random" and hence any sequence accepted on such a basis is from a stratified space and is not random. A property (randomness) of a sequence which disappears on testing is not a property of the sequence. Hence there is either no random sequence or all sequence are random. This situation is not avoidable.

Journal ArticleDOI
TL;DR: In this article, the second-moment structure of a weakly stationary point process is determined by its variance-time curve V(t), and a suitably normalized version of V*(t) is shown to converge weakly to a non-stationary Gaussian random function T1 (t) in the case of a Poisson process.
Abstract: It is well-known that the second-moment structure of a weakly stationary point process is determined by its variance-time curve V(t). An estimator, V*(t), of the variance-time curve is proposed. A suitably normalized version of V*(t) is shown to converge weakly to a non-stationary Gaussian random function T1 (t) in the case of a Poisson process. The normality of the finite-dimensional distributions of T1 (r ) leads to a multivariate test of randomness based upon the index of dispersion of the underlying process. Two linear functionals of T1 (t), (I == fll T1 (t )dt and (2 == f(\\T1 (t )dt, are also considered as tests of randomness. The first is easily seen to be normal, and the characteristic function of the second is derived. Approximate percentage points of the distribution function of (2 have been computed and are seen to compare favorably with those obtained from a Monte Carlo simulation. Finally, all three test statistics appear to perform well when applied to real and simulated data, and each can be generalized to higher dimensions.