scispace - formally typeset
Search or ask a question

Showing papers on "Randomness published in 2011"


Book
12 Feb 2011
TL;DR: In this paper, the random phase approximation (RPA) was used to estimate the phase and amplitude randomness of wave wave wave Fourier modes in wave wave Turbulence (WT) systems.
Abstract: In this paper we review recent developments in the statistical theory of weakly nonlinear dispersive waves, the subject known as Wave Turbulence (WT) We revise WT theory using a generalisation of the random phase approximation (RPA) This generalisation takes into account that not only the phases but also the amplitudes of the wave Fourier modes are random quantities and it is called the ``Random Phase and Amplitude'' approach This approach allows to systematically derive the kinetic equation for the energy spectrum from the the Peierls-Brout-Prigogine (PBP) equation for the multi-mode probability density function (PDF) The PBP equation was originally derived for the three-wave systems and in the present paper we derive a similar equation for the four-wave case Equation for the multi-mode PDF will be used to validate the statistical assumptions about the phase and the amplitude randomness used for WT closures Further, the multi-mode PDF contains a detailed statistical information, beyond spectra, and it finally allows to study non-Gaussianity and intermittency in WT, as it will be described in the present paper In particular, we will show that intermittency of stochastic nonlinear waves is related to a flux of probability in the space of wave amplitudes

433 citations


Journal ArticleDOI
TL;DR: This work introduces a protocol for private randomness expansion with untrusted devices which is designed to take as input an initially private random string and produce as output a longerPrivate random string.
Abstract: Randomness is an important resource for many applications, from gambling to secure communication. However, guaranteeing that the output from a candidate random source could not have been predicted by an outside party is a challenging task, and many supposedly random sources used today provide no such guarantee. Quantum solutions to this problem exist, for example a device which internally sends a photon through a beamsplitter and observes on which side it emerges, but, presently, such solutions require the user to trust the internal workings of the device. Here, we seek to go beyond this limitation by asking whether randomness can be generated using untrusted devices—even ones created by an adversarial agent—while providing a guarantee that no outside party (including the agent) can predict it. Since this is easily seen to be impossible unless the user has an initially private random string, the task we investigate here is private randomness expansion. We introduce a protocol for private randomness expansion with untrusted devices which is designed to take as input an initially private random string and produce as output a longer private random string. We point out that private randomness expansion protocols are generally vulnerable to attacks that can render the initial string partially insecure, even though that string is used only inside a secure laboratory; our protocol is designed to remove this previously unconsidered vulnerability by privacy amplification. We also discuss extensions of our protocol designed to generate an arbitrarily long random string from a finite initially private random string. The security of these protocols against the most general attacks is left as an open question.

348 citations


Journal ArticleDOI
TL;DR: It is argued that the number of samples required to guarantee that with probability at least 1−Δ, the relative error in the estimate is at most &epsis; that such bounds are much more useful in applications than the variance.
Abstract: We analyze the convergence of randomized trace estimators. Starting at 1989, several algorithms have been proposed for estimating the trace of a matrix by 1/MΣi=1M ziTAzi, where the zi are random vectors; different estimators use different distributions for the zis, all of which lead to E(1/MΣi=1MziTAzi) = trace(A). These algorithms are useful in applications in which there is no explicit representation of A but rather an efficient method compute zTAz given z. Existing results only analyze the variance of the different estimators. In contrast, we analyze the number of samples M required to guarantee that with probability at least 1-δ, the relative error in the estimate is at most ϵ. We argue that such bounds are much more useful in applications than the variance. We found that these bounds rank the estimators differently than the variance; this suggests that minimum-variance estimators may not be the best.We also make two additional contributions to this area. The first is a specialized bound for projection matrices, whose trace (rank) needs to be computed in electronic structure calculations. The second is a new estimator that uses less randomness than all the existing estimators.

319 citations


Journal ArticleDOI
TL;DR: This work says that no extension of quantum theory can give more information about the outcomes of future measurements than quantum theory itself, under the assumption that measurements can be chosen freely.
Abstract: According to quantum theory, measurements generate random outcomes, in stark contrast with classical mechanics. This raises the question of whether there could exist an extension of the theory that removes this indeterminism, as suspected by Einstein, Podolsky and Rosen. Although this has been shown to be impossible, existing results do not imply that the current theory is maximally informative. Here we ask the more general question of whether any improved predictions can be achieved by any extension of quantum theory. Under the assumption that measurements can be chosen freely, we answer this question in the negative: no extension of quantum theory can give more information about the outcomes of future measurements than quantum theory itself. Our result has significance for the foundations of quantum mechanics, as well as applications to tasks that exploit the inherent randomness in quantum theory, such as quantum cryptography.

217 citations


Journal ArticleDOI
TL;DR: The randomness and mean orientation angle maps generated using the adaptive decomposition significantly improve the physical interpretation of the scattering observed at the three different frequencies.
Abstract: Previous model-based decomposition techniques are applicable to a limited range of vegetation types because of their specific assumptions about the volume scattering component. Furthermore, most of these techniques use the same model, or just a few models, to characterize the volume scattering component in the decomposition for all pixels in an image. In this paper, we extend the model-based decomposition idea by creating an adaptive model-based decomposition technique, allowing us to estimate both the mean orientation angle and a degree of randomness for the canopy scattering for each pixel in an image. No scattering reflection symmetry assumption is required to determine the volume contribution. We examined the usefulness of the proposed decomposition technique by decomposing the covariance matrix using the National Aeronautics and Space Administration/Jet Propulsion Laboratory Airborne Synthetic Aperture Radar data at the C-, L-, and P-bands. The randomness and mean orientation angle maps generated using our adaptive decomposition significantly improve the physical interpretation of the scattering observed at the three different frequencies.

196 citations


Book ChapterDOI
28 Mar 2011
TL;DR: In this article, a lower bound on the amount of randomness needed for implementing an information theoretically secure oblivious RAM is proved, without assuming that the CPU has access to a random oracle.
Abstract: We present an algorithm for implementing a secure oblivious RAM where the access pattern is perfectly hidden in the information theoretic sense, without assuming that the CPU has access to a random oracle. In addition we prove a lower bound on the amount of randomness needed for implementing an information theoretically secure oblivious RAM.

187 citations


Book ChapterDOI
28 Sep 2011
TL;DR: A novel and efficient method to generate true random numbers on FPGAs by inducing metastability in bi-stable circuit elements, e.g. flip-flops, by using precise programmable delay lines (PDL) that accurately equalize the signal arrival times to flip-Flops.
Abstract: The paper presents a novel and efficient method to generate true random numbers on FPGAs by inducing metastability in bi-stable circuit elements, e.g. flip-flops. Metastability is achieved by using precise programmable delay lines (PDL) that accurately equalize the signal arrival times to flip-flops. The PDLs are capable of adjusting signal propagation delays with resolutions higher than fractions of a pico second. In addition, a real time monitoring system is utilized to assure a high degree of randomness in the generated output bits, resilience against fluctuations in environmental conditions, as well as robustness against active adversarial attacks. The monitoring system employs a feedback loop that actively monitors the probability of output bits; as soon as any bias is observed in probabilities, it adjusts the delay through PDLs to return to the metastable operation region. Implementation on Xilinx Virtex 5 FPGAs and results of NIST randomness tests show the effectiveness of our approach.

144 citations


Book
20 Jun 2011
TL;DR: In this paper, the basics of stochastic calculus and its application to the study of noise-induced phenomena in environmental systems are presented, and a reference text for ecologists, geoscientists and environmental engineers interested in the subject is provided.
Abstract: Randomness is ubiquitous in nature. Random drivers are generally considered a source of disorder in environmental systems. However, the interaction between noise and nonlinear dynamics may lead to the emergence of a number of ordered behaviors (in time and space) that would not exist in the absence of noise. This counterintuitive effect of randomness may play a crucial role in environmental processes. For example, seemingly 'random' background events in the atmosphere can grow into larger instabilities that have great effects on weather patterns. This book presents the basics of the theory of stochastic calculus and its application to the study of noise-induced phenomena in environmental systems. It will be an invaluable reference text for ecologists, geoscientists and environmental engineers interested in the study of stochastic environmental dynamics.

132 citations


Journal ArticleDOI
TL;DR: Bucher as discussed by the authors presented a Computational Analysis of Randomness in Structural Mechanics by Christian Bucher, CRC Press, Leiden, The Netherlands, 2009, 231 pp., US$102.95, ISBN 978-0415403542
Abstract: Computational Analysis of Randomness in Structural Mechanics by Christian Bucher, CRC Press, Leiden, The Netherlands, 2009, 231 pp., US$102.95, ISBN 978-0415403542 This high-quality book captures i...

132 citations


Journal ArticleDOI
Min Ren1, Eliza Wu1, Yan Liang1, Yi Jian1, Guang Wu1, Heping Zeng1 
TL;DR: A high-efficiency quantum random number generator is demonstrated which takes inherent advantage of the photon number distribution randomness of a coherent light source and passed all the stringent statistical tests.
Abstract: We demonstrated a high-efficiency quantum random number generator which takes inherent advantage of the photon number distribution randomness of a coherent light source. This scheme was realized by comparing the photon flux of consecutive pulses with a photon number resolving detector. The random bit generation rate could reach $2.4$ MHz with a system clock of $6.0$ MHz, corresponding to a random bit generation efficiency as high as $40%$. The random number files passed all the stringent statistical tests.

117 citations


Proceedings ArticleDOI
25 Jul 2011
TL;DR: A new approach for generating point sets with high-quality blue noise properties that formulates the problem using a statistical mechanics interacting particle model, and derives a highly efficient multi-scale sampling scheme for drawing random point distributions from this model.
Abstract: Stochastic point distributions with blue-noise spectrum are used extensively in computer graphics for various applications such as avoiding aliasing artifacts in ray tracing, halftoning, stippling, etc. In this paper we present a new approach for generating point sets with high-quality blue noise properties that formulates the problem using a statistical mechanics interacting particle model. Points distributions are generated by sampling this model. This new formulation of the problem unifies randomness with the requirement for equidistant point spacing, responsible for the enhanced blue noise spectral properties. We derive a highly efficient multi-scale sampling scheme for drawing random point distributions from this model. The new scheme avoids the critical slowing down phenomena that plagues this type of models. This derivation is accompanied by a model-specific analysis.Altogether, our approach generates high-quality point distributions, supports spatially-varying spatial point density, and runs in time that is linear in the number of points generated.

Journal ArticleDOI
TL;DR: An ensemble investigation of the computational capabilities of small-world networks as compared to ordered and random topologies finds that the ordered phase of the dynamics and topologies with low randomness are dominated by information storage, while the chaotic phase is dominated byInformation storage and information transfer.
Abstract: Small-world networks have been one of the most influential concepts in complex systems science, partly due to their prevalence in naturally occurring networks. It is often suggested that this prevalence is due to an inherent capability to store and transfer information efficiently. We perform an ensemble investigation of the computational capabilities of small-world networks as compared to ordered and random topologies. To generate dynamic behavior for this experiment, we imbue the nodes in these networks with random Boolean functions. We find that the ordered phase of the dynamics (low activity in dynamics) and topologies with low randomness are dominated by information storage, while the chaotic phase (high activity in dynamics) and topologies with high randomness are dominated by information transfer. Information storage and information transfer are somewhat balanced (crossed over) near the small-world regime, providing quantitative evidence that small-world networks do indeed have a propensity to combine comparably large information storage and transfer capacity.

Book ChapterDOI
04 Jul 2011
TL;DR: The main concepts of this area of "randomness extraction" are surveyed: deterministic extractors, seeded extractors and multiple sources extractors.
Abstract: We give an introduction to the area of "randomness extraction" and survey the main concepts of this area: deterministic extractors, seeded extractors and multiple sources extractors. For each one we briefly discuss background, definitions, explicit constructions and applications.

Journal ArticleDOI
TL;DR: The algorithm improves upon existing multilevel splitting methods and can be analyzed using Poisson process tools that lead to exact description of the distribution of the estimated probabilities and quantiles.
Abstract: Let X be a random vector with distribution μ on źd and ź be a mapping from źd to ź. That mapping acts as a black box, e.g., the result from some computer experiments for which no analytical expression is available. This paper presents an efficient algorithm to estimate a tail probability given a quantile or a quantile given a tail probability. The algorithm improves upon existing multilevel splitting methods and can be analyzed using Poisson process tools that lead to exact description of the distribution of the estimated probabilities and quantiles. The performance of the algorithm is demonstrated in a problem related to digital watermarking.

Journal ArticleDOI
TL;DR: This work proposes a true random-number expansion protocol without entanglement, where the randomness can be guaranteed only by the two-dimensional quantum witness violation.
Abstract: By testing the classical correlation violation between two systems, true random numbers can be generated and certified without applying classical statistical method In this work, we propose a true random-number expansion protocol without entanglement, where the randomness can be guaranteed only by the two-dimensional quantum witness violation Furthermore, we only assume that the dimensionality of the system used in the protocol has a tight bound, and the whole protocol can be regarded as a semi-device-independent black-box scenario Compared with the device-independent random-number expansion protocol based on entanglement, our protocol is much easier to implement and test


Journal ArticleDOI
TL;DR: In this paper, wind-tunnel experiments were conducted on seven types of urban building arrays with various roughness packing densities to measure the bulk drag coefficient and mean wind profile; aerodynamic parameters such as roughness length and displacement height were also estimated.
Abstract: It is difficult to describe the flow characteristics within and above urban canopies using only geometrical parameters such as plan area index (λ p ) and frontal area index (λ f ) because urban surfaces comprise buildings with random layouts, shapes, and heights. Furthermore, two types of ‘randomness’ are associated with the geometry of building arrays: the randomness of element heights (vertical) and that of the rotation angles of each block (horizontal). In this study, wind-tunnel experiments were conducted on seven types of urban building arrays with various roughness packing densities to measure the bulk drag coefficient (C d ) and mean wind profile; aerodynamic parameters such as roughness length (z o ) and displacement height (d) were also estimated. The results are compared with previous results from regular arrays having neither ‘vertical’ nor ‘horizontal’ randomness. In vertical random arrays, the plot of C d and z o versus λ f exhibited a monotonic increase, and z o increased by a factor of almost two for λ f = 48–70%. C d was strongly influenced by the standard deviation of the height of blocks (σ) when λ p ≥ 17%, whereas C d was independent of σ when λ p = 7%. In the case of horizontal random arrays, the plot of the estimated C d against λ f showed a peak. The effect of both vertical and horizontal randomness of the layout on aerodynamic parameters can be explained by the structure of the vortices around the blocks; the aspect ratio of the block is an appropriate index for the estimation of such features.

01 Jan 2011
TL;DR: Probability theory is that part of mathematics that is concerned with the description and modeling of random phenomena, or in a more general — but not unanimously accepted — sense, of any kind of uncertainty.
Abstract: Probability theory is that part of mathematics that is concerned with the description and modeling of random phenomena, or in a more general — but not unanimously accepted — sense, of any kind of uncertainty. Probability is assigned to random events, expressing their tendency to occur in a random experiment, or more generally to propositions, characterizing the degree of belief in their truth. Probability is the fundamental concept underlying most statistical analyses that go beyond a mere description of the observed data. In statistical inference, where conclusions from a random sample have to be drawn about the properties of the underlying population, arguments based on probability allow to cope with the sampling error and therefore control the inference error, which is necessarily present in any generalization from a part to the whole. Statistical modeling aims at separating regularities (structure explainable by a model) from randomness. There, the sampling error and all the variation that is not explained by the chosen optimal model are comprised in an error probability as a residual category.

Journal ArticleDOI
TL;DR: In this article, the authors investigated the sensitivity of the resulting mass yields to a variety of model ingredients, including in particular the dimensionality and discretization of the shape space and the structure of the dissipation tensor.
Abstract: Random walks on five-dimensional potential-energy surfaces were recently found to yield fission-fragment mass distributions that are in remarkable agreement with experimental data Within the framework of the Smoluchowski equation of motion, which is appropriate for highly dissipative evolutions, we discuss the physical justification for that treatment and investigate the sensitivity of the resulting mass yields to a variety of model ingredients, including in particular the dimensionality and discretization of the shape space and the structure of the dissipation tensor The mass yields are found to be relatively robust, suggesting that the simple random walk presents a useful calculational tool Quantitatively refined results can be obtained by including physically plausible forms of the dissipation, which amounts to simulating the Brownian shape motion in an anisotropic medium

Journal ArticleDOI
TL;DR: In this paper, the authors apply kernel principal component analysis (KPCA) to construct a reduced-order stochastic input model for the material property variation in heterogeneous media.

Journal ArticleDOI
TL;DR: In this paper, it was shown that the entropy per coordinate in a log-concave random vector of any dimension with given density at the mode has a range of just 1.
Abstract: The entropy per coordinate in a log-concave random vector of any dimension with given density at the mode is shown to have a range of just 1. Uniform distributions on convex bodies are at the lower end of this range, the distribution with i.i.d. exponentially distributed coordinates is at the upper end, and the normal is exactly in the middle. Thus, in terms of the amount of randomness as measured by entropy per coordinate, any log-concave random vector of any dimension contains randomness that differs from that in the normal random variable with the same maximal density value by at most 1/2. As applications, we obtain an information-theoretic formulation of the famous hyperplane conjecture in convex geometry, entropy bounds for certain infinitely divisible distributions, and quantitative estimates for the behavior of the density at the mode on convolution. More generally, one may consider so-called convex or hyperbolic probability measures on Euclidean spaces; we give new constraints on entropy per coordinate for this class of measures, which generalize our results under the log-concavity assumption, expose the extremal role of multivariate Pareto-type distributions, and give some applications.

Book
13 Sep 2011
TL;DR: In this article, the authors modify the random choice method of Glimm by replacing the exact solution of the Riemann problem with an appropriate finite difference approximation, which is computationally more efficient and is easier to extend to more general situations.
Abstract: In this paper, we show how to modify the random choice method of Glimm by replacing the exact solution of the Riemann problem with an appropriate finite difference approximation. Our modification resolves discontinuities as well as Glimm’s scheme, but is computationally more efficient and is easier to extend to more general situations.

Journal ArticleDOI
TL;DR: The result can be used to prove the soundness of locally computable extractors in a context where side information might be quantum-mechanical and implies that key agreement in the bounded-storage model-using a standard sample-and-hash protocol-is fully secure against quantum adversaries, thus solving a long-standing open problem.
Abstract: Let X1,..., Xn be a sequence of n classical random variables and consider a sample Xs1,..., Xsr of r ≤ n positions selected at random. Then, except with (exponentially in r) small probability, the min-entropy Hmin(Xs1 ...Xsr) of the sample is not smaller than, roughly, a fraction r/n of the overall entropy Hmin(X1 ...Xn), which is optimal. Here, we show that this statement, originally proved in [S. Vadhan, LNCS 2729, Springer, 2003] for the purely classical case, is still true if the min-entropy Hmin is measured relative to a quantum system. Because min-entropy quantifies the amount of randomness that can be extracted from a given random variable, our result can be used to prove the soundness of locally computable extractors in a context where side information might be quantum-mechanical. In particular, it implies that key agreement in the bounded-storage model-using a standard sample-and-hash protocol-is fully secure against quantum adversaries, thus solving a long-standing open problem.

Posted Content
TL;DR: It is shown that as long as the source of randomness is suitably ergodic — it converges quickly enough to a stationary distribution — the method enjoys strong convergence guarantees, both in expectation and with high probability.
Abstract: We generalize stochastic subgradient descent methods to situations in which we do not receive independent samples from the distribution over which we optimize, but instead receive samples that are coupled over time. We show that as long as the source of randomness is suitably ergodic---it converges quickly enough to a stationary distribution---the method enjoys strong convergence guarantees, both in expectation and with high probability. This result has implications for stochastic optimization in high-dimensional spaces, peer-to-peer distributed optimization schemes, decision problems with dependent data, and stochastic optimization problems over combinatorial spaces.

Journal ArticleDOI
TL;DR: In this paper, the authors studied backward stochastic differential equations with general terminal value and general random generator and obtained the rate of convergence of the schemes based on the obtained Lp-Holder continuity results.
Abstract: In this paper we study backward stochastic differential equations with general terminal value and general random generator. In particular, we do not require the terminal value be given by a forward diffusion equation. The randomness of the generator does not need to be from a forward equation, either. Motivated from applications to numerical simulations, first we obtain the Lp-Holder continuity of the solution. Then we construct several numerical approximation schemes for backward stochastic differential equations and obtain the rate of convergence of the schemes based on the obtained Lp-Holder continuity results. The main tool is the Malliavin calculus.

Journal ArticleDOI
01 Jan 2011
TL;DR: Three chance-constrained programming models are constructed for the railway freight transportation planning problem under the mixed uncertain environment of fuzziness and randomness, in which the optimal paths, the amount of commodities passing through each path and the frequency of services need to be determined.
Abstract: The railway freight transportation planning problem under the mixed uncertain environment of fuzziness and randomness is investigated in this paper, in which the optimal paths, the amount of commodities passing through each path and the frequency of services need to be determined. Based on the chance measure and critical values of the random fuzzy variable, three chance-constrained programming models are constructed for the problem with respect to different criteria. Some equivalents of objectives and constraints are also discussed in order to investigate mathematical properties of the models. To solve the models, a potential path searching algorithm, simulation algorithms and a genetic algorithm are integrated as a hybrid algorithm to solve an optimal solution. Finally, some numerical examples are performed to show the applications of the models and the algorithm.

Journal ArticleDOI
TL;DR: This paper investigates an opportunistic cooperative system with multiple relays that is derived based on the theory of point processes and develops a general analytical approach to performance analysis to accommodate the randomness of the locations as well as the underlying channels.
Abstract: This paper investigates an opportunistic cooperative system with multiple relays. The locations of the relays are essentially random due to their unpredictable mobility and are thus assumed to form a spatial Poisson process. A general analytical approach to performance analysis is developed to accommodate the randomness of the locations as well as the underlying channels. The outage probability of the system is derived based on the theory of point processes. In particular, two relay selection criteria, namely the best forward channel selection and the best worse channel selection, are used as examples to illustrate the proposed approach. The accuracy of the analytical results is verified by Monte-Carlo simulations with various system configurations.

Proceedings ArticleDOI
13 Jul 2011
TL;DR: A solution for accelerating statistical tests of Diehard Battery based on reconfigurable hardware, benefiting from task parallelization and high frequencies is described.
Abstract: Pseudorandom number generators (PRNGs) are used frequently in secure data processing algorithms. Randomness measuring is an essential test, performed on these generators, that help to range the security of the designed algorithm with respect to assure strong messing-up of the processed data. This paper describes a solution for accelerating statistical tests of Diehard Battery based on reconfigurable hardware, benefiting from task parallelization and high frequencies. With this proposal, users can obtain a fast, cheap and reliable measure of the randomness properties that can enable a complete exploration of the design space to produce better devices in shorter times.

Journal ArticleDOI
TL;DR: A parameter estimation method is proposed and computational guidelines for an efficient implementation are given, and the method is evaluated using simulations from standard models like the two-dimensional Ornstein-Uhlenbeck (OU) and the square root models.

Journal ArticleDOI
TL;DR: The analysis of the NLSE with a random (Anderson like) potential has been done at various levels of control: numerical, analytical and rigorous as discussed by the authors, and this model equation presents us with a highly inconclusive and often contradictory picture.
Abstract: The Nonlinear Schroedinger Equation (NLSE) with a random potential is motivated by experiments in optics and in atom optics and is a paradigm for the competition between the randomness and nonlinearity. The analysis of the NLSE with a random (Anderson like) potential has been done at various levels of control: numerical, analytical and rigorous. Yet, this model equation presents us with a highly inconclusive and often contradictory picture. We will describe the main recent results obtained in this field and propose a list of specific problems to focus on, that we hope will enable to resolve these outstanding questions.