scispace - formally typeset
Search or ask a question
Author

J. M. Hammersley

Bio: J. M. Hammersley is an academic researcher from University of Oxford. The author has contributed to research in topics: Monte Carlo method & Dynamic Monte Carlo method. The author has an hindex of 29, co-authored 71 publications receiving 13768 citations. Previous affiliations of J. M. Hammersley include Trinity College, Dublin & American University.


Papers
More filters
Book
01 Jan 1964
TL;DR: The general nature of Monte Carlo methods can be found in this paper, where a short resume of statistical terms is given, including random, pseudorandom, and quasirandom numbers.
Abstract: 1 The general nature of Monte Carlo methods.- 2 Short resume of statistical terms.- 3 Random, pseudorandom, and quasirandom numbers.- 4 Direct simulation.- 5 General principles of the Monte Carlo method.- 6 Conditional Monte Carlo.- 7 Solution of linear operator equations.- 8 Radiation shielding and reactor criticality.- 9 Problems in statistical mechanics.- 10 Long polymer molecules.- 11 Percolation processes.- 12 Multivariable problems.- References.

3,226 citations

Journal ArticleDOI
01 Jul 1957
TL;DR: In this paper, the authors study how the random properties of a medium influence the percolation of a fluid through it, in a general way, in which the treatment diifers from conventional diffusion theory.
Abstract: The paper studies, in a general way, how the random properties of a ‘medium’ influence the percolation of a ‘fluid’ through it. The treatment diifers from conventional diffusion theory, in which it is the random properties of the fluid that matter. Fluid and medium bear general interpretations: for example, solute diffusing through solvent, electrons migrating over an atomic lattice, molecules penetrating a porous solid, disease infecting a community, etc.

1,707 citations

Journal ArticleDOI
01 Oct 1959
TL;DR: In this paper, it was shown that the length of the shortest closed path through n points in a bounded plane region of area v is almost always asymptotically proportional to √(nv) for large n; and this result was extended to bounded Lebesgue sets in k-dimensional Euclidean space.
Abstract: We prove that the length of the shortest closed path through n points in a bounded plane region of area v is ‘almost always’ asymptotically proportional to √(nv) for large n; and we extend this result to bounded Lebesgue sets in k–dimensional Euclidean space. The constants of proportionality depend only upon the dimensionality of the space, and are independent of the shape of the region. We give numerical bounds for these constants for various values of k; and we estimate the constant in the particular case k = 2. The results are relevant to the travelling-salesman problem, Steiner's street network problem, and the Loberman—Weinberger wiring problem. They have possible generalizations in the direction of Plateau's problem and Douglas' problem.

902 citations

Journal ArticleDOI
TL;DR: In this paper, a brief general introduction on the nature of Monte Carlo methods that can be skipped by readers acquainted with them is given, and the application of these methods to multivariable problems is discussed.
Abstract: This paper opens with a brief general introduction on the nature of Monte Carlo methods that can be skipped by readers acquainted with them. I then deal more specifically with the application of these methods to multivariable problems, and I indicate certain relatively unexplored areas of this field where further research might be profitable. As I believe is appropriate, some of my material is exploratory, speculative, and controversial, and accordingly I hope it will stimulate discussion.

497 citations


Cited by
More filters
Journal ArticleDOI
13 May 1983-Science
TL;DR: There is a deep and useful connection between statistical mechanics and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters), and a detailed analogy with annealing in solids provides a framework for optimization of very large and complex systems.
Abstract: There is a deep and useful connection between statistical mechanics (the behavior of systems with many degrees of freedom in thermal equilibrium at a finite temperature) and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters). A detailed analogy with annealing in solids provides a framework for optimization of the properties of very large and complex systems. This connection to statistical mechanics exposes new information and provides an unfamiliar perspective on traditional optimization problems and methods.

41,772 citations

Journal ArticleDOI
TL;DR: In this paper, a simple model based on the power-law degree distribution of real networks was proposed, which was able to reproduce the power law degree distribution in real networks and to capture the evolution of networks, not just their static topology.
Abstract: The emergence of order in natural systems is a constant source of inspiration for both physical and biological sciences. While the spatial order characterizing for example the crystals has been the basis of many advances in contemporary physics, most complex systems in nature do not offer such high degree of order. Many of these systems form complex networks whose nodes are the elements of the system and edges represent the interactions between them. Traditionally complex networks have been described by the random graph theory founded in 1959 by Paul Erdohs and Alfred Renyi. One of the defining features of random graphs is that they are statistically homogeneous, and their degree distribution (characterizing the spread in the number of edges starting from a node) is a Poisson distribution. In contrast, recent empirical studies, including the work of our group, indicate that the topology of real networks is much richer than that of random graphs. In particular, the degree distribution of real networks is a power-law, indicating a heterogeneous topology in which the majority of the nodes have a small degree, but there is a significant fraction of highly connected nodes that play an important role in the connectivity of the network. The scale-free topology of real networks has very important consequences on their functioning. For example, we have discovered that scale-free networks are extremely resilient to the random disruption of their nodes. On the other hand, the selective removal of the nodes with highest degree induces a rapid breakdown of the network to isolated subparts that cannot communicate with each other. The non-trivial scaling of the degree distribution of real networks is also an indication of their assembly and evolution. Indeed, our modeling studies have shown us that there are general principles governing the evolution of networks. Most networks start from a small seed and grow by the addition of new nodes which attach to the nodes already in the system. This process obeys preferential attachment: the new nodes are more likely to connect to nodes with already high degree. We have proposed a simple model based on these two principles wich was able to reproduce the power-law degree distribution of real networks. Perhaps even more importantly, this model paved the way to a new paradigm of network modeling, trying to capture the evolution of networks, not just their static topology.

18,415 citations

Journal ArticleDOI
TL;DR: Developments in this field are reviewed, including such concepts as the small-world effect, degree distributions, clustering, network correlations, random graph models, models of network growth and preferential attachment, and dynamical processes taking place on networks.
Abstract: Inspired by empirical studies of networked systems such as the Internet, social networks, and biological networks, researchers have in recent years developed a variety of techniques and models to help us understand or predict the behavior of these systems. Here we review developments in this field, including such concepts as the small-world effect, degree distributions, clustering, network correlations, random graph models, models of network growth and preferential attachment, and dynamical processes taking place on networks.

17,647 citations

Journal ArticleDOI
TL;DR: A generalization of the sampling method introduced by Metropolis et al. as mentioned in this paper is presented along with an exposition of the relevant theory, techniques of application and methods and difficulties of assessing the error in Monte Carlo estimates.
Abstract: SUMMARY A generalization of the sampling method introduced by Metropolis et al. (1953) is presented along with an exposition of the relevant theory, techniques of application and methods and difficulties of assessing the error in Monte Carlo estimates. Examples of the methods, including the generation of random orthogonal matrices and potential applications of the methods to numerical problems arising in statistics, are discussed. For numerical problems in a large number of dimensions, Monte Carlo methods are often more efficient than conventional numerical methods. However, implementation of the Monte Carlo methods requires sampling from high dimensional probability distributions and this may be very difficult and expensive in analysis and computer time. General methods for sampling from, or estimating expectations with respect to, such distributions are as follows. (i) If possible, factorize the distribution into the product of one-dimensional conditional distributions from which samples may be obtained. (ii) Use importance sampling, which may also be used for variance reduction. That is, in order to evaluate the integral J = X) p(x)dx = Ev(f), where p(x) is a probability density function, instead of obtaining independent samples XI, ..., Xv from p(x) and using the estimate J, = Zf(xi)/N, we instead obtain the sample from a distribution with density q(x) and use the estimate J2 = Y{f(xj)p(x1)}/{q(xj)N}. This may be advantageous if it is easier to sample from q(x) thanp(x), but it is a difficult method to use in a large number of dimensions, since the values of the weights w(xi) = p(x1)/q(xj) for reasonable values of N may all be extremely small, or a few may be extremely large. In estimating the probability of an event A, however, these difficulties may not be as serious since the only values of w(x) which are important are those for which x -A. Since the methods proposed by Trotter & Tukey (1956) for the estimation of conditional expectations require the use of importance sampling, the same difficulties may be encountered in their use. (iii) Use a simulation technique; that is, if it is difficult to sample directly from p(x) or if p(x) is unknown, sample from some distribution q(y) and obtain the sample x values as some function of the corresponding y values. If we want samples from the conditional dis

14,965 citations

Journal Article
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON

13,333 citations