scispace - formally typeset
Open AccessJournal ArticleDOI

Performance limitations of flat-histogram methods.

TLDR
The shape parameters of these distributions indicate that statistical sample means become ill defined already for moderate system sizes within these complex energy landscapes, as well as indicating the optimal scaling of local-update flat-histogram methods with system size.
Abstract
Monte Carlo methods are well-suited for the simulation of large many body problems, since the complexity for a single Monte Carlo update step scales only polynomially and often linearly in the system size, while the config- uration space grows exponentially with the system size. The performance of a Monte Carlo method is then deter- mined by how many update steps are needed to efficiently sample the configuration space. For second order phase transitions in unfrustrated systems the problem of "crit- ical slowing down" - a rapid divergence of the number of Monte Carlo steps needed to obtain a subsequent un- correlated configuration - was solved more than a decade ago by cluster update algorithms (1). At first order phase transitions and in systems with many local minima of the free energy such as frustrated magnets or spin glasses, there is the similar problem of long tunneling times be- tween local minima. With energy barriersE scaling lin- early with the linear system size L, the tunneling times � at an inverse temperature � = 1/kBT scale exponentially with the system size, � � exp(��E) / exp(const × L). Several methods were developed to overcome this tun- neling problem, such as the multicanonical method (2), broad histograms (4), simulated and parallel tempering (3), and Wang-Landau sampling (5). The common aim of all these methods is to broaden the range of energies sam- pled within Monte Carlo simulations from the sharply peaked distribution of canonical sampling at fixed tem- perature in order to ease the tunneling through barriers. Ideally, all relevant energy levels are sampled equally often during a simulation, thus producing a "flat his- togram" in energy space. Some methods approach this goal by variations and generalizations of canonical dis- tributions (2, 3), while others (4, 5) discard the notion of temperature completely and instead are formulated in terms of the density of states. With a probability p(E) for a single configuration with energy E, the probability of sampling an arbitrary configuration with energy E is given as PE = �(E)p(E), where the density of states �(E) counts the number of states with energy E. Upon choos- ing p(E) / 1/�(E) instead of p(E) / exp( �E) one ob- tains a constant probability PE for visiting each energy level E, and hence a flat histogram. Wang and Landau (5) proposed a simple and elegant flat histogram algorithm that iteratively improves approximations to the initially unknown density of states �(E). Once �(E) is determined with sufficient accuracy, the Monte Carlo algorithm just performs a random walk in energy space. Within two years of publication this algorithm has been applied to a large number of problems (6, 7, 8) and extended to quantum systems (9). In this Letter we investigate the performance of flat histogram algorithms in general, and the Wang-Landau algorithm in particular, for three systems for which the density of states �(E) is known exactly on finite two- dimensional (2D) lattices: the Ising ferromagnet as the simplest example, the fully frustrated Ising model as a prototype for frustrated systems, and the ±J Ising spin glass. For each of these models we construct a perfect flat histogram method by simulating a random walk in configuration space where we employ the known density of states for these models to set p(E) / 1/�(E). As a measure of performance we use the average tun- neling timeto get from a ground state (lowest energy configuration) to an anti-ground state (configuration of highest energy), which is the relevant time scale for sam- pling the whole phase space (10). Since the number of energy levels in a d-dimensional system with linear size L scales with the number of spins N = L d , the tunneling time for a pure random walk in energy space is

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Continuous-time Monte Carlo methods for quantum impurity models

TL;DR: In this paper, the continuous-time quantum Monte Carlo (QMC) algorithm is used to solve the local correlation problem in quantum impurity models with high and low energy scales and is effective for wide classes of physically realistic models.
Journal ArticleDOI

Feedback-optimized parallel tempering Monte Carlo

TL;DR: This paper shows that by choosing the temperatures with a modified version of the optimized ensemble feedback method, one can minimize the round-trip times between the lowest and highest temperatures which effectively increases the efficiency of the parallel tempering algorithm.
Journal ArticleDOI

Optimized parallel tempering simulations of proteins.

TL;DR: An adaptive algorithm is applied that systematically improves the efficiency of parallel tempering or replica exchange methods in the numerical simulation of small proteins and finds the lowest-energy configuration with a root-mean-square deviation of less than 4 A to the experimentally determined structure.
Journal Article

Optimized parallel tempering simulations of proteins

TL;DR: In this paper, the authors apply a recently developed adaptive algorithm that systematically improves the efficiency of parallel tempering or replica exchange methods in the numerical simulation of small proteins and test their algorithm by simulating the 36-residue villin headpiece subdomain HP-36 where they find a lowest energy configuration with a root-mean-square deviation of less than 4 A to the experimentally determined structure.
Journal ArticleDOI

Wang-Landau algorithm for continuous models and joint density of states.

TL;DR: A modified Wang-Landau algorithm for models with continuous degrees of freedom is presented and strategies to significantly speed up this calculation for large systems over a large range of energy and order parameter are presented.
References
More filters
Book

Statistical Analysis of Extreme Values

TL;DR: This book provides a self-contained introduction to the parametric modeling, exploratory analysis and statistical interference for extreme values and additional sections and chapters, elaborated on more than 100 pages, are particularly concerned with topics like dependencies, the conditional analysis and the multivariate modeling of extreme data.
Journal ArticleDOI

Statistical Analysis of Extreme Values

Katherine Campbell
- 01 Aug 2002 - 
TL;DR: The author draws on his extensive experience with this topic through the SHADOW (an intrusion detection system) project at the Naval Surface Warfare Center and builds a comprehensive overview of problems related to network monitoring and intrusion, along with approaches for addressing them.
Related Papers (5)