scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Beitrag zur Theorie des Ferromagnetismus

01 Feb 1925-European Physical Journal A (Springer-Verlag)-Vol. 31, Iss: 1, pp 253-258
About: This article is published in European Physical Journal A.The article was published on 1925-02-01. It has received 2983 citations till now.
Citations
More filters
Journal ArticleDOI
TL;DR: The analogy between images and statistical mechanics systems is made and the analogous operation under the posterior distribution yields the maximum a posteriori (MAP) estimate of the image given the degraded observations, creating a highly parallel ``relaxation'' algorithm for MAP estimation.
Abstract: We make an analogy between images and statistical mechanics systems. Pixel gray levels and the presence and orientation of edges are viewed as states of atoms or molecules in a lattice-like physical system. The assignment of an energy function in the physical system determines its Gibbs distribution. Because of the Gibbs distribution, Markov random field (MRF) equivalence, this assignment also determines an MRF image model. The energy function is a more convenient and natural mechanism for embodying picture attributes than are the local characteristics of the MRF. For a range of degradation mechanisms, including blurring, nonlinear deformations, and multiplicative or additive noise, the posterior distribution is an MRF with a structure akin to the image model. By the analogy, the posterior distribution defines another (imaginary) physical system. Gradual temperature reduction in the physical system isolates low energy states (``annealing''), or what is the same thing, the most probable states under the Gibbs distribution. The analogous operation under the posterior distribution yields the maximum a posteriori (MAP) estimate of the image given the degraded observations. The result is a highly parallel ``relaxation'' algorithm for MAP estimation. We establish convergence properties of the algorithm and we experiment with some simple pictures, for which good restorations are obtained at low signal-to-noise ratios.

18,761 citations

Journal ArticleDOI
TL;DR: In this paper, the effect of a uniform, time-varying magnetic field upon the Ising model is discussed, and the frequency-dependent magnetic susceptibility is found in the weak-field limit.
Abstract: The individual spins of the Ising model are assumed to interact with an external agency (e.g., a heat reservoir) which causes them to change their states randomly with time. Coupling between the spins is introduced through the assumption that the transition probabilities for any one spin depend on the values of the neighboring spins. This dependence is determined, in part, by the detailed balancing condition obeyed by the equilibrium state of the model. The Markoff process which describes the spin functions is analyzed in detail for the case of a closed N‐member chain. The expectation values of the individual spins and of the products of pairs of spins, each of the pair evaluated at a different time, are found explicitly. The influence of a uniform, time‐varying magnetic field upon the model is discussed, and the frequency‐dependent magnetic susceptibility is found in the weak‐field limit. Some fluctuation‐dissipation theorems are derived which relate the susceptibility to the Fourier transform of the time‐dependent correlation function of the magnetization at equilibrium.

2,833 citations

Journal ArticleDOI
TL;DR: An expectation-maximization algorithm for simultaneous truth and performance level estimation (STAPLE), which considers a collection of segmentations and computes a probabilistic estimate of the true segmentation and a measure of the performance level represented by each segmentation.
Abstract: Characterizing the performance of image segmentation approaches has been a persistent challenge. Performance analysis is important since segmentation algorithms often have limited accuracy and precision. Interactive drawing of the desired segmentation by human raters has often been the only acceptable approach, and yet suffers from intra-rater and inter-rater variability. Automated algorithms have been sought in order to remove the variability introduced by raters, but such algorithms must be assessed to ensure they are suitable for the task. The performance of raters (human or algorithmic) generating segmentations of medical images has been difficult to quantify because of the difficulty of obtaining or estimating a known true segmentation for clinical data. Although physical and digital phantoms can be constructed for which ground truth is known or readily estimated, such phantoms do not fully reflect clinical images due to the difficulty of constructing phantoms which reproduce the full range of imaging characteristics and normal and pathological anatomical variability observed in clinical data. Comparison to a collection of segmentations by raters is an attractive alternative since it can be carried out directly on the relevant clinical imaging data. However, the most appropriate measure or set of measures with which to compare such segmentations has not been clarified and several measures are used in practice. We present here an expectation-maximization algorithm for simultaneous truth and performance level estimation (STAPLE). The algorithm considers a collection of segmentations and computes a probabilistic estimate of the true segmentation and a measure of the performance level represented by each segmentation. The source of each segmentation in the collection may be an appropriately trained human rater or raters, or may be an automated segmentation algorithm. The probabilistic estimate of the true segmentation is formed by estimating an optimal combination of the segmentations, weighting each segmentation depending upon the estimated performance level, and incorporating a prior model for the spatial distribution of structures being segmented as well as spatial homogeneity constraints. STAPLE is straightforward to apply to clinical imaging data, it readily enables assessment of the performance of an automated image segmentation algorithm, and enables direct comparison of human rater and algorithm performance.

1,923 citations

Journal ArticleDOI
TL;DR: An examines methodologies suited to identify such symptom networks and discusses network analysis techniques that may be used to extract clinically and scientifically useful information from such networks (e.g., which symptom is most central in a person's network).
Abstract: In network approaches to psychopathology, disorders result from the causal interplay between symptoms (e.g., worry → insomnia → fatigue), possibly involving feedback loops (e.g., a person may engage in substance abuse to forget the problems that arose due to substance abuse). The present review examines methodologies suited to identify such symptom networks and discusses network analysis techniques that may be used to extract clinically and scientifically useful information from such networks (e.g., which symptom is most central in a person's network). The authors also show how network analysis techniques may be used to construct simulation models that mimic symptom dynamics. Network approaches naturally explain the limited success of traditional research strategies, which are typically based on the idea that symptoms are manifestations of some common underlying factor, while offering promising methodological alternatives. In addition, these techniques may offer possibilities to guide and evaluate therape...

1,824 citations

Journal ArticleDOI
TL;DR: The theory of critical phenomena in systems at equilibrium is reviewed at an introductory level with special emphasis on the values of the critical point exponents α, β, γ,..., and their interrelations as mentioned in this paper.
Abstract: The theory of critical phenomena in systems at equilibrium is reviewed at an introductory level with special emphasis on the values of the critical point exponents α, β, γ,..., and their interrelations. The experimental observations are surveyed and the analogies between different physical systems - fluids, magnets, superfluids, binary alloys, etc. - are developed phenomenologically. An exact theoretical basis for the analogies follows from the equivalence between classical and quantal `lattice gases' and the Ising and Heisenberg-Ising magnetic models. General rigorous inequalities for critical exponents at and below Tc are derived. The nature and validity of the `classical' (phenomenological and mean field) theories are discussed, their predictions being contrasted with the exact results for plane Ising models, which are summarized concisely. Pade approximant and ratio techniques applied to appropriate series expansions lead to precise critical-point estimates for the three-dimensional Heisenberg and Ising models (tables of data are presented). With this background a critique is presented of recent theoretical ideas: namely, the `droplet' picture of the critical point and the `homogeneity' and `scaling' hypotheses. These lead to a `law of corresponding states' near a critical point and to relations between the various exponents which suggest that perhaps only two or three exponents might be algebraically independent for any system.

1,792 citations