Entropy (energy dispersal)
About: Entropy (energy dispersal) is a research topic. Over the lifetime, 4468 publications have been published within this topic receiving 78008 citations.
Papers published on a yearly basis
TL;DR: Results indicate that the normalised entropy measure provides significantly improved behaviour over a range of imaged fields of view.
Abstract: This paper is concerned with the development of entropy-based registration criteria for automated 3D multi-modality medical image alignment. In this application where misalignment can be large with respect to the imaged field of view, invariance to overlap statistics is an important consideration. Current entropy measures are reviewed and a normalised measure is proposed which is simply the ratio of the sum of the marginal entropies and the joint entropy. The effect of changing overlap on current entropy measures and this normalised measure are compared using a simple image model and experiments on clinical image data. Results indicate that the normalised entropy measure provides significantly improved behaviour over a range of imaged fields of view.
TL;DR: Entropy balancing, a data preprocessing method to achieve covariate balance in observational studies with binary treatments, obviates the need for continual balance checking and iterative searching over propensity score models that may stochastically balance the covariate moments.
Abstract: This paper proposes entropy balancing, a data preprocessing method to achieve covariate balance in observational studies with binary treatments. Entropy balancing relies on a maximum entropy reweighting scheme that calibrates unit weights so that the reweighted treatment and control group satisfy a potentially large set of prespecified balance conditions that incorporate information about known sample moments. Entropy balancing thereby exactly adjusts inequalities in representation with respect to the first, second, and possibly higher moments of the covariate distributions. These balance improvements can reduce model dependence for the subsequent estimation of treatment effects. The method assures that balance improves on all covariate moments included in the reweighting. It also obviates the need for continual balance checking and iterative searching over propensity score models that may stochastically balance the covariate moments. We demonstrate the use of entropy balancing with Monte Carlo simulations and empirical applications.
TL;DR: The authors outline a new scheme for parameterizing polarimetric scattering problems that relies on an eigenvalue analysis of the coherency matrix and employs a three-level Bernoulli statistical model to generate estimates of the average target scattering matrix parameters from the data.
Abstract: The authors outline a new scheme for parameterizing polarimetric scattering problems, which has application in the quantitative analysis of polarimetric SAR data. The method relies on an eigenvalue analysis of the coherency matrix and employs a three-level Bernoulli statistical model to generate estimates of the average target scattering matrix parameters from the data. The scattering entropy is a key parameter is determining the randomness in this model and is seen as a fundamental parameter in assessing the importance of polarimetry in remote sensing problems. The authors show application of the method to some important classical random media scattering problems and apply it to POLSAR data from the NASA/JPL AIRSAR data base.
••15 Jun 2019
TL;DR: This work proposes two novel, complementary methods using (i) entropy loss and (ii) adversarial loss respectively for unsupervised domain adaptation in semantic segmentation with losses based on the entropy of the pixel-wise predictions.
Abstract: Semantic segmentation is a key problem for many computer vision tasks. While approaches based on convolutional neural networks constantly break new records on different benchmarks, generalizing well to diverse testing environments remains a major challenge. In numerous real-world applications, there is indeed a large gap between data distributions in train and test domains, which results in severe performance loss at run-time. In this work, we address the task of unsupervised domain adaptation in semantic segmentation with losses based on the entropy of the pixel-wise predictions. To this end, we propose two novel, complementary methods using (i) entropy loss and (ii) adversarial loss respectively. We demonstrate state-of-the-art performance in semantic segmentation on two challenging “synthetic-2-real” set-ups and show that the approach can also be used for detection.
TL;DR: In this paper, a new version of smoothed particle hydrodynamics (SPH) was derived that conserves both energy and entropy if smoothing lengths are allowed to adapt freely to the local mass resolution.
Abstract: We discuss differences in simulation results that arise between the use of either the thermal energy or the entropy as an independent variable in smoothed particle hydrodynamics (SPH). In this context, we derive a new version of SPH that, when appropriate, manifestly conserves both energy and entropy if smoothing lengths are allowed to adapt freely to the local mass resolution. To test various formulations of SPH, we consider point-like energy injection, as in certain models of supernova feedback, and find that powerful explosions are well represented by SPH even when the energy is deposited into a single particle, provided that the entropy equation is integrated. If the thermal energy is instead used as an independent variable, unphysical solutions can be obtained for this problem. We also examine the radiative cooling of gas spheres that collapse and virialize in isolation, and of haloes that form in cosmological simulations of structure formation. When applied to these problems, the thermal energy version of SPH leads to substantial overcooling in haloes that are resolved with up to a few thousand particles, while the entropy formulation is biased only moderately low for these haloes under the same circumstances. For objects resolved with much larger particle numbers, the two approaches yield consistent results. We trace the origin of the differences to systematic resolution effects in the outer parts of cooling flows. When the thermal energy equation is integrated and the resolution is low, the compressional heating of the gas in the inflow region is underestimated, violating entropy conservation and improperly accelerating cooling. The cumulative effect of this overcooling can be significant. In cosmological simulations of moderate size, we find that the fraction of baryons which cool and condense can be reduced by up to a factor ∼2 if the entropy equation is employed rather than the thermal energy equation, partly explaining discrepancies with semi-analytic treatments of galaxy formation. We also demonstrate that the entropy method leads to a greatly reduced scatter in the density–temperature relation of the low-density Lyα forest relative to the thermal energy approach, in accord with theoretical expectations.
Trending Questions (10)