scispace - formally typeset
Search or ask a question

Showing papers on "Entropy (information theory) published in 2022"


Book
17 Jan 2022
TL;DR: This book is to clarify how information theory works behind thermodynamics and to shed modern light on it, and presents self-contained and rigorous proofs of several fundamental properties of entropies, divergences, and majorization.
Abstract: In these decades, it has been revealed that there is rich information-theoretic structure in thermodynamics of out-of-equilibrium systems in both the classical and quantum regimes. This has led to the fruitful interplay among statistical physics, quantum information theory, and mathematical theories including matrix analysis and asymptotic probability theory. The main purpose of this book is to clarify how information theory works behind thermodynamics and to shed modern light on it. We focus on both of purely information-theoretic concepts and their physical implications: We present self-contained and rigorous proofs of several fundamental properties of entropies, divergences, and majorization. We also discuss the modern formulations of thermodynamics, especially from the perspectives of stochastic thermodynamics and resource theory of thermodynamics. Resource theory is a recently-developed field as a branch of quantum information theory in order to quantify (energetically or information-theoretically) "useful resources." We show that resource theory has an intrinsic connection to various fundamental ideas of mathematics and information theory. This book is not intended to be a comprehensive review of the field, but would serve as a concise introduction to several important ingredients of the information-theoretic formulation of thermodynamics.

35 citations


Journal ArticleDOI
TL;DR: In this article, an efficient computer-aided technological solution for detecting ADHD from the acquired electroencephalography (EEG) signals based on different nonlinear entropy estimators and an artificial neural network classifier was investigated.
Abstract: Attention deficit hyperactivity disorder (ADHD) is a prevalent behavioral, cognitive, neurodevelopmental pediatric disorder. Clinical evaluations, symptom surveys, and neuropsychological assessments are some of the ADHD assessment methods, which are time-consuming processes and have a certain degree of uncertainty. This research investigates an efficient computer-aided technological solution for detecting ADHD from the acquired electroencephalography (EEG) signals based on different nonlinear entropy estimators and an artificial neural network classifier. Features extracted through fuzzy entropy, log energy entropy, permutation entropy, SURE entropy, and Shannon entropy are analyzed for effective discrimination of ADHD subjects from the control group. The experimented results confirm that the proposed techniques can effectively detect and classify ADHD subjects. The permutation entropy gives the highest classification accuracy of 99.82%, sensitivity of 98.21%, and specificity of 98.82%. Also, the potency of different entropy estimators derived from the t-test reflects that the Shannon entropy has a higher P-value (>.001); therefore, it has a limited scope than other entropy estimators for ADHD diagnosis. Furthermore, the considerable variance found from potential features obtained in the frontal polar (FP) and frontal (F) lobes using different entropy estimators under the eyes-closed condition shows that the signals received in these lobes will have more significance in distinguishing ADHD from normal subjects.

15 citations


Journal ArticleDOI
TL;DR: In this paper, a new measure of discrimination, called Tsallis extropy, is introduced and some of its properties are then discussed and some bounds are also presented, and an application of this extropy to pattern recognition is demonstrated.

11 citations


Journal ArticleDOI
TL;DR: In this paper, a probabilistic linguistic decision-making framework based on the PLMDFT and deviation entropy is proposed for multi-attribute decision making (MADM) problems.
Abstract: As an effective tool to describe qualitative evaluations, probabilistic linguistic term set (PLTS) can identify the different preference degrees for the possible linguistic evaluations. For the multi-attribute decision making (MADM) problems based on the PLTSs, making decisions is not instantaneous behavior but needs some time to complete information processing. Considering the dynamic nature of decision-making behavior, this study aims to develop a process-oriented probabilistic linguistic decision-making framework. First, we introduce the parameters in the probabilistic linguistic multi-alternative decision field theory (PLMDFT) model. An improved decision rule for selecting the optimal alternative(s) is also presented. Then, a deviation entropy-based model is developed to determine attribute weights. Furthermore, we construct a probabilistic linguistic decision-making framework based on the PLMDFT and deviation entropy. Finally, the constructed framework is applied to solve emergency scheme selection problem. Some discussion and comparative analysis is complemented to demonstrate the validity of the proposed framework.

10 citations


Journal ArticleDOI
TL;DR: This paper proposes a novel entropy-based metric by utilizing the Jensen–Shannon divergence, an effective and efficient tool in solving inverse problems in presence of mixed uncertainty such as encountered in the context of imprecise probabilities.

8 citations


Journal ArticleDOI
TL;DR: In this article, the authors employ Shannon entropy to capture informativeness of sentences and employ non-negative matrix factorization to reveal probability distributions for computing entropy of terms, topics, and sentences in latent space.
Abstract: Automatic text summarization aims to cut down readers’ time and cognitive effort by reducing the content of a text document without compromising on its essence. Ergo, informativeness is the prime attribute of document summary generated by an algorithm, and selecting sentences that capture the essence of a document is the primary goal of extractive document summarization. In this paper, we employ Shannon’s entropy to capture informativeness of sentences. We employ Non-negative Matrix Factorization (NMF) to reveal probability distributions for computing entropy of terms, topics, and sentences in latent space. We present an information theoretic interpretation of the computed entropy, which is the bedrock of the proposed E-Summ algorithm, an unsupervised method for extractive document summarization. The algorithm systematically applies information theoretic principle for selecting informative sentences from important topics in the document. The proposed algorithm is generic and fast, and hence amenable to use for summarization of documents in real time. Furthermore, it is domain-, collection-independent and agnostic to the language of the document. Benefiting from strictly positive NMF factor matrices, E-Summ algorithm is transparent and explainable too. We use standard ROUGE toolkit for performance evaluation of the proposed method on four well known public data-sets. We also perform quantitative assessment of E-Summ summary quality by computing its semantic similarity w.r.t the original document. Our investigation reveals that though using NMF and information theoretic approach for document summarization promises efficient, explainable, and language independent text summarization, it needs to be bolstered to match the performance of deep neural methods.

5 citations


Journal ArticleDOI
TL;DR: In this article, the authors proposed a method for determining the interaction information between three variables using total correlation and conditional mutual information, which can capture both well-known and yet-to-be-discovered functional brain connections.

5 citations


Journal ArticleDOI
TL;DR: The Small Ball Probabilities of Gaussian Rough Paths are studied and upper and lower bounds for the rate of convergence of an Empirical Rough Gaussian measure to its true law in pathspace are found.

3 citations


Journal ArticleDOI
TL;DR: In this article, the authors improved entropy bounds for a self-timed ring-based true random number generator, taking the timing of the reference clock signals into account, and used parity filters as post-processing blocks.

1 citations


Journal ArticleDOI
TL;DR: Li et al. as mentioned in this paper proposed a computational framework to evaluate gaps between different domains, e.g., judging which one of source domains is closer to the target domain, based on the observation that given a well-trained classifier on the source domain, the entropy of its classification scores of the output layer can be used as an indicator of the domain gap.

1 citations