scispace - formally typeset
Search or ask a question

Showing papers on "Entropy (information theory) published in 2006"


Journal ArticleDOI
TL;DR: The paper deals with the f-divergences of Csiszar generalizing the discrimination information of Kullback, the total variation distance, the Hellinger divergence, and the Pearson divergence, where basic properties of f-Divergence including relations to the decision errors are proved in a new manner replacing the classical Jensen inequality.
Abstract: The paper deals with the f-divergences of Csiszar generalizing the discrimination information of Kullback, the total variation distance, the Hellinger divergence, and the Pearson divergence. All basic properties of f-divergences including relations to the decision errors are proved in a new manner replacing the classical Jensen inequality by a new generalized Taylor expansion of convex functions. Some new properties are proved too, e.g., relations to the statistical sufficiency and deficiency. The generalized Taylor expansion also shows very easily that all f-divergences are average statistical informations (differences between prior and posterior Bayes errors) mutually differing only in the weights imposed on various prior distributions. The statistical information introduced by De Groot and the classical information of Shannon are shown to be extremal cases corresponding to alpha=0 and alpha=1 in the class of the so-called Arimoto alpha-informations introduced in this paper for 0

641 citations


Journal ArticleDOI
TL;DR: A new generalized correlation measure is developed that includes the information of both the distribution and that of the time structure of a stochastic process.
Abstract: With an abundance of tools based on kernel methods and information theoretic learning, a void still exists in incorporating both the time structure and the statistical distribution of the time series in the same functional measure. In this paper, a new generalized correlation measure is developed that includes the information of both the distribution and that of the time structure of a stochastic process. It is shown how this measure can be interpreted from a kernel method as well as from an information theoretic learning points of view, demonstrating some relevant properties. To underscore the effectiveness of the new measure, a simple blind equalization problem is considered using a coded signal.

395 citations


Book
01 Jan 2006
TL;DR: In this article, the authors link entropy to estimation of distribution algorithms and propose a parallel island model for the quadratic assignment problem and a hybrid Cooperative Search Evolutionary Algorithm.
Abstract: Linking Entropy to Estimation of Distribution Algorithms.- Entropy-based Convergence Measurement in Discrete Estimation of Distribution Algorithms.- Real-coded Bayesian Optimization Algorithm.- The CMA Evolution Strategy: A Comparing Review.- Estimation of Distribution Programming: EDA-based Approach to Program Generation.- Multi-objective Optimization with the Naive ID A.- A Parallel Island Model for Estimation of Distribution Algorithms.- GA-EDA: A New Hybrid Cooperative Search Evolutionary Algorithm.- Bayesian Classifiers in Optimization: An EDA-like Approach.- Feature Ranking Using an EDA-based Wrapper Approach.- Learning Linguistic Fuzzy Rules by Using Estimation of Distribution Algorithms as the Search Engine in the COR Methodology.- Estimation of Distribution Algorithm with 2-opt Local Search for the Quadratic Assignment Problem.

365 citations


Journal ArticleDOI
TL;DR: Alternative sequential and nonsequential versions of robust control theory imply identical robust decision rules that are dynamically consistent in a useful sense.

354 citations


Journal ArticleDOI
TL;DR: A novel unsupervised, information-theoretic, adaptive filter that improves the predictability of pixel intensities from their neighborhoods by decreasing their joint entropy and can thereby restore a wide spectrum of images.
Abstract: Image restoration is an important and widely studied problem in computer vision and image processing. Various image filtering strategies have been effective, but invariably make strong assumptions about the properties of the signal and/or degradation. Hence, these methods lack the generality to be easily applied to new applications or diverse image collections. This paper describes a novel unsupervised, information-theoretic, adaptive filter (UINTA) that improves the predictability of pixel intensities from their neighborhoods by decreasing their joint entropy. In this way, UINTA automatically discovers the statistical properties of the signal and can thereby restore a wide spectrum of images. The paper describes the formulation to minimize the joint entropy measure and presents several important practical considerations in estimating neighborhood statistics. It presents a series of results on both real and synthetic data along with comparisons with state-of-the-art techniques, including novel applications to medical image processing.

298 citations


Journal ArticleDOI
TL;DR: This paper introduces the concepts of information entropy, rough entropy, knowledge granulation and granularity measure in incomplete information systems, their important properties are given, and the relationships among these concepts are established.
Abstract: Rough set theory is a relatively new mathematical tool for use in computer applications in circumstances that are characterized by vagueness and uncertainty. Rough set theory uses a table called an information system, and knowledge is defined as classifications of an information system. In this paper, we introduce the concepts of information entropy, rough entropy, knowledge granulation and granularity measure in incomplete information systems, their important properties are given, and the relationships among these concepts are established. The relationship between the information entropy E(A) and the knowledge granulation GK(A) of knowledge A can be expressed as E(A)+GK(A) = 1, the relationship between the granularity measure G(A) and the rough entropy E r(A) of knowledge A can be expressed as G(A)+E r(A) = log2|U|. The conclusions in Liang and Shi (2004) are special instances in this paper. Furthermore, two inequalities − log2 GK(A) ≤ G(A) and E r(A) ≤ log2(|U|(1 − E(A))) about the measures GK, G, E and...

281 citations


Journal ArticleDOI
TL;DR: Analysis of microarray data from genes in Saccharomyces cerevisiae chemostat cultures exhibiting energy metabolic oscillations identifies a gene interaction network that reflects the intracellular communication pathways that adjust cellular metabolic activity and cell division to the limiting nutrient conditions that trigger metabolic oscillation.
Abstract: We describe a method based on the principle of entropy maximization to identify the gene interaction network with the highest probability of giving rise to experimentally observed transcript profiles. In its simplest form, the method yields the pairwise gene interaction network, but it can also be extended to deduce higher-order interactions. Analysis of microarray data from genes in Saccharomyces cerevisiae chemostat cultures exhibiting energy metabolic oscillations identifies a gene interaction network that reflects the intracellular communication pathways that adjust cellular metabolic activity and cell division to the limiting nutrient conditions that trigger metabolic oscillations. The success of the present approach in extracting meaningful genetic connections suggests that the maximum entropy principle is a useful concept for understanding living systems, as it is for other complex, nonequilibrium systems.

273 citations


Journal ArticleDOI
TL;DR: It is shown that the various algorithms used for the calculation of entropy and complexity actually measure different properties of the signal, and ShEn tends to increase while the other tested measures decrease with deepening sedation.
Abstract: Entropy and complexity of the electroencephalogram (EEG) have recently been proposed as measures of depth of anesthesia and sedation. Using surrogate data of predefined spectrum and probability distribution we show that the various algorithms used for the calculation of entropy and complexity actually measure different properties of the signal. The tested methods, Shannon entropy (ShEn), spectral entropy, approximate entropy (ApEn), Lempel-Ziv complexity (LZC), and Higuchi fractal dimension (HFD) are then applied to the EEG signal recorded during sedation in the intensive care unit (ICU). It is shown that the applied measures behave in a different manner when compared to clinical depth of sedation score the Ramsay score. ShEn tends to increase while the other tested measures decrease with deepening sedation. ApEn, LZC, and HFD are highly sensitive to the presence of high-frequency components in the EEG signal.

251 citations


Journal ArticleDOI
TL;DR: A theory about fuzzy probabilistic approximation spaces is proposed in this paper, which combines three types of uncertainty: probability, fuzziness, and roughness into a rough set model.
Abstract: Rough set theory has proven to be an efficient tool for modeling and reasoning with uncertainty information. By introducing probability into fuzzy approximation space, a theory about fuzzy probabilistic approximation spaces is proposed in this paper, which combines three types of uncertainty: probability, fuzziness, and roughness into a rough set model. We introduce Shannon's entropy to measure information quantity implied in a Pawlak's approximation space, and then present a novel representation of Shannon's entropy with a relation matrix. Based on the modified formulas, some generalizations of the entropy are proposed to calculate the information in a fuzzy approximation space and a fuzzy probabilistic approximation space, respectively. As a result, uniform representations of approximation spaces and their information measures are formed with this work

246 citations


Journal ArticleDOI
01 Dec 2006
TL;DR: A survey and comparative analysis is conducted among several widely used methods that include Pun and Kapur's maximum entropy, Kittler and Illingworth's minimum error thresholding, Pal and Pal's entropy thresholding and Chang et al.'s relative entropy thresholded methods.
Abstract: Entropy-based image thresholding has received considerable interest in recent years. Two types of entropy are generally used as thresholding criteria: Shannon's entropy and relative entropy, also known as Kullback - Leibler information distance, where the former measures uncertainty in an information source with an optimal threshold obtained by maximising Shannon's entropy, whereas the latter measures the information discrepancy between two different sources with an optimal threshold obtained by minimising relative entropy. Many thresholding methods have been developed for both criteria and reported in the literature. These two entropy- based thresholding criteria have been investigated and the relationship among entropy and relative entropy thresholding methods has been explored. In particular, a survey and comparative analysis is conducted among several widely used methods that include Pun and Kapur's maximum entropy, Kittler and Illingworth's minimum error thresholding, Pal and Pal's entropy thresholding and Chang et al.'s relative entropy thresholding methods. In order to objectively assess these methods, two measures, uniformity and shape, are used for performance evaluation.

223 citations


Journal ArticleDOI
TL;DR: In this article, a new Probabilistic Sensitivity Analysis (PSA) approach based on the concept of relative entropy is proposed for design under uncertainty, which can be applied both over the whole distribution of a performance response and in any interested partial range of a response distribution.
Abstract: In this paper, a new Probabilistic Sensitivity Analysis (PSA) approach based on the concept of relative entropy is proposed for design under uncertainty. The relative entropy based method evaluates the impact of a random variable on a design performance by measuring the divergence between two probability density functions of the performance response, obtained before and after the variation reduction of the random variable. The method can be applied both over the whole distribution of a performance response [called global response probabilistic sensitivity analysis (GRPSA)] and in any interested partial range of a response distribution [called regional response probabilistic sensitivity analysis (RRPSA)]. Such flexibility of our approach facilitates its use under various scenarios of design under uncertainty, for instance in robust design, reliability-based design, and utility optimization. The proposed method is applicable to both the prior-design stage for variable screening when a design solution is yet identified and the post-design stage for uncertainty reduction after an optimal design has been determined. The saddlepoint approximation approach is introduced for improving the computational efficiency of applying our proposed method. The proposed method is illustrated and verified by numerical examples and industrial design cases.

Journal ArticleDOI
26 Jun 2006
TL;DR: This paper presents two algorithms for randomly approximating the entropy in a time and space efficient manner, applicable for use on very high speed (greater than OC-48) links.
Abstract: Using entropy of traffic distributions has been shown to aid a wide variety of network monitoring applications such as anomaly detection, clustering to reveal interesting patterns, and traffic classification. However, realizing this potential benefit in practice requires accurate algorithms that can operate on high-speed links, with low CPU and memory requirements. In this paper, we investigate the problem of estimating the entropy in a streaming computation model. We give lower bounds for this problem, showing that neither approximation nor randomization alone will let us compute the entropy efficiently. We present two algorithms for randomly approximating the entropy in a time and space efficient manner, applicable for use on very high speed (greater than OC-48) links. The first algorithm for entropy estimation is inspired by the structural similarity with the seminal work of Alon et al. for estimating frequency moments, and we provide strong theoretical guarantees on the error and resource usage. Our second algorithm utilizes the observation that the performance of the streaming algorithm can be enhanced by separating the high-frequency items (or elephants) from the low-frequency items (or mice). We evaluate our algorithms on traffic traces from different deployment scenarios.

Journal ArticleDOI
Tong Zhang1
TL;DR: In this article, an extension of e-entropy to a KL-divergence based complexity measure for randomized density estimation methods is proposed, which can lead to improvements of some classical results concerning the convergence of minimum description length and Bayesian posterior distributions.
Abstract: We consider an extension of e-entropy to a KL-divergence based complexity measure for randomized density estimation methods. Based on this extension, we develop a general information-theoretical inequality that measures the statistical complexity of some deterministic and randomized density estimators. Consequences of the new inequality will be presented. In particular, we show that this technique can lead to improvements of some classical results concerning the convergence of minimum description length and Bayesian posterior distributions. Moreover, we are able to derive clean finite-sample convergence bounds that are not obtainable using previous approaches.

Journal ArticleDOI
TL;DR: In this article, modified versions of logarithmic Sobolev inequalities in the discrete setting of finite Markov chains and graphs were studied, motivated by the rate at which the entropy of a Markov chain relative to its stationary distribution decays to zero.
Abstract: Motivated by the rate at which the entropy of an ergodic Markov chain relative to its stationary distribution decays to zero, we study modified versions of logarithmic Sobolev inequalities in the discrete setting of finite Markov chains and graphs. These inequalities turn out to be weaker than the standard log-Sobolev inequality, but stronger than the Poincare’ (spectral gap) inequality. We show that, in contrast with the spectral gap, for bounded degree expander graphs, various log-Sobolev constants go to zero with the size of the graph. We also derive a hypercontractivity formulation equivalent to our main modified log-Sobolev inequality. Along the way we survey various recent results that have been obtained in this topic by other researchers.

Journal ArticleDOI
TL;DR: The effect of finite data size is studied and analytic expressions for the LZ complexity for regular and random sequences are derived, and employ them to develop a normalization scheme.
Abstract: The Lempel-Ziv (LZ) complexity and its variants are popular metrics for characterizing biological signals. Proper interpretation of such analyses, however, has not been thoroughly addressed. In this letter, we study the the effect of finite data size. We derive analytic expressions for the LZ complexity for regular and random sequences, and employ them to develop a normalization scheme. To gain further understanding, we compare the LZ complexity with the correlation entropy from chaos theory in the context of epileptic seizure detection from EEG data, and discuss advantages of the normalized LZ complexity over the correlation entropy

Journal ArticleDOI
TL;DR: The entropy of the degree distribution is an effective measure of network's resilience to random failures and is concluded that the optimal design of scale-free networks torandom failures is obtained.
Abstract: Many networks are characterized by highly heterogeneous distributions of links which are called scale-free networks, and the degree distributions follow p ( k ) ∼ ck - α . We study the robustness of scale-free networks to random failures from the character of their heterogeneity. Entropy of the degree distribution can be an average measure of a network's heterogeneity. Optimization of scale-free networks’ robustness to random failures with average connectivity constant is equivalent to maximizing the entropy of the degree distribution. By examining the relationship of the entropy of the degree distribution, scaling exponent and the minimal connectivity, we get the optimal design of scale-free networks to random failures. We conclude that the entropy of the degree distribution is an effective measure of network's resilience to random failures.

Journal ArticleDOI
TL;DR: It is proved that even in this simple case, the optimization problem is NP-hard, and some efficient, scalable, and distributed heuristic approximation algorithms are proposed for solving this problem and the total transmission cost can be significantly improved over direct transmission or the shortest path tree.
Abstract: We consider the problem of correlated data gathering by a network with a sink node and a tree-based communication structure, where the goal is to minimize the total transmission cost of transporting the information collected by the nodes, to the sink node. For source coding of correlated data, we consider a joint entropy-based coding model with explicit communication where coding is simple and the transmission structure optimization is difficult. We first formulate the optimization problem definition in the general case and then we study further a network setting where the entropy conditioning at nodes does not depend on the amount of side information, but only on its availability. We prove that even in this simple case, the optimization problem is NP-hard. We propose some efficient, scalable, and distributed heuristic approximation algorithms for solving this problem and show by numerical simulations that the total transmission cost can be significantly improved over direct transmission or the shortest path tree. We also present an approximation algorithm that provides a tree transmission structure with total cost within a constant factor from the optimal.

Journal ArticleDOI
TL;DR: Experimental results have confirmed that the proposed metric outperforms the standard MI metric by correlating better with the subjective quality of fused images.
Abstract: A novel image fusion performance metric using mutual information is proposed. The metric is based on Tsallis entropy, which is a one-parameter generalisation of Shannon entropy. Experimental results have confirmed that the proposed metric outperforms the standard MI metric by correlating better with the subjective quality of fused images.

Journal ArticleDOI
TL;DR: The results show that the ApEn index significantly distinguishes suffering from normal fetuses between the 30th and the 35th week of gestation and the MSE entropy values are reliable indicators of the fetal distress associated with the presence of a pathological condition at birth.
Abstract: This paper considers the multiscale entropy (MSE) approach for estimating the regularity of time series at different scales. Sample entropy (SampEn) and approximate entropy (ApEn) are evaluated in MSE analysis on simulated data to enhance the main features of both estimators. We applied the approximate entropy and the sample entropy estimators to fetal heart rate signals on both single and multiple scales for an early identification of fetal sufferance antepartum. Our results show that the ApEn index significantly distinguishes suffering from normal fetuses between the 30th and the 35th week of gestation. Furthermore, our data shows that the MSE entropy values are reliable indicators of the fetal distress associated with the presence of a pathological condition at birth.

Journal ArticleDOI
TL;DR: This correspondence gives a simple proof of Shannon's entropy-power inequality (EPI) using the relationship between mutual information and minimum mean-square error (MMSE) in Gaussian channels.
Abstract: This correspondence gives a simple proof of Shannon's entropy-power inequality (EPI) using the relationship between mutual information and minimum mean-square error (MMSE) in Gaussian channels.

Journal ArticleDOI
TL;DR: It is shown that quadratic diversity can be interpreted as the expected conflict among the species of a given assemblage and can be related to the Shannon entropy through a generalized version of the Tsallis parametric entropy.

Journal ArticleDOI
TL;DR: This work derives an algorithm to compute the leading order of the logarithm of the number of solutions, of matchings with a given size, and an analytic result for the entropy in regular and Erd?s?R?nyi random graph ensembles.
Abstract: We study matchings on sparse random graphs by means of the cavity method. We first show how the method reproduces several known results about maximum and perfect matchings in regular and Erd?s?R?nyi random graphs. Our main new result is the computation of the entropy, i.e.?the leading order of the logarithm of the number of solutions, of matchings with a given size. We derive both an algorithm to compute this entropy for an arbitrary graph with a girth that diverges in the large size limit, and an analytic result for the entropy in regular and Erd?s?R?nyi random graph ensembles.

Journal ArticleDOI
TL;DR: A thresholding technique based on two-dimensional Tsallis-Havrda-Charvat entropy is presented, demonstrating the effectiveness of the proposed method by using examples from the real-world and synthetic images.

Proceedings ArticleDOI
21 May 2006
TL;DR: The main novelty comes in a bootstrap procedure which allows the Challenge-Response mechanism for detecting "entropy concentration" of [4] to be used with sources of less and less entropy, using recursive calls to itself.
Abstract: The main result of this paper is an explicit disperser for two independent sources on n bits, each of entropy k=no(1) Put differently, setting N=2n and K=2k, we construct explicit N x N Boolean matrices for which no K x K submatrix is monochromatic Viewed as adjacency matrices of bipartite graphs, this gives an explicit construction of K-Ramsey bipartite graphs of size NThis greatly improves the previous bound of k=o(n) of Barak, Kindler, Shaltiel, Sudakov and Wigderson [4] It also significantly improves the 25-year record of k = O (√n) on the special case of Ramsey graphs, due to Frankl and Wilson [9]The construction uses (besides "classical" extractor ideas) almost all of the machinery developed in the last couple of years for extraction from independent sources, including:Bourgain's extractor for 2 independent sources of some entropy rate 1/2 [18]Rao's extractor for 2 independent block-sources of entropy nΩ (1) [17]The "Challenge-Response" mechanism for detecting "entropy concentration" of [4]The main novelty comes in a bootstrap procedure which allows the Challenge-Response mechanism of [4] to be used with sources of less and less entropy, using recursive calls to itself Subtleties arise since the success of this mechanism depends on restricting the given sources, and so recursion constantly changes the original sources These are resolved via a new construct, in between a disperser and an extractor, which behaves like an extractor on sufficiently large subsources of the given onesThis version is only an extended abstract, please see the full version, available on the authors' homepages, for more details

Journal ArticleDOI
TL;DR: The inverse problem of describing power spectra which are consistent with second-order statistics is discussed, which has been the main motivation behind the present work.
Abstract: Entropy-like functionals on operator algebras have been studied since the pioneering work of von Neumann, Umegaki, Lindblad, and Lieb. The best known are the von Neumann entropy |(rho):=-trace(rhologrho) and a generalization of the Kullback- Leibler distance S(rhoparsigma):=trace(rhologrho-rhologsigma), referred to as quantum relative entropy and used to quantify distance between states of a quantum system. The purpose of this paper is to explore | and S as regularizing functionals in seeking solutions to multivariable and multidimensional moment problems. It will be shown that extrema can be effectively constructed via a suitable homotopy. The homotopy approach leads naturally to a further generalization and a description of all the solutions to such moment problems. This is accomplished by a renormalization of a Riemannian metric induced by entropy functionals. As an application, we discuss the inverse problem of describing power spectra which are consistent with second-order statistics, which has been the main motivation behind the present work

BookDOI
01 Jan 2006
TL;DR: Reports on Models of Write-Efficient Memories with Localized Errors and Defects and Problems in Network Coding.
Abstract: Rudolf Ahlswede - From 60 to 66.- Information Theory and Some Friendly Neighbors - Ein Wunschkonzert.- Probabilistic Models.- Identification for Sources.- On Identification.- Identification and Prediction.- Watermarking Identification Codes with Related Topics on Common Randomness.- Notes on Conditions for Successive Refinement of Information.- Coding for the Multiple-Access Adder Channel.- Bounds of E-Capacity for Multiple-Access Channel with Random Parameter.- Huge Size Codes for Identification Via a Multiple Access Channel Under a Word-Length Constraint.- Codes with the Identifiable Parent Property and the Multiple-Access Channel.- Cryptology - Pseudo Random Sequences.- Transmission, Identification and Common Randomness Capacities for Wire-Tape Channels with Secure Feedback from the Decoder.- A Simplified Method for Computing the Key Equivocation for Additive-Like Instantaneous Block Encipherers.- Secrecy Systems for Identification Via Channels with Additive-Like Instantaneous Block Encipherer.- Large Families of Pseudorandom Sequences of k Symbols and Their Complexity - Part I.- Large Families of Pseudorandom Sequences of k Symbols and Their Complexity - Part II.- On a Fast Version of a Pseudorandom Generator.- On Pseudorandom Sequences and Their Application.- Authorship Attribution of Texts: A Review.- Quantum Models.- Raum-Zeit und Quantenphysik - Ein Geburtstagsstandchen fur Hans-Jurgen Treder.- Quantum Information Transfer from One System to Another One.- On Rank Two Channels.- Universal Sets of Quantum Information Processing Primitives and Their Optimal Use.- An Upper Bound on the Rate of Information Transfer by Grover's Oracle.- A Strong Converse Theorem for Quantum Multiple Access Channels.- Identification Via Quantum Channels in the Presence of Prior Correlation and Feedback.- Additive Number Theory and the Ring of Quantum Integers.- The Proper Fiducial Argument.- On Sequential Discrimination Between Close Markov Chains.- Estimating with Randomized Encoding the Joint Empirical Distribution in a Correlated Source.- On Logarithmically Asymptotically Optimal Hypothesis Testing for Arbitrarily Varying Sources with Side Information.- On Logarithmically Asymptotically Optimal Testing of Hypotheses and Identification.- Correlation Inequalities in Function Spaces.- Lower Bounds for Divergence in the Central Limit Theorem.- Information Measures - Error Concepts - Performance Criteria.- Identification Entropy.- Optimal Information Measures for Weakly Chaotic Dynamical Systems.- Report on Models of Write-Efficient Memories with Localized Errors and Defects.- Percolation on a k-Ary Tree.- On Concepts of Performance Parameters for Channels.- Appendix: On Common Information and Related Characteristics of Correlated Information Sources.- Search - Sorting - Ordering - Planning.- Q-Ary Ulam-Renyi Game with Constrained Lies.- Search with Noisy and Delayed Responses.- A Kraft-Type Inequality for d-Delay Binary Search Codes.- Threshold Group Testing.- A Fast Suffix-Sorting Algorithm.- Monotonicity Checking.- Algorithmic Motion Planning: The Randomized Approach.- Language Evolution - Pattern Discovery - Reconstructions.- Information Theoretic Models in Language Evolution.- Zipf's Law, Hyperbolic Distributions and Entropy Loss.- Bridging Lossy and Lossless Compression by Motif Pattern Discovery.- Reverse-Complement Similarity Codes.- On Some Applications of Information Indices in Chemical Graph Theory.- Largest Graphs of Diameter 2 and Maximum Degree 6.- Network Coding.- An Outside Opinion.- Problems in Network Coding and Error Correcting Codes Appended by a Draft Version of S. Riis "Utilising Public Information in Network Coding".- Combinatorial Models.- On the Thinnest Coverings of Spheres and Ellipsoids with Balls in Hamming and Euclidean Spaces.- Appendix: On Set Coverings in Cartesian Product Spaces.- Testing Sets for 1-Perfect Code.- On Partitions of a Rectangle into Rectangles with Restricted Number of Cross Sections.- On Attractive and Friendly Sets in Sequence Spaces.- Remarks on an Edge Isoperimetric Problem.- Appendix: On Edge-Isoperimetric Theorems for Uniform Hypergraphs.- Appendix: Solution of Burnashev's Problem and a Sharpening of the Erd?s/Ko/Rado Theorem.- Realization of Intensity Modulated Radiation Fields Using Multileaf Collimators.- Sparse Asymmetric Connectors in Communication Networks.- Problem Section.- Finding , the Identification Capacity of the AVC , if Randomization in the Encoding Is Excluded.- Intersection Graphs of Rectangles and Segments.- Cutoff Rate Enhancement.- Some Problems in Organic Coding Theory.- Generalized Anticodes in Hamming Spaces.- Two Problems from Coding Theory.- Private Capacity of Broadcast Channels.- A Short Survey on Upper and Lower Bounds for Multidimensional Zero Sums.- Binary Linear Codes That Are Optimal for Error Correction.- Capacity Problem of Trapdoor Channel.- Hotlink Assignment on the Web.- The Rigidity of Hamming Spaces.- A Conjecture in Finite Fields.- Multiparty Computations in Non-private Environments.- Some Mathematical Problems Related to Quantum Hypothesis Testing.- Designs and Perfect Codes.

Journal ArticleDOI
TL;DR: A new algorithm is presented for autofocus in synthetic aperture radar imaging that applies more widely than the minimum-entropy algorithms with a fixed-order polynomial model.
Abstract: A new algorithm is presented for autofocus in synthetic aperture radar imaging. Entropy is used to measure the focus quality of the image, and better focus corresponds to smaller entropy. The phase response of the focus filter is modeled as a specially designed polynomial, and the coefficients of this polynomial are adjusted in sequence to minimize the entropy of the image. Because the order of this polynomial is adaptive, this algorithm applies more widely than the minimum-entropy algorithms with a fixed-order polynomial model

Book ChapterDOI
25 Sep 2006
TL;DR: A novel information-theoretic measure of spatiotemporal coordination in a modular robotic system is presented, and it is used as a fitness function in evolving the system.
Abstract: In this paper we present a novel information-theoretic measure of spatiotemporal coordination in a modular robotic system, and use it as a fitness function in evolving the system This approach exemplifies a new methodology formalizing co-evolution in multi-agent adaptive systems: information-driven evolutionary design The methodology attempts to link together different aspects of information transfer involved in adaptive systems, and suggests to approximate direct task-specific fitness functions with intrinsic selection pressures In particular, the information-theoretic measure of coordination employed in this work estimates the generalized correlation entropy K2 and the generalized excess entropy E2 computed over a multivariate time series of actuators' states The simulated modular robotic system evolved according to the new measure exhibits regular locomotion and performs well in challenging terrains.

Proceedings ArticleDOI
22 Jan 2006
TL;DR: In this article, the authors give an algorithm for estimating the entropy of a distribution in a data stream setting in polylogarithmic space and yields an asymptotic constant factor approximation scheme.
Abstract: In most algorithmic applications which compare two distributions, information theoretic distances are more natural than standard lp norms. In this paper we design streaming and sublinear time property testing algorithms for entropy and various information theoretic distances.Batu et al posed the problem of property testing with respect to the Jensen-Shannon distance. We present optimal algorithms for estimating bounded, symmetric f-divergences (including the Jensen-Shannon divergence and the Hellinger distance) between distributions in various property testing frameworks. Along the way, we close a (log n)/H gap between the upper and lower bounds for estimating entropy H, yielding an optimal algorithm over all values of the entropy. In a data stream setting (sublinear space), we give the first algorithm for estimating the entropy of a distribution. Our algorithm runs in polylogarithmic space and yields an asymptotic constant factor approximation scheme. An integral part of the algorithm is an interesting use of an F0 (the number of distinct elements in a set) estimation algorithm; we also provide other results along the space/time/approximation tradeoff curve.Our results have interesting structural implications that connect sublinear time and space constrained algorithms. The mediating model is the random order streaming model, which assumes the input is a random permutation of a multiset and was first considered by Munro and Paterson in 1980. We show that any property testing algorithm in the combined oracle model for calculating a permutation invariant functions can be simulated in the random order model in a single pass. This addresses a question raised by Feigenbaum et al regarding the relationship between property testing and stream algorithms. Further, we give a polylog-space PTAS for estimating the entropy of a one pass random order stream. This bound cannot be achieved in the combined oracle (generalized property testing) model.

Journal ArticleDOI
TL;DR: A distributed active network-based algorithm that exploits this property to correlate arbitrary traffic flows in the network to detect possible denial-of-service attacks and shows that DDoS attacks can be detected in a manner that is not sensitive to legitimate background traffic.
Abstract: This paper describes an approach to detecting distributed denial of service (DDoS) attacks that is based on fundamentals of Information Theory, specifically Kolmogorov Complexity. A theorem derived using principles of Kolmogorov Complexity states that the joint complexity measure of random strings is lower than the sum of the complexities of the individual strings when the strings exhibit some correlation. Furthermore, the joint complexity measure varies inversely with the amount of correlation. We propose a distributed active network-based algorithm that exploits this property to correlate arbitrary traffic flows in the network to detect possible denial-of-service attacks. One of the strengths of this algorithm is that it does not require special filtering rules and hence it can be used to detect any type of DDoS attack. We implement and investigate the performance of the algorithm in an active network. Our results show that DDoS attacks can be detected in a manner that is not sensitive to legitimate background traffic.