scispace - formally typeset
Journal ArticleDOI

Local measures of information storage in complex distributed computation

TLDR
The active information storage is introduced, which quantifies the information storage component that is directly in use in the computation of the next state of a process, and it is demonstrated that the local entropy rate is a useful spatiotemporal filter for information transfer structure.
About
This article is published in Information Sciences.The article was published on 2012-11-15. It has received 165 citations till now. The article focuses on the topics: Information transfer & Information theory.

read more

Citations
More filters
Journal ArticleDOI

Local Shannon entropy measure with statistical tests for image randomness

TL;DR: The proposed local Shannon entropy measure overcomes several weaknesses of the conventional global Shannon entropyMeasure, including unfair randomness comparisons between images of different sizes, failure to discern image randomness before and after image shuffling, and possible inaccurate scores for synthesized images.
Journal ArticleDOI

Measuring Information-Transfer Delays

TL;DR: A robust estimator for neuronal interaction delays rooted in an information-theoretic framework is proposed, which allows a model-free exploration of interactions and shows the ability of the extended transfer entropy to detect the presence of multiple delays, as well as feedback loops.
Journal ArticleDOI

JIDT: An information-theoretic toolkit for studying the dynamics of complex systems

TL;DR: The Java Information Dynamics Toolkit (JIDT) is introduced, a Google code project which provides a standalone, (GNU GPL v3 licensed) open-source code implementation for empirical estimation of information-theoretic measures from time-series data.
Journal ArticleDOI

Local active information storage as a tool to understand distributed neural information processing

TL;DR: It is suggested that LAIS will be a useful quantity to test theories of cortical function, such as predictive coding, when measured on a local scale in time and space in voltage sensitive dye imaging data from area 18 of the cat.
Journal ArticleDOI

JIDT: an information-theoretic toolkit for studying the dynamics of complex systems

TL;DR: The Java Information Dynamics Toolkit (JIDT) as discussed by the authors is a toolkit for empirical estimation of Shannon information-theoretic measures from time-series data, focusing on quantifying information storage, transfer and modification.
References
More filters
Book

Elements of information theory

TL;DR: The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Book

Information Theory, Inference and Learning Algorithms

TL;DR: A fun and exciting textbook on the mathematics underpinning the most dynamic areas of modern science and engineering.
Book

Information theory, inference, and learning algorithms

Djc MacKay
TL;DR: In this paper, the mathematics underpinning the most dynamic areas of modern science and engineering are discussed and discussed in a fun and exciting textbook on the mathematics underlying the most important areas of science and technology.
Book

Networks of the Brain

TL;DR: Olaf Sporns describes how the integrative nature of brain function can be illuminated from a complex network perspective and describes new links between network anatomy and function and investigates how networks shape complex brain dynamics and enable adaptive neural computation.
Journal ArticleDOI

A measure for brain complexity: relating functional segregation and integration in the nervous system

TL;DR: A measure, called neural complexity (CN), that captures the interplay between functional segregation and functional integration in brains of higher vertebrates and may prove useful in analyzing complexity in other biological domains such as gene regulation and embryogenesis.
Related Papers (5)