scispace - formally typeset
Journal ArticleDOI

Entropy and information: A multidisciplinary overview

Reads0
Chats0
TLDR
The concept of entropy, from the second law of thermodynamics, has been used by numerous writers on information theory, including its use as metaphor.
Abstract
The concept of entropy, from the second law of thermodynamics, has been used by numerous writers on information theory. Basic relationships between entropy, order, information, and meaning have been observed by writers in disciplines as diverse as biology, economics, information science, the arts, and religion. This article, while not attempting comprehensive treatment, cites representative extensions of the concept, including its use as metaphor.

read more

Citations
More filters
Book ChapterDOI

Mind the Gap: Transitions Between Concepts of Information in Varied Domains

TL;DR: With information physics gaining general acceptance, and biology gaining the status of an information science, it seems rational to look for links, relationships, analogies and even helpful metaphors between them and the library/information sciences.
Journal ArticleDOI

Meanings of information: The assumptions and research consequences of three foundational LIS theories

TL;DR: This article addresses the question “what is information?” by comparing the meaning of the term “information” and epistemological assumptions of three theories in library and information science: the “Shannon-Weaver model,” Brookes’ interpretation of Popper's World 3, and the Data-Information-Knowledge-Wisdom model.
Journal ArticleDOI

“A few exciting words”: Information and entropy revisited

TL;DR: A review is presented of the relation between information and entropy, focusing on two main issues: the similarity of the formal definitions of physical entropy and of information, according to statistical mechanics and information theory; and the possible subjectivity of entropy considered as missing information.
Journal ArticleDOI

The Entropy Universe

TL;DR: The Entropy Universe as discussed by the authors is a review of many variants of entropy applied to time-series for different scientific fields, such as general physics, information theory, chaos theory, data mining, and mathematical linguistics.
References
More filters
Journal ArticleDOI

Godel, Escher, Bach: An Eternal Golden Braid

TL;DR: In this paper, the authors apply Godel's seminal contribution to modern mathematics to the study of the human mind and the development of artificial intelligence, and apply it to the case of artificial neural networks.
Journal ArticleDOI

über die Entropieverminderung in einem thermodynamischen System bei Eingriffen intelligenter Wesen

TL;DR: In this paper, the authors untersucht, durch welche Umstande es bedingt is, das man scheinbar ein Perpetuum mobile zweiter Art konstruieren kann, wenn man ein Intellekt besitzendes Wesen Eingriffe an einem thermodynamischen System vornehmen last.
Journal ArticleDOI

Relevance: a review of and a framework for the thinking on the notion in information science

TL;DR: Information science emerged as the third subject, along with logic and philosophy, to deal with relevance-an elusive, human notion that is traced to the problems of scientific communication.
Journal ArticleDOI

Time, structure, and fluctuations.

TL;DR: It is shown that nonequilibrium may become a source of order and that irreversible processes may lead to a new type of dynamic states of matter called "dissipative structures" and the thermodynamic theory of such structures is outlined.
Journal ArticleDOI

Energy and information

TL;DR: The historical development of information and communications theory is traced and the relation of energy and information is analyzed in depth.