scispace - formally typeset
Search or ask a question
Topic

Information algebra

About: Information algebra is a research topic. Over the lifetime, 342 publications have been published within this topic receiving 28991 citations.


Papers
More filters
Journal Article
TL;DR: The Mathematical Theory of Communication (MTOC) as discussed by the authors was originally published as a paper on communication theory more than fifty years ago and has since gone through four hardcover and sixteen paperback printings.
Abstract: Scientific knowledge grows at a phenomenal pace--but few books have had as lasting an impact or played as important a role in our modern world as The Mathematical Theory of Communication, published originally as a paper on communication theory more than fifty years ago. Republished in book form shortly thereafter, it has since gone through four hardcover and sixteen paperback printings. It is a revolutionary work, astounding in its foresight and contemporaneity. The University of Illinois Press is pleased and honored to issue this commemorative reprinting of a classic.

15,525 citations

Book
01 Jan 1990
TL;DR: This book is an updated version of the information theory classic, first published in 1990, with expanded treatment of stationary or sliding-block codes and their relations to traditional block codes and discussion of results from ergodic theory relevant to information theory.
Abstract: This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition:Expanded treatment of stationary or sliding-block codes and their relations to traditional block codesExpanded discussion of results from ergodic theory relevant to information theoryExpanded treatment of B-processes -- processes formed by stationary coding memoryless sourcesNew material on trading off information and distortion, including the Marton inequalityNew material on the properties of optimal and asymptotically optimal source codesNew material on the relationships of source coding and rate-constrained simulation or modeling of random processesSignificant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.

1,810 citations

Journal ArticleDOI
TL;DR: Findings are pointed to that enable the system designer to put the design process in the wider context of the user in the organization.
Abstract: Introduction Until recently the computer science and information systems communities have equated 'information requirements' of users with the way users behave in relation to the systems available. In other words, investigations into information requirements were concerned almost entirely with how a user navigated a given system and what he or she could do with the data (rather than information) made available by information systems. This is now beginning to change as ethnographic methods are introduced into the requirements definition stage of systems design, and Beyer and Holtzblatt (1998) have shown the benefits. However, even when such methods are employed, the designers appear to be asking, "How is this person using the system?" rather than seeking to determine what the individual's (or the organization's) information needs may be and how information seeking behavior relates to other, task-oriented behavior. In fact, a concern with what information is needed has been the province not of information systems as a discipline, but of information science and, before that, librarianship. To these fields we can add consumer behavior research, marketing, psychology, health communication research, and a number of other disciplines that take the user as the focus of interest, rather than the system. The aim of this paper is to review some of this research and to point to findings that enable the system designer to put the design process in the wider context of the user in the organization. Some Definitions Some definitions are needed before we go further. In this paper, four terms are used: information behavior, information seeking behavior, information searching behavior and information use behavior. They are defined as follows: Information Behavior is the totality of human behavior in relation to sources and channels of information, including both active and passive information seeking, and information use. Thus, it includes face-to-face communication with others, as well as the passive reception of information as in, for example, watching TV advertisements, without any intention to act on the information given. Information Seeking Behavior is the purposive seeking for information as a consequence of a need to satisfy some goal. In the course of seeking, the individual may interact with manual information systems (such as a newspaper or a library), or with computer-based systems (such as the World Wide Web). Information Searching Behavior is the 'micro-level' of behavior employed by the searcher in interacting with information systems of all kinds. It consists of all the interactions with the system, whether at the level of human computer interaction (for example, use of the mouse and clicks on links) or at the intellectual level (for example, adopting a Boolean search strategy or determining the criteria for deciding which of two books selected from adjacent places on a library shelf is most useful), which will also involve mental acts, such as judging the relevance of data or information retrieved. Information Use Behavior consists of the physical and mental acts involved in incorporating the information found into the person's existing knowledge base. It may involve, therefore, physical acts such as marking sections in a text to note their importance or significance, as well as mental acts that involve, for example, comparison of new information with existing knowledge. In all of the above definitions data is subsumed under information, that is, data may or may not be information depending upon the state of understanding of the information user. A datum such as "hbar=h/2pi = 6.58*10 [conjunction] -25 GeV s = 1.05*10 [conjunction]-34 J s" does not inform me because I have no framework of understanding in which to incorporate the datum. …

1,392 citations

Book
Gregory J. Chaitin1
30 Oct 1987
TL;DR: This paper reviews algorithmic information theory, which is an attempt to apply information-theoretic and probabilistic ideas to recursive function theory.
Abstract: This paper reviews algorithmic information theory, which is an attempt to apply information-theoretic and probabilistic ideas to recursive function theory. Typical concerns in this approach are, for example, the number of bits of information required to specify an algorithm, or the probability that a program whose bits are chosen by coin flipping produces a given output. During the past few years the definitions of algorithmic information theory have been reformulated. The basic features of the new formalism are presented here and certain results of R. M. Solovay are reported.

994 citations

Book
15 Dec 2004
TL;DR: This tutorial is concerned with applications of information theory concepts in statistics, in the finite alphabet setting, and an introduction is provided to the theory of universal coding, and to statistical inference via the minimum description length principle motivated by that theory.
Abstract: This tutorial is concerned with applications of information theory concepts in statistics, in the finite alphabet setting. The information measure known as information divergence or Kullback-Leibler distance or relative entropy plays a key role, often with a geometric flavor as an analogue of squared Euclidean distance, as in the concepts of I-projection, I-radius and I-centroid. The topics covered include large deviations, hypothesis testing, maximum likelihood estimation in exponential families, analysis of contingency tables, and iterative algorithms with an "information geometry" background. Also, an introduction is provided to the theory of universal coding, and to statistical inference via the minimum description length principle motivated by that theory.

852 citations


Network Information
Related Topics (5)
Information system
107.5K papers, 1.8M citations
65% related
Web page
50.3K papers, 975.1K citations
64% related
Metadata
43.9K papers, 642.7K citations
63% related
Information technology
53.9K papers, 894.1K citations
62% related
Ontology (information science)
57K papers, 869.1K citations
62% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20216
20201
20182
201712
201613
201515