scispace - formally typeset
Journal ArticleDOI

Capturing the Intangible Concept of Information

TLDR
The purpose of this article is to develop a general appreciation for the meanings of information functions rather than their mathematical use and to discuss the intricacies of quantifying information in some statistical problems.
Abstract
The purpose of this article is to discuss the intricacies of quantifying information in some statistical problems. The aim is to develop a general appreciation for the meanings of information functions rather than their mathematical use. This theme integrates fundamental aspects of the contributions of Kullback, Lindley, and Jaynes and bridges chaos to probability modeling. A synopsis of information-theoretic statistics is presented in the form of a pyramid with Shannon at the vertex and a triangular base that signifies three distinct variants of quantifying information: discrimination information (Kullback), mutual information (Lindley), and maximum entropy information (Jaynes). Examples of capturing information by the maximum entropy (ME) method are discussed. It is shown that the ME approach produces a general class of logit models capable of capturing various forms of sample and nonsample information. Diagnostics for quantifying information captured by the ME logit models are given, and decom...

read more

Citations
More filters
Journal ArticleDOI

AIC model selection and multimodel inference in behavioral ecology: some background, observations, and comparisons

TL;DR: The information-theoretic (I-T) approaches to valid inference are outlined including a review of some simple methods for making formal inference from all the hypotheses in the model set (multimodel inference).
Journal ArticleDOI

Finding the Number of Clusters in a Dataset

TL;DR: A simple, yet powerful nonparametric method for choosing the number of clusters based on distortion, a quantity that measures the average distance, per dimension, between each observation and its closest cluster center, is developed.
Journal ArticleDOI

The Influence of Task Complexity on Consumer Choice: A Latent Class Model of Decision Strategy Switching

TL;DR: This paper introduced decision strategy selection, within a maintained compensatory framework, into aggregate choice models via latent classes, which arise because of task complexity, and demonstrated that within an experimental choice task, the model reflects changing aggregate preferences as choice complexity changes and as the task progresses.
Journal ArticleDOI

Choice Environment, Market Complexity, and Consumer Behavior: A Theoretical and Empirical Approach for Incorporating Decision Complexity into Models of Consumer Choice

TL;DR: In this article, a theoretical model that simultaneously considers task complexity, effort applied by the consumer, ability to choose, and choice context is presented, and a measure of task complexity is constructed based on the random utility framework.
Journal ArticleDOI

Global sensitivity measures from given data

TL;DR: A design for estimating global sensitivity indices from given data, at the minimum computational cost, is introduced for the identification of the key drivers of uncertainty for the complex computer code developed at the National Aeronautics and Space Administration assessing the risk of lunar space missions.
References
More filters
Journal ArticleDOI

A mathematical theory of communication

TL;DR: This final installment of the paper considers the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now.
Book

Elements of information theory

TL;DR: The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Proceedings Article

Information Theory and an Extention of the Maximum Likelihood Principle

H. Akaike
TL;DR: The classical maximum likelihood principle can be considered to be a method of asymptotic realization of an optimum estimate with respect to a very general information theoretic criterion to provide answers to many practical problems of statistical model fitting.
Book ChapterDOI

Information Theory and an Extension of the Maximum Likelihood Principle

TL;DR: In this paper, it is shown that the classical maximum likelihood principle can be considered to be a method of asymptotic realization of an optimum estimate with respect to a very general information theoretic criterion.