scispace - formally typeset
Proceedings ArticleDOI

Learning and the language of thought

Reads0
Chats0
TLDR
The Probabilistic Language of Thought approach that brings logic and probability together into compositional representations with probabilistic meaning - formalized as stochastic lambda calculus is described.
Abstract
Logic and probability are key themes of cognitive science that have long had an uneasy coexistence. I will describe the Probabilistic Language of Thought approach that brings them together into compositional representations with probabilistic meaning - formalized as stochastic lambda calculus. I will describe how this general framework is realized in the probabilistic programming language Church.

read more

Citations
More filters
Journal ArticleDOI

Building machines that learn and think like people.

TL;DR: In this article, a review of recent progress in cognitive science suggests that truly human-like learning and thinking machines will have to reach beyond current engineering trends in both what they learn and how they learn it.
Journal ArticleDOI

The logical primitives of thought: Empirical foundations for compositional cognitive models.

TL;DR: This work shows how different sets of LOT primitives, embedded in a psychologically realistic approximate Bayesian inference framework, systematically predict distinct learning curves in rule-based concept learning experiments, and shows how specific LOT theories can be distinguished empirically.
Posted Content

DreamCoder: Growing generalizable, interpretable knowledge with wake-sleep Bayesian program learning

TL;DR: DreamCoder is presented, a system that learns to solve problems by writing programs that builds expertise by creating programming languages for expressing domain concepts, together with neural networks to guide the search for programs within these languages.
Journal ArticleDOI

Grammatical morphology as a source of early number word meanings

TL;DR: Although exposure to counting is important to learning number word meanings, hearing number words used outside of these routines—in the quantificational structures of language—may also be highly important in early acquisition.
Journal ArticleDOI

Holistic Reinforcement Learning: The Role of Structure and Attention.

TL;DR: This work proposes an integration of Bayesian cognitive models in which structured knowledge learned via approximate Bayesian inference acts as a source of selective attention, which biases reinforcement learning towards relevant dimensions of the environment.
References
More filters
Book

The Theory of Parsing, Translation, and Compiling

TL;DR: It is the hope that the algorithms and concepts presented in this book will survive the next generation of computers and programming languages, and that at least some of them will be applicable to fields other than compiler writing.
Journal ArticleDOI

Trainable grammars for speech recognition

TL;DR: This paper presents a generalization of these algorithms to certain denumerable‐state, hidden Markov processes that permits automatic training of the stochastic analog of an arbitrary context free grammar.
Book

Grammatical Inference: Learning Automata and Grammars

TL;DR: The author describes a number of techniques and algorithms that allow us to learn from text, from an informant, or through interaction with the environment that concern automata, grammars, rewriting systems, pattern languages or transducers.
Journal ArticleDOI

A Blend of Different Tastes: The Language of Coffeemakers:

TL;DR: A grammar that describes a language of coffemenakers is presented and shown to generate a large class of coffeemakers currently on the market, as well as new designs that could be introduced to consumers.

Two Experiments on Learning Probabilistic Dependency Grammars from Corpora

TL;DR: This work presents a scheme for learning probabilistic dependency grammars from positive training examples plus constraints on rules plus results of two experiments, in which the constraints were minimal and the first experiment was unsuccessful.