Open AccessBook
The Soar Cognitive Architecture
TLDR
This book offers the definitive presentation of Soar from theoretical and practical perspectives, providing comprehensive descriptions of fundamental aspects and new components, and proposes requirements for general cognitive architectures and explicitly evaluates how well Soar meets those requirements.Abstract:
In development for thirty years, Soar is a general cognitive architecture that integrates knowledge-intensive reasoning, reactive execution, hierarchical reasoning, planning, and learning from experience, with the goal of creating a general computational system that has the same cognitive abilities as humans. In contrast, most AI systems are designed to solve only one type of problem, such as playing chess, searching the Internet, or scheduling aircraft departures. Soar is both a software system for agent development and a theory of what computational structures are necessary to support human-level agents. Over the years, both software system and theory have evolved. This book offers the definitive presentation of Soar from theoretical and practical perspectives, providing comprehensive descriptions of fundamental aspects and new components. The current version of Soar features major extensions, adding reinforcement learning, semantic memory, episodic memory, mental imagery, and an appraisal-based model of emotion. This book describes details of Soar's component memories and processes and offers demonstrations of individual components, components working in combination, and real-world applications. Beyond these functional considerations, the book also proposes requirements for general cognitive architectures and explicitly evaluates how well Soar meets those requirements.read more
Citations
More filters
Journal ArticleDOI
Machine learning & artificial intelligence in the quantum domain: a review of recent progress.
TL;DR: In this article, the authors describe the main ideas, recent developments and progress in a broad spectrum of research investigating ML and AI in the quantum domain, and discuss the fundamental issue of quantum generalizations of learning and AI concepts.
Journal ArticleDOI
Nengo: a Python tool for building large-scale functional brain models.
Trevor Bekolay,James Bergstra,Eric Hunsberger,Travis DeWolf,Terrence C. Stewart,Daniel Rasmussen,Xuan Choo,Aaron R. Voelker,Chris Eliasmith +8 more
TL;DR: Nengo 2.0 is described, which is implemented in Python and uses simple and extendable syntax, simulates a benchmark model on the scale of Spaun 50 times faster than Nengo 1.4, and has a flexible mechanism for collecting simulation results.
Journal ArticleDOI
The many faces of working memory and short-term storage.
TL;DR: By delineating nine previously used definitions of working memory and explaining how additional ones may emerge from combinations of these nine, the potential advantages of clarity about definitions of WM and short-term storage are illustrated.
Posted Content
Machine learning \& artificial intelligence in the quantum domain
Vedran Dunjko,Hans J. Briegel +1 more
TL;DR: The main ideas, recent developments and progress are described in a broad spectrum of research investigating ML and AI in the quantum domain, investigating how results and techniques from one field can be used to solve the problems of the other.
Journal ArticleDOI
40 years of cognitive architectures: core cognitive abilities and practical applications
Iuliia Kotseruba,John K. Tsotsos +1 more
TL;DR: This survey describes a variety of methods and ideas that have been tried and their relative success in modeling human cognitive abilities, as well as which aspects of cognitive behavior need more research with respect to their mechanistic counterparts and thus can further inform how cognitive science might progress.
References
More filters
Book
Soar: an architecture for general intelligence
TL;DR: SOAR, an implemented proposal for a foundation for a system capable of general intelligent behavior, is presented and its organizational principles, the system as currently implemented, and demonstrations of its capabilities are described.
Journal ArticleDOI
Lipschitzian optimization without the Lipschitz constant
TL;DR: In this article, the Lipschitz constant is viewed as a weighting parameter that indicates how much emphasis to place on global versus local search, which accounts for the fast convergence of the new algorithm on the test functions.
Journal ArticleDOI
Toward memory-based reasoning
Craig Stanfill,David L. Waltz +1 more
TL;DR: The intensive use of memory to recall specific episodes from the past—rather than rules—should be the foundation of machine reasoning.