scispace - formally typeset
Search or ask a question
Book

The Handbook of Brain Theory and Neural Networks

01 Jan 2007-
TL;DR: A circular cribbage board having a circular base plate on which a circular counter disc, bearing a circular scale having 122 divisions numbered consecutively from 0, is mounted for rotation.
Abstract: From the Publisher: Dramatically updating and extending the first edition, published in 1995, the second edition of The Handbook of Brain Theory and Neural Networks presents the enormous progress made in recent years in the many subfields related to the two great questions: How does the brain work? and, How can we build intelligent machines? Once again, the heart of the book is a set of almost 300 articles covering the whole spectrum of topics in brain theory and neural networks. The first two parts of the book, prepared by Michael Arbib, are designed to help readers orient themselves in this wealth of material. Part I provides general background on brain modeling and on both biological and artificial neural networks. Part II consists of "Road Maps" to help readers steer through articles in part III on specific topics of interest. The articles in part III are written so as to be accessible to readers of diverse backgrounds. They are cross-referenced and provide lists of pointers to Road Maps, background material, and related reading. The second edition greatly increases the coverage of models of fundamental neurobiology, cognitive neuroscience, and neural network approaches to language. It contains 287 articles, compared to the 266 in the first edition. Articles on topics from the first edition have been updated by the original authors or written anew by new authors, and there are 106 articles on new topics.
Citations
More filters
Journal ArticleDOI
TL;DR: This historical survey compactly summarizes relevant work, much of it from the previous millennium, review deep supervised learning, unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.

14,635 citations

Journal ArticleDOI
TL;DR: The major concepts and results recently achieved in the study of the structure and dynamics of complex networks are reviewed, and the relevant applications of these ideas in many different disciplines are summarized, ranging from nonlinear science to biology, from statistical mechanics to medicine and engineering.

9,441 citations

Journal ArticleDOI
08 Mar 2001-Nature
TL;DR: This work aims to understand how an enormous network of interacting dynamical systems — be they neurons, power stations or lasers — will behave collectively, given their individual dynamics and coupling architecture.
Abstract: The study of networks pervades all of science, from neurobiology to statistical physics. The most basic issues are structural: how does one characterize the wiring diagram of a food web or the Internet or the metabolic network of the bacterium Escherichia coli? Are there any unifying principles underlying their topology? From the perspective of nonlinear dynamics, we would also like to understand how an enormous network of interacting dynamical systems-be they neurons, power stations or lasers-will behave collectively, given their individual dynamics and coupling architecture. Researchers are only now beginning to unravel the structure and dynamics of complex networks.

7,665 citations

Book
01 Jan 1996
TL;DR: Professor Ripley brings together two crucial ideas in pattern recognition; statistical methods and machine learning via neural networks in this self-contained account.
Abstract: From the Publisher: Pattern recognition has long been studied in relation to many different (and mainly unrelated) applications, such as remote sensing, computer vision, space research, and medical imaging. In this book Professor Ripley brings together two crucial ideas in pattern recognition; statistical methods and machine learning via neural networks. Unifying principles are brought to the fore, and the author gives an overview of the state of the subject. Many examples are included to illustrate real problems in pattern recognition and how to overcome them.This is a self-contained account, ideal both as an introduction for non-specialists readers, and also as a handbook for the more expert reader.

5,632 citations

Book ChapterDOI
04 Jul 2014

4,238 citations

References
More filters
Journal ArticleDOI
TL;DR: The analogy between images and statistical mechanics systems is made and the analogous operation under the posterior distribution yields the maximum a posteriori (MAP) estimate of the image given the degraded observations, creating a highly parallel ``relaxation'' algorithm for MAP estimation.
Abstract: We make an analogy between images and statistical mechanics systems. Pixel gray levels and the presence and orientation of edges are viewed as states of atoms or molecules in a lattice-like physical system. The assignment of an energy function in the physical system determines its Gibbs distribution. Because of the Gibbs distribution, Markov random field (MRF) equivalence, this assignment also determines an MRF image model. The energy function is a more convenient and natural mechanism for embodying picture attributes than are the local characteristics of the MRF. For a range of degradation mechanisms, including blurring, nonlinear deformations, and multiplicative or additive noise, the posterior distribution is an MRF with a structure akin to the image model. By the analogy, the posterior distribution defines another (imaginary) physical system. Gradual temperature reduction in the physical system isolates low energy states (``annealing''), or what is the same thing, the most probable states under the Gibbs distribution. The analogous operation under the posterior distribution yields the maximum a posteriori (MAP) estimate of the image given the degraded observations. The result is a highly parallel ``relaxation'' algorithm for MAP estimation. We establish convergence properties of the algorithm and we experiment with some simple pictures, for which good restorations are obtained at low signal-to-noise ratios.

18,761 citations

Journal ArticleDOI
TL;DR: The orderly variation of cell discharge with the direction of movement and the fact that cells related to only one of the eight directions of movement tested were rarely observed indicate that movements in a particular direction are not subserved by motor cortical cells uniquely related to that movement.
Abstract: The activity of single cells in the motor cortex was recorded while monkeys made arm movements in eight directions (at 45 degrees intervals) in a two-dimensional apparatus. These movements started from the same point and were of the same amplitude. The activity of 606 cells related to proximal arm movements was examined in the task; 323 of the 606 cells were active in that task and were studied in detail. The frequency of discharge of 241 of the 323 cells (74.6%) varied in an orderly fashion with the direction of movement. Discharge was most intense with movements in a preferred direction and was reduced gradually when movements were made in directions farther and farther away from the preferred one. This resulted in a bell-shaped directional tuning curve. These relations were observed for cell discharge during the reaction time, the movement time, and the period that preceded the earliest changes in the electromyographic activity (approximately 80 msec before movement onset). In about 75% of the 241 directionally tuned cells, the frequency of discharge, D, was a sinusoidal function of the direction of movement, theta: D = b0 + b1 sin theta + b2cos theta, or, in terms of the preferred direction, theta 0: D = b0 + c1cos (theta - theta0), where b0, b1, b2, and c1 are regression coefficients. Preferred directions differed for different cells so that the tuning curves partially overlapped. The orderly variation of cell discharge with the direction of movement and the fact that cells related to only one of the eight directions of movement tested were rarely observed indicate that movements in a particular direction are not subserved by motor cortical cells uniquely related to that movement. It is suggested, instead, that a movement trajectory in a desired direction might be generated by the cooperation of cells with overlapping tuning curves. The nature of this hypothetical population code for movement direction remains to be elucidated.

2,049 citations


"The Handbook of Brain Theory and Ne..." refers methods in this paper

  • ...898 Part III: Articles It was also possible to show that these cells formed an interpretable population code that could sometimes identify individual faces using the population vector technique (Georgopoulos et al., 1982; see REACHING: CODING IN MOTOR CORTEX)....

    [...]

Journal ArticleDOI
TL;DR: There are several lines of evidence suggesting that a possible site for further processing of visual information and perhaps even for storage of such information might, in the monkey, be inferotemporal cortexthe cortex on the inferior convexity of the temporal lobe.
Abstract: IN THE LAST DEC,ADE, considerable progress has been made in understanding the physiology of one of the most fundamental aspects of human experience: perception of the visual world. It is now clear that the retina and visual pathways do not simply transmit a mosaic of Iight and dark to some central sensorium. Rather, even at the retinal level, specific features of visual stimuli are detected and their presence communicated to the next level. In cats and monkeys, the geniculostriate visual system consists of a series of converging and diverging connections such that at each successive tier of processing mechanism, single neurons respond to increasingly more specific visual stimuli falling on an increasingly wider area of the retina (19-Z). How far does this analytical-synthetic process continue whereby individual cells have more and more specific trigger features? Are there regions of the brain beyond striate and prestriatel cortex where this processing of visual information is carrie,d further? If so, how far and in what way? Are there cells that are concerned with the storage of visual information as well as its analysis? There are several lines of evidence suggesting that a possible site for further processing of visual information and perhaps even for storage of such information might, in the monkey, be inferotemporal cortexthe cortex on the inferior convexity of the temporal lobe. First, this area receives afferents from prestriate cortex which itself processes visual information received from

1,449 citations


"The Handbook of Brain Theory and Ne..." refers background or methods in this paper

  • ...Cells’ preferences in IT are often difficult to account for by reference to simple stimulus features, such as orientation, motion, position, or color, and they appear to lie in the domain of shape (Gross, Rocha-Miranda, and Bender, 1972; Tanaka et al., 1991)....

    [...]

  • ...One, which has been widely employed (Gross et al., 1972) but which has recently been applied as systematically as possible by Tanaka and his colleagues (Fujita et al....

    [...]

  • ...One, which has been widely employed (Gross et al., 1972) but which has recently been applied as systematically as possible by Tanaka and his colleagues (Fujita et al., 1992; Tanaka et al., 1991), has been to try to determine preferred features of cells by simplifying the stimuli that excite them....

    [...]

01 Jan 2011
TL;DR: Spectral Methods in Fluid DynamicsNumerical Methods for Partial Differential Equations (PDE): Theory and Applications of Spectral Methods: Theory and ApplicatonsSpectral methods for Incompressible Viscous FlowAdvances in Numerical Analysis: Nonlinear partial differential equations and dynamical systemsSpectral method using Multivariate polynomials on the Unit Ball as discussed by the authors.
Abstract: Spectral Methods in Fluid DynamicsNumerical Methods for Partial Differential EquationsNumerical Analysis of Partial Differential EquationsNumerical analysis of spectral methods : theory and applicationsSpectral Methods And Their ApplicationsA Brief Introduction to Numerical AnalysisA First Course in the Numerical Analysis of Differential Equations South Asian EditionConvergence of Spectral Methods for Hyperbolic Initial-boundary Value SystemsReview of Some Approximation Operators for the Numerical Analysis of Spectral MethodsSpectral Methods in MATLABA Modified Spectral Method in Phase SpaceThe Birth of Numerical AnalysisSpectral Methods for Non-Standard Eigenvalue ProblemsPartial Differential EquationsNumerical Analysis of Spectral MethodsNumerical Analysis of Partial Differential Equations Using Maple and MATLABSpectral MethodsSpectral Methods for NonStandard Eigenvalue ProblemsAn Introduction to the Numerical Analysis of Spectral MethodsSpectral Methods in Time for Parabolic ProblemsSpectral Methods in Chemistry and PhysicsA First Course in the Numerical Analysis of Differential Equations South Asian EditionSummary of Research in Applied Mathematics, Numerical Analysis and Computer Science at the Institute for Computer Applications in Science and EngineeringNumerical AnalysisSpectral Methods for Compressible Flow ProblemsA First Course in the Numerical Analysis of Differential EquationsSummary of Research in Applied Mathematics, Numerical Analysis, and Computer SciencesA Theoretical Introduction to Numerical AnalysisNumerical AnalysisRiemann-Hilbert Problems, Their Numerical Solution, and the Computation of Nonlinear Special FunctionsSpectral MethodsSpectral Methods for Uncertainty QuantificationSpectral Methods and Their ApplicationsNumerical Analysis of Spectral Methods: Theory and ApplicatonsSpectral Methods for Incompressible Viscous FlowAdvances in Numerical Analysis: Nonlinear partial differential equations and dynamical systemsSpectral Methods Using Multivariate Polynomials on the Unit BallA First Course in the Numerical Analysis of Differential EquationsFundamentals of Engineering Numerical AnalysisSpectral Methods for Time-Dependent Problems

1,425 citations