scispace - formally typeset
Open AccessBook

Neural Networks And Learning Machines

Simon Haykin
TLDR
Refocused, revised and renamed to reflect the duality of neural networks and learning machines, this edition recognizes that the subject matter is richer when these topics are studied together.
Abstract
For graduate-level neural network courses offered in the departments of Computer Engineering, Electrical Engineering, and Computer Science. Neural Networks and Learning Machines, Third Edition is renowned for its thoroughness and readability. This well-organized and completely upto-date text remains the most comprehensive treatment of neural networks from an engineering perspective. This is ideal for professional engineers and research scientists. Matlab codes used for the computer experiments in the text are available for download at: http://www.pearsonhighered.com/haykin/ Refocused, revised and renamed to reflect the duality of neural networks and learning machines, this edition recognizes that the subject matter is richer when these topics are studied together. Ideas drawn from neural networks and machine learning are hybridized to perform improved learning tasks beyond the capability of either independently.

read more

Citations
More filters
Journal ArticleDOI

State-of-the-art in artificial neural network applications: A survey

TL;DR: The study found that neural-network models such as feedforward and feedback propagation artificial neural networks are performing better in its application to human problems and proposed feedforwardand feedback propagation ANN models for research focus based on data analysis factors like accuracy, processing speed, latency, fault tolerance, volume, scalability, convergence, and performance.
Journal ArticleDOI

Investigating Critical Frequency Bands and Channels for EEG-Based Emotion Recognition with Deep Neural Networks

TL;DR: The experiment results show that neural signatures associated with different emotions do exist and they share commonality across sessions and individuals, and the performance of deep models with shallow models is compared.
Proceedings Article

Machine learning with adversaries: byzantine tolerant gradient descent

TL;DR: Krum is proposed, an aggregation rule that satisfies the resilience property of the aggregation rule capturing the basic requirements to guarantee convergence despite f Byzantine workers, which is argued to be the first provably Byzantine-resilient algorithm for distributed SGD.
Journal ArticleDOI

Digital Coherent Optical Receivers: Algorithms and Subsystems

TL;DR: In this article, a theoretical analysis of the dual-polarization constant modulus algorithm is presented, where the control surfaces several different equalizer algorithms are derived, including the decision-directed, trained, and the radially directed equalizer for both polarization division multiplexed quadriphase shift keyed (PDM-QPSK) and 16 level quadrature amplitude modulation (PDm-16-QAM).
References
More filters
Journal ArticleDOI

30 years of adaptive neural networks: perceptron, Madaline, and backpropagation

TL;DR: The history, origination, operating characteristics, and basic theory of several supervised neural-network training algorithms (including the perceptron rule, the least-mean-square algorithm, three Madaline rules, and the backpropagation technique) are described.
Book

Neural Networks: A Systematic Introduction

Raúl Rojas
TL;DR: The authors may not be able to make you love reading, but neural networks a systematic introduction will lead you to love reading starting from now.
Journal ArticleDOI

Perceptron-based learning algorithms

TL;DR: The heart of these algorithms is the pocket algorithm, a modification of perceptron learning that makes perceptronLearning well-behaved with nonseparable training data, even if the data are noisy and contradictory.
Book ChapterDOI

Neural Networks for Control

TL;DR: This paper starts by placing neural net techniques in a general nonlinear control framework, and several basic theoretical results on networks are surveyed.