scispace - formally typeset
Journal ArticleDOI

Neural networks

TLDR
The development and evolution of different topics related to neural networks is described showing that the field has acquired maturity and consolidation, proven by its competitiveness in solving real-world problems.
About
This article is published in Neurocomputing.The article was published on 2016-11-19. It has received 184 citations till now. The article focuses on the topics: Neural modeling fields & Nervous system network models.

read more

Citations
More filters
Journal ArticleDOI

Novel deep genetic ensemble of classifiers for arrhythmia detection using ECG signals

TL;DR: The proposed work based on 744 segments of ECG signal is obtained from the MIT-BIH Arrhythmia database and can be applied in cloud computing or implemented in mobile devices to evaluate the cardiac health immediately with highest precision.
Journal ArticleDOI

Recommendation system based on deep learning methods: a systematic review and new directions

TL;DR: This paper is the first SLR specifically on the deep learning based RS to summarize and analyze the existing studies based on the best quality research publications and indicated that autoencoder models are the most widely exploited deep learning architectures for RS followed by the Convolutional Neural Networks and the Recurrent Neural Networks.
Journal ArticleDOI

Deep learning approach for microarray cancer data classification

TL;DR: A deep feedforward method to classify the given microarray cancer data into a set of classes for subsequent diagnosis purposes using a 7-layer deep neural network architecture having various parameters for each dataset is developed.
Journal ArticleDOI

Supervised learning in spiking neural networks: A review of algorithms and evaluations

TL;DR: This article presents a comprehensive review of supervised learning algorithms for spiking neural networks and evaluates them qualitatively and quantitatively, and provides five qualitative performance evaluation criteria and presents a new taxonomy for supervisedLearning algorithms depending on these five performance evaluated criteria.
Journal ArticleDOI

Non-iterative and Fast Deep Learning: Multilayer Extreme Learning Machines

TL;DR: A thorough review on the development of ML-ELMs, including stacked ELM autoencoder, residual ELM, and local receptive field based ELM (ELM-LRF), as well as address their applications, and the connection between random neural networks and conventional deep learning.
References
More filters
Book ChapterDOI

FPGA Implementations of Neural Networks – A Survey of a Decade of Progress

TL;DR: A taxonomy for classifying FPGA implementations of artificial neural networks (ANNs) is provided and different implementation techniques and design issues are discussed.
Journal ArticleDOI

Multilayer Feedforward Neural Network Based on Multi-valued Neurons (MLMVN) and a Backpropagation Learning Algorithm

TL;DR: It is shown that using a traditional architecture of multilayer feedforward neural network (MLF) and the high functionality of the MVN, it is possible to obtain a new powerful neural network.
Journal ArticleDOI

A locally recurrent fuzzy neural network with application to the wind speed prediction using spatial correlation

TL;DR: Extensive simulation results demonstrate that the LF-DFNN models exhibit superior performance compared to other network types suggested in the literature, and it is shown that DRPE outperforms three gradient descent algorithms, in training of the recurrent forecast models.
Journal ArticleDOI

On numerical simulations of integrate-and-fire neural networks

TL;DR: It is shown that very small time steps are required to reproduce correctly the synchronization properties of large networks of integrate-and-fire neurons when the differential system describing their dynamics is integrated with the standard Euler or second-order Runge-Kutta algorithms.
Journal ArticleDOI

Efficient Event-Driven Simulation of Large Networks of Spiking Neurons and Dynamical Synapses

TL;DR: The main impact of the new approach is a drastic reduction of the computational load incurred upon introduction of dynamic synaptic efficacies, which vary organically as a function of the activities of the pre- and postsynaptic neurons.