Showing papers in "Neural Networks in 2019"
••
TL;DR: This review critically summarize the main challenges linked to lifelong learning for artificial learning systems and compare existing neural network approaches that alleviate, to different extents, catastrophic forgetting.
2,095 citations
••
TL;DR: An overview of recent advances in physical reservoir computing is provided by classifying them according to the type of the reservoir to expand its practical applications and develop next-generation machine learning systems.
959 citations
••
TL;DR: The emerging picture is that SNNs still lag behind ANNs in terms of accuracy, but the gap is decreasing, and can even vanish on some tasks, while SNN's typically require many fewer operations and are the better candidates to process spatio-temporal data.
756 citations
••
TL;DR: In this paper, the authors proposed transforming the existing univariate time series classification models, the Long Short Term Memory Fully Convolutional Network (LSTM-FCN) and Attention LSTM FCN (ALSTMFCN), into a multivariate time-series classification model by augmenting the fully convolutional block with a squeeze-and-excitation block to further improve accuracy.
509 citations
••
TL;DR: In this article, the authors provide a review of deep neural network concepts in background subtraction for novices and experts in order to analyze this success and to provide further directions.
278 citations
••
TL;DR: The differences between multi-task and single-incremental-task scenarios are pointed out and well-known approaches such as LWF, EWC and SI are not ideal for incremental task scenarios and a new approach, denoted as AR1, combining architectural and regularization strategies is then specifically proposed.
224 citations
••
TL;DR: A correlation-based channel selection (CCS) method is proposed to select the channels that contained more correlated information in this study to improve the classification performance of MI-based BCIs and a novel regularized common spatial pattern (RCSP) method was used to extract effective features.
219 citations
••
TL;DR: In this article, the expressive power of DNNs with ReLU activation function to linear spline methods is compared and it is shown that deep networks perform better or only slightly worse than the considered spline method.
211 citations
••
TL;DR: In this article, a comparison of a large collection composed by 77 popular regression models which belong to 19 families: linear and generalized linear models, generalized additive models, least squares, projection methods, LASSO and ridge regression, Bayesian models, Gaussian processes, nearest neighbors, regression trees and rules, neural networks, bagging and boosting, deep learning and support vector regression is presented.
159 citations
••
TL;DR: Multi-Representation Adaptation Network (MRAN) is presented which can dramatically improve the classification accuracy for cross-domain image classification and specially aims to align the distributions of multiple representations extracted by a hybrid structure named Inception Adaptation Module (IAM).
132 citations
••
TL;DR: Four different kinds of feedback controllers are designed, under which the considered inertial memristor-based neural networks can realize fixed-time synchronization perfectly and the obtained fixed- time synchronization criteria can be verified by algebraic operations.
••
TL;DR: Easy adaptation of the proposed technique to different convolutional structures and its efficiency demonstrate that popular deep learning models may be improved with differential convolution.
••
TL;DR: A novel unsupervised network parameter learning method for RVFL, named sparse pre-trained random vector functional link (SP-RVFL for short) network, which uses a sparse autoencoder with ℓ1-norm regularization to adaptively learn superior network parameters for specific learning tasks.
••
TL;DR: The controllers in this paper are discrete state-dependent and can be updated under the event-based triggering condition, which is more simpler than the previous results.
••
TL;DR: In this paper, the authors illustrate an advanced information theoretic methodology to understand the dynamics of learning and the design of autoencoders, a special type of deep learning architectures that resembles a communication channel.
••
TL;DR: By exploiting inequality techniques and by constructing appropriate Lyapunov functional, several sufficient conditions are obtained in the form of linear matrix inequalities (LMIs), which can be used to ascertain the passivity, output and input strict passivity of delayed RDMNNs.
••
TL;DR: In this paper, a new type of neural networks, quaternion-valued memristive neural networks (QVMNNs) is formulated on the basis of the differential inclusion principle and the Lyapunov functional method, and criterion of fixed-time synchronization for QVMNN's is given.
••
TL;DR: In this article, the authors investigate how far they can go on digit (MNIST) and object (CIFAR10) classification with biologically plausible, local learning rules in a network with one hidden layer and a single readout layer.
••
TL;DR: The proposed neurodynamic optimization model based on an augmented Lagrangian function is proposed and its states are proven to be asymptotically stable at a strict local minimum in the presence of nonconvexity in objective function or constraints.
••
TL;DR: The results suggest that the SW topology is essential for maintaining the echo state property, which is the appropriate neural dynamics between input and output brain regions.
••
TL;DR: Li et al. as discussed by the authors proposed a two-stream fusion network (FCTSFN) for interactive image segmentation, which includes two sub-networks: a twostream late fusion network that predicts the foreground at a reduced resolution, and a multi-scale refining network (MSRN) that refines the foreground in full resolution.
••
TL;DR: It is demonstrated that DCssCDBM can be extended well into the classification model instead of feature extractor alone previously and shows the better ability of high-level representations extraction and demonstrates the advanced results over several state-of-the-art methods.
••
TL;DR: The new analytical techniques skillfully overcome the difficulties caused by time-varying delays and cope with the uncertainties of both Filippov solution and Markov jumping, which enable the settling time explicitly to be determined.
••
TL;DR: This work proposes a novel unsupervised feature representation approach by incorporating a spectral constraint strategy into adversarial autoencoders (AAE) without any prior knowledge to obtain better discrimination represented by hidden nodes.
••
TL;DR: The estimated properties, in comparison with the computational results, indicate that CNNs perform outstandingly in predicting the physical parameters of rock without conducting any time-demanding forward modeling if enough input data are provided.
••
TL;DR: Delayed bifurcation criteria of the developed FOQVNN are attained and the exactness and merits of the achieved analytic results are eventually substantiated by a simulation example.
••
TL;DR: To realize QPS and CS, linear feedback controller and adaptive controller are designed, and a novel fractional-order differential inequality is built by means of Laplace transform and properties of Mittag-Leffler function.
••
TL;DR: By means of a novel transformation and interval matrix approach, non-fragile estimators are designed and parameter mismatch problem is averted in a class of fractional-order memristive BAM neural networks with and without time delays for the first time.
••
TL;DR: In this study, an approximated solution of the graph partitioning problem is obtained by using a deterministic annealing neural network algorithm that attempts to obtain a high-quality solution by following a path of minimum points of a barrier problem as the barrier parameter is reduced from a sufficiently large positive number to 0.
••
TL;DR: An exponential-attenuation-based switching event-trigger (EABSET) scheme is designed to achieve the global stabilization of delayed memristive neural networks (MNNs) and to design both the controller and the trigger parameters.