scispace - formally typeset
J

J. Connor

Researcher at University of Washington

Publications -  7
Citations -  1550

J. Connor is an academic researcher from University of Washington. The author has contributed to research in topics: Recurrent neural network & Artificial neural network. The author has an hindex of 7, co-authored 7 publications receiving 1319 citations.

Papers
More filters
Journal ArticleDOI

Recurrent neural networks and robust time series prediction

TL;DR: A robust learning algorithm is proposed and applied to recurrent neural networks, NARMA(p,q), which show advantages over feedforward neural networks for time series with a moving average component and are shown to give better predictions than neural networks trained on unfiltered time series.
Journal ArticleDOI

A performance comparison of trained multilayer perceptrons and trained classification trees

TL;DR: It is concluded that there is not enough theoretical basis to demonstrate clear-cut superiority of one technique over the other in power-system load forecasting, power- system security prediction, and speaker-independent vowel recognition.
Proceedings ArticleDOI

Recurrent neural networks and time series prediction

TL;DR: It is shown that feedforward networks are nonlinear autoregressive models and that recurrent networks can model a larger class of processes, including nonlinearautoregressive moving average models.
Proceedings Article

Recurrent Networks and NARMA Modeling

TL;DR: It is shown that recurrent neural networks are a type of nonlinear autoregressive-moving average (NARMA) model, which is well modeled by feedforward networks or linear models, but can be modeled by recurrent networks.
Proceedings ArticleDOI

A performance comparison of trained multilayer perceptrons and trained classification trees

TL;DR: There is not enough theoretical basis for the clear-cut superiority of one technique over the other in terms of classification and prediction outside the training set, and the authors are confident that the univariate version of the trained classification trees do not perform as well as the multilayer perceptron.