scispace - formally typeset
B

Barry Flower

Researcher at University of Sydney

Publications -  26
Citations -  843

Barry Flower is an academic researcher from University of Sydney. The author has contributed to research in topics: Very-large-scale integration & Artificial neural network. The author has an hindex of 11, co-authored 25 publications receiving 825 citations.

Papers
More filters
Journal ArticleDOI

Weight perturbation: an optimal architecture and learning technique for analog VLSI feedforward and recurrent multilayer networks

TL;DR: It is shown that using gradient descent with direct approximation of the gradient instead of back-propagation is more economical for parallel analog implementations and is suitable for multilayer recurrent networks as well.
Journal ArticleDOI

Multiresolution forecasting for futures trading using wavelet decompositions

TL;DR: This work investigates the effectiveness of a financial time-series forecasting strategy which exploits the multiresolution property of the wavelet transform to choose short past windows for the inputs to the MLPs at lower scales and long past windows at higher scales.
Patent

A neural network

TL;DR: In this paper, a neural network (1) comprises an input port (5) connected to an output port (6) by one or more paths, each of which comprises an alternating series of weights and neurons (2).
Patent

Apparatus and method for the detection and treatment of arrhythmias using a neural network

TL;DR: An apparatus and method for the detection and treatment of arrhythmia using a processor having a neural network with a hierarchical arrangement including a first lower level for classifying individual waveforms, a second higher level for diagnosing detected arrhythmias and a third higher level to the application of therapy in response to a diagnosed arrhmythmia as discussed by the authors.
Proceedings Article

Summed Weight Neuron Perturbation: An O(N) Improvement Over Weight Perturbation

TL;DR: The algorithm presented performs gradient descent on the weight space of an Artificial Neural Network, using a finite difference to approximate the gradient, which achieves a computational complexity similar to that of Node Perturbation, O(N3).