scispace - formally typeset
Search or ask a question

Showing papers by "Amos Storkey published in 1999"


Journal ArticleDOI
TL;DR: The nature of the basins of attraction of a Hopfield network is as important as the capacity, and a new learning rule is re-introduced, which has a higher capacity than that of the Hebb rule, and still keeps important functionality, such as incrementality and locality, which the pseudo-inverse lacks.

85 citations


Proceedings ArticleDOI
01 Jan 1999
TL;DR: The use of truncated covariances is proposed, which allow the use of speedy, memory efficient Toeplitz inversion for high dimensional grid based Gaussian process predictors.
Abstract: Gaussian processes are a limit extension of neural networks. Standard Gaussian process techniques use a squared exponential covariance function. Here, the use of truncated covariances is proposed. Such covariances have compact support. Their use speeds up matrix inversion and increases precision. Furthermore they allow the use of speedy, memory efficient Toeplitz inversion for high dimensional grid based Gaussian process predictors.

32 citations