scispace - formally typeset
Search or ask a question

Showing papers by "Klaus-Robert Müller published in 1995"


Proceedings Article
27 Nov 1995
TL;DR: In this article, a statistical theory for overtraining is proposed, which treats realizable stochastic neural networks, trained with Kullback-Leibler loss in the asymptotic case.
Abstract: A statistical theory for overtraining is proposed. The analysis treats realizable stochastic neural networks, trained with Kullback-Leibler loss in the asymptotic case. It is shown that the asymptotic gain in the generalization error is small if we perform early stopping, even if we have access to the optimal stopping time. Considering cross-validation stopping we answer the question: In what ratio the examples should be divided into training and testing sets in order to obtain the optimum performance. In the nonasymptotic region cross-validated early stopping always decreases the generalization error. Our large scale simulations done on a CM5 are in nice agreement with our analytical findings.

72 citations


Journal Article
TL;DR: The segmentation achieved by the unsupervised segmentation of time series is very precise and transients are included, a fact, which makes the approach promising for future applications.
Abstract: We present a framework for the unsupervised segmentation of time series. It applies to non-stationary signals originating from di erent dynamical systems which alternate in time, a phenomenon which appears in many natural systems. In our approach, predictors compete for data points of a given time series. We combine competition and evolutionary inertia to a learning rule. Under this learning rule the system evolves such that the predictors, which nally survive, unambiguously identify the underlying processes. The segmentation achieved by this method is very precise and transients are included, a fact, which makes our approach promising for future applications. key words: neural networks, non-linear dynamics, chaos, time series analysis, prediction, competing neural networks

37 citations