Times Series Averaging and Denoising from a Probabilistic Perspective on Time–Elastic Kernels
Reads0
Chats0
TLDR
In this article, the authors re-considers the concept of time elastic centroid for a set of time series and derive a new algorithm based on a probabilistic interpretation of kernel alignment matrices.Abstract:
In the light of regularized dynamic time warping kernels, this paper re-considers the concept of time elastic centroid for a set
of time series. We derive a new algorithm based on a probabilistic interpretation of kernel alignment matrices. This algorithm expresses
the averaging process in terms of a stochastic alignment automata. It uses an iterative agglomerative heuristic method for averaging
the aligned samples, while also averaging the times of occurrence of the aligned samples. By comparing classification accuracies for
45 heterogeneous time series datasets obtained by first nearest centroid/medoid classifiers we show that: i) centroid-based
approaches significantly outperform medoid-based approaches, ii) for the considered datasets, our algorithm that combines averaging
in the sample space and along the time axes, emerges as the most significantly robust model for time-elastic averaging with a
promising noise reduction capability. We also demonstrate its benefit in an isolated gesture recognition experiment and its ability to
significantly reduce the size of training instance sets. Finally we highlight its denoising capability using demonstrative synthetic data:
we show that it is possible to retrieve, from few noisy instances, a signal whose components are scattered in a wide spectral band.read more
Citations
More filters
Journal ArticleDOI
Proximity Forest: An effective and scalable distance-based classifier for time series
Benjamin Lucas,Ahmed Shifaz,Charlotte Pelletier,Lachlan O'Neill,Nayyar Abbas Zaidi,Bart Goethals,François Petitjean,Geoffrey I. Webb +7 more
TL;DR: Proximity Forest as mentioned in this paper is an algorithm that learns accurate models from datasets with millions of time series, and classifies a time series in milliseconds by ensembles of highly randomized Proximity Trees.
Posted Content
Deep learning for time series classification
TL;DR: The main objective of this thesis was to study and develop deep neural networks specifically constructed for the classification of time series data, and to carry out the first large scale experimental study allowing to compare the existing deep methods and to position them compared other non-deep learning based state-of-the-art methods.
Journal ArticleDOI
Exact mean computation in dynamic time warping spaces
TL;DR: An exponential-time dynamic program for computing a global minimum of the Fréchet function is proposed and an exact polynomial-time algorithm for the special case of binary time series is presented.
Journal ArticleDOI
Three Rapid Methods for Averaging GPS Segments
TL;DR: Three extremely fast and practical methods to extract the road segment by averaging GPS trajectories are introduced, which provide equal or better accuracy than the best existing methods while being very fast, and are therefore suitable for real-time processing.
Proceedings Article
Exact Mean Computation in Dynamic Time Warping Spaces.
TL;DR: In this article, an exponential-time dynamic program for computing a global minimum of the Frechet function is proposed, which is useful for benchmarking and evaluating known heuristics.
References
More filters
Journal ArticleDOI
A tutorial on hidden Markov models and selected applications in speech recognition
TL;DR: In this paper, the authors provide an overview of the basic theory of hidden Markov models (HMMs) as originated by L.E. Baum and T. Petrie (1966) and give practical details on methods of implementation of the theory along with a description of selected applications of HMMs to distinct problems in speech recognition.
Journal Article
Statistical Comparisons of Classifiers over Multiple Data Sets
TL;DR: A set of simple, yet safe and robust non-parametric tests for statistical comparisons of classifiers is recommended: the Wilcoxon signed ranks test for comparison of two classifiers and the Friedman test with the corresponding post-hoc tests for comparisons of more classifiers over multiple data sets.
Book
Nonparametric Statistical Methods
Myles Hollander,Douglas A. Wolfe +1 more
TL;DR: An ideal text for an upper-level undergraduate or first-year graduate course, Nonparametric Statistical Methods, Second Edition is also an invaluable source for professionals who want to keep abreast of the latest developments within this dynamic branch of modern statistics.
Journal ArticleDOI
Theory of Reproducing Kernels.
TL;DR: In this paper, a short historical introduction is given to indicate the different manners in which these kernels have been used by various investigators and discuss the more important trends of the application of these kernels without attempting, however, a complete bibliography of the subject matter.
Book ChapterDOI
A Generalized Representer Theorem
TL;DR: The result shows that a wide range of problems have optimal solutions that live in the finite dimensional span of the training examples mapped into feature space, thus enabling us to carry out kernel algorithms independent of the (potentially infinite) dimensionality of the feature space.