scispace - formally typeset
Open AccessJournal ArticleDOI

Functional Data Analysis

TLDR
In this article, the authors provide an overview of FDA, starting with simple statistical notions such as mean and covariance functions, then covering some core techniques, the most popular of which is functional principal component analysis (FPCA).
Abstract
With the advance of modern technology, more and more data are being recorded continuously during a time interval or intermittently at several discrete time points. These are both examples of functional data, which has become a commonly encountered type of data. Functional data analysis (FDA) encompasses the statistical methodology for such data. Broadly interpreted, FDA deals with the analysis and theory of data that are in the form of functions. This paper provides an overview of FDA, starting with simple statistical notions such as mean and covariance functions, then covering some core techniques, the most popular of which is functional principal component analysis (FPCA). FPCA is an important dimension reduction tool, and in sparse data situations it can be used to impute functional data that are sparsely observed. Other dimension reduction approaches are also discussed. In addition, we review another core technique, functional linear regression, as well as clustering and classification of functional d...

read more

Content maybe subject to copyright    Report

Citations
More filters
Posted Content

Continuous-time autoregressive moving-average processes in Hilbert space

TL;DR: In this article, the authors introduce the class of continuous-time autoregressive moving-average (CARMA) processes in Hilbert spaces and consider Levy processes as driving noises of these processes.
Journal ArticleDOI

Robust functional principal component analysis for non-Gaussian longitudinal data

TL;DR: In this paper , a robust functional principal component analysis methodology for non-Gaussian longitudinal data is proposed to deal with sparsity and irregularity along with non-negligible measurement errors.
Posted Content

Inference for Sparse and Dense Functional Data with Covariate Adjustments

TL;DR: Theoretical results demonstrate that the existing asymptotic normality results can lead to severely misleading inferences in finite samples and propose finite-sample corrections which provide practically useful approximations for inference in sparse and dense data scenarios.
Book ChapterDOI

Brief Review of Functional Data Analysis: A Case Study on Regional Demographic and Economic Data

TL;DR: In this paper, the authors describe the general techniques used in functional data analysis and some relevant studies performed with data from Ecuador and carry out an exploratory analysis with FPCA, functional clustering and PCA on data sets considering fertility, infant mortality, life expectancy, MPI, HDI and GDP growth indexes.
References
More filters
Journal ArticleDOI

Nonlinear dimensionality reduction by locally linear embedding.

TL;DR: Locally linear embedding (LLE) is introduced, an unsupervised learning algorithm that computes low-dimensional, neighborhood-preserving embeddings of high-dimensional inputs that learns the global structure of nonlinear manifolds.
Journal ArticleDOI

A global geometric framework for nonlinear dimensionality reduction.

TL;DR: An approach to solving dimensionality reduction problems that uses easily measured local metric information to learn the underlying global geometry of a data set and efficiently computes a globally optimal solution, and is guaranteed to converge asymptotically to the true structure.
Journal ArticleDOI

Generalized Additive Models.

Journal ArticleDOI

Dynamic programming algorithm optimization for spoken word recognition

TL;DR: This paper reports on an optimum dynamic progxamming (DP) based time-normalization algorithm for spoken word recognition, in which the warping function slope is restricted so as to improve discrimination between words in different categories.
Journal ArticleDOI

Generalized Additive Models

TL;DR: The class of generalized additive models is introduced, which replaces the linear form E fjXj by a sum of smooth functions E sj(Xj), and has the advantage of being completely auto- matic, i.e., no "detective work" is needed on the part of the statistician.