A study of longitudinal trends in time-frequency transformations of EEG data during a learning experiment
TL;DR: Longitudinal time-frequency transformation of ERP (LTFT-ERP) is proposed to retain information from both the time and frequency domains, offering distinct but complementary information on the underlying cognitive processes evoked, while still retaining the longitudinal dynamics in the ERP waveforms.
About: This article is published in Computational Statistics & Data Analysis.The article was published on 2022-03-01. It has received 1 citations till now. The article focuses on the topics: Electroencephalography & Computer science.
Citations
More filters
••
TL;DR: In this paper , fast multilevel functional principal component analysis (fast MFPCA) was proposed to scale up to high dimensional functional data measured at multiple visits, which is orders of magnitude faster than and achieves comparable estimation accuracy with the original MFPA.
Abstract: Abstract We introduce fast multilevel functional principal component analysis (fast MFPCA), which scales up to high dimensional functional data measured at multiple visits. The new approach is orders of magnitude faster than and achieves comparable estimation accuracy with the original MFPCA. Methods are motivated by the National Health and Nutritional Examination Survey (NHANES), which contains minute-level physical activity information of more than 10, 000 participants over multiple days and 1440 observations per day. While MFPCA takes more than five days to analyze these data, fast MFPCA takes less than five minutes. A theoretical study of the proposed method is also provided. The associated function mfpca.face() is available in the R package refund. Supplementary materials for this article are available online.
2 citations
References
More filters
••
TL;DR: EELAB as mentioned in this paper is a toolbox and graphic user interface for processing collections of single-trial and/or averaged EEG data of any number of channels, including EEG data, channel and event information importing, data visualization (scrolling, scalp map and dipole model plotting, plus multi-trial ERP-image plots), preprocessing (including artifact rejection, filtering, epoch selection, and averaging), Independent Component Analysis (ICA) and time/frequency decomposition including channel and component cross-coherence supported by bootstrap statistical methods based on data resampling.
17,362 citations
••
TL;DR: In this article, a step-by-step guide to wavelet analysis is given, with examples taken from time series of the El Nino-Southern Oscillation (ENSO).
Abstract: A practical step-by-step guide to wavelet analysis is given, with examples taken from time series of the El Nino–Southern Oscillation (ENSO). The guide includes a comparison to the windowed Fourier transform, the choice of an appropriate wavelet basis function, edge effects due to finite-length time series, and the relationship between wavelet scale and Fourier frequency. New statistical significance tests for wavelet power spectra are developed by deriving theoretical wavelet spectra for white and red noise processes and using these to establish significance levels and confidence intervals. It is shown that smoothing in time or scale can be used to increase the confidence of the wavelet spectrum. Empirical formulas are given for the effect of smoothing on significance levels and confidence intervals. Extensions to wavelet analysis such as filtering, the power Hovmoller, cross-wavelet spectra, and coherence are described. The statistical significance tests are used to give a quantitative measure of change...
12,803 citations
••
TL;DR: Wavelet transforms are recent mathematical techniques, based on group theory and square integrable representations, which allows one to unfold a signal, or a field, into both space and scale, and possibly directions.
Abstract: Wavelet transforms are recent mathematical techniques, based on group theory and square integrable representations, which allows one to unfold a signal, or a field, into both space and scale, and possibly directions. They use analyzing functions, called wavelets, which are localized in space. The scale decomposition is obtained by dilating or contracting the chosen analyzing wavelet before convolving it with the signal. The limited spatial support of wavelets is important because then the behavior of the signal at infinity does not play any role. Therefore the wavelet analysis or syn thesis can be performed locally on the signal, as opposed to the Fourier transform which is inherently nonlocal due to the space-filling nature of the trigonometric functions. Wavelet transforms have been applied mostly to signal processing, image coding, and numerical analysis, and they are still evolving. So far there are only two complete presentations of this topic, both written in French, one for engineers (Gasquet & Witomski 1 990) and the other for mathematicians (Meyer 1 990a), and two conference proceedings, the first in English (Combes et al 1 989), the second in French (Lemarie 1 990a). In preparation are a textbook (Holschneider 199 1 ), a course (Dau bee hies 1 99 1), three conference procecdings (Mcyer & Paul 199 1 , Beylkin et al 199 1b, Farge et al 1 99 1), and a special issue of IEEE Transactions
2,770 citations
••
TL;DR: A method of analyzing collections of related curves in which the individual curves are modeled as spline functions with random coefficients, which produces a low-rank, low-frequency approximation to the covariance structure, which can be estimated naturally by the EM algorithm.
Abstract: We propose a method of analyzing collections of related curves in which the individual curves are modeled as spline functions with random coefficients. The method is applicable when the individual curves are sampled at variable and irregularly spaced points. This produces a low-rank, low-frequency approximation to the covariance structure, which can be estimated naturally by the EM algorithm. Smooth curves for individual trajectories are constructed as best linear unbiased predictor (BLUP) estimates, combining data from that individual and the entire collection. This framework leads naturally to methods for examining the effects of covariates on the shapes of the curves. We use model selection techniques--Akaike information criterion (AIC), Bayesian information criterion (BIC), and cross-validation--to select the number of breakpoints for the spline approximation. We believe that the methodology we propose provides a simple, flexible, and computationally efficient means of functional data analysis.
402 citations
••
TL;DR: Though motivated by the SHHS, the proposed methodology is generally applicable, with potential relevance to many modern scientific studies of hierarchical or longitudinal functional outcomes, and Notably, using MFPCA, it identifies and quantify associations between EEG activity during sleep and adverse cardiovascular outcomes.
Abstract: The Sleep Heart Health Study (SHHS) is a comprehensive landmark study of sleep and its impacts on health outcomes. A primary metric of the SHHS is the in-home polysomnogram, which includes two electroencephalographic (EEG) channels for each subject, at two visits. The volume and importance of this data presents enormous challenges for analysis. To address these challenges, we introduce multilevel functional principal component analysis (MFPCA), a novel statistical methodology designed to extract core intra- and inter-subject geometric components of multilevel functional data. Though motivated by the SHHS, the proposed methodology is generally applicable, with potential relevance to many modern scientific studies of hierarchical or longitudinal functional outcomes. Notably, using MFPCA, we identify and quantify associations between EEG activity during sleep and adverse cardiovascular outcomes.
288 citations