scispace - formally typeset
Search or ask a question
Journal ArticleDOI

EEG Signals Denoising Using Optimal Wavelet Transform Hybridized With Efficient Metaheuristic Methods

TL;DR: Five powerful metaheuristic algorithms are proposed to find the optimal WT parameters for EEG signal denoising which are harmony search (HS), particle swarm optimization (PSO), genetic algorithm (GA), and flower pollination algorithm (FPA), and interestingly, for almost all evaluating criteria, FPA achieves the best parameters configuration for WT.
Abstract: Background . The most common and successful technique for signal denoising with non-stationary signals, such as electroencephalogram (EEG) and electrocardiogram (ECG) is the wavelet transform (WT). The success of WT depends on the optimal configuration of its control parameters which are often experimentally set. Fortunately, the optimality of the combination of these parameters can be measured in advance by using the mean squared error (MSE) function. Method . In this paper, five powerful metaheuristic algorithms are proposed to find the optimal WT parameters for EEG signal denoising which are harmony search (HS), $\beta $ -hill climbing ( $\beta $ -hc), particle swarm optimization (PSO), genetic algorithm (GA), and flower pollination algorithm (FPA). It is worth mentioning that this is the initial investigation of using optimization methods for WT parameter configuration. This paper then examines which efficient algorithm has obtained the minimum MSE and the best WT parameter configurations. Result . The performance of the proposed algorithms is tested using two standard EEG datasets, namely, Kiern’s EEG dataset and EEG Motor Movement/Imagery dataset. The results of the proposed algorithms are evaluated using five common criteria: signal-to-noise-ratio (SNR), SNR improvement, mean square error (MSE), root mean square error (RMSE), and percentage root mean square difference (PRD). Interestingly, for almost all evaluating criteria, FPA achieves the best parameters configuration for WT and empowers this technique to efficiently denoise the EEG signals for almost all used datasets. To further validate the FPA results, a comparative study between the FPA results and the results of two previous studies is conducted, and the findings favor to FPA. Conclusion . In conclusion, the results show that the proposed methods for EEG signal denoising can produce better results than manual configurations based on ad hoc strategy. Therefore, using metaheuristic approaches to optimize the parameters for EEG signals positively affects the denoising process performance of the WT method.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: A systematic review of the published articles in the last five years aims to help in choosing the appropriate deep neural network architecture and other hyperparameters for developing MI EEG-based BCI systems.

151 citations

Journal ArticleDOI
TL;DR: A new type of EEG classification network, the separable EEGNet (S-EEGNet), is proposed based on Hilbert–Huang transform and a separable convolutional neural network (CNN) with bilinear interpolation.
Abstract: As one of the most important research fields in the brain-computer interface (BCI) field, electroencephalogram (EEG) classification has a wide range of application values. However, for the EEG signal, it is difficult for the traditional neural networks to capture the characteristics of the EEG signal more comprehensively from the time and space dimensions, which has a certain effect on the accuracy of EEG classification. To solve this problem, we can improve the accuracy of classification via end-to-end learning of the time and space dimensions of EEG. In this paper, a new type of EEG classification network, the separable EEGNet (S-EEGNet), is proposed based on Hilbert-Huang transform (HHT) and a separable convolutional neural network (CNN) with bilinear interpolation. The EEG signal is transformed into time-frequency representation by HHT, which allows the EEG signal to be better described in the frequency domain. Then, the depthwise and pointwise elements of the network are combined to extract the feature map. The displacement variable is added by the bilinear interpolation method to the convolution layer of the separable CNN, allowing the free deformation of the sampling grid. The deformation depends on the local, dense, and adaptive input characteristics of the EEG data. The network can learn from the time and space dimensions of EEG signals end to end to extract features to improve the accuracy of EEG classification. To show the effectiveness of S-EEGNet, the team used this method to test two different types of EEG public datasets (motor imagery classification and emotion classification). The accuracy of motor imagery classification is 77.9%, and the accuracy of emotion classification is 89.91%, and 88.31%, respectively. The experimental results showed that the classification accuracy of S-EEGNet improved by 3.6%, 1.15%, and 1.33%, respectively.

35 citations


Cites methods from "EEG Signals Denoising Using Optimal..."

  • ...[6] proposed an EEG denoising method based on wavelet transform (WT), and Yahya et al....

    [...]

Journal ArticleDOI
TL;DR: A new hybrid of MVO algorithm with the K-means clustering algorithm is proposed, i.e., the H-MVO algorithm, which aims at improving the global (diversification) ability of the search and finding a better cluster partition.
Abstract: Text clustering has been widely utilized with the aim of partitioning specific document collection into different subsets using homogeneity/heterogeneity criteria. It has also become a very complicated area of research, including pattern recognition, information retrieval, and text mining. Metaheuristics are typically used as efficient approaches for the text clustering problem. The multi-verse optimizer algorithm (MVO) involves a stochastic population-based algorithm. It has been recently proposed and successfully utilized to tackle many hard optimization problems. However, a recently applied research trend involves hybridizing two or more algorithms with the aim of obtaining a superior solution regarding the problems of optimization. In this paper, a new hybrid of MVO algorithm with the K-means clustering algorithm is proposed, i.e., the H-MVO algorithm with the aims of enhancing the quality of initial candidate solutions, as well as the best solution, which is produced by MVO at each iteration. This hybrid algorithm aims at improving the global (diversification) ability of the search and finding a better cluster partition. The proposed H-MVO effectiveness was tested on five standard datasets, which are used in the domain of data clustering, as well as six standard text datasets, which are utilized in the domain of text document clustering, in addition to two scientific articles’ datasets. The experiments showed that K-means hybridized MVO improves the results in terms of high convergence rate, accuracy, error rate, purity, entropy, recall, precision, and F-measure criteria. In general, H-MVO has outperformed or at least proven to be highly competitive compared to the original MVO algorithm and with well-known optimization algorithms like KHA, HS, PSO, GA, H-PSO, and H-GA and the clustering techniques like K-mean, K-mean++, DBSCAN, agglomerative, and spectral clustering techniques.

33 citations


Cites methods from "EEG Signals Denoising Using Optimal..."

  • ...Until today, random has been the standard method to create the initial population [36]....

    [...]

Journal ArticleDOI
TL;DR: A new formulation for smart home battery (SHB) is proposed for PSPSH that reduces the effect of restrictions in obtaining the optimal/near-optimal solutions and exhibits and yields better performance than the other compared algorithms in almost all scenarios.
Abstract: The power scheduling problem in a smart home (PSPSH) refers to the timely scheduling operations of smart home appliances under a set of restrictions and a dynamic pricing scheme(s) produced by a power supplier company (PSC). The primary objectives of PSPSH are: (I) minimizing the cost of the power consumed by home appliances, which refers to electricity bills, (II) balance the power consumed during a time horizon, particularly at peak periods, which is known as the peak-to-average ratio, and (III) maximizing the satisfaction level of users. Several approaches have been proposed to address PSPSH optimally, including optimization and non-optimization based approaches. However, the set of restrictions inhibit the approach used to obtain the optimal solutions. In this paper, a new formulation for smart home battery (SHB) is proposed for PSPSH that reduces the effect of restrictions in obtaining the optimal/near-optimal solutions. SHB can enhance the scheduling of smart home appliances by storing power at unsuitable periods and use the stored power at suitable periods for PSPSH objectives. PSPSH is formulated as a multi-objective optimization problem to achieve all objectives simultaneously. A robust swarm-based optimization algorithm inspired by the grey wolf lifestyle called grey wolf optimizer (GWO) is adapted to address PSPSH. GWO has powerful operations managed by its dynamic parameters that maintain exploration and exploitation behavior in search space. Seven scenarios of power consumption and dynamic pricing schemes are considered in the simulation results to evaluate the proposed multi-objective PSPSH using SHB (BMO-PSPSH) approach. The proposed BMO-PSPSH approach’s performance is compared with that of other 17 state-of-the-art algorithms using their recommended datasets and four algorithms using the proposed datasets. The proposed BMO-PSPSH approach exhibits and yields better performance than the other compared algorithms in almost all scenarios.

30 citations

Journal ArticleDOI
TL;DR: This study suggests directed functional connectivity with GTA as informative features and highlight Support Vector Machine as the suitable classifier for classifying semantic vigilance levels.
Abstract: This paper proposes two novel methods to classify semantic vigilance levels by utilizing EEG directed connectivity patterns with their corresponding graphical network measures We estimate the directed connectivity using relative wavelet transform entropy (RWTE) and partial directed coherence (PDC) and the graphical network measures by graph theory analysis (GTA) at four frequency bands The RWTE and PDC quantify the strength and directionality of information flow between EEG nodes On the other hand, the GTA of the complex network measures summarizes the topological structure of the network We then evaluate the proposed methods using machine learning classifiers We carried out an experiment on nine subjects performing semantic vigilance task (Stroop color word test (SCWT)) for approximately 45 minutes Behaviorally, all subjects demonstrated vigilance decrement as reflected by the significant increase in response time and reduced accuracy The strength and directionality of information flow in the connectivity network by RWTE/PDC and the GTA measures significantly decrease with vigilance decrement, p<; 005 The classification results show that the proposed methods outperform other related and competitive methods available in the literature and achieve 100% accuracy in subject-dependent and above 89% in subject-independent level in each of the four frequency bands The overall results indicate that the proposed methods of directed connectivity patterns and GTA provide a complementary aspect of functional connectivity Our study suggests directed functional connectivity with GTA as informative features and highlight Support Vector Machine as the suitable classifier for classifying semantic vigilance levels

28 citations


Additional excerpts

  • ...1∼4Hz), θ wavelet (4∼8Hz), α wavelet (8∼13Hz), and β wavelet (14∼30Hz)], as described in [51], [52]....

    [...]

References
More filters
Journal ArticleDOI
13 May 1983-Science
TL;DR: There is a deep and useful connection between statistical mechanics and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters), and a detailed analogy with annealing in solids provides a framework for optimization of very large and complex systems.
Abstract: There is a deep and useful connection between statistical mechanics (the behavior of systems with many degrees of freedom in thermal equilibrium at a finite temperature) and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters). A detailed analogy with annealing in solids provides a framework for optimization of the properties of very large and complex systems. This connection to statistical mechanics exposes new information and provides an unfamiliar perspective on traditional optimization problems and methods.

41,772 citations

BookDOI
01 May 1992
TL;DR: Initially applying his concepts to simply defined artificial systems with limited numbers of parameters, Holland goes on to explore their use in the study of a wide range of complex, naturally occuring processes, concentrating on systems having multiple factors that interact in nonlinear ways.
Abstract: From the Publisher: Genetic algorithms are playing an increasingly important role in studies of complex adaptive systems, ranging from adaptive agents in economic theory to the use of machine learning techniques in the design of complex devices such as aircraft turbines and integrated circuits. Adaptation in Natural and Artificial Systems is the book that initiated this field of study, presenting the theoretical foundations and exploring applications. In its most familiar form, adaptation is a biological process, whereby organisms evolve by rearranging genetic material to survive in environments confronting them. In this now classic work, Holland presents a mathematical model that allows for the nonlinearity of such complex interactions. He demonstrates the model's universality by applying it to economics, physiological psychology, game theory, and artificial intelligence and then outlines the way in which this approach modifies the traditional views of mathematical genetics. Initially applying his concepts to simply defined artificial systems with limited numbers of parameters, Holland goes on to explore their use in the study of a wide range of complex, naturally occuring processes, concentrating on systems having multiple factors that interact in nonlinear ways. Along the way he accounts for major effects of coadaptation and coevolution: the emergence of building blocks, or schemata, that are recombined and passed on to succeeding generations to provide, innovations and improvements. John H. Holland is Professor of Psychology and Professor of Electrical Engineering and Computer Science at the University of Michigan. He is also Maxwell Professor at the Santa Fe Institute and isDirector of the University of Michigan/Santa Fe Institute Advanced Research Program.

12,584 citations


"EEG Signals Denoising Using Optimal..." refers background in this paper

  • ...Metaheuristic algorithms are conventionally categorized into: i) evolutionary algorithms (EAs), including GA [16], harmony search (HS) [17], and genetic programming (GP) [18]; ii) swarmbased intelligence algorithms (SI), including particle swarm...

    [...]

  • ...GA was developed in [16] to mimic the natural phenomenon of Darwin evolution theory....

    [...]

Journal ArticleDOI
TL;DR: The authors prove two results about this type of estimator that are unprecedented in several ways: with high probability f/spl circ/*/sub n/ is at least as smooth as f, in any of a wide variety of smoothness measures.
Abstract: Donoho and Johnstone (1994) proposed a method for reconstructing an unknown function f on [0,1] from noisy data d/sub i/=f(t/sub i/)+/spl sigma/z/sub i/, i=0, ..., n-1,t/sub i/=i/n, where the z/sub i/ are independent and identically distributed standard Gaussian random variables. The reconstruction f/spl circ/*/sub n/ is defined in the wavelet domain by translating all the empirical wavelet coefficients of d toward 0 by an amount /spl sigma//spl middot//spl radic/(2log (n)/n). The authors prove two results about this type of estimator. [Smooth]: with high probability f/spl circ/*/sub n/ is at least as smooth as f, in any of a wide variety of smoothness measures. [Adapt]: the estimator comes nearly as close in mean square to f as any measurable estimator can come, uniformly over balls in each of two broad scales of smoothness classes. These two properties are unprecedented in several ways. The present proof of these results develops new facts about abstract statistical inference and its connection with an optimal recovery model. >

9,359 citations


"EEG Signals Denoising Using Optimal..." refers background in this paper

  • ...The wavelet generally provides two standard types of thresholding functions (β), namely, hard and soft thresholding [47], [62]....

    [...]

  • ...e, β)), can be divided into hard and soft thresholding [47], [62]....

    [...]

Journal ArticleDOI
TL;DR: In this article, the authors developed a spatially adaptive method, RiskShrink, which works by shrinkage of empirical wavelet coefficients, and achieved a performance within a factor log 2 n of the ideal performance of piecewise polynomial and variable-knot spline methods.
Abstract: SUMMARY With ideal spatial adaptation, an oracle furnishes information about how best to adapt a spatially variable estimator, whether piecewise constant, piecewise polynomial, variable knot spline, or variable bandwidth kernel, to the unknown function. Estimation with the aid of an oracle offers dramatic advantages over traditional linear estimation by nonadaptive kernels; however, it is a priori unclear whether such performance can be obtained by a procedure relying on the data alone. We describe a new principle for spatially-adaptive estimation: selective wavelet reconstruction. We show that variable-knot spline fits and piecewise-polynomial fits, when equipped with an oracle to select the knots, are not dramatically more powerful than selective wavelet reconstruction with an oracle. We develop a practical spatially adaptive method, RiskShrink, which works by shrinkage of empirical wavelet coefficients. RiskShrink mimics the performance of an oracle for selective wavelet reconstruction as well as it is possible to do so. A new inequality in multivariate normal decision theory which we call the oracle inequality shows that attained performance differs from ideal performance by at most a factor of approximately 2 log n, where n is the sample size. Moreover no estimator can give a better guarantee than this. Within the class of spatially adaptive procedures, RiskShrink is essentially optimal. Relying only on the data, it comes within a factor log 2 n of the performance of piecewise polynomial and variableknot spline methods equipped with an oracle. In contrast, it is unknown how or if piecewise polynomial methods could be made to function this well when denied access to an oracle and forced to rely on data alone.

8,153 citations


"EEG Signals Denoising Using Optimal..." refers background or methods in this paper

  • ...e, β)), can be divided into hard and soft thresholding [47], [62]....

    [...]

  • ...[47] D. L. Donoho and I. M. Johnstone, ‘‘Ideal spatial adaptation by wavelet shrinkage,’’ Biometrika, vol. 81, no. 3, pp. 425–455, Sep. 1994....

    [...]

  • ...DWT was originally established in [47] as the so-called Donoho’s approach....

    [...]

  • ...Donoho and Johnstone [47] calculated the threshold δ on an...

    [...]

  • ...The wavelet generally provides two standard types of thresholding functions (β), namely, hard and soft thresholding [47], [62]....

    [...]

01 Jan 2005

5,265 citations


Additional excerpts

  • ...optimization (PSO) [19], artificial bee colony (ABC) [20], flower pollination algorithm (FPA) [21], and iii) trajectorybased algorithm (TAs), including β-hill climbing (βHC) [14], simulating annealing (SA) [22], tabu search (TS) [23], greedy randomized adaptive search procedure (GRASP) [24], variable neighborhood search (VNS) [25], iterated local search (ILS) [26] meta-heuristic....

    [...]

  • ...Metaheuristic algorithms are conventionally categorized into: i) evolutionary algorithms (EAs), including GA [16], harmony search (HS) [17], and genetic programming (GP) [18]; ii) swarmbased intelligence algorithms (SI), including particle swarm optimization (PSO) [19], artificial bee colony (ABC) [20], flower pollination algorithm (FPA) [21], and iii) trajectorybased algorithm (TAs), including β-hill climbing (βHC) [14], simulating annealing (SA) [22], tabu search (TS) [23], greedy randomized adaptive search procedure (GRASP) [24], variable neighborhood search (VNS) [25], iterated local search (ILS) [26] meta-heuristic....

    [...]