Stepwise Feature Selection by Cross Validation for EEG-based Brain Computer Interface
Citations
118 citations
Cites background from "Stepwise Feature Selection by Cross..."
...It is well known that feature selection is effective for pattern classification as shown in [14]....
[...]
46 citations
38 citations
15 citations
Cites background or methods from "Stepwise Feature Selection by Cross..."
...However, a large number of studies [30], [31], [34]–[36] have proved that LOO usually produce high variance, and the model evaluation effect is not as good as that of 10-fold CV [23]....
[...]
...For example, in literature [36], 5-fold CV were applied to improve the generalization performance of SVM model and guide the removal of irrelevant and redundant features in brain-computer interface, good application results are obtained....
[...]
14 citations
Cites background or methods from "Stepwise Feature Selection by Cross..."
...In the case where K equals the number of samples available in the data, this validation method is referred to as the ‘leave-one-out’ approach (also known as jack-knifing) (Tanaka et al., 2006; Duffy and Als, 2012)....
[...]
...The generalisation performance of a classifier, using K-fold cross-validation, is estimated according to the following steps (Tanaka et al., 2006): Stellenbosch University https://scholar....
[...]
...The smaller the feature set to be classified, the lower the complexity of the computational burden, which in turn may lead to improved classification performance (Tanaka et al., 2006)....
[...]
...The drawback of training with large data sets is the cost of a high computational burden – this burden increases with an increase in sample size (Tanaka et al., 2006)....
[...]
References
47,133 citations
"Stepwise Feature Selection by Cross..." refers methods in this paper
...Information criteria such as AIC [ 13 ] and MDL [14] are also often used, which evaluate the prediction model using the maximum log likelihood calculated from the learning parameters....
[...]
26,531 citations
"Stepwise Feature Selection by Cross..." refers methods in this paper
...Instead of linear SVM, we suggest using kernel support vector machines (kernel SVM) [9] as a way of improving accuracy....
[...]
...Since the choice of a kernel function has an influence on the performance of the constructed SVM classifier, it was necessary to select an appropriate kernel function for classification....
[...]
...(2) and choosing an appropriate kernel function K, suitable kernel SVMs can be constructed for a given task [12]....
[...]
...An additional way of improving the classifier will be to use wavelets or kernel feature selection for kernel SVMs....
[...]
...The proposed algorithm is based on backward stepwise selection with 5-fold cross validation and kernel SVMs....
[...]
13,736 citations
9,705 citations
"Stepwise Feature Selection by Cross..." refers methods in this paper
...The power spectrum densities for each electrode was estimated using the Welch periodogram [8], [6] and was divided into 12 components with a 2Hz resolution....
[...]