Feature Selection Based on Structured Sparsity: A Comprehensive Study
Citations
233 citations
228 citations
210 citations
202 citations
References
40,785 citations
"Feature Selection Based on Structur..." refers background in this paper
...An interesting way to cope with feature selection in the learning by examples framework is to resort to regularization techniques based on l1 penalty [37], [38]....
[...]
16,538 citations
"Feature Selection Based on Structur..." refers methods in this paper
...To handle features with strong correlations, elastic net regularization [46] is proposed as...
[...]
15,106 citations
"Feature Selection Based on Structur..." refers background in this paper
...The first term of (23), arg minZ Z T =Im×m tr(Z L Z T ), is exactly the same as [91], which is to find the lowdimensional embedding of each example....
[...]
14,509 citations
"Feature Selection Based on Structur..." refers background in this paper
...E. l2,0-Norm Regularized/Constrained Feature Selection Sparse feature selection [71] selects features by solving a smoothed general loss function with a l2,0-norm constraint....
[...]
...(22) 4) Feature Selection via Joint Embedding Learning and Sparse Regression: Instead of regressing each example to its label [26], [86], [88], the objective of joint embedding learning and sparse regression (JELSR) [89], [90] is to regress each example Xi to its low-dimensional embedding Zi ∈ Rm , where m is the dimensionality of embedding....
[...]
...C. l2,1-Norm Regularized/Constrained Feature Selection 1) Efficient and Robust Feature Selection via Joint l2,1-Norm Minimization: Nie et al. [26] aim to learn a linear function y = x T W + b, such that for n training examples, Y i ≈ X Ti W + b, i.e., minW,b ‖Y i − X Ti W − b‖2....
[...]
...To solve this issue, feature selection (also known as feature ranking, subset, or variable selection) [1]–[7] techniques are designed to select a subset of features from the high-dimensional feature set for a compact and accurate data representation....
[...]
...6) Unsupervised Feature Selection: Maximum margin criterion (MMC) [95], [96] is a supervised subspace method, a variant of linear discriminant analysis (LDA)....
[...]
13,789 citations