scispace - formally typeset
Search or ask a question

Showing papers by "Gao Huang published in 2013"


Proceedings Article
16 Jun 2013
TL;DR: Anytime Feature Representations (AFR) is introduced, a novel algorithm that explicitly addresses this trade-off in the data representation rather than in the classifier, allowing conventional classifiers to be turned into test-time cost sensitive anytime classifiers.
Abstract: Evaluation cost during test-time is becoming increasingly important as many real-world applications need fast evaluation (e.g. web search engines, email spam filtering) or use expensive features (e.g. medical diagnosis). We introduce Anytime Feature Representations (AFR), a novel algorithm that explicitly addresses this trade-off in the data representation rather than in the classifier. This enables us to turn conventional classifiers, in particular Support Vector Machines, into test-time cost sensitive anytime classifiers-- combining the advantages of anytime learning and large-margin classification.

18 citations


Journal ArticleDOI
TL;DR: A novel SSL algorithm based on the multiple clusters per class assumption that is efficient for handling big data sets with a large amount of unlabeled data and guaranteed by the maximal margin classifier is proposed.

9 citations


Journal ArticleDOI
TL;DR: This paper proposes an alternative selection procedure based on the kernelized least angle regression (LARS)–least absolute shrinkage and selection operator (LASSO) method, formulating the RBF neural network as a linear-in-the-parameters model and derive a l1-constrained objective function for training the network.
Abstract: Model structure selection is of crucial importance in radial basis function (RBF) neural networks. Existing model structure selection algorithms are essentially forward selection or backward elimination methods that may lead to sub-optimal models. This paper proposes an alternative selection procedure based on the kernelized least angle regression (LARS)–least absolute shrinkage and selection operator (LASSO) method. By formulating the RBF neural network as a linear-in-the-parameters model, we derive a l 1-constrained objective function for training the network. The proposed algorithm makes it possible to dynamically drop a previously selected regressor term that is insignificant. Furthermore, inspired by the idea of LARS, the computing of output weights in our algorithm is greatly simplified. Since our proposed algorithm can simultaneously conduct model structure selection and parameter optimization, a network with better generalization performance is built. Computational experiments with artificial and real world data confirm the efficacy of the proposed algorithm.

8 citations