Journal ArticleDOI
Learnability and the Vapnik-Chervonenkis dimension
TLDR
This paper shows that the essential condition for distribution-free learnability is finiteness of the Vapnik-Chervonenkis dimension, a simple combinatorial parameter of the class of concepts to be learned.Abstract:
Valiant's learnability model is extended to learning classes of concepts defined by regions in Euclidean space En. The methods in this paper lead to a unified treatment of some of Valiant's results, along with previous results on distribution-free convergence of certain pattern recognition algorithms. It is shown that the essential condition for distribution-free learnability is finiteness of the Vapnik-Chervonenkis dimension, a simple combinatorial parameter of the class of concepts to be learned. Using this parameter, the complexity and closure properties of learnable classes are analyzed, and the necessary and sufficient conditions are provided for feasible learnability.read more
Citations
More filters
Dissertation
Noise tolerant algorithms for learning and searching
TL;DR: A general technique is developed which allows nearly all PAC learning algorithms to be converted into highly efficient PAClearning algorithms which tolerate classification noise and malicious errors, and highly efficient algorithms for searching in the presence of linearly bounded errors are developed.
Book ChapterDOI
Learning recursive functions from approximations
TL;DR: Investigated is algorithmic learning, in the limit, of correct programs for recursive functions f from both input/output examples of f and several interesting varieties of approximate additional (algorithmic) information about f.
Journal ArticleDOI
Learning Boxes in High Dimension
Amos Beimel,Eyal Kushilevitz +1 more
TL;DR: The authors' algorithm learns the class of decision trees over n variables, that take values in {0,...,l-1} , with comparison nodes in time poly(n,t,log l) , where t is the number of leaves.
Journal ArticleDOI
Improved upper bounds for probabilities of uniform deviations
TL;DR: In this paper, the authors obtained Vapnik-Chervonenkis type upper bounds for the uniform deviation of probabilities from their expectations, which sharpen previously known probability inequalities.
Proceedings ArticleDOI
Short-Term Load Forecasting with LSTM Based Ensemble Learning
TL;DR: A Fully Connected Cascade Neural Network is incorporated for ensemble learning, which is solved by an enhanced Levenberg-Marquardt (LM) training algorithm, and its superior performance over several baseline schemes is demonstrated.
References
More filters
Book
Computers and Intractability: A Guide to the Theory of NP-Completeness
TL;DR: The second edition of a quarterly column as discussed by the authors provides a continuing update to the list of problems (NP-complete and harder) presented by M. R. Garey and myself in our book "Computers and Intractability: A Guide to the Theory of NP-Completeness,” W. H. Freeman & Co., San Francisco, 1979.
Book
The Art of Computer Programming
TL;DR: The arrangement of this invention provides a strong vibration free hold-down mechanism while avoiding a large pressure drop to the flow of coolant fluid.
Journal ArticleDOI
Pattern Classification and Scene Analysis.
Book
Pattern classification and scene analysis
Richard O. Duda,Peter E. Hart +1 more
TL;DR: In this article, a unified, comprehensive and up-to-date treatment of both statistical and descriptive methods for pattern recognition is provided, including Bayesian decision theory, supervised and unsupervised learning, nonparametric techniques, discriminant analysis, clustering, preprosessing of pictorial data, spatial filtering, shape description techniques, perspective transformations, projective invariants, linguistic procedures, and artificial intelligence techniques for scene analysis.