Journal ArticleDOI
Learnability and the Vapnik-Chervonenkis dimension
Reads0
Chats0
TLDR
This paper shows that the essential condition for distribution-free learnability is finiteness of the Vapnik-Chervonenkis dimension, a simple combinatorial parameter of the class of concepts to be learned.Abstract:
Valiant's learnability model is extended to learning classes of concepts defined by regions in Euclidean space En. The methods in this paper lead to a unified treatment of some of Valiant's results, along with previous results on distribution-free convergence of certain pattern recognition algorithms. It is shown that the essential condition for distribution-free learnability is finiteness of the Vapnik-Chervonenkis dimension, a simple combinatorial parameter of the class of concepts to be learned. Using this parameter, the complexity and closure properties of learnable classes are analyzed, and the necessary and sufficient conditions are provided for feasible learnability.read more
Citations
More filters
Journal ArticleDOI
Privacy preserving classification on local differential privacy in data centers
Weibei Fan,Weibei Fan,Jing He,Jing He,Mengjiao Guo,Peng Li,Peng Li,Zhijie Han,Ruchuan Wang,Ruchuan Wang +9 more
TL;DR: Experiments demonstrated that the differential privacy-based classification algorithm proposed in this paper has higher iteration efficiency, better security and feasible accuracy, on the premise of ensuring availability, has reliable privacy protection characteristics and excellent timeliness.
Journal ArticleDOI
Approximation and Learning of Convex Superpositions
Leonid Gurvits,Pascal Koiran +1 more
TL;DR: An integral representation in the form of a “continuous neural network” which generalizes Barron's is given, and it is shown that the existence of an integral representation is equivalent to both L2andL∞approximability.
Journal Article
Active learning via perfect selective classification
Ran El-Yaniv,Yair Wiener +1 more
TL;DR: A reduction of active learning to selective classification that preserves fast rates is shown and exponential target-independent label complexity speedup is derived for actively learning general (non-homogeneous) linear classifiers when the data distribution is an arbitrary high dimensional mixture of Gaussians.
Journal ArticleDOI
A geometric approach to sample compression
TL;DR: The sample compression conjecture of Littlestone & Warmuth has remained unsolved for a quarter century as mentioned in this paper, and two promising ways forward are: embedding maximal classes into maximum classes with at most a polynomial increase to VC dimension, and compression via operating on geometric representations.
Journal ArticleDOI
Learning in the framework of fuzzy lattices
TL;DR: Applications are shown to one medical data set and two benchmark data sets, where the /spl sigma/-FLL scheme's capacity for treating efficiently real numbers as well as lattice-ordered symbols separately or jointly is demonstrated.
References
More filters
Book
Computers and Intractability: A Guide to the Theory of NP-Completeness
TL;DR: The second edition of a quarterly column as discussed by the authors provides a continuing update to the list of problems (NP-complete and harder) presented by M. R. Garey and myself in our book "Computers and Intractability: A Guide to the Theory of NP-Completeness,” W. H. Freeman & Co., San Francisco, 1979.
Book
The Art of Computer Programming
TL;DR: The arrangement of this invention provides a strong vibration free hold-down mechanism while avoiding a large pressure drop to the flow of coolant fluid.
Journal ArticleDOI
Pattern Classification and Scene Analysis.
Book
Pattern classification and scene analysis
Richard O. Duda,Peter E. Hart +1 more
TL;DR: In this article, a unified, comprehensive and up-to-date treatment of both statistical and descriptive methods for pattern recognition is provided, including Bayesian decision theory, supervised and unsupervised learning, nonparametric techniques, discriminant analysis, clustering, preprosessing of pictorial data, spatial filtering, shape description techniques, perspective transformations, projective invariants, linguistic procedures, and artificial intelligence techniques for scene analysis.