# VC-Dimension of Hyperplanes over Finite Fields

19 Jul 2023-

TL;DR: In this article , the authors generalize Sun's result to arbitrary dimension and improve the exponent in the case $d=3, where the VC-dimension of the vector space is 3.

Abstract: Let $\mathbb{F}_q^d$ be the $d$-dimensional vector space over the finite field with $q$ elements. For a subset $E\subseteq \mathbb{F}_q^d$ and a fixed nonzero $t\in \mathbb{F}_q$, let $\mathcal{H}_t(E)=\{h_y: y\in E\}$, where $h_y$ is the indicator function of the set $\{x\in E: x\cdot y=t\}$. Two of the authors, with Maxwell Sun, showed in the case $d=3$ that if $|E|\geq Cq^{\frac{11}{4}}$ and $q$ is sufficiently large, then the VC-dimension of $\mathcal{H}_t(E)$ is 3. In this paper, we generalize the result to arbitrary dimension and improve the exponent in the case $d=3$.

##### References

More filters

••

TL;DR: This chapter reproduces the English translation by B. Seckler of the paper by Vapnik and Chervonenkis in which they gave proofs for the innovative results they had obtained in a draft form in July 1966 and announced in 1968 in their note in Soviet Mathematics Doklady.

Abstract: This chapter reproduces the English translation by B. Seckler of the paper by Vapnik and Chervonenkis in which they gave proofs for the innovative results they had obtained in a draft form in July 1966 and announced in 1968 in their note in Soviet Mathematics Doklady. The paper was first published in Russian as Вапник В. Н. and Червоненкис А. Я. О равномерноЙ сходимости частот появления событиЙ к их вероятностям. Теория вероятностеЙ и ее применения 16(2), 264–279 (1971).

3,939 citations

•

01 Jan 2015TL;DR: The aim of this textbook is to introduce machine learning, and the algorithmic paradigms it offers, in a principled way in an advanced undergraduate or beginning graduate course.

Abstract: Machine learning is one of the fastest growing areas of computer science, with far-reaching applications. The aim of this textbook is to introduce machine learning, and the algorithmic paradigms it offers, in a principled way. The book provides an extensive theoretical account of the fundamental ideas underlying machine learning and the mathematical derivations that transform these principles into practical algorithms. Following a presentation of the basics of the field, the book covers a wide array of central topics that have not been addressed by previous textbooks. These include a discussion of the computational complexity of learning and the concepts of convexity and stability; important algorithmic paradigms including stochastic gradient descent, neural networks, and structured output learning; and emerging theoretical concepts such as the PAC-Bayes approach and compression-based bounds. Designed for an advanced undergraduate or beginning graduate course, the text makes the fundamentals and algorithms of machine learning accessible to students and non-expert readers in statistics, computer science, mathematics, and engineering.

3,857 citations

•

01 Jan 2005

TL;DR: This chapter discusses 100 Research Problems in Discrete Geometry from the Facsimile edition of the World Classics in Mathematics Series, vol.

Abstract: Note: Professor Pach's number: [045]; Also in: Facsimile edition: World Classics in Mathematics Series, vol. 28, China Science Press, Beijing, 2006. Mimeographed from 100 Research Problems in Discrete Geometry (with W. O. J. Moser). Reference DCG-BOOK-2008-001 URL: http://www.math.nyu.edu/~pach/research_problems.html Record created on 2008-11-18, modified on 2017-05-12

866 citations

••

TL;DR: In this paper, the sets of distances of n points have been studied in the setting of sets of points, and the American Mathematical Monthly: Vol. 53, No. 5, pp. 248-250.

Abstract: (1946). On Sets of Distances of n Points. The American Mathematical Monthly: Vol. 53, No. 5, pp. 248-250.

592 citations

••

TL;DR: In this paper, it was shown that a set of points in R 2 has at least c N log N distinct distances, thus obtaining the sharp exponent in a problem of Erd} os.

Abstract: In this paper, we prove that a set of N points in R 2 has at least c N log N distinct distances, thus obtaining the sharp exponent in a problem of Erd} os. We follow the setup of Elekes and Sharir which, in the spirit of the Erlangen program, allows us to study the problem in the group of rigid motions of the plane. This converts the problem to one of point-line incidences in space. We introduce two new ideas in our proof. In order to control points where many lines are incident, we create a cell decomposition using the polynomial ham sandwich theorem. This creates a dichotomy: either most of the points are in the interiors of the cells, in which case we immediately get sharp results or, alternatively, the points lie on the walls of the cells, in which case they are in the zero-set of a polynomial of suprisingly low degree, and we may apply the algebraic method. In order to control points incident to only two lines, we use the ecnode polynomial of the Rev. George Salmon to conclude that most of the lines lie on a ruled surface. Then we use the geometry of ruled surfaces to complete the proof.

439 citations