scispace - formally typeset
Search or ask a question
Author

Norbert Sauer

Bio: Norbert Sauer is an academic researcher from University of Calgary. The author has contributed to research in topics: Countable set & Metric space. The author has an hindex of 22, co-authored 111 publications receiving 2801 citations. Previous affiliations of Norbert Sauer include Emory University & University of La Réunion.


Papers
More filters
Journal ArticleDOI
TL;DR: This paper will answer the question in the affirmative by determining the exact upper bound of T if T is a family of subsets of some infinite set S then either there exists to each number n a set A ⊂ S with |A| = n such that |T ∩ A| = 2n or there exists some number N such that •A| c for each A⩾ N and some constant c.

1,029 citations

Journal ArticleDOI
TL;DR: A new upper bound is obtained for φ(2, k) for all k ⩾ 3 and a new lower bound forπ(n, 3) is obtained, both for positive integers n and k.

217 citations

Journal ArticleDOI
TL;DR: It is proved that indeed for all 4-chromatic graphsG χ(ℒ(G))=3 which establishes Hedetniemi’s [3] conjecture for 4- chromatic graphs.
Abstract: For any graphG and numbern≧1 two functionsf, g fromV(G) into {1, 2, ...,n} are adjacent if for all edges (a, b) ofG, f(a) ≠g(b). The graph of all such functions is the colouring graph ℒ(G) ofG. We establish first that χ(G)=n+1 implies χ(ℒ(G))=n iff χ(G ×H)=n+1 for all graphsH with χ(H)≧n+1. Then we will prove that indeed for all 4-chromatic graphsG χ(ℒ(G))=3 which establishes Hedetniemi’s [3] conjecture for 4-chromatic graphs.

134 citations

Journal ArticleDOI
TL;DR: It is proved that if G contains no monochromatic infinite outward path, then there is an independent set S of vertices of G such that, for every vertex x not in S, there is a monochromaatic directed path from x to a vertex of S.

131 citations


Cited by
More filters
Book
Vladimir Vapnik1
01 Jan 1995
TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Abstract: Setting of the learning problem consistency of learning processes bounds on the rate of convergence of learning processes controlling the generalization ability of learning processes constructing learning algorithms what is important in learning theory?.

40,147 citations

Journal ArticleDOI
TL;DR: This tutorial gives an overview of the basic ideas underlying Support Vector (SV) machines for function estimation, and includes a summary of currently used algorithms for training SV machines, covering both the quadratic programming part and advanced methods for dealing with large datasets.
Abstract: In this tutorial we give an overview of the basic ideas underlying Support Vector (SV) machines for function estimation. Furthermore, we include a summary of currently used algorithms for training SV machines, covering both the quadratic (or convex) programming part and advanced methods for dealing with large datasets. Finally, we mention some modifications and extensions that have been applied to the standard SV algorithm, and discuss the aspect of regularization from a SV perspective.

10,696 citations

Book
01 Jan 1991
TL;DR: A particular set of problems - all dealing with “good” colorings of an underlying set of points relative to a given family of sets - is explored.
Abstract: The use of randomness is now an accepted tool in Theoretical Computer Science but not everyone is aware of the underpinnings of this methodology in Combinatorics - particularly, in what is now called the probabilistic Method as developed primarily by Paul Erdoős over the past half century. Here I will explore a particular set of problems - all dealing with “good” colorings of an underlying set of points relative to a given family of sets. A central point will be the evolution of these problems from the purely existential proofs of Erdős to the algorithmic aspects of much interest to this audience.

6,594 citations

Book
01 Jan 2015
TL;DR: The aim of this textbook is to introduce machine learning, and the algorithmic paradigms it offers, in a principled way in an advanced undergraduate or beginning graduate course.
Abstract: Machine learning is one of the fastest growing areas of computer science, with far-reaching applications. The aim of this textbook is to introduce machine learning, and the algorithmic paradigms it offers, in a principled way. The book provides an extensive theoretical account of the fundamental ideas underlying machine learning and the mathematical derivations that transform these principles into practical algorithms. Following a presentation of the basics of the field, the book covers a wide array of central topics that have not been addressed by previous textbooks. These include a discussion of the computational complexity of learning and the concepts of convexity and stability; important algorithmic paradigms including stochastic gradient descent, neural networks, and structured output learning; and emerging theoretical concepts such as the PAC-Bayes approach and compression-based bounds. Designed for an advanced undergraduate or beginning graduate course, the text makes the fundamentals and algorithms of machine learning accessible to students and non-expert readers in statistics, computer science, mathematics, and engineering.

3,857 citations

Book
01 Jan 2006
TL;DR: In this paper, the authors provide a comprehensive treatment of the problem of predicting individual sequences using expert advice, a general framework within which many related problems can be cast and discussed, such as repeated game playing, adaptive data compression, sequential investment in the stock market, sequential pattern analysis, and several other problems.
Abstract: This important text and reference for researchers and students in machine learning, game theory, statistics and information theory offers a comprehensive treatment of the problem of predicting individual sequences. Unlike standard statistical approaches to forecasting, prediction of individual sequences does not impose any probabilistic assumption on the data-generating mechanism. Yet, prediction algorithms can be constructed that work well for all possible sequences, in the sense that their performance is always nearly as good as the best forecasting strategy in a given reference class. The central theme is the model of prediction using expert advice, a general framework within which many related problems can be cast and discussed. Repeated game playing, adaptive data compression, sequential investment in the stock market, sequential pattern analysis, and several other problems are viewed as instances of the experts' framework and analyzed from a common nonstochastic standpoint that often reveals new and intriguing connections.

3,615 citations