scispace - formally typeset
Open AccessJournal ArticleDOI

Using the Perceptron Algorithm to Find Consistent Hypotheses

TLDR
A simple proof is given that the perceptron learning algorithm for finding a linearly separable boolean function consistent with a sample of such a function is not efficient.
Abstract
The perceptron learning algorithm yields quite naturally an algorithm for finding a linearly separable boolean function consistent with a sample of such a function. Using the idea of a specifying sample, we give a simple proof that this algorithm is not efficient, in general.

read more

Citations
More filters
Journal ArticleDOI

On specifying Boolean functions by labelled examples

TL;DR: A simple proof of the fact that for any linearly separable Boolean function, there is exactly one set of examples of minimal cardinality which specifies the function.
Journal ArticleDOI

Classification by polynomial surfaces

TL;DR: It is shown that, for even n, at most half of all Boolean functions are realizable by a separating surface of degree ⌊ n 2 ⌋, and the Vapnik-Chervonenkis dimension of the class of functions realized by polynomial separating surfaces of at most a given degree is computed.

Boolean Functions and Artificial Neural Networks

TL;DR: The type of Boolean functions a given type of network can compute, and how extensive or expressive the set of functions so computable is, is investigated.
Journal ArticleDOI

Algebraic Techniques for Constructing Minimal Weight Threshold Functions

TL;DR: It is proved that the class of linear threshold functions with polynomial-size weights can be divided into subclasses according to the degree of the polyn coefficients, and that there exists a minimal weight linear threshold function for any arbitrary number of inputs and any weight size.

Learning multivalued multithreshold functions

TL;DR: A simple description of an algorithm based on a procedure suggested by Takiyama is given, and some open questions on the effectiveness of this algorithm are raised, and on the complexity of finding consistent hypotheses for samples of multithreshold functions are raised.
References
More filters
Journal ArticleDOI

Learnability and the Vapnik-Chervonenkis dimension

TL;DR: This paper shows that the essential condition for distribution-free learnability is finiteness of the Vapnik-Chervonenkis dimension, a simple combinatorial parameter of the class of concepts to be learned.
Journal ArticleDOI

Learning Quickly When Irrelevant Attributes Abound: A New Linear-Threshold Algorithm

TL;DR: This work presents one such algorithm that learns disjunctive Boolean functions, along with variants for learning other classes of Boolean functions.
Book

Computational learning theory: an introduction

TL;DR: This volume is relatively self contained as the necessary background material from logic, probability and complexity theory is included, and will form an introduction to the theory of computational learning, suitable for a broad spectrum of graduate students from theoretical computer science and mathematics.
Journal ArticleDOI

Linear function neurons: Structure and training

TL;DR: The similarities between its simplest mathematical representation (perceptron training), a formal model of animal learning, and one mechanism of neural learning (Aplysia gill withdrawal) are pointed out.
Related Papers (5)