scispace - formally typeset
Journal ArticleDOI

Learning and generalization characteristics of the random vector functional-link net

Yoh Han Pao, +2 more
- 01 Apr 1994 - 
- Vol. 6, Iss: 2, pp 163-180
Reads0
Chats0
TLDR
The learning and generalization characteristics of the random vector version of the Functional-link net are explored and compared with those attainable with the GDR algorithm and it seems that ‘ overtraining ’ occurs for stochastic mappings.
About
This article is published in Neurocomputing.The article was published on 1994-04-01. It has received 876 citations till now. The article focuses on the topics: Multivariate random variable & Generalization.

read more

Citations
More filters
Journal ArticleDOI

Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture

TL;DR: Compared with existing deep neural networks, experimental results on the Modified National Institute of Standards and Technology database and NYU NORB object recognition dataset benchmark data demonstrate the effectiveness of the proposed Broad Learning System.
Journal ArticleDOI

An Insight into Extreme Learning Machines: Random Neurons, Random Features and Kernels

TL;DR: An insight into ELMs in three aspects, viz: random neurons, random features and kernels is provided and it is shown that in theory ELMs (with the same kernels) tend to outperform support vector machine and its variants in both regression and classification applications with much easier implementation.
Journal ArticleDOI

Stochastic choice of basis functions in adaptive function approximation and the functional-link net

TL;DR: A theoretical justification for the random vector version of the functional-link (RVFL) net is presented, based on a general approach to adaptive function approximation, which results are that the RVFL is a universal approximator for continuous functions on bounded finite dimensional sets.
Journal ArticleDOI

Fuzziness based semi-supervised learning approach for intrusion detection system

TL;DR: A novel fuzziness based semi-supervised learning approach by utilizing unlabeled samples assisted with supervised learning algorithm to improve the classifier's performance for the IDSs is proposed.
Journal ArticleDOI

What are Extreme Learning Machines? Filling the Gap Between Frank Rosenblatt’s Dream and John von Neumann’s Puzzle

TL;DR: ELM theories manage to address the open problem which has puzzled the neural networks, machine learning and neuroscience communities for 60 years: whether hidden nodes/neurons need to be tuned in learning, and proved that in contrast to the common knowledge and conventional neural network learning tenets,hidden nodes/NEurons do not need to been iteratively tuned in wide types of neural networks and learning models.
References
More filters
Journal ArticleDOI

Multilayer feedforward networks are universal approximators

TL;DR: It is rigorously established that standard multilayer feedforward networks with as few as one hidden layer using arbitrary squashing functions are capable of approximating any Borel measurable function from one finite dimensional space to another to any desired degree of accuracy, provided sufficiently many hidden units are available.
Book ChapterDOI

Learning internal representations by error propagation

TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Journal ArticleDOI

Methods of Conjugate Gradients for Solving Linear Systems

TL;DR: An iterative algorithm is given for solving a system Ax=k of n linear equations in n unknowns and it is shown that this method is a special case of a very general method which also includes Gaussian elimination.
Related Papers (5)