scispace - formally typeset
Open AccessPosted Content

Generalizing from a Few Examples: A Survey on Few-Shot Learning

TLDR
A thorough survey to fully understand Few-Shot Learning (FSL), and categorizes FSL methods from three perspectives: data, which uses prior knowledge to augment the supervised experience; model, which used to reduce the size of the hypothesis space; and algorithm, which using prior knowledgeto alter the search for the best hypothesis in the given hypothesis space.
Abstract
Machine learning has been highly successful in data-intensive applications but is often hampered when the data set is small. Recently, Few-Shot Learning (FSL) is proposed to tackle this problem. Using prior knowledge, FSL can rapidly generalize to new tasks containing only a few samples with supervised information. In this paper, we conduct a thorough survey to fully understand FSL. Starting from a formal definition of FSL, we distinguish FSL from several relevant machine learning problems. We then point out that the core issue in FSL is that the empirical risk minimized is unreliable. Based on how prior knowledge can be used to handle this core issue, we categorize FSL methods from three perspectives: (i) data, which uses prior knowledge to augment the supervised experience; (ii) model, which uses prior knowledge to reduce the size of the hypothesis space; and (iii) algorithm, which uses prior knowledge to alter the search for the best hypothesis in the given hypothesis space. With this taxonomy, we review and discuss the pros and cons of each category. Promising directions, in the aspects of the FSL problem setups, techniques, applications and theories, are also proposed to provide insights for future research.

read more

Citations
More filters
Journal ArticleDOI

Network Intrusion Detection System: A systematic study of Machine Learning and Deep Learning approaches

TL;DR: The concept of IDS is clarified and the taxonomy based on the notable ML and DL techniques adopted in designing network‐based IDS (NIDS) systems is provided, which highlights various research challenges and provided the future scope for the research in improving ML andDL‐based NIDS.
Posted Content

COVID-19 Image Data Collection: Prospective Predictions Are the Future

TL;DR: This dataset currently contains hundreds of frontal view X-rays and is the largest public resource for COVID-19 image and prognostic data, making it a necessary resource to develop and evaluate tools to aid in the treatment of CO VID-19.
Journal ArticleDOI

Informed Machine Learning -- A Taxonomy and Survey of Integrating Knowledge into Learning Systems

TL;DR: A definition and proposed concept for informed machine learning is provided, which illustrates its building blocks and distinguishes it from conventional machine learning, and a taxonomy is introduced that serves as a classification framework forinformed machine learning approaches.
Journal ArticleDOI

Empowering Things With Intelligence: A Survey of the Progress, Challenges, and Opportunities in Artificial Intelligence of Things

TL;DR: In this article, the authors present a comprehensive survey on AIoT to show how AI can empower the IoT to make it faster, smarter, greener, and safer, and highlight the challenges facing AI-oT and some potential research opportunities.
Journal ArticleDOI

The myth of generalisability in clinical research and machine learning in health care.

TL;DR: This paper argues that an emphasis on overly broad notions of generalisability as it pertains to applications of machine learning in health care can overlook situations in which machine learning might provide clinical utility.
References
More filters
Proceedings ArticleDOI

Deep Residual Learning for Image Recognition

TL;DR: In this article, the authors proposed a residual learning framework to ease the training of networks that are substantially deeper than those used previously, which won the 1st place on the ILSVRC 2015 classification task.
Proceedings Article

ImageNet Classification with Deep Convolutional Neural Networks

TL;DR: The state-of-the-art performance of CNNs was achieved by Deep Convolutional Neural Networks (DCNNs) as discussed by the authors, which consists of five convolutional layers, some of which are followed by max-pooling layers, and three fully-connected layers with a final 1000-way softmax.
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Proceedings ArticleDOI

ImageNet: A large-scale hierarchical image database

TL;DR: A new database called “ImageNet” is introduced, a large-scale ontology of images built upon the backbone of the WordNet structure, much larger in scale and diversity and much more accurate than the current image datasets.
Journal ArticleDOI

Regression Shrinkage and Selection via the Lasso

TL;DR: A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.