scispace - formally typeset
Search or ask a question
Topic

Multiple kernel learning

About: Multiple kernel learning is a research topic. Over the lifetime, 1630 publications have been published within this topic receiving 56082 citations.


Papers
More filters
Proceedings Article
Peng Zhou1, Liang Du1, Lei Shi1, Hanmo Wang1, Yi-Dong Shen1 
25 Jul 2015
TL;DR: This paper proposes a novel method for learning a robust yet low-rank kernel for clustering tasks, observing that the noises of each kernel have specific structures, so it can make full use of them to clean multiple input kernels and then aggregate them into a robust, low- rank consensus kernel.
Abstract: Kernel-based methods, such as kernel k-means and kernel PCA, have been widely used in machine learning tasks. The performance of these methods critically depends on the selection of kernel functions; however, the challenge is that we usually do not know what kind of kernels is suitable for the given data and task in advance; this leads to research on multiple kernel learning, i.e. we learn a consensus kernel from multiple candidate kernels. Existing multiple kernel learning methods have difficulty in dealing with noises. In this paper, we propose a novel method for learning a robust yet low-rank kernel for clustering tasks. We observe that the noises of each kernel have specific structures, so we can make full use of them to clean multiple input kernels and then aggregate them into a robust, low-rank consensus kernel. The underlying optimization problem is hard to solve and we will show that it can be solved via alternating minimization, whose convergence is theoretically guaranteed. Experimental results on several benchmark data sets further demonstrate the effectiveness of our method.

38 citations

Proceedings Article
14 Jun 2011
TL;DR: This work proposes a family of online algorithms able to tackle variants of MKL and group-LASSO, for which regret, convergence, and generalization bounds are shown.
Abstract: Training structured predictors often requires a considerable time selecting features or tweaking the kernel. Multiple kernel learning (MKL) sidesteps this issue by embedding the kernel learning into the training procedure. Despite the recent progress towards efficiency of MKL algorithms, the structured output case remains an open research front. We propose a family of online algorithms able to tackle variants of MKL and group-LASSO, for which we show regret, convergence, and generalization bounds. Experiments on handwriting recognition and dependency parsing attest the success of the approach.

38 citations

Proceedings ArticleDOI
20 Mar 2016
TL;DR: This paper proposes a new framework that significantly reduces the complexity of deep multiple kernels, and designs its equivalent deep map network (DMN), using multi-layer explicit maps that approximate the initial DKN with a high precision.
Abstract: Deep multiple kernel learning is a powerful technique that selects and deeply combines multiple elementary kernels in order to provide the best performance on a given classification task. This technique, particularly effective, becomes intractable when handling large scale datasets; indeed, multiple nonlinear kernel combinations are time and memory demanding., In this paper, we propose a new framework that significantly reduces the complexity of deep multiple kernels. Given a deep kernel network (DKN), our method designs its equivalent deep map network (DMN), using multi-layer explicit maps that approximate the initial DKN with a high precision. When combined with support vector machines, the design of DMN preserves high classification accuracy compared to its underlying DKN while being (at least) an order of magnitude faster. Experiments conducted on the challenging Im-ageCLEF2013 annotation benchmark, show that the proposed DMN is indeed effective and highly efficient.

38 citations

Journal Article
TL;DR: According to the composition of multiple kernels, the construction theories of multiple kernel methods are systematically reviewed, the learning methods of multiplekernel with the corresponding characteristics and disadvantages are analyzed, and the respective applications are summarized from three aspects.

38 citations

Proceedings ArticleDOI
01 Sep 2012
TL;DR: This work considers the problem of retrieving weather information from a database of still images, and proposes to use multiple kernel learning to gather and select an optimal subset of image features from a certain feature pool.
Abstract: Low-cost monitoring cameras/webcams provide unique visual information. To take advantage of the vast image dataset captured by a typical webcam, we consider the problem of retrieving weather information from a database of still images. The task is to automatically label all images with different weather conditions (e.g., sunny, cloudy, and overcast), using limited human assistance. To address the drawbacks in existing weather prediction algorithms, we first apply image segmentation to the raw images to avoid disturbance of the non-sky region. Then, we propose to use multiple kernel learning to gather and select an optimal subset of image features from a certain feature pool. To further increase the recognition performance, we adopt multi-pass active learning for selecting the training set. The experimental results show that our weather recognition system achieves high performance.

38 citations


Network Information
Related Topics (5)
Convolutional neural network
74.7K papers, 2M citations
89% related
Deep learning
79.8K papers, 2.1M citations
89% related
Feature extraction
111.8K papers, 2.1M citations
87% related
Feature (computer vision)
128.2K papers, 1.7M citations
87% related
Image segmentation
79.6K papers, 1.8M citations
86% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202321
202244
202172
2020101
2019113
2018114