scispace - formally typeset
Search or ask a question
Topic

Multiple kernel learning

About: Multiple kernel learning is a research topic. Over the lifetime, 1630 publications have been published within this topic receiving 56082 citations.


Papers
More filters
Proceedings ArticleDOI
10 May 2011
TL;DR: In this article, the classification of masses with level set segmentation and multiple kernel learning was investigated. And the experimental results showed that level set based segmentation can improve the characterization of masses compared with manually rough segmentation.
Abstract: Classification of mammographic masses as malignant or benign may assist radiologists in reducing the biopsy rate without increasing false negatives. In this paper, we investigated the classification of masses with level set segmentation and multiple kernel learning. Based on the initial contour guided by radiologist, level set segmentation is used to deform the contour and achieves the final segmentation. Morphological features are extracted from the boundary of segmented regions. Linear discriminant analysis, support vector machine and multiple kernel learning are investigated for the final classification. Mammography images from DDSM were used for experiment. The method based on the level set segmentation and the morphological features achieved an accuracy of 76%. The experimental result shows that level set based segmentation can improve the characterization of masses compared with manually rough segmentation.

3 citations

Proceedings Article
01 Jan 2018
TL;DR: An empirical comparison against hard baselines and state-of-the-art MKL methods on several real-world datasets is presented showing the merits of the proposed algorithm especially with respect to the robustness to overfitting.
Abstract: The Multiple Kernel Learning (MKL) paradigm aims at learning the representation from data reducing the effort devoted to the choice of the kernel’s hyperparameters. Typically, the resulting kernel is obtained as the maximal margin combination of a set of base kernels. When too expressive base kernels are provided to the MKL algorithm, the solution found by these algorithms can overfit data. In this paper, we propose a novel MKL algorithm which takes into consideration the expressiveness of the obtained representation in its objective function in such a way that a trade-off between large margins and simple hypothesis spaces can be found. Moreover, an empirical comparison against hard baselines and state-of-the-art MKL methods on several real-world datasets is presented showing the merits of the proposed algorithm especially with respect to the robustness to overfitting.

3 citations

Proceedings ArticleDOI
TL;DR: In this article, the authors apply multiple kernel learning (MKL) to feature fusion in computer vision and compare the MKL method to two baseline approaches and investigate the reasons for performance improvement.
Abstract: Combining information from different sources is a common way to improve classification accuracy in Brain-Computer Interfacing (BCI). For instance, in small sample settings it is useful to integrate data from other subjects or sessions in order to improve the estimation quality of the spatial filters or the classifier. Since data from different subjects may show large variability, it is crucial to weight the contributions according to importance. Many multi-subject learning algorithms determine the optimal weighting in a separate step by using heuristics, however, without ensuring that the selected weights are optimal with respect to classification. In this work we apply Multiple Kernel Learning (MKL) to this problem. MKL has been widely used for feature fusion in computer vision and allows to simultaneously learn the classifier and the optimal weighting. We compare the MKL method to two baseline approaches and investigate the reasons for performance improvement.

3 citations

Proceedings ArticleDOI
01 Jan 2020
TL;DR: A Flexible Kernel by Negotiating between Data-dependentkernel learning and Task-dependent kernel learning termed as FKNDT is presented, which is better than other state-of-the-art kernel-based algorithms in terms of classification accuracy on fifteen benchmark datasets.
Abstract: Kernel learning is a challenging issue which has been vastly investigated over the last decades. The performance of kernel-based methods broadly relies on selecting an appropriate kernel. In machine learning community, a fundamental problem is how to model a suitable kernel. The traditional kernels, e.g., Gaussian kernel and polynomial kernel, are not adequately flexible to employ the information of the given data. Classical kernels are unable to sufficiently depict the characteristics of data similarities. To alleviate this problem, this paper presents a Flexible Kernel by Negotiating between Data-dependent kernel learning and Task-dependent kernel learning termed as FKNDT. Our method learns a suitable kernel by way of the Hadamard product of two types of kernels; a data-dependent kernel and a set of pre-specified classical kernels as a task-dependent kernel. We evaluate a flexible kernel in a supervised manner via Support Vector Machines (SVM). We model a learning process as a joint optimization problem including data-dependent kernel matrix learning, multiple kernel learning by means of quadratic programming, and standard SVM optimization. The experimental results demonstrate our technique provides a more effective kernel than the traditional kernels. Our method is better than other state-of-the-art kernel-based algorithms in terms of classification accuracy on fifteen benchmark datasets.

3 citations

Journal ArticleDOI
01 Jan 2010
TL;DR: MKL problems can be solved efficiently by modified projection gradient method and applied for image categorization and object detection and is evaluated on the ETH-80 dataset for several multi-level image encodings for supervised and unsupervised object recognition and report competitive results.
Abstract: Multiple kernel learning (MKL) aims at simultaneously optimizing kernel weights while training the support vector machine (SVM) to get satisfactory classification or regression results. Recent publications and developments based on SVM have shown that by using MKL one can enhance interpretability of the decision function and improve classifier performance, which motivates researchers to explore the use of homogeneous model obtained as linear combination of various types of kernels. In this paper, we show that MKL problems can be solved efficiently by modified projection gradient method and applied for image categorization and object detection. The kernel is defined as a linear combination of feature histogram function that can measure the degree of similarity of partial correspondence between feature sets for discriminative classification, which allows recognition robust to within-class variation, pose changes, and articulation. We evaluate our proposed framework on the ETH-80 dataset for several multi-level image encodings for supervised and unsupervised object recognition and report competitive results.

3 citations


Network Information
Related Topics (5)
Convolutional neural network
74.7K papers, 2M citations
89% related
Deep learning
79.8K papers, 2.1M citations
89% related
Feature extraction
111.8K papers, 2.1M citations
87% related
Feature (computer vision)
128.2K papers, 1.7M citations
87% related
Image segmentation
79.6K papers, 1.8M citations
86% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202321
202244
202172
2020101
2019113
2018114