scispace - formally typeset
Search or ask a question
Topic

Multiple kernel learning

About: Multiple kernel learning is a research topic. Over the lifetime, 1630 publications have been published within this topic receiving 56082 citations.


Papers
More filters
Proceedings ArticleDOI
14 Nov 2013
TL;DR: A novel multiple kernel learning method based on the notion of Gaussianity evaluated by entropy power inequality is proposed which only optimizes the kernel combination coefficients, while the conventional methods need optimizing both combination coefficients and classifier parameters.
Abstract: Kernel methods have become a standard solution for a large number of data analysis, and extensively utilized in the field of signal processing including analysis of speech, image, time series, and DNA sequences. The main difficulty in applying the kernel method is in designing the appropriate kernel for the specific data, and multiple kernel learning (MKL) is one of the principled approaches for kernel design problem. In this paper, a novel multiple kernel learning method based on the notion of Gaussianity evaluated by entropy power inequality is proposed. The notable characteristics of the proposed method are in utilizing the entropy power inequality for kernel learning, and in realizing an MKL algorithm which only optimizes the kernel combination coefficients, while the conventional methods need optimizing both combination coefficients and classifier parameters. The proposed MKL algorithm is experimentally shown to have good classification accuracy.

1 citations

Journal ArticleDOI
TL;DR: A novel algorithm termed as elastic multiple kernel discriminant analysis (EMKDA) is proposed by using hybrid regularization for automatically learning kernels over a linear combination of pre-specified kernel functions.
Abstract: Kernel discriminant analysis (KDA) is one of the state-of-the-art kernel-based methods for pattern classification and dimensionality reduction. It performs linear discriminant analysis in the feature space via kernel function. However, the performance of KDA greatly depends on the selection of the optimal kernel for the learning task of interest. In this paper, we propose a novel algorithm termed as elastic multiple kernel discriminant analysis (EMKDA) by using hybrid regularization for automatically learning kernels over a linear combination of pre-specified kernel functions. EMKDA makes use of a mixing norm regularization function to compromise the sparsity and non-sparsity of the kernel weights. A semi-infinite program based algorithm is then proposed to solve EMKDA. Extensive experiments on synthetic datasets, UCI benchmark datasets, digit and terrain database are conducted to show the effectiveness of the proposed methods.

1 citations

Proceedings ArticleDOI
01 Sep 2016
TL;DR: This paper proposes a supervised method to compute the laplacian graph and uses hellinger distance to measure the similarity of data samples and aims to obtain a classifier which reaches the minimization of classification error and considers the structural information of data to have a reasonable view of the whole data.
Abstract: In supervised classification, many previous methods are model selections or parameters learning, not taking consideration of the structural information of data. Therefore in this paper, manifold regularization is used to obtain the structural information of data. Widely used method in computing manifold regularization is laplacian graph and it is usually unsupervised. Considering our problem is supervised classification, we propose a supervised method to compute the laplacian graph and use hellinger distance to measure the similarity of data samples, for hellinger distance can give a comprehensive evaluation of the relations of samples from four aspects, which are similarity, density, dimension and direction. Traditional linear model or support vector machine needs improving when dealing with datasets with many attributes or multi-source. Therefore, adaptable and flexible multiple kernel learning is proposed recently. Our supervised classification model uses manifold regularization to constrain multiple kernel classifier, incorporating the structural information, hoping to obtain a classifier which reaches the minimization of classification error and considers the structural information of data to have a reasonable view of the whole data. In experiments, we use UCI dataset to show the classification performance of the proposed model, and use a synthetic dataset to validate that adding manifold regularization can obtain parts of the structural information.

1 citations

Journal ArticleDOI
TL;DR: A coupled multiple kernel learning method for supervised classification (CMKL-C) is proposed, which comprehensively involves the intra-coupling within each kernel, inter-Coupling among different kernels and coupling between target labels and real ones in MKL.
Abstract: Multiple kernel learning (MKL) has recently received significant attention due to the fact that it is able to automatically fuse information embedded in multiple base kernels and then find a new kernel for classification or regression. In this paper, we propose a coupled multiple kernel learning method for supervised classification (CMKL-C), which comprehensively involves the intra-coupling within each kernel, inter-coupling among different kernels and coupling between target labels and real ones in MKL. Specifically, the intra-coupling controls the class distribution in a kernel space, the inter-coupling captures the co-information of base kernel matrices, and the last type of coupling determines whether the new learned kernel can make a correct decision. Furthermore, we deduce the analytical solutions to solve the CMKL-C optimization problem for highly efficient learning. Experimental results over eight UCI data sets and three bioinformatics data sets demonstrate the superior performance of CMKL-C in terms of the classification accuracy.

1 citations


Network Information
Related Topics (5)
Convolutional neural network
74.7K papers, 2M citations
89% related
Deep learning
79.8K papers, 2.1M citations
89% related
Feature extraction
111.8K papers, 2.1M citations
87% related
Feature (computer vision)
128.2K papers, 1.7M citations
87% related
Image segmentation
79.6K papers, 1.8M citations
86% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202321
202244
202172
2020101
2019113
2018114