scispace - formally typeset
Open AccessJournal ArticleDOI

Structural Minimax Probability Machine

Reads0
Chats0
TLDR
This paper uses two finite mixture models to capture the structural information of the data from binary classification and proposes a structural MPM, which can be interpreted as a large margin classifier and can be transformed to support vector machine and maxi–min margin machine under certain special conditions.
Abstract
Minimax probability machine (MPM) is an interesting discriminative classifier based on generative prior knowledge. It can directly estimate the probabilistic accuracy bound by minimizing the maximum probability of misclassification. The structural information of data is an effective way to represent prior knowledge, and has been found to be vital for designing classifiers in real-world problems. However, MPM only considers the prior probability distribution of each class with a given mean and covariance matrix, which does not efficiently exploit the structural information of data. In this paper, we use two finite mixture models to capture the structural information of the data from binary classification. For each subdistribution in a finite mixture model, only its mean and covariance matrix are assumed to be known. Based on the finite mixture models, we propose a structural MPM (SMPM). SMPM can be solved effectively by a sequence of the second-order cone programming problems. Moreover, we extend a linear model of SMPM to a nonlinear model by exploiting kernelization techniques. We also show that the SMPM can be interpreted as a large margin classifier and can be transformed to support vector machine and maxi–min margin machine under certain special conditions. Experimental results on both synthetic and real-world data sets demonstrate the effectiveness of SMPM.

read more

Citations
More filters
Journal ArticleDOI

Quality of Protection in Cloud-Assisted Cognitive Machine-to-Machine Communications for Industrial Systems

TL;DR: This paper focuses on secure information transmission for the primary system when the secondary users (SUs) are the potential eavesdroppers and aims to jointly design power splitting and secure beamforming to maximize the secondary M2M system data rate subject to the secrecy requirement of thePrimary system and the ST power constraint.
Journal ArticleDOI

Studying the capture of stochastic events using radar and a mobile robot

TL;DR: This paper model intrusion detection processes via a radar and a mobile sensor/robot (or a vehicular device) mathematically and considers using a pair of moving sensors or robots together, and studies their detection quality help to understand the benefits of understanding mobility coverage.
Journal ArticleDOI

Defense against malicious URL spreading in micro-blog network with hub nodes

TL;DR: A defending scheme against malicious Uniform Resource Locator (URL) diffusing in micro‐blog networks with hub nodes and the comparison mechanism is added to reduce the redundancy of spreading the warning message in the networks.
Book ChapterDOI

A New Universal Quantum Gates and Its Simulation on GPGPU

TL;DR: Results of the experiments show that the Grover’s search algorithm will acquire quadratic acceleration when solving the search problem, which reflects the validity of the proposed gates.
Journal ArticleDOI

Solving -Barrier Coverage Problem Using Modified Gravitational Search Algorithm

TL;DR: The proposed PGSA adopts a -barrier coverage generation strategy based on probability and integrates the exploitation ability in particle swarm optimization to update the velocity and enhance the global search capability and introduces the boundary mutation strategy of an agent to increase the population diversity and search accuracy.
References
More filters
Journal ArticleDOI

A tutorial on hidden Markov models and selected applications in speech recognition

TL;DR: In this paper, the authors provide an overview of the basic theory of hidden Markov models (HMMs) as originated by L.E. Baum and T. Petrie (1966) and give practical details on methods of implementation of the theory along with a description of selected applications of HMMs to distinct problems in speech recognition.
Journal ArticleDOI

Hierarchical Grouping to Optimize an Objective Function

TL;DR: In this paper, a procedure for forming hierarchical groups of mutually exclusive subsets, each of which has members that are maximally similar with respect to specified characteristics, is suggested for use in large-scale (n > 100) studies when a precise optimal solution for a specified number of groups is not practical.
Book ChapterDOI

Neural Networks for Pattern Recognition

TL;DR: The chapter discusses two important directions of research to improve learning algorithms: the dynamic node generation, which is used by the cascade correlation algorithm; and designing learning algorithms where the choice of parameters is not an issue.
Journal ArticleDOI

Using SeDuMi 1.02, a MATLAB toolbox for optimization over symmetric cones

TL;DR: This paper describes how to work with SeDuMi, an add-on for MATLAB, which lets you solve optimization problems with linear, quadratic and semidefiniteness constraints by exploiting sparsity.
Journal ArticleDOI

A comparison of methods for multiclass support vector machines

TL;DR: Decomposition implementations for two "all-together" multiclass SVM methods are given and it is shown that for large problems methods by considering all data at once in general need fewer support vectors.
Related Papers (5)