scispace - formally typeset
K

Kristin P. Bennett

Researcher at Rensselaer Polytechnic Institute

Publications -  176
Citations -  10074

Kristin P. Bennett is an academic researcher from Rensselaer Polytechnic Institute. The author has contributed to research in topics: Support vector machine & Computer science. The author has an hindex of 45, co-authored 158 publications receiving 9250 citations. Previous affiliations of Kristin P. Bennett include University of Wisconsin-Madison & Catholic University of Leuven.

Papers
More filters
Proceedings Article

Semi-Supervised Support Vector Machines

TL;DR: A general S3VM model is proposed that minimizes both the misclassification error and the function capacity based on all the available data that can be converted to a mixed-integer program and then solved exactly using integer programming.
Journal ArticleDOI

Robust linear programming discrimination of two linearly inseparable sets

TL;DR: A single linear programming formulation is proposed which generates a plane that of minimizes an average sum of misclassified points belonging to two disjoint points sets in n-dimensional real space, without the imposition of extraneous normalization constraints that inevitably fail to handle certain cases.
Journal ArticleDOI

Support vector machines: hype or hallelujah?

TL;DR: An intuitive explanation of SVMs from a geometric perspective is provided and the classification problem is used to investigate the basic concepts behind SVMs and to examine their strengths and weaknesses from a data mining perspective.
Journal Article

Dimensionality reduction via sparse support vector machines

TL;DR: The method constructs a series of sparse linear SVMs to generate linear models that can generalize well, and uses a subset of nonzero weighted variables found by the linear models to produce a final nonlinear model.
Journal ArticleDOI

Linear Programming Boosting via Column Generation

TL;DR: It is proved that for classification, minimizing the 1-norm soft margin error function directly optimizes a generalization error bound and is competitive in quality and computational cost to AdaBoost.