scispace - formally typeset
P

Purdy Ho

Researcher at Hewlett-Packard

Publications -  12
Citations -  1304

Purdy Ho is an academic researcher from Hewlett-Packard. The author has contributed to research in topics: Support vector machine & Kernel (statistics). The author has an hindex of 9, co-authored 12 publications receiving 1272 citations.

Papers
More filters
Proceedings Article

A Kullback-Leibler Divergence Based Kernel for SVM Classification in Multimedia Applications

TL;DR: This paper suggests an alternative procedure to the Fisher kernel for systematically finding kernel functions that naturally handle variable length sequence data in multimedia domains and derives a kernel distance based on the Kullback-Leibler (KL) divergence between generative models.
Journal ArticleDOI

Face recognition: component-based versus global approaches

TL;DR: A component-based method and two global methods for face recognition and evaluate them with respect to robustness against pose changes are presented and the component system clearly outperformed both global systems.
Patent

Robust multi-factor authentication for secure application environments

TL;DR: In this paper, a multi-factor user authentication system is presented, where one authentication factor is the user's speech pattern and another authentication factor, a one-time passcode, is provided via voice portal and/or browser input.
Book ChapterDOI

The Kullback-Leibler kernel as a framework for discriminant and localized representations for visual recognition

TL;DR: A taxonomy of kernels based on the combination of the KL-kernel with various probabilistic representation previously proposed in the recognition literature is derived, which shows that these kernels can significantly outperform traditional SVM solutions for recognition.
Proceedings Article

A New SVM Approach to Speaker Identification and Verification Using Probabilistic Distance Kernels

Pedro J. Moreno, +1 more
TL;DR: This work explores the use of generative speaker identification models such as Gaussian Mixture Models and derive a kernel distance based on the Kullback-Leibler (KL) divergence between generative models, which combines the best of both generative and discriminative methods.