scispace - formally typeset
Proceedings ArticleDOI

Optimizing SVMs for complex call classification

Patrick Haffner, +2 more
- Vol. 1, pp 632-635
Reads0
Chats0
TLDR
A global optimization process based on an optimal channel communication model that allows a combination of possibly heterogeneous binary classifiers to decrease the call-type classification error rate for AT&T's How May I Help You (HMIHY/sup (sm)/) natural dialog system by 50 % is proposed.
Abstract
Large margin classifiers such as support vector machines (SVM) or Adaboost are obvious choices for natural language document or call routing. However, how to combine several binary classifiers to optimize the whole routing process and how this process scales when it involves many different decisions (or classes) is a complex problem that has only received partial answers. We propose a global optimization process based on an optimal channel communication model that allows a combination of possibly heterogeneous binary classifiers. As in Markov modeling, computational feasibility is achieved through simplifications and independence assumptions that are easy to interpret. Using this approach, we have managed to decrease the call-type classification error rate for AT&T's How May I Help You (HMIHY/sup (sm)/) natural dialog system by 50 %.

read more

Citations
More filters
Journal ArticleDOI

Using recurrent neural networks for slot filling in spoken language understanding

TL;DR: This paper implemented and compared several important RNN architectures, including Elman, Jordan, and hybrid variants, and implemented these networks with the publicly available Theano neural network toolkit and completed experiments on the well-known airline travel information system (ATIS) benchmark.
Proceedings ArticleDOI

Multi-Domain Joint Semantic Frame Parsing Using Bi-Directional RNN-LSTM.

TL;DR: Experimental results show the power of a holistic multi-domain, multi-task modeling approach to estimate complete semantic frames for all user utterances addressed to a conversational system over alternative methods based on single domain/task deep learning.
Journal ArticleDOI

Application of Deep Belief Networks for natural language understanding

TL;DR: The plain DBN-based model gives a call-routing classification accuracy that is equal to the best of the other models, however, using additional unlabeled data for DBN pre-training and combining Dbn-based learned features with the original features provides significant gains over SVMs, which, in turn, performed better than both MaxEnt and Boosting.
Book ChapterDOI

Theory and Algorithms

Proceedings Article

A joint model of intent determination and slot filling for spoken language understanding

TL;DR: A joint model is proposed based on the idea that the intent and semantic slots of a sentence are correlative, and it outperforms the state-of-the-art approaches on both tasks.
References
More filters

Statistical learning theory

TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Book ChapterDOI

Text Categorization with Suport Vector Machines: Learning with Many Relevant Features

TL;DR: This paper explores the use of Support Vector Machines for learning text classifiers from examples and analyzes the particular properties of learning with text data and identifies why SVMs are appropriate for this task.
Posted ContentDOI

Making large scale SVM learning practical

TL;DR: SVM light as discussed by the authors is an implementation of an SVM learner which addresses the problem of large-scale SVM training with many training examples on the shelf, which makes large scale SVM learning more practical.
Proceedings Article

Learning with Kernels

Related Papers (5)