Determinantal point processes for machine learning
Alex Kulesza,Ben Taskar +1 more
Reads0
Chats0
TLDR
Determinantal Point Processes for Machine Learning provides a comprehensible introduction to DPPs, focusing on the intuitions, algorithms, and extensions that are most relevant to the machine learning community, and shows how they can be applied to real-world applications.Abstract:
Determinantal point processes (DPPs) are elegant probabilistic models of repulsion that arise in quantum physics and random matrix theory. In contrast to traditional structured models like Markov random fields, which become intractable and hard to approximate in the presence of negative correlations, DPPs offer efficient and exact algorithms for sampling, marginalization, conditioning, and other inference tasks. We provide a gentle introduction to DPPs, focusing on the intuitions, algorithms, and extensions that are most relevant to the machine learning community, and show how DPPs can be applied to real-world applications like finding diverse sets of high-quality search results, building informative summaries by selecting diverse sentences from documents, modeling non-overlapping human poses in images or video, and automatically building timelines of important news stories.read more
Citations
More filters
Journal ArticleDOI
Recent automatic text summarization techniques: a survey
Mahak Gambhir,Vishal Gupta +1 more
TL;DR: A comprehensive survey of recent text summarization extractive approaches developed in the last decade is presented and the discussion of useful future directions that can help researchers to identify areas where further research is needed are discussed.
Proceedings Article
Diverse Sequential Subset Selection for Supervised Video Summarization
TL;DR: This work proposes the sequential determinantal point process (seqDPP), a probabilistic model for diverse sequential subset selection, which heeds the inherent sequential structures in video data, thus overcoming the deficiency of the standard DPP.
Posted Content
Video Summarization with Long Short-term Memory
TL;DR: Long Short-Term Memory (LSTM), a special type of recurrent neural networks are used to model the variable-range dependencies entailed in the task of video summarization to improve summarization by reducing the discrepancies in statistical properties across those datasets.
Book
Kernel Mean Embedding of Distributions: A Review and Beyond
TL;DR: The kernel mean embedding (KME) as discussed by the authors is a generalization of the original feature map of support vector machines (SVMs) and other kernel methods, and it can be viewed as a generalisation of the SVM feature map.
References
More filters
Book
The Nature of Statistical Learning Theory
TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Journal ArticleDOI
A Computational Approach to Edge Detection
TL;DR: There is a natural uncertainty principle between detection and localization performance, which are the two main goals, and with this principle a single operator shape is derived which is optimal at any scale.
Proceedings ArticleDOI
Object recognition from local scale-invariant features
TL;DR: Experimental results show that robust object recognition can be achieved in cluttered partially occluded images with a computation time of under 2 seconds.
Proceedings Article
Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data
TL;DR: This work presents iterative parameter estimation algorithms for conditional random fields and compares the performance of the resulting models to HMMs and MEMMs on synthetic and natural-language data.