scispace - formally typeset
Open AccessBook ChapterDOI

Image annotation using metric learning in semantic neighbourhoods

Reads0
Chats0
TLDR
2PKNN, a two-step variant of the classical K-nearest neighbour algorithm, is proposed that performs comparable to the current state-of-the-art on three challenging image annotation datasets, and shows significant improvements after metric learning.
Abstract
Automatic image annotation aims at predicting a set of textual labels for an image that describe its semantics. These are usually taken from an annotation vocabulary of few hundred labels. Because of the large vocabulary, there is a high variance in the number of images corresponding to different labels ("class-imbalance"). Additionally, due to the limitations of manual annotation, a significant number of available images are not annotated with all the relevant labels ("weak-labelling"). These two issues badly affect the performance of most of the existing image annotation models. In this work, we propose 2PKNN, a two-step variant of the classical K-nearest neighbour algorithm, that addresses these two issues in the image annotation task. The first step of 2PKNN uses "image-to-label" similarities, while the second step uses "image-to-image" similarities; thus combining the benefits of both. Since the performance of nearest-neighbour based methods greatly depends on how features are compared, we also propose a metric learning framework over 2PKNN that learns weights for multiple features as well as distances together. This is done in a large margin set-up by generalizing a well-known (single-label) classification metric learning algorithm for multi-label prediction. For scalability, we implement it by alternating between stochastic sub-gradient descent and projection steps. Extensive experiments demonstrate that, though conceptually simple, 2PKNN alone performs comparable to the current state-of-the-art on three challenging image annotation datasets, and shows significant improvements after metric learning.

read more

Content maybe subject to copyright    Report

Citations
More filters

IPL at ImageCLEF 2018: A kNN-based Concept Detection Approach.

TL;DR: A k-NN based concept detection algorithm was used in order to automatically predict multiple concepts in medical images and the visual representation of images was based on the bagof-visual-words and bag-of-colors models.
Journal ArticleDOI

Trigraph Regularized Collective Matrix Tri-Factorization Framework on Multiview Features for Multilabel Image Annotation

TL;DR: This paper proposes a novel image annotation framework that uses multiple information from data to achieve better annotation performance than most state-of-the-art methods and shows the annotation process as a precise optimization problem and solved by an iterative algorithm, which proves the correctness of the proposed method from the mathematical theory.

Recent advances on supervised distance metric learning algorithms

TL;DR: Recently developed algorithms for supervised distance metric learning based on the partition of pairwise constraints and non-pairwise constraints are reviewed and some representative algorithms are introduced and their respective pros and cons are analyzed.
Journal ArticleDOI

Designing a Symmetric Classifier for Image Annotation using Multi-Layer Sparse Coding ☆

TL;DR: A novel annotation method that employs two layers of sparse coding and performs coarse-to-fine labeling that outperforms various previously proposed annotation systems, but also achieves symmetric response in terms of precision and recall.
Journal ArticleDOI

Image tag-ranking via pairwise supervision based semi-supervised model

TL;DR: A novel model which uses images with ranked tag lists as its supervision information and sufficiently exploits the tag relevance to images as well as the ranking structures of tag lists to capture the intrinsic ranking structures.
References
More filters
Proceedings Article

Distance Metric Learning for Large Margin Nearest Neighbor Classification

TL;DR: In this article, a Mahanalobis distance metric for k-NN classification is trained with the goal that the k-nearest neighbors always belong to the same class while examples from different classes are separated by a large margin.
Journal ArticleDOI

Distance Metric Learning for Large Margin Nearest Neighbor Classification

TL;DR: This paper shows how to learn a Mahalanobis distance metric for kNN classification from labeled examples in a globally integrated manner and finds that metrics trained in this way lead to significant improvements in kNN Classification.
Proceedings ArticleDOI

Labeling images with a computer game

TL;DR: A new interactive system: a game that is fun and can be used to create valuable output that addresses the image-labeling problem and encourages people to do the work by taking advantage of their desire to be entertained.
Journal ArticleDOI

Pegasos: primal estimated sub-gradient solver for SVM

TL;DR: A simple and effective stochastic sub-gradient descent algorithm for solving the optimization problem cast by Support Vector Machines, which is particularly well suited for large text classification problems, and demonstrates an order-of-magnitude speedup over previous SVM learning methods.
Book ChapterDOI

Object Recognition as Machine Translation: Learning a Lexicon for a Fixed Image Vocabulary

TL;DR: This work shows how to cluster words that individually are difficult to predict into clusters that can be predicted well, and cannot predict the distinction between train and locomotive using the current set of features, but can predict the underlying concept.
Related Papers (5)