scispace - formally typeset
Open AccessBook ChapterDOI

Image annotation using metric learning in semantic neighbourhoods

Reads0
Chats0
TLDR
2PKNN, a two-step variant of the classical K-nearest neighbour algorithm, is proposed that performs comparable to the current state-of-the-art on three challenging image annotation datasets, and shows significant improvements after metric learning.
Abstract
Automatic image annotation aims at predicting a set of textual labels for an image that describe its semantics. These are usually taken from an annotation vocabulary of few hundred labels. Because of the large vocabulary, there is a high variance in the number of images corresponding to different labels ("class-imbalance"). Additionally, due to the limitations of manual annotation, a significant number of available images are not annotated with all the relevant labels ("weak-labelling"). These two issues badly affect the performance of most of the existing image annotation models. In this work, we propose 2PKNN, a two-step variant of the classical K-nearest neighbour algorithm, that addresses these two issues in the image annotation task. The first step of 2PKNN uses "image-to-label" similarities, while the second step uses "image-to-image" similarities; thus combining the benefits of both. Since the performance of nearest-neighbour based methods greatly depends on how features are compared, we also propose a metric learning framework over 2PKNN that learns weights for multiple features as well as distances together. This is done in a large margin set-up by generalizing a well-known (single-label) classification metric learning algorithm for multi-label prediction. For scalability, we implement it by alternating between stochastic sub-gradient descent and projection steps. Extensive experiments demonstrate that, though conceptually simple, 2PKNN alone performs comparable to the current state-of-the-art on three challenging image annotation datasets, and shows significant improvements after metric learning.

read more

Content maybe subject to copyright    Report

Citations
More filters
Proceedings ArticleDOI

Image Semantic Distance Metric Learning Approach for Large-scale Automatic Image Annotation

Cong Jin, +1 more
TL;DR: The experimental results confirm that the proposed image semantic distance metric learning (ISDML) can improve the efficiency of large-scale AIA approach and achieve better annotation performance than the other state-of-the art AIA approaches.
Journal ArticleDOI

Deep Convolutional Neural Network with KNN Regression for Automatic Image Annotation

TL;DR: Zhang et al. as mentioned in this paper proposed a hybrid approach that combines the advantages of both CNN and the conventional concept-to-image assignment approach, which achieved an F1 score of 58.89% and 80.24% with the datasets Corel 5k and MSRC v2.
Posted Content

VSE-ens: Visual-Semantic Embeddings with Efficient Negative Sampling

TL;DR: In this article, a fast adaptive negative sampler is proposed to choose the negative examples that are most likely to meet the requirements of violation according to the latent factors of images, which can linearly scale up to large datasets.
Book ChapterDOI

Nearest Neighbor with Multi-feature Metric for Image Annotation

TL;DR: This paper proposes a novel Nearest Neighbor method based on a multi-feature distance metric, which takes full advantage of different and complementary features, and can mitigate the semantic issues due to intra-class variations and inter-class similarities, and improve the image annotation performance.
Journal ArticleDOI

An novel approach to extract the content retrieval with the image perception using collaborative community oriented sifting (CCOS)

TL;DR: This work has proposed an image based content retrieval for the given search query from the enduser and introduced standardization step, which will enhance precision of conventional community oriented sifting procedures.
References
More filters
Proceedings Article

Distance Metric Learning for Large Margin Nearest Neighbor Classification

TL;DR: In this article, a Mahanalobis distance metric for k-NN classification is trained with the goal that the k-nearest neighbors always belong to the same class while examples from different classes are separated by a large margin.
Journal ArticleDOI

Distance Metric Learning for Large Margin Nearest Neighbor Classification

TL;DR: This paper shows how to learn a Mahalanobis distance metric for kNN classification from labeled examples in a globally integrated manner and finds that metrics trained in this way lead to significant improvements in kNN Classification.
Proceedings ArticleDOI

Labeling images with a computer game

TL;DR: A new interactive system: a game that is fun and can be used to create valuable output that addresses the image-labeling problem and encourages people to do the work by taking advantage of their desire to be entertained.
Journal ArticleDOI

Pegasos: primal estimated sub-gradient solver for SVM

TL;DR: A simple and effective stochastic sub-gradient descent algorithm for solving the optimization problem cast by Support Vector Machines, which is particularly well suited for large text classification problems, and demonstrates an order-of-magnitude speedup over previous SVM learning methods.
Book ChapterDOI

Object Recognition as Machine Translation: Learning a Lexicon for a Fixed Image Vocabulary

TL;DR: This work shows how to cluster words that individually are difficult to predict into clusters that can be predicted well, and cannot predict the distinction between train and locomotive using the current set of features, but can predict the underlying concept.
Related Papers (5)