scispace - formally typeset
Search or ask a question
Topic

Pairwise comparison

About: Pairwise comparison is a research topic. Over the lifetime, 6804 publications have been published within this topic receiving 174081 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: A new preference disaggregation modeling formulations for multiple criteria sorting with a set of additive value functions is introduced, demonstrating the applicability of the framework on data involving the classification of cities into liveability classes.

59 citations

Proceedings ArticleDOI
10 Apr 2011
TL;DR: This work addresses pairwise and (for the first time) triple key establishment problems in wireless sensor networks (WSN) and introduces a novel concept of triple key distribution, in which a common key is established between three nodes, which allows secure passive monitoring of forwarding progress in routing tasks.
Abstract: We address pairwise and (for the first time) triple key establishment problems in wireless sensor networks (WSN). We use combinatorial designs to establish pairwise keys between nodes in a WSN. A BIBD(v; b; r; k; λ) (or t - (v; b; r; k; λ)) design can be mapped to a sensor network, where v represents the size of the key pool, b represents the maximum number of nodes that the network can support, k represents the size of the key chain. Any pair (or t-subset) of keys occurs together uniquely in exactly λ nodes. λ = 2 and λ = 3 are used to establish unique pairwise or triple keys. Our pairwise key distribution is the first one that is fully secure (none of the links among uncompromised nodes is affected) and applicable for mobile sensor networks (as key distribution is independent on the connectivity graph), while preserving low storage, computation and communication requirements. We also use combinatorial trades to establish pairwise keys. This is the first time that trades are being applied to key management. We describe a new construction of Strong Steiner Trades. We introduce a novel concept of triple key distribution, in which a common key is established between three nodes. This allows secure passive monitoring of forwarding progress in routing tasks. We present a polynomial-based approach and a combinatorial approach (using trades) for triple key distribution.

59 citations

Posted Content
TL;DR: This survey detailedly investigates current deep hashing algorithms including deep supervised hashing and deep unsupervised hashing, and categorizes deep supervised hash methods into pairwise methods, ranking-based methods, pointwise methods as well as quantization according to how measuring the similarities of the learned hash codes.
Abstract: Nearest neighbor search is to find the data points in the database such that the distances from them to the query are the smallest, which is a fundamental problem in various domains, such as computer vision, recommendation systems and machine learning. Hashing is one of the most widely used methods for its computational and storage efficiency. With the development of deep learning, deep hashing methods show more advantages than traditional methods. In this paper, we present a comprehensive survey of the deep hashing algorithms. Specifically, we categorize deep supervised hashing methods into pairwise similarity preserving, multiwise similarity preserving, implicit similarity preserving, classification-oriented preserving as well as quantization according to the manners of preserving the similarities. In addition, we also introduce some other topics such as deep unsupervised hashing and multi-modal deep hashing methods. Meanwhile, we also present some commonly used public datasets and the scheme to measure the performance of deep hashing algorithms. Finally, we discussed some potential research directions in conclusion.

59 citations

Journal ArticleDOI
TL;DR: In this paper, the authors extend the QWeighted algorithm for efficient pairwise multiclass voting to the multilabel setting, and evaluate the adapted algorithm on several real-world datasets.

59 citations

Proceedings Article
11 Jul 2009
TL;DR: Two algorithms are derived on the basis of manifold regularization and log-determinant divergence regularization technique, respectively, which can simultaneously exploit label information, unlabeled examples, and the metrics derived from auxiliary data sets and are efficient and flexible.
Abstract: Most of the existing metric learning methods are accomplished by exploiting pairwise constraints over the labeled data and frequently suffer from the insufficiency of training examples. To learn a robust distance metric from few labeled examples, prior knowledge from unlabeled examples as well as the metrics previously derived from auxiliary data sets can be useful. In this paper, we propose to leverage such auxiliary knowledge to assist distance metric learning, which is formulated following the regularized loss minimization principle. Two algorithms are derived on the basis of manifold regularization and log-determinant divergence regularization technique, respectively, which can simultaneously exploit label information (i.e., the pairwise constraints over labeled data), unlabeled examples, and the metrics derived from auxiliary data sets. The proposed methods directly manipulate the auxiliary metrics and require no raw examples from the auxiliary data sets, which make them efficient and flexible. We conduct extensive evaluations to compare our approaches with a number of competing approaches on face recognition task. The experimental results show that our approaches can derive reliable distance metrics from limited training examples and thus are superior in terms of accuracy and labeling efforts.

59 citations


Network Information
Related Topics (5)
Markov chain
51.9K papers, 1.3M citations
81% related
Cluster analysis
146.5K papers, 2.9M citations
76% related
Deep learning
79.8K papers, 2.1M citations
75% related
Optimization problem
96.4K papers, 2.1M citations
74% related
Robustness (computer science)
94.7K papers, 1.6M citations
74% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20241
20231,305
20222,607
2021581
2020554
2019520