scispace - formally typeset
Open AccessJournal ArticleDOI

Multidimensional scaling: Multidimensional scaling

TLDR
Key aspects of performing MDS are discussed, such as methods that can be used to collect similarity estimates, analytic techniques for treating proximity data, and various concerns regarding interpretation of the MDS output.
Abstract
The concept of similarity, or a sense of 'sameness' among things, is pivotal to theories in the cognitive sciences and beyond. Similarity, however, is a difficult thing to measure. Multidimensional scaling (MDS) is a tool by which researchers can obtain quantitative estimates of similarity among groups of items. More formally, MDS refers to a set of statistical techniques that are used to reduce the complexity of a data set, permitting visual appreciation of the underlying relational structures contained therein. The current paper provides an overview of MDS. We discuss key aspects of performing this technique, such as methods that can be used to collect similarity estimates, analytic techniques for treating proximity data, and various concerns regarding interpretation of the MDS output. MDS analyses of two novel data sets are also included, highlighting in step-by-step fashion how MDS is performed, and key issues that may arise during analysis. WIREs Cogn Sci 2013, 4:93-103. doi: 10.1002/wcs.1203 This article is categorized under: Psychology > Perception and Psychophysics.

read more

Content maybe subject to copyright    Report

Citations
More filters
Book

Applied Predictive Modeling

Max Kuhn, +1 more
TL;DR: This research presents a novel and scalable approach called “Smartfitting” that automates the very labor-intensive and therefore time-heavy and therefore expensive and expensive process of designing and implementing statistical models for regression models.
Proceedings ArticleDOI

LINE: Large-scale Information Network Embedding

TL;DR: LINE as discussed by the authors proposes a network embedding method called LINE, which is suitable for arbitrary types of information networks: undirected, directed, and/or weighted, and optimizes a carefully designed objective function that preserves both the local and global network structures.
Journal ArticleDOI

Deep learning for time series classification: a review

TL;DR: This article proposes the most exhaustive study of DNNs for TSC by training 8730 deep learning models on 97 time series datasets and provides an open source deep learning framework to the TSC community.
Proceedings ArticleDOI

Deep Metric Learning via Lifted Structured Feature Embedding

TL;DR: In this article, the authors propose to lift the vector of pairwise distances within the batch to the matrix of pairswise distances, which enables the algorithm to learn the state-of-the-art feature embedding by optimizing a novel structured prediction objective on the lifted problem.
Proceedings ArticleDOI

GraRep: Learning Graph Representations with Global Structural Information

TL;DR: A novel model for learning vertex representations of weighted graphs that integrates global structural information of the graph into the learning process and significantly outperforms other state-of-the-art methods in such tasks.
References
More filters
Journal ArticleDOI

Features of Similarity

Amos Tversky
- 01 Jul 1977 - 
TL;DR: The metric and dimensional assumptions that underlie the geometric representation of similarity are questioned on both theoretical and empirical grounds and a set of qualitative assumptions are shown to imply the contrast model, which expresses the similarity between objects as a linear combination of the measures of their common and distinctive features.
Journal ArticleDOI

Multidimensional scaling by optimizing goodness of fit to a nonmetric hypothesis

TL;DR: The fundamental hypothesis is that dissimilarities and distances are monotonically related, and a quantitative, intuitively satisfying measure of goodness of fit is defined to this hypothesis.
Journal ArticleDOI

Nonmetric multidimensional scaling: A numerical method

TL;DR: The numerical methods required in the approach to multi-dimensional scaling are described and the rationale of this approach has appeared previously.
Journal ArticleDOI

Analysis of individual differences in multidimensional scaling via an n-way generalization of 'eckart-young' decomposition

TL;DR: In this paper, an individual differences model for multidimensional scaling is outlined in which individuals are assumed differentially to weight the several dimensions of a common "psychological space" and a corresponding method of analyzing similarities data is proposed, involving a generalization of Eckart-Young analysis to decomposition of three-way (or higher-way) tables.