Beyond Low-Rank Representations: Orthogonal clustering basis reconstruction with optimized graph structure for multi-view spectral clustering.
Yang Wang,Yang Wang,Lin Wu +2 more
Reads0
Chats0
TLDR
This paper decomposes LRR into latent clustered orthogonal representation via low-rank matrix factorization, to encode the more flexible cluster structures than LRR over primal data objects and converts the problem of L RR into that of simultaneously learning Orthogonal clustered representation and optimized local graph structure for each view.About:
This article is published in Neural Networks.The article was published on 2018-07-01 and is currently open access. It has received 116 citations till now. The article focuses on the topics: Graph (abstract data type) & Orthogonal basis.read more
Citations
More filters
Journal ArticleDOI
GMC: Graph-Based Multi-View Clustering
TL;DR: The proposed general Graph-based Multi-view Clustering (GMC) takes the data graph matrices of all views and fuses them to generate a unified graph matrix, which helps partition the data points naturally into the required number of clusters.
Journal ArticleDOI
Incomplete Multiview Spectral Clustering With Adaptive Graph Learning
TL;DR: The proposed method is the first work that exploits the graph learning and spectral clustering techniques to learn the common representation for incomplete multiview clustering and achieves the best performance in comparison with some state-of-the-art methods.
Journal ArticleDOI
Learning a Joint Affinity Graph for Multiview Subspace Clustering
TL;DR: A low-rank representation model is employed to learn a shared sample representation coefficient matrix to generate the affinity graph and diversity regularization is used to learn the optimal weights for each view, which can suppress the redundancy and enhance the diversity among different feature views.
Journal ArticleDOI
Applications of Generative Adversarial Networks (GANs): An Updated Review
TL;DR: A comprehensive review of the crucial applications of GANs covering a variety of areas is presented, study of the techniques and architectures used and further the contribution of that respective application in the real world are presented.
Posted Content
Where-and-When to Look: Deep Siamese Attention Networks for Video-based Person Re-identification.
TL;DR: Wang et al. as mentioned in this paper proposed a Siamese attention architecture that jointly learns spatio-temporal video representations and their similarity metrics, which can enhance their discriminative capability by focusing on distinct regions when measuring the similarity with another pedestrian video.
References
More filters
Proceedings Article
On Spectral Clustering: Analysis and an algorithm
TL;DR: A simple spectral clustering algorithm that can be implemented using a few lines of Matlab is presented, and tools from matrix perturbation theory are used to analyze the algorithm, and give conditions under which it can be expected to do well.
Journal ArticleDOI
A Singular Value Thresholding Algorithm for Matrix Completion
TL;DR: This paper develops a simple first-order and easy-to-implement algorithm that is extremely efficient at addressing problems in which the optimal solution has low rank, and develops a framework in which one can understand these algorithms in terms of well-known Lagrange multiplier algorithms.
Journal ArticleDOI
Top 10 algorithms in data mining
Xindong Wu,Vipin Kumar,J. Ross Quinlan,Joydeep Ghosh,Qiang Yang,Hiroshi Motoda,Geoffrey J. McLachlan,Angus S. K. Ng,Bing Liu,Philip S. Yu,Zhi-Hua Zhou,Michael Steinbach,David J. Hand,Dan Steinberg +13 more
TL;DR: This paper presents the top 10 data mining algorithms identified by the IEEE International Conference on Data Mining (ICDM) in December 2006: C4.5, k-Means, SVM, Apriori, EM, PageRank, AdaBoost, kNN, Naive Bayes, and CART.
Journal ArticleDOI
Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
TL;DR: It is shown that if a certain restricted isometry property holds for the linear transformation defining the constraints, the minimum-rank solution can be recovered by solving a convex optimization problem, namely, the minimization of the nuclear norm over the given affine space.
Journal Article
Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
TL;DR: In this paper, it was shown that if a certain restricted isometry property holds for the linear transformation defining the constraints, the minimum-rank solution can be recovered by solving a convex optimization problem, namely, the minimization of the nuclear norm over the given affine space.