scispace - formally typeset
Open AccessProceedings ArticleDOI

Deep Clustering via Joint Convolutional Autoencoder Embedding and Relative Entropy Minimization

TLDR
A new clustering model, called DEeP Embedded Regularized ClusTering (DEPICT), which efficiently maps data into a discriminative embedding subspace and precisely predicts cluster assignments is proposed, which indicates the superiority and faster running time of DEPICT in real-world clustering tasks, where no labeled data is available for hyper-parameter tuning.
Abstract
In this paper, we propose a new clustering model, called DEeP Embedded Regularized ClusTering (DEPICT), which efficiently maps data into a discriminative embedding subspace and precisely predicts cluster assignments. DEPICT generally consists of a multinomial logistic regression function stacked on top of a multi-layer convolutional autoencoder. We define a clustering objective function using relative entropy (KL divergence) minimization, regularized by a prior for the frequency of cluster assignments. An alternating strategy is then derived to optimize the objective by updating parameters and estimating cluster assignments. Furthermore, we employ the reconstruction loss functions in our autoencoder, as a data-dependent regularization term, to prevent the deep embedding function from overfitting. In order to benefit from end-to-end optimization and eliminate the necessity for layer-wise pre-training, we introduce a joint learning framework to minimize the unified clustering and reconstruction loss functions together and train all network layers simultaneously. Experimental results indicate the superiority and faster running time of DEPICT in real-world clustering tasks, where no labeled data is available for hyper-parameter tuning.

read more

Citations
More filters
Journal ArticleDOI

Deep Learning for Anomaly Detection: A Review

TL;DR: A comprehensive survey of deep anomaly detection with a comprehensive taxonomy is presented in this paper, covering advancements in 3 high-level categories and 11 fine-grained categories of the methods.
Journal ArticleDOI

A Survey of Clustering With Deep Learning: From the Perspective of Network Architecture

TL;DR: This paper gives a systematic survey of clustering with deep learning in views of architecture and introduces the preliminary knowledge for better understanding of this field.
Posted Content

Contrastive Clustering

TL;DR: A one-stage online clustering method called Contrastive Clustering (CC) which explicitly performs the instance- and cluster-level contrastive learning, which remarkably outperforms 17 competitive clustering methods on six challenging image benchmarks.
Proceedings ArticleDOI

Deep Spectral Clustering Using Dual Autoencoder Network

TL;DR: A joint learning framework for discriminative embedding and spectral clustering is proposed, which can significantly outperform state-of-the-art clustering approaches and be more robust to noise.
Proceedings ArticleDOI

Unsupervised Domain Adaptation via Structurally Regularized Deep Clustering

TL;DR: This work describes the proposed method as Structurally Regularized Deep Clustering (SRDC), where it enhances target discrimination with clustering of intermediate network features, and enhance structural regularization with soft selection of less divergent source examples.
References
More filters
Proceedings Article

Adam: A Method for Stochastic Optimization

TL;DR: This work introduces Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, and provides a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework.
Journal ArticleDOI

Gradient-based learning applied to document recognition

TL;DR: In this article, a graph transformer network (GTN) is proposed for handwritten character recognition, which can be used to synthesize a complex decision surface that can classify high-dimensional patterns, such as handwritten characters.
Journal Article

Dropout: a simple way to prevent neural networks from overfitting

TL;DR: It is shown that dropout improves the performance of neural networks on supervised learning tasks in vision, speech recognition, document classification and computational biology, obtaining state-of-the-art results on many benchmark data sets.
Journal ArticleDOI

Normalized cuts and image segmentation

TL;DR: This work treats image segmentation as a graph partitioning problem and proposes a novel global criterion, the normalized cut, for segmenting the graph, which measures both the total dissimilarity between the different groups as well as the total similarity within the groups.
Journal ArticleDOI

Least squares quantization in PCM

TL;DR: In this article, the authors derived necessary conditions for any finite number of quanta and associated quantization intervals of an optimum finite quantization scheme to achieve minimum average quantization noise power.
Related Papers (5)