scispace - formally typeset
Journal ArticleDOI

Local k-proximal plane clustering

TLDR
A local k-proximal plane clustering (LkPPC) is proposed by bringing k-means into kPPC which will force the data points to center around some prototypes and thus localize the representations of the cluster center plane.
Abstract
k-Plane clustering (kPC) and k-proximal plane clustering (kPPC) cluster data points to the center plane, instead of clustering data points to cluster center in k-means. However, the cluster center plane constructed by kPC and kPPC is infinitely extending, which will affect the clustering performance. In this paper, we propose a local k-proximal plane clustering (LkPPC) by bringing k-means into kPPC which will force the data points to center around some prototypes and thus localize the representations of the cluster center plane. The contributions of our LkPPC are as follows: (1) LkPPC introduces localized representation of each cluster center plane to avoid the infinitely confusion. (2) Different from kPPC, our LkPPC constructs cluster center plane that makes the data points of the same cluster close to both the same center plane and the prototype, and meanwhile far away from the other clusters to some extent, which leads to solve eigenvalue problems. (3) Instead of randomly selecting the initial data points, a Laplace graph strategy is established to initialize the data points. (4) The experimental results on several artificial datasets and benchmark datasets show the effectiveness of our LkPPC.

read more

Citations
More filters
Journal ArticleDOI

Piecewise Linear Regression Based on Plane Clustering

TL;DR: The proposed method first partitions the data into multiple plane-centered clusters and then analytically compute corresponding piecewise linear functions, which are generated from plane clustering, which is truly coincident with geometrical intuition.
Journal ArticleDOI

Fuzzy semi-supervised weighted linear loss twin support vector clustering

TL;DR: To build a robust clustering algorithm which is not sensitive to noise and outliers, the proposed formulations achieve better clustering accuracy over other state-of-the-art plane-based clustering algorithms with comparatively lesser computational time.
Journal ArticleDOI

General Plane-Based Clustering With Distribution Loss

TL;DR: This paper proposes a plane-based clustering method by introducing a new loss function to capture the data distribution precisely and proves that the general model terminates in a finite number of steps at a local or weak local optimal point.
Journal ArticleDOI

Robust k-subspace discriminant clustering

TL;DR: By introducing the L1-norm discriminant and local information to each subspace, k SDC realizes robust dimensionality reduction and clustering, and its optimization problems can be solved through an effective alternating direction method of multipliers.
Journal ArticleDOI

Multiple Flat Projections for Cross-Manifold Clustering.

TL;DR: This article proposes multiple flat projections clustering (MFPC) for cross-manifold clustering, a series of nonconvex matrix optimization problems is solved by a proposed recursive algorithm and a nonlinear version of MFPC is extended via kernel tricks to deal with a more complex cross- manifold learning situation.
References
More filters
Book

Data Mining: Concepts and Techniques

TL;DR: This book presents dozens of algorithms and implementation examples, all in pseudo-code and suitable for use in real-world, large-scale data mining projects, and provides a comprehensive, practical look at the concepts and techniques you need to get the most out of real business data.
Journal ArticleDOI

Data clustering: a review

TL;DR: An overview of pattern clustering methods from a statistical pattern recognition perspective is presented, with a goal of providing useful advice and references to fundamental concepts accessible to the broad community of clustering practitioners.
Journal Article

Statistical Comparisons of Classifiers over Multiple Data Sets

TL;DR: A set of simple, yet safe and robust non-parametric tests for statistical comparisons of classifiers is recommended: the Wilcoxon signed ranks test for comparison of two classifiers and the Friedman test with the corresponding post-hoc tests for comparisons of more classifiers over multiple data sets.
Journal ArticleDOI

A tutorial on spectral clustering

TL;DR: In this article, the authors present the most common spectral clustering algorithms, and derive those algorithms from scratch by several different approaches, and discuss the advantages and disadvantages of these algorithms.
Related Papers (5)