scispace - formally typeset
Search or ask a question

Showing papers on "Centroid published in 2014"


Journal Article
TL;DR: The simulation experiment with IRIS data set shows that the proposed algorithm converges faster and the value k found is close to the actual value, which proves the validity of the algorithm.
Abstract: Aiming at the problemsof too much iterative times in selecting initial centroids stochastically for K-Means algorithm,a method is proposed to optimize the initial centroids through cutting the set into k segmentations and select one point in each segmentation as initial centroids for iterative computing. A new valid function called clustering-index is defined as the sum of clustering-density and clustering-significance and can be used to search the optimization of k in the internal of [1,n(1/2) ]. The simulation experiment with IRIS data set shows that the proposed algorithm converges faster and the value k found is close to the actual value,which proves the validity of the algorithm.

88 citations


Journal ArticleDOI
TL;DR: This paper introduces a cluster level index called centroid index, which is intuitive, simple to implement, fast to compute and applicable in case of model mismatch as well as to other clustering models beyond the centroid-based k-means.

85 citations


Journal ArticleDOI
TL;DR: In this article, a weak-lensing study of the cluster A520 based on Advanced Camera for Surveys (ACS) data is presented, which confirms the previous claims that a substantial amount of dark mass is present between two luminous subclusters where we observe very little light.
Abstract: We present a weak-lensing study of the cluster A520 based on Advanced Camera for Surveys (ACS) data. The excellent data quality provides a mean source density of ~109 arcmin–2, which improves both resolution and significance of the mass reconstruction compared to a previous study based on Wide Field Planetary Camera 2 (WFPC2) images. We take care in removing instrumental effects such as the charge trailing due to radiation damage of the detector and the position-dependent point-spread function. This new ACS analysis confirms the previous claims that a substantial amount of dark mass is present between two luminous subclusters where we observe very little light. The centroid of the dark peak in the current ACS analysis is offset to the southwest by ~1' with respect to the centroid from the WFPC2 analysis. Interestingly, this new centroid is in better agreement with the location where the X-ray emission is strongest, and the mass-to-light ratio estimated with this centroid is much higher (813 ± 78 M ☉/L R☉) than the previous value; the aperture mass with the WFPC2 centroid provides a consistent mass. Although we cannot provide a definite explanation for the dark peak, we discuss a revised scenario, wherein dark matter with a more conventional range (σDM/m DM < 1 cm2 g–1) of self-interacting cross-section can lead to the detection of this dark substructure. If supported by detailed numerical simulations, this hypothesis opens up the possibility that the A520 system can be used to establish a lower limit of the self-interacting cross-section of dark matter.

79 citations


Journal ArticleDOI
Zhou Zhiyong1, Jian Zheng1, Yakang Dai1, Zhe Zhou1, Shi Chen1 
11 Mar 2014-PLOS ONE
TL;DR: This paper proposes a robust non-rigid point set registration algorithm using the Student's-t mixture model and compared it with other state-of-the-art registration algorithms on both 2D and 3D data with noise and outliers, where it showed accurate results and outperformed the other algorithms.
Abstract: The Student's-t mixture model, which is heavily tailed and more robust than the Gaussian mixture model, has recently received great attention on image processing. In this paper, we propose a robust non-rigid point set registration algorithm using the Student's-t mixture model. Specifically, first, we consider the alignment of two point sets as a probability density estimation problem and treat one point set as Student's-t mixture model centroids. Then, we fit the Student's-t mixture model centroids to the other point set which is treated as data. Finally, we get the closed-form solutions of registration parameters, leading to a computationally efficient registration algorithm. The proposed algorithm is especially effective for addressing the non-rigid point set registration problem when significant amounts of noise and outliers are present. Moreover, less registration parameters have to be set manually for our algorithm compared to the popular coherent points drift (CPD) algorithm. We have compared our algorithm with other state-of-the-art registration algorithms on both 2D and 3D data with noise and outliers, where our non-rigid registration algorithm showed accurate results and outperformed the other algorithms.

65 citations


Proceedings ArticleDOI
23 Jun 2014
TL;DR: This paper proposes an extension which projects features to fern-specific embedding spaces, which yields improved matching rates in short runtime and provides improved matching performance in comparison to the standard nearest neighbor approach.
Abstract: The prevalent approach to image-based localization is matching interest points detected in the query image to a sparse 3D point cloud representing the known world The obtained correspondences are then used to recover a precise camera pose The state-of-the-art in this field often ignores the availability of a set of 2D descriptors per 3D point, for example by representing each 3D point by only its centroid In this paper we demonstrate that these sets contain useful information that can be exploited by formulating matching as a discriminative classification problem Since memory demands and computational complexity are crucial in such a setup, we base our algorithm on the efficient and effective random fern principle We propose an extension which projects features to fern-specific embedding spaces, which yields improved matching rates in short runtime Experiments first show that our novel formulation provides improved matching performance in comparison to the standard nearest neighbor approach and that we outperform related randomization methods in our localization scenario

61 citations


Journal ArticleDOI
TL;DR: This paper presents a biometric technique for identification of a person using the iris image that is first segmented from the acquired image of an eye using an edge detection algorithm, and exhibits an accuracy of 98.5%.
Abstract: This paper presents a biometric technique for identification of a person using the iris image. The iris is first segmented from the acquired image of an eye using an edge detection algorithm. The disk shaped area of the iris is transformed into a rectangular form. Described moments are extracted from the grayscale image which yields a feature vector containing scale, rotation, and translation invariant moments. Images are clustered using the k-means algorithm and centroids for each cluster are computed. An arbitrary image is assumed to belong to the cluster whose centroid is the nearest to the feature vector in terms of Euclidean distance computed. The described model exhibits an accuracy of 98.5%.

60 citations


Journal ArticleDOI
TL;DR: Support Vector Machine (SVM) Regression was applied to predict measurements errors for Accuracy Enhancement in Optical Scanning Systems by a novel method, based on the Power Spectrum Centroid Calculation in determining the energy center of an optoelectronic signal in order to obtain accuracy enhancement in optical scanning system measurements.

57 citations


Proceedings ArticleDOI
24 Feb 2014
TL;DR: This paper shows how to reduce the cost of the k-means algorithm by large factors by adapting ranked retrieval techniques, and proposes a variant of the WAND algorithm that uses the results of the intermediate results of nearest neighbor computations to improve performance.
Abstract: The k-means clustering algorithm has a long history and a proven practical performance, however it does not scale to clustering millions of data points into thousands of clusters in high dimensional spaces. The main computational bottleneck is the need to recompute the nearest centroid for every data point at every iteration, aprohibitive cost when the number of clusters is large. In this paper we show how to reduce the cost of the k-means algorithm by large factors by adapting ranked retrieval techniques. Using a combination of heuristics, on two real life data sets the wall clock time per iteration is reduced from 445 minutes to less than 4, and from 705 minutes to 1.4, while the clustering quality remains within 0.5% of the k-means quality. The key insight is to invert the process of point-to-centroid assignment by creating an inverted index over all the points and then using the current centroids as queries to this index to decide on cluster membership. In other words, rather than each iteration consisting of "points picking centroids", each iteration now consists of "centroids picking points". This is much more efficient, but comes at the cost of leaving some points unassigned to any centroid. We show experimentally that the number of such points is low and thus they can be separately assigned once the final centroids are decided. To speed up the computation we sparsify the centroids by pruning low weight features. Finally, to further reduce the running time and the number of unassigned points, we propose a variant of the WAND algorithm that uses the results of the intermediate results of nearest neighbor computations to improve performance.

53 citations


Journal ArticleDOI
TL;DR: A low complexity minimum distance centroid estimator is suggested to estimate the channel gain and carrier phase jointly and numerical results show that the estimator provides reliable estimation of signal centroids, enabling an accurate classification with a non-parametric likelihood function.
Abstract: In this paper, we propose a blind modulation classifier that differs from most existing classifiers. A low complexity minimum distance centroid estimator is suggested to estimate the channel gain and carrier phase jointly. The estimation is achieved by minimizing a signal-to-centroid distance. A new non-parametric likelihood function is proposed for fast classification with unknown noise variance and distribution. Numerical results show that the estimator provides reliable estimation of signal centroids, enabling an accurate classification with a non-parametric likelihood function. When different channel conditions are simulated, the proposed blind classifier achieves similar classification accuracy versus non-blind state-of-the-art classifiers while being more robust and having much lower complexity.

52 citations


Journal ArticleDOI
TL;DR: This paper presents exact computation procedures for the centroid of type-2 fuzzy sets with either triangular or Gaussian secondary membership functions as a subsequent analytical extension of the KM type-reduction procedure dedicated to intervaltype-2 sets.

43 citations


Journal ArticleDOI
TL;DR: Two new hybrid weighted averaging operators for aggregating crisp and fuzzy information are proposed, some of which desirable properties are studied, and three special types of preferred centroid of triangular fuzzy number are defined.

Journal ArticleDOI
TL;DR: The algorithm is robust and outlier-adaptive, which does not need prior information about data such as the appropriate outlier ratio when the point sets are perturbed by outliers, and exhibits inherent statistical robustness and has an explicit interpretation.
Abstract: In this paper, a flexible probabilistic method is introduced for non-rigid point registration, which is motivated by the pioneering research named Coherent Point Drift (CPD). Being different from CPD, our algorithm is robust and outlier-adaptive, which does not need prior information about data such as the appropriate outlier ratio when the point sets are perturbed by outliers. We consider the registration as the alignment of the data (one point set) to a set of Gaussian Mixture Model centroids (the other point set), and initially formulate it as maximizing the likelihood problem, then the problem is solved under Expectation---Maximization (EM) framework. The outlier ratio is also formulated in EM framework and will be updated during the EM iteration. Moreover, we use the volume of the point set region to determine the uniform distribution for modeling the outliers. The resulting registration algorithm exhibits inherent statistical robustness and has an explicit interpretation. The experiments demonstrate that our algorithm outperforms the state-of-the-art method.

Journal ArticleDOI
TL;DR: In this article, the authors considered microlensed image centroid motions by the exotic lens models and showed that the centroid shift from the source position might move on a multiply connected curve like a bow tie, while it is known to move on an ellipse for the $n=1$ case and on an oval curve for $n = 2.
Abstract: Gravitational lens models with negative convergence inspired by modified gravity theories, exotic matter, and energy have been recently examined in such a way that a static and spherically symmetric modified spacetime metric depends on the inverse distance to the $n$th power ($n=1$ for Schwarzschild metric, $n=2$ for Ellis wormhole, and $n\ensuremath{ e}1$ for an extended spherical distribution of matter such as an isothermal sphere) in the weak-field approximation. Some of the models act as if a convex lens, whereas the others are repulsive on light rays like a concave lens. The present paper considers microlensed image centroid motions by the exotic lens models. Numerical calculations show that, for large $n$ cases in the convex-type models, the centroid shift from the source position might move on a multiply connected curve like a bow tie, while it is known to move on an ellipse for the $n=1$ case and to move on an oval curve for $n=2$. The distinctive feature of the microlensed image centroid may be used for searching (or constraining) localized exotic matter or energy with astrometric observations. It is shown also that the centroid shift trajectory for concave-type repulsive models might be elongated vertically to the source motion direction like a prolate spheroid, whereas that for convex-type models such as the Schwarzschild one is tangentially elongated like an oblate spheroid.

Journal ArticleDOI
TL;DR: The simulation result shows the ellipse centroid localization algorithm is more effective than the centroid algorithm and the weighted centroid precision algorithm.
Abstract: Location technology is becoming more and more important in wireless sensor networks. The weighted centroid localization offers a fast and simple algorithm for the location equipment in wireless sensor networks. The algorithm derives from the centroid measurement and calculation device of the adjacent anchor in the average coordinate. After the analysis of the radio propagation loss model, the most appropriate log-distance distribution model is selected to simulate the signal propagation. Based on the centroid algorithm and the weighted centroid algorithm, this paper proposes an ellipse centroid localization algorithm. This algorithm makes use of ellipse’s characteristic to estimate the unknown node’s coordinate. The main idea of ellipse centroid localization algorithm is the precision control factor that can control the algorithm’s location precision. In ellipse centroid localization algorithm, node is extended as anchor in order to strengthen anchor density’s dynamic characteristic. The simulation result shows the ellipse centroid localization algorithm is more effective than the centroid algorithm and the weighted centroid precision algorithm

Proceedings ArticleDOI
20 May 2014
TL;DR: The proposed method is shown to be invariant to image transformations (translation, rotation, reflection and scaling) and robust to minor deformations and occlusions and is used as a classifier for plant leaf classification.
Abstract: In this paper, we use centroid distance and axis of least inertia method for plant leaf classification. For this propose the RGB (Red, Green, Blue) image are converted to the binary image. Then, Canny operator is applied to the binary image to recognize the edges of the image before thinning the edges. After that, the boundary of the image is traced to sample the shape. Sampling helps us to avoid time-consuming computations. We compute the centroid distance of these points and distance of sampling points from axis of least inertia line. By selecting a fixed start point and normalizing the distances, the proposed method is shown to be invariant to image transformations (translation, rotation, reflection and scaling) and robust to minor deformations and occlusions. In this study, probabilistic neural network (PNN) has been used as a classifier. Two public leaf datasets including: Swedish leaf dataset and Flavia dataset are evaluated. Experimental results demonstrate the superior performance of the proposed feature in plant leaf classification.

Journal ArticleDOI
TL;DR: The convergence properties of the proposed algorithm are characterized, and its sensitiveness against additive noise on the relative distance measurements is investigated, and an experimental validation of the effectiveness of the implemented algorithm is provided.
Abstract: In this study, the decentralized common reference frame estimation problem for multiagent systems in the absence of any common coordinate system is investigated. Each agent is deployed in a 2-D space and can only measure the relative distance of neighboring agents and the angle of their line of sight in its local reference frame; no relative attitude measurement is available. Only asynchronous and random pairwise communications are allowed between neighboring agents. The convergence properties of the proposed algorithm are characterized, and its sensitiveness against additive noise on the relative distance measurements is investigated. An experimental validation of the effectiveness of the proposed algorithm is provided.

Book ChapterDOI
21 Oct 2014
TL;DR: This work proposes a more straightforward approach based on nearest centroid classification: profiles of topic categories are extracted from the source domain and are then adapted by iterative refining steps using most similar documents in the target domain, obtaining accuracy measures better or comparable to other methods.
Abstract: In cross-domain text classification, topic labels for documents of a target domain are predicted by leveraging knowledge of labeled documents of a source domain, having equal or similar topics with possibly different words. Existing methods either adapt documents of the source domain to the target or represent both domains in a common space. These methods are mostly based on advanced statistical techniques and often require tuning of parameters in order to obtain optimal performances. We propose a more straightforward approach based on nearest centroid classification: profiles of topic categories are extracted from the source domain and are then adapted by iterative refining steps using most similar documents in the target domain. Experiments on common benchmark datasets show that this approach, despite its simplicity, obtains accuracy measures better or comparable to other methods, obtained with fixed empirical values for its few parameters.

Journal ArticleDOI
TL;DR: A very simple method for measuring the spatial coherence of quasi-monochromatic fields through the comparison of two measurements of the radiant intensity with and without a small obscuration at the test plane is presented.
Abstract: We present a very simple method for measuring the spatial coherence of quasi-monochromatic fields through the comparison of two measurements of the radiant intensity with and without a small obscuration at the test plane. From these measurements one can measure simultaneously the field’s coherence at all pairs of points whose centroid is the centroid of the obstacle. This method can be implemented without the need of any refractive or diffractive focusing elements.

Journal ArticleDOI
TL;DR: The centroid method for the correction of turbulence consists in computing the Karcher-Fr echet mean of the sequence of input images to produce useful results from an arbitrarily small set ofinput images.
Abstract: The centroid method for the correction of turbulence consists in computing the Karcher-Fr echet mean of the sequence of input images. The direction of deformation between a pair of images is determined by the optical ow. A distinguishing feature of the centroid method is that it can produce useful results from an arbitrarily small set of input images. Source Code The source code and a online demo are accessible at the IPOL web page of this article 1 .

Journal ArticleDOI
TL;DR: A novel method for ranking fuzzy numbers that integrates the centroid point and the spread approaches and overcomes the limitations and weaknesses of most existing methods is proposed.
Abstract: Centroid and spread are commonly used approaches in ranking fuzzy numbers. Some experts rank fuzzy numbers using centroid or spread alone while others tend to integrate them together. Although a lot of methods for ranking fuzzy numbers that are related to both approaches have been presented, there are still limitations whereby the ranking obtained is inconsistent with human intuition. This paper proposes a novel method for ranking fuzzy numbers that integrates the centroid point and the spread approaches and overcomes the limitations and weaknesses of most existing methods. Proves and justifications with regard to the proposed ranking method are also presented.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated the position centroid rendezvous/formation problems of multiple unicycle agents for the first time, and proposed a new output to convert the unicycle model to canonical normal form via feedback linearization approach.
Abstract: This work investigates the position centroid rendezvous/formation problems of multiple unicycle agents for the first time. By constructing a new output, the unicycle model is converted to canonical normal form via feedback linearisation approach. Then, we propose the centroid rendezvous and the centroid formation control laws, respectively, guaranteeing that all the agents globally meet at the common initial centroid location and the desired geometric pattern centred at the initial centroid. The proposed control laws are distributed, smooth and time-varying, ensuring the internal orientation states and the velocity inputs convergent to some fixed values and zero, respectively. All the results are proved under the communication scenarios of fixed directed balanced graph with a spanning tree. Simulation results verify the effectiveness and the robustness of the proposed control schemes.

Journal ArticleDOI
TL;DR: This work proposes the use of an asymmetric dissimilarity measure in centroid-based clustering, which can be asymmetrized using the Alpha-Beta divergence, and compute the degree of asymmetry of the AB-divergence on the basis of the within-cluster variances.

Patent
04 Jun 2014
TL;DR: In this article, a method for controlling array light beam co-target collimation of a target in a loop based on light spot centroid calculation is proposed, in which the coordinates of the centroid of a light spot on an imaging plane are obtained through a light spots centroid algorithm, a tilt control quantity which needs to be conducted on each light beam is obtained according to the quantitative relation between tilt control of a tilting mirror of the corresponding light beam and a light spotting movement vector, and the center of the light spot is moved to a target spot of an image device through
Abstract: The invention relates to a method for controlling array light beam co-target collimation of a target in a loop based on light spot centroid calculation. The coordinates of the centroid of a light spot on an imaging plane are obtained through a light spot centroid algorithm, a tilt control quantity which needs to be conducted on each light beam is obtained according to the quantitative relation between tilt control of a tilting mirror of the corresponding light beam and a light spot movement vector, and the centroid of the light spot is moved to a target spot of an image device through control over the tilting mirrors. Co-target collimation of all the light beams of the target in the loop is realized through sequential control cover the array light beams. There is no need to accurately describe or obtain a light path, and a control scheme is simple and convenient and easy to implement. The adopted centroid of the light spot serves as a performance evaluation function which is simple, convenient, high in real-time performance, and remarkable in effect, and consumes a short time. The method for controlling array light beam co-target collimation of the target in the loop based on light spot centroid calculation has extensive application prospects in satellite tracking, the directed energy technology and other fields.

Journal ArticleDOI
TL;DR: Lemoine point is proposed as the best trilateration estimator thanks to the desired property that the total distance to three triangle edges is minimized and demonstrated through simulation that LP outperforms centroid localization without additional computational load.
Abstract: The centroid of three most closely spaced intersections of constant-range loci is conventionally used as trilateration estimate without rigorous justification. In this paper, we address the quality of trilateration intersections through range scaling factors. Several triangle centres, including centroid, incentre, Lemoine point, and Fermat point, are discussed in detail. Lemoine point (LP) is proposed as the best trilateration estimator thanks to the desired property that the total distance to three triangle edges is minimized. It is demonstrated through simulation that LP outperforms centroid localization without additional computational load. In addition, severe trilateration scenarios such as two-intersection cases are considered in this paper, and enhanced trilateration algorithms are proposed.

Journal ArticleDOI
TL;DR: The worst-case proposed in this paper provides theoretical evidence that the minimum zone center of the two spheres with minimum radial separation containing the sampled spherical surface is included in a spherical neighborhood centered in the centroid of radius 2 π −2 E C, where E C is the sphericity error related to the Centroid, which can be determined in closed form.
Abstract: The minimum zone sphericity tolerance is derived from the ANSI and ISO standards for roundness and has extensive applications in the tribology of ball bearings, hip joints and other lubricated pairs. The worst-case proposed in this paper provides theoretical evidence that the minimum zone center of the two (circumscribed and inscribed reference) spheres with minimum radial separation containing the sampled spherical surface is included in a spherical neighborhood centered in the centroid of radius 2 π −2 E C , where E C is the sphericity error related to the centroid, which can be determined in closed form. Such linear estimating (about 20% of E C from the centroid, i.e., about one order of magnitude lower than the sphericity tolerance to be assessed) can be used to locate the sphere center with a given tolerance and as a search neighborhood for minimum zone center-based algorithms, such as metaheuristics (genetic algorithms, particle swarm optimization, etc.). The proposed upper bound has been experimentally assessed, using a genetic algorithm (GA) with parameters previously optimized for roundness and extended to three dimensions, which has overcome most of all available datasets from the literature that have been tested with center-based minimum zone algorithms by different authors. The optimum dataset size on artificially generated datasets is also discussed and it is speculated to allow the extension of the proposed upper bound to partial (or incomplete) spherical features.

Journal ArticleDOI
TL;DR: A modified definition of the centroid of a fuzzy set is presented that avoids the computational problems associated with the usual definition and reduces to this definition when MFs are continuous and normal on some interval.
Abstract: Centroids are practically important in type-1 and type-2 fuzzy logic systems as a method of defuzzification and type reduction. However, computational problems arise when membership functions (MF) have singleton spikes. The novel thresholding aggregation operators that were described in our companion paper “ New Classes of Threshold Aggregation Functions Based Upon the Tsallis q-Exponential with Applications to Perceptual Computing” produce such MFs with spikes. Such spikes may occur when modeling concepts defined on a real-valued domain, and they are also formed in unions of fuzzy sets in which some have MFs with discrete support and others have support defined on an interval. This paper presents a modified definition of the centroid of a fuzzy set that avoids the computational problems associated with the usual definition and reduces to this definition when MFs are continuous and normal (i.e., of unit height) on some interval. We also present an enhanced Karnik–Mendel-type algorithm to compute the modified centroid of interval type-2 fuzzy sets whose MFs have spikes.

Journal ArticleDOI
TL;DR: In this article, the authors analyzed the relation-ship between the centroids of the two opposing teams in an 11-on-11 soccer match to test the new centroid algorithm.
Abstract: New, intelligent systems have been developed recently to improve the quality of match analysis These systems analyze the tactical behavior of the teams However, the existing methods leave room for improvement Thus, the main goal of this study is to refine the team centroid metric by considering all of the players on the team and the ball position Furthermore, this study analyzes the relation-ship between the centroids of the two opposing teams One 11-on-11 soccer match was analyzed to test the new centroid algorithm The results provided strong evidence of the positive relation between the centroids of the two teams over time in the x-axis (rs= 0781) and the x-axis (rs= 0707) This study confirmed the results of previous studies that analyzed the relationship between team centroids Furthermore, it was possible to prove the effectiveness of the new tactical metric and its relevance for adding information during a match

Journal ArticleDOI
TL;DR: This paper presents grid based, contour based and area based approach for signature verification and explains how intersecting points and centroids of two equal half of the signature are being calculated.
Abstract: Now a day’s Signature verification is one of the most important features for checking the authenticity of a person. There are many security checking parameters like pin code, password, finger print checking but signature recognition is the most popular because it is quite accurate and cost efficient too. On the other hand one doesn’t have to remember the authentication key like pin code or password. The signature of a genuine signer stays almost constant. But there may be little difference between well practiced forgeries and the genuine signer. It is required to distinguish these differences. This paper presents grid based, contour based and area based approach for signature verification. Intersecting points and centroids of two equal half of the signature is being calculated and then those centroids are connected with a straight line and the angles of these intersecting points with respect to the centroids connecting lines are calculated. General Terms Signature Verification, Grid based approach, Centroid based approach and contour based approach.


Proceedings ArticleDOI
01 Dec 2014
TL;DR: Experimental result shows that the proposed tracking approach detects and tracks object moving in the scene efficiently and also can handle occlusion.
Abstract: Object motion tracking has been done for videos with various methodologies The existing systems are mainly based on single feature and motion detection as a single object by taking the movement of the feature selected We propose a moving object tracking method that uses low-level features like centroid location of tracked object combined with the color feature Centroid location of each moving object in the current frame is computed using the greedy technique based on previous locations of the object in earlier frame Experimental result shows that the proposed tracking approach detects and tracks object moving in the scene efficiently and also can handle occlusion