scispace - formally typeset
Search or ask a question

Showing papers on "Centroid published in 2009"


Journal ArticleDOI
TL;DR: The powerful visualization tools of geometric morphometrics and the typically large amount of shape variables give rise to a specific exploratory style of analysis, allowing the identification and quantification of previously unknown shape features.
Abstract: Geometric morphometrics is the statistical analysis of form based on Cartesian landmark coordinates. After separating shape from overall size, position, and orientation of the landmark configurations, the resulting Procrustes shape coordinates can be used for statistical analysis. Kendall shape space, the mathematical space induced by the shape coordinates, is a metric space that can be approximated locally by a Euclidean tangent space. Thus, notions of distance (similarity) between shapes or of the length and direction of developmental and evolutionary trajectories can be meaningfully assessed in this space. Results of statistical techniques that preserve these convenient properties—such as principal component analysis, multivariate regression, or partial least squares analysis—can be visualized as actual shapes or shape deformations. The Procrustes distance between a shape and its relabeled reflection is a measure of bilateral asymmetry. Shape space can be extended to form space by augmenting the shape coordinates with the natural logarithm of Centroid Size, a measure of size in geometric morphometrics that is uncorrelated with shape for small isotropic landmark variation. The thin-plate spline interpolation function is the standard tool to compute deformation grids and 3D visualizations. It is also central to the estimation of missing landmarks and to the semilandmark algorithm, which permits to include outlines and surfaces in geometric morphometric analysis. The powerful visualization tools of geometric morphometrics and the typically large amount of shape variables give rise to a specific exploratory style of analysis, allowing the identification and quantification of previously unknown shape features.

1,017 citations


Proceedings ArticleDOI
20 Jun 2009
TL;DR: It is demonstrated that Hough forests improve the results of the Hough-transform object detection significantly and achieve state-of-the-art performance for several classes and datasets.
Abstract: We present a method for the detection of instances of an object class, such as cars or pedestrians, in natural images Similarly to some previous works, this is accomplished via generalized Hough transform, where the detections of individual object parts cast probabilistic votes for possible locations of the centroid of the whole object; the detection hypotheses then correspond to the maxima of the Hough image that accumulates the votes from all parts However, whereas the previous methods detect object parts using generative codebooks of part appearances, we take a more discriminative approach to object part detection Towards this end, we train a class-specific Hough forest, which is a random forest that directly maps the image patch appearance to the probabilistic vote about the possible location of the object centroid We demonstrate that Hough forests improve the results of the Hough-transform object detection significantly and achieve state-of-the-art performance for several classes and datasets

518 citations


01 Jan 2009
TL;DR: It is demonstrated that Hough forests improve the results of the Hough-transform object detection significantly and achieve state-of-the-art performance for several classes and datasets.
Abstract: We present a method for the detection of instances of an object class, such as cars or pedestrians, in natural images. Similarly to some previous works, this is accomplished via generalized Hough transform, where the detections of individual object parts cast probabilistic votes for possible locations of the centroid of the whole object; the detection hypotheses then correspond to the maxima of the Hough image that accumulates the votes from all parts. However, whereas the previous methods detect object parts using generative codebooks of part appearances, we take a more discriminative approach to object part detection. Towards this end, we train a class-specific Hough forest, which is a random forest that directly maps the image patch appearance to the probabilistic vote about the possible location of the object centroid. We demonstrate that Hough forests improve the results of the Hough-transform object detection significantly and achieve state-of-the-art performance for several classes and datasets.

320 citations


Journal ArticleDOI
TL;DR: It is proved that set theoretic operations for T2 FSs can be computed using very simple alpha-plane computations that are the set theoretics operations for interval T2 (IT2) FSs.
Abstract: This paper 1) reviews the alpha-plane representation of a type-2 fuzzy set (T2 FS), which is a representation that is comparable to the alpha-cut representation of a type-1 FS (T1 FS) and is useful for both theoretical and computational studies of and for T2 FSs; 2) proves that set theoretic operations for T2 FSs can be computed using very simple alpha-plane computations that are the set theoretic operations for interval T2 (IT2) FSs; 3) reviews how the centroid of a T2 FS can be computed using alpha-plane computations that are also very simple because they can be performed using existing Karnik Mendel algorithms that are applied to each alpha-plane; 4) shows how many theoretically based geometrical properties can be obtained about the centroid, even before the centroid is computed; 5) provides examples that show that the mean value (defuzzified value) of the centroid can often be approximated by using the centroids of only 0 and 1 alpha -planes of a T2 FS; 6) examines a triangle quasi-T2 fuzzy logic system (Q-T2 FLS) whose secondary membership functions are triangles and for which all calculations use existing T1 or IT2 FS mathematics, and hence, they may be a good next step in the hierarchy of FLSs, from T1 to IT2 to T2; and 7) compares T1, IT2, and triangle Q-T2 FLSs to forecast noise-corrupted measurements of a chaotic Mackey-Glass time series.

298 citations


Journal ArticleDOI
TL;DR: It is proved that all three centroids are unique and give closed-form solutions for the sided centroid that are generalized means, and a provably fast and efficient arbitrary close approximation algorithm for the symmetrized centroid based on its exact geometric characterization is designed.
Abstract: In this paper, we generalize the notions of centroids (and barycenters) to the broad class of information-theoretic distortion measures called Bregman divergences. Bregman divergences form a rich and versatile family of distances that unifies quadratic Euclidean distances with various well-known statistical entropic measures. Since besides the squared Euclidean distance, Bregman divergences are asymmetric, we consider the left-sided and right-sided centroids and the symmetrized centroids as minimizers of average Bregman distortions. We prove that all three centroids are unique and give closed-form solutions for the sided centroids that are generalized means. Furthermore, we design a provably fast and efficient arbitrary close approximation algorithm for the symmetrized centroid based on its exact geometric characterization. The geometric approximation algorithm requires only to walk on a geodesic linking the two left/right-sided centroids. We report on our implementation for computing entropic centers of image histogram clusters and entropic centers of multivariate normal distributions that are useful operations for processing multimedia information and retrieval. These experiments illustrate that our generic methods compare favorably with former limited ad hoc methods.

204 citations


Proceedings ArticleDOI
20 Apr 2009
TL;DR: A fast Class-Feature-Centroid (CFC) classifier for multi-class, single-label text categorization that consistently outperforms the state-of-the-art SVM classifiers on both micro-F1 and macro-f1 scores.
Abstract: Automated text categorization is an important technique for many web applications, such as document indexing, document filtering, and cataloging web resources. Many different approaches have been proposed for the automated text categorization problem. Among them, centroid-based approaches have the advantages of short training time and testing time due to its computational efficiency. As a result, centroid-based classifiers have been widely used in many web applications. However, the accuracy of centroid-based classifiers is inferior to SVM, mainly because centroids found during construction are far from perfect locations.We design a fast Class-Feature-Centroid (CFC) classifier for multi-class, single-label text categorization. In CFC, a centroid is built from two important class distributions: inter-class term index and inner-class term index. CFC proposes a novel combination of these indices and employs a denormalized cosine measure to calculate the similarity score between a text vector and a centroid. Experiments on the Reuters-21578 corpus and 20-newsgroup email collection show that CFC consistently outperforms the state-of-the-art SVM classifiers on both micro-F1 and macro-F1 scores. Particularly, CFC is more effective and robust than SVM when data is sparse.

125 citations


Journal ArticleDOI
TL;DR: A novel adaptive mesh refinement (AMR) strategy based on the moment-of-fluid (MOF) method for volume-tracking of evolving interfaces is presented, which shows the superior accuracy of the AMR-MOF method over other methods.

77 citations


Proceedings ArticleDOI
15 May 2009
TL;DR: A new approach to optimizing the designation of initial centroids for K-means clustering by using the farthest accumulated distance between them, inspired by the thought process of determining a set of pillars' locations in order to make a stable house or building.
Abstract: Clustering performance of the K-means greatly relies upon the correctness of the initial centroids. Usually the initial centroids for the K-means clustering are determined randomly so that the determined centroids may reach the nearest local minima, not the global optimum. This paper proposes a new approach to optimizing the designation of initial centroids for K-means clustering. This approach is inspired by the thought process of determining a set of pillars' locations in order to make a stable house or building. We consider the pillars' placement which should be located as far as possible from each other to withstand against the pressure distribution of a roof, as identical to the number of centroids amongst the data distribution. Therefore, our proposed approach in this paper designates positions of initial centroids by using the farthest accumulated distance between them. First, the accumulated distance metric between all data points and their grand mean is created. The first initial centroid which has maximum accumulated distance metric is selected from the data points. The next initial centroids are designated by modifying the accumulated distance metric between each data point and all previous initial centroids, and then, a data point which has the maximum distance is selected as a new initial centroid. This iterative process is needed so that all the initial centroids are designated. This approach also has a mechanism to avoid outlier data being chosen as the initial centroids. The experimental results show effectiveness of the proposed algorithm for improving the clustering results of K-means clustering.

71 citations


Patent
02 Mar 2009
TL;DR: In this article, a method for registration of n frames 3D point cloud data is presented, where the data points in the n frames are transformed using the global translation and rotation vectors to provide a set of n coarsely adjusted frames.
Abstract: Method (300) for registration of n frames 3D point cloud data. Frame pairs (200i, 200j) are selected from among the n frames and sub-volumes (702) within each frame are defined. Qualifying sub-volumes are identified in which the 3D point cloud data has a blob-like structure. A location of a centroid associated with each of the blob-like objects is also determined. Correspondence points between frame pairs are determined using the locations of the centroids in corresponding sub-volumes of different frames. Thereafter, the correspondence points are used to simultaneously calculate for all n frames, global translation and rotation vectors for registering all points in each frame. Data points in the n frames are then transformed using the global translation and rotation vectors to provide a set of n coarsely adjusted frames.

66 citations



Journal ArticleDOI
TL;DR: The CCD-based point source centroid computation (PSCC) error under the background light is analyzed integrally in theory, numerical simulation and experiment and a comprehensive formula of the PSCC error caused by the diversified error sources is put forward.
Abstract: The CCD-based point source centroid computation (PSCC) error under the background light is analyzed integrally in theory, numerical simulation and experiment. Furthermore, a comprehensive formula of the PSCC error caused by the diversified error sources is put forward. The optimum threshold to reduce the effects of all the error sources to a minimum is selected. The best threshold level is (N-B) over bar + 3 sigma(B), where (N-B) over bar is the average value of the error sources and sigma(B) is the mean-square value of the fluctuation of the error sources. The simulation and experiment results are in great accordance with the theoretical analysis. (C) 2009 Optical Society of America

Patent
01 Sep 2009
TL;DR: In this paper, the number of seeds calculated corresponds to a number of data clusters expected in the set of multi-dimensional data points and seed values are determined using subsample elimination techniques.
Abstract: Improved efficiencies of data mining clustering techniques are provided by preprocessing a sample set of data points taken from a complete data set to provide seeds for centroid calculations of the complete data set. Such seeds are generated by selecting a uniform sample set of data points from a set of multi-dimensional data and then seed values for the cluster determination calculation are determined using a centroid analysis on the sample set of data points. The number of seeds calculated corresponds to a number of data clusters expected in the set of multi-dimensional data points. Seed values are determined using subsample elimination techniques.

Proceedings ArticleDOI
01 Sep 2009
TL;DR: This paper proposes distributed self-deployment schemes of mobile sensors, using Voronoi diagram and centroid (geometric center) to design two schemes, named Centroid (centroid-based scheme) and Dual-Centroid (dual-centroid based scheme).
Abstract: Efficient deployment of sensors is an important issue in wireless sensor networks. If we deploy sensors randomly by our hands or carriers only, some unlucky places are not covered forever. In this paper, we propose distributed self-deployment schemes of mobile sensors. At first, sensors are deployed randomly. They then calculate the next positions locally by utilizing the proposed schemes and move to them. The locations of the sensors are adjusted round by round so that the coverage is gradually improved. By using Voronoi diagram and centroid (geometric center), we design two schemes, named Centroid (centroid-based scheme) and Dual-Centroid (dual-centroid based scheme). We also measure the performance of the proposed schemes and the existing schemes, and show that the proposed schemes get better results.

Proceedings ArticleDOI
19 Jan 2009
TL;DR: Significant improvements can be obtained by the approach in both common centroid and 1-D symmetry placements, and it is claimed that this work is the first who can handle both constraints simultaneously.
Abstract: In this paper, we will present a placement method for analog circuits. We consider both common centroid and 1-D symmetry constraints, which are the two most common types of placement requirements in analog designs. The approach is based on a symmetric feasible condition on the sequence pair representation that can cover completely the set of all placements satisfying the common centroid and 1-D symmetry constraints. This condition is essential for a good searching process to solve the problem effectively. Symmetric placement is an important step to achieve matchings of other electrical properties like delay and temperature variation. We have compared our results with those presented in the most updated previous works. Significant improvements can be obtained by our approach in both common centroid and 1-D symmetry placements, and we are the first who can handle both constraints simultaneously.

Patent
27 Feb 2009
TL;DR: In this article, a clustering method for high-dimensional data is proposed, which identifies a set of nearest neighbors of a point in a multidimensional space and determines the centroid of the set of neighbors.
Abstract: A clustering method for high-dimensionality data includes identifying a set of nearest neighbors of a point in a multidimensional space and determining the centroid of the set of nearest neighbors, where the centroid is a member of the set of nearest neighbors. The method is then repeated using the neighbors identified around the computed centroid. In one embodiment, the method may terminate when the computed centroid becomes stationary over successive iterations. The resulting centroid may be returned as a mode of the data set. Points of the data set having common modes may be assigned to the same cluster.

Journal ArticleDOI
TL;DR: This work presents a generalization of the standard scheme that is called Hermite point set surface, which allows interpolating, given normal constraints in a stable way, and yields an intuitive parameter for shape control and preserves convexity in most situations.
Abstract: Point set surfaces define a (typically) manifold surface from a set of scattered points. The definition involves weighted centroids and a gradient field. The data points are interpolated if singular weight functions are used to define the centroids. While this way of deriving an interpolatory scheme appears natural, we show that it has two deficiencies: Convexity of the input is not preserved and the extension to Hermite data is numerically unstable. We present a generalization of the standard scheme that we call Hermite point set surface. It allows interpolating, given normal constraints in a stable way. It also yields an intuitive parameter for shape control and preserves convexity in most situations. The analysis of derivatives also leads to a more natural way to define normals, in case they are not supplied with the point data. We conclude by comparing to similar surface definitions.

Journal ArticleDOI
TL;DR: Using componentwise medians to construct a robust classifier that is relatively insensitive to the difficulties caused by heavy-tailed data and entails straightforward computation is suggested.
Abstract: Conventional distance-based classifiers use standard Euclidean distance, and so can suffer from excessive volatility if vector components have heavy-tailed distributions. This difficulty can be alleviated by replacing the L2 distance by its L1 counterpart. For example, the L1 version of the popular centroid classifier would allocate a new data value to the population to whose centroid it was closest in L1 terms. However, this approach can lead to inconsistency, because the centroid is defined using L2, rather than L1 distance. In particular, by mixing L1 and L2 approaches, we produce a classifier that can seriously misidentify data in cases where the means and medians of marginal distributions take different values. These difficulties motivate replacing centroids by medians. However, in the very-high-dimensional settings commonly encountered today, this can be problematic if we attempt to work with a conventional spatial median. Therefore, we suggest using componentwise medians to construct a robust class...

Journal ArticleDOI
TL;DR: A centroid measurement algorithm based on the adaptive thresholding and dynamic windowing method by utilizing image processing techniques for practical application of the digital SHWS in surface profile measurement that has better precision, repeatability, and stability compared with other commonly used centroid methods.
Abstract: A Shack-Hartmann wavefront sensor (SWHS) splits the incident wavefront into many subsections and transfers the distorted wavefront detection into the centroid measurement. The accuracy of the centroid measurement determines the accuracy of the SWHS. Many methods have been presented to improve the accuracy of the wavefront centroid measurement. However, most of these methods are discussed from the point of view of optics, based on the assumption that the spot intensity of the SHWS has a Gaussian distribution, which is not applicable to the digital SHWS. In this paper, we present a centroid measurement algorithm based on the adaptive thresholding and dynamic windowing method by utilizing image processing techniques for practical application of the digital SHWS in surface profile measurement. The method can detect the centroid of each focal spot precisely and robustly by eliminating the influence of various noises, such as diffraction of the digital SHWS, unevenness and instability of the light source, as well as deviation between the centroid of the focal spot and the center of the detection area. The experimental results demonstrate that the algorithm has better precision, repeatability, and stability compared with other commonly used centroid methods, such as the statistical averaging, thresholding, and windowing algorithms.

Journal ArticleDOI
TL;DR: With this method, the recorded Shack-Hartmann spots are not constrained to stay in the field of view of their lenslet, which makes it useful for the measurements of highly aberrated eyes.
Abstract: We present an algorithm to extend significantly the dynamic range of a Shack-Hartmann wavefront sensor With this method, the recorded Shack-Hartmann spots are not constrained to stay in the field of view of their lenslet The proposed algorithm is computationally effective, robust to a high level of noise on the measured centroid positions and also to missing centroid values The principle is closely related to the description of wavefronts using Zernike polynomials, which makes optimization for a given sensor and application achievable thanks to numerical simulation These features make it useful for the measurements of highly aberrated eyes

Journal ArticleDOI
TL;DR: A new methodology to establish the best global match of objects’ contours in images is presented and its results are compared to those obtained by the geometric modeling approach proposed by Shapiro and Brady who are well known in this domain.
Abstract: This paper presents a new methodology to establish the best global match of objects’ contours in images. The first step is the extraction of the sets of ordered points that define the objects’ contours. Then, by using the curvature value and its distance to the corresponded centroid for each point, an affinity matrix is built. This matrix contains information of the cost for all possible matches between the two sets of ordered points. Then, to determine the desired one-to-one global matching, an assignment algorithm based on dynamic programming is used. This algorithm establishes the global matching of the minimum global cost that preserves the circular order of the contours’ points. Additionally, a methodology to estimate the similarity transformation that best aligns the matched contours is also presented. This methodology uses the matching information which was previously obtained, in addition to a statistical process to estimate the parameters of the similarity transformation in question. In order to validate the proposed matching methodology, its results are compared to those obtained by the geometric modeling approach proposed by Shapiro and Brady who are well known in this domain.

Journal ArticleDOI
TL;DR: In this article, the authors explore the use of centroid coordinates as a means to identify the "locations" of electron-proton bremsstrahlung hard X-ray sources in solar flares.
Abstract: We explore the use of centroid coordinates as a means to identify the "locations" of electron-proton bremsstrahlung hard X-ray sources in solar flares. Differences between the coordinates of the electron and photon centroids are derived and explained. For electron propagation in a collision-dominated target, with either a uniform or an exponential density profile, the position of the electron centroid can be calculated analytically. We compare these analytic forms to data from a flare event on 2002 February 20. We first spectrally invert the native photon visibility data to obtain "electron visibilities," which are in turn used to construct electron flux images at various electron energies E. Centroids of these maps are then obtained by straightforward numerical integration over the electron maps. This comparison allows us to infer the density structure in the two compact sources visible, and we discuss the (somewhat unexpected) results thus obtained.

Proceedings ArticleDOI
26 Jul 2009
TL;DR: A novel shape descriptor based on the histogram matrix of pixel-level structural features that can measure circularity, smoothness, and symmetry of shapes, and be used to recognize shapes.
Abstract: A novel shape descriptor based on the histogram matrix of pixel-level structural features is presented. First, length ratios and angles between the centroid and contour points of a shape are calculated as two structural attributes. Then, the attributes are combined to construct a new histogram matrix in the feature space statistically. The proposed shape descriptor can measure circularity, smoothness, and symmetry of shapes, and be used to recognize shapes. Experimental results demonstrate the effectiveness of our method.

Journal ArticleDOI
01 Oct 2009
TL;DR: The experiments show that the devised method outperforms other well-known Fourier-based shape descriptors such as centroid distance and boundary curvature.
Abstract: This paper presents two shape descriptors which could be applied to both binary and grayscale images. The proposed algorithm utilizes gradient based features which are extracted along the object boundaries. We use two-dimensional steerable G-Filters (IEEE Trans Pattern Anal Mach Intell 19(6):545–563, 1997) to obtain gradient information at different orientations and scales, and then aggregate the gradients into a shape signature. The signature derived from the rotated object is circularly shifted version of the signature derived from the original object. This property is called the circular-shifting rule (Affine-invariant gradient based shape descriptor. Lecture notes in computer science. International workshop on multimedia contents Representation, Classification and Security, pp 514–521, 2006). The shape descriptor is defined as the Fourier transform of the signature. We also provide a distance measure for the proposed descriptor by taking the circular-shifting rule into account. The performance of the proposed descriptor is evaluated over two databases; one containing digits taken from vehicle license plates and the other containing MPEG-7 Core Experiment and Kimia shape data set. The experiments show that the devised method outperforms other well-known Fourier-based shape descriptors such as centroid distance and boundary curvature.

Journal Article
TL;DR: This paper presents new fast codebook search algorithm which uses sorting and centroid technique to search the closest codevector in the codebook and uses the mean absolute error as the quality factor.
Abstract: Vector Quantization(VQ) is an efficient technique for data compression and has been successfully used in various applications. In this paper we present new fast codebook search algorithm which uses sorting and centroid technique to search the closest codevector in the codebook. The proposed search algorithm is faster since it reduces number of Euclidean distance computation as compared to Exhaustive search algorithm while keep the image quality imperceptibly close to Exhaustive search algorithm. We have used the mean absolute error as the quality factor since it gives better feel of distortion. Also the proposed algorithm is compared with other codebook search algorithms given in literature and it is found that the performance parameter' average execution time and average number of Euclidean distance computation per image training vector of the proposed algorithm is considerably better compared to most of them.

Journal ArticleDOI
TL;DR: A novel automated protocol for the tracking of the movement of different animals at once based on a multivariate morphometric approach is presented and marked differences in the amplitude of activity phases as proof of interindividual interaction are indicated.

Journal ArticleDOI
TL;DR: The connection between the results presented in this work and symmetry detection principles suggested from previous complex moment-based formulations is demonstrated, offering a unifying framework for shape orientation/symmetry detection.
Abstract: In this paper, the problem of moment-based shape orientation and symmetry classification is jointly considered. A generalization and modification of current state-of-the-art geometric moment-based functions is introduced. The properties of these functions are investigated thoroughly using Fourier series analysis and several observations and closed-form solutions are derived. We demonstrate the connection between the results presented in this work and symmetry detection principles suggested from previous complex moment-based formulations. The proposed analysis offers a unifying framework for shape orientation/symmetry detection. In the context of symmetry classification and matching, the second part of this work presents a frequency domain method, aiming at computing a robust moment-based feature set based on a true polar Fourier representation of image complex gradients and a novel periodicity detection scheme using subspace analysis. The proposed approach removes the requirement for accurate shape centroid estimation, which is the main limitation of moment-based methods, operating in the image spatial domain. The proposed framework demonstrated improved performance, compared to state-of-the-art methods.

Patent
16 Apr 2009
TL;DR: In this article, a unified template matching technique was proposed to provide an adequate matching position even in an image with a distorted pattern shape and a variation in edge intensity, where the candidates were then re-sorted based on candidate positions and correlation values newly obtained.
Abstract: The present invention provides a unified template matching technique which allows an adequate matching position to be provided even in an image with a distorted pattern shape and a variation in edge intensity. A correlation value contribution rate map is created for the vicinity of each of top candidate positions obtained by applying a centroid distance filter to a normalized correlation map resulting from template matching. A corrected intensity image is created from the correlation value contribution rate maps. Luminance correction is performed based on the corrected intensity image. Local matching is performed again on the vicinity of each candidate position. The candidates are then re-sorted based on candidate positions and correlation values newly obtained. Thus, even in an image with a distorted pattern shape and a variation in edge intensity, an adequate matching position can be provided in a unified manner.

Journal ArticleDOI
TL;DR: It was found that the orthogonal error function results in two local minima and that the outcome of the optimization depends on the choice of starting point, so the directional error function is more suitable for applications that require fully automated sphere fitting.
Abstract: Two error functions used for nonlinear least squares (LS) fitting of spheres to range data from 3-D imaging systems are discussed: the orthogonal error function and the directional error function. Both functions allow unrestricted gradient-based minimization and were tested on more than 40 data sets collected under different experimental conditions (e.g., different sphere diameters, instruments, data density, and data noise). It was found that the orthogonal error function results in two local minima and that the outcome of the optimization depends on the choice of starting point. The centroid of the data points is commonly used as the starting point for the nonlinear LS solution, but the choice of starting point is sensitive to data segmentation and, for some sparse and noisy data sets, can lead to a spurious minimum that does not correspond to the center of a real sphere. The directional error function has only one minimum; therefore, it is not sensitive to the starting point and is more suitable for applications that require fully automated sphere fitting.

Proceedings ArticleDOI
18 Sep 2009
TL;DR: The results shows that the algorithm can realize accurate target location when there is none of error in the information of node information and time difference of arrival time, and the effectiveness of the algorithm is better than the centroid algorithm when there are minor error.
Abstract: In order to improve the location accuracy of the centroid algorithm in binary-detection sensor networks, a cooperative target location algorithm based on time difference of arrival was proposed. This algorithm only needs the position information and the target signal arrival time information of four nodes; it is simple in principle and calculation. Besides, because this algorithm used the centroid information of the four nodes, the algorithm can tolerant some minor error including the node position error and the time difference error. Simulations were made, and the results shows that the algorithm can realize accurate target location when there is none of error in the information of node information and time difference of arrival time, and the effectiveness of the algorithm is better than the centroid algorithm when there are minor error in the information of node position and time difference of arrival time.

Book ChapterDOI
30 Sep 2009
TL;DR: In this paper, the authors present algorithms to infer movement by making use of inherent fluctuations in the received signal strengths from existing WLAN infrastructure, and evaluate the performance of the presented algorithms based on classification metrics such as recall and precision using annotated traces obtained over twelve hours.
Abstract: We present novel algorithms to infer movement by making use of inherent fluctuations in the received signal strengths from existing WLAN infrastructure. We evaluate the performance of the presented algorithms based on classification metrics such as recall and precision using annotated traces obtained over twelve hours effectively from different types of environment and with different access point densities. We show how common deterministic localisation algorithms such as centroid and weighted centroid can improve when a motion model is included. To our knowledge, motion models are normally used only in probabilistic algorithms and such simple deterministic algorithms have not used a motion model in a principled manner. We evaluate the performance of these algorithms also against traces of RSSI data, with and without adding inferred mobility information.