scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Line generalisation by repeated elimination of points

01 Jan 1993-Cartographic Journal (Maney Publishing)-Vol. 30, Iss: 1, pp 46-51
TL;DR: A new approach to line generalisation which uses the concept of 'effective area' for progressive simplification of a line by point elimination and offers scope for modelling cartographic lines as consisting of features within features so that their geometric manipulation may be modified by application- and/or user-defined rules and weights.
Abstract: This paper presents a new approach to line generalisation which uses the concept of 'effective area' for progressive simplification of a line by point elimination. Two coastlines are used to compare the performance of this, with that of the widely used Douglas-Peucker, algorithm. The results from the area-based algorithm compare favourably with manual generalisation of the same lines. It is capable of achieving both imperceptible minimal simplifications and caricatural generalisations. By careful selection of cut-off values, it is possible to use the same algorithm for scale-dependent and scale-independent generalisations. More importantly, it offers scope for modelling cartographic lines as consisting of features within features so that their geometric manipulation may be modified by application- and/or user-defined rules and weights. The paper examines the merits and limitations of the algorithm and the opportunities it offers for further research and progress in the field of line generalisation.
Citations
More filters
Patent
17 Jun 2010
TL;DR: In this paper, the authors use tiles of stored geographic data available at some scales to generate maps having several scales, including but not limited to the scales of the stored data, organized in a tile-tree in which different tiles contain approximately the same amount of data.
Abstract: Utilizing the structure and methods of the invention herein when loading geographic vector data, the average time to load all necessary geographic data needed for one rendered map to the next will be greatly reduced - the loading time reduction will be especially large when pan and zoom operations are the main transitions from one rendered map to the next. The invention uses tiles of stored geographic data available at some scales to generate maps having several scales, including but not limited to the scales of the stored data. The geographical data is organized in a tile-tree in which different tiles contain approximately the same amount of data. This is accomplished by a data generation process that packs details higher in the tree for areas where the data density is low.

243 citations

Patent
16 Mar 2001
TL;DR: In this paper, a system and method for making computer-generated maps includes a different scale factor for each road in a route and a refinement technique such as simulated annealing is used to find a solution to the target function.
Abstract: A system and method for making computer-generated maps includes a different scale factor for each road in a route The scale factors are used to optimize the route map against a target function that considers factors such as the number of false intersections in the route and the number of roads falling below a minimum length threshold A refinement technique such as simulated annealing is used to find a solution to the target function Each road in the scaled map is rendered to provide a finished product having the appearance of a hand-drawn map The finished product includes context roads that intersect the main route but are not part of the main route Furthermore, the hand-drawn map is optimized to the characteristics of the viewport used to visualize the map

234 citations

Journal ArticleDOI
TL;DR: This paper evaluates performance of the line simplification algorithms using two comprehensive measures of positional accuracy of the simplified line: one displacement measure and one shape distortion measure, both of which are able to consider the displacement between the original line and its simplified version.
Abstract: Many studies of line simplification methods have been developed; however, an evaluation of these methods is still an open issue. This paper aims to evaluate a diversity of automatic line simplification algorithms in terms of positional accuracy and processing time. Past research studies for the performance evaluation were centred on measuring the location difference between a line to be simplified and its simplified version. However, the original line contains positional uncertainty. This paper evaluates performance of the line simplification algorithms using two comprehensive measures of positional accuracy of the simplified line. These two measures include one displacement measure and one shape distortion measure, both of which are able to consider (a) the displacement between the original line and its simplified version, and (b) positional uncertainty of the original line.

121 citations


Cites background from "Line generalisation by repeated eli..."

  • ...Another example of a global routine is an area-based line generalization proposed by Visvalingam and Whyatt (1993)....

    [...]

Book ChapterDOI
16 Sep 1996

118 citations

References
More filters
Journal ArticleDOI
TL;DR: In this paper, two algorithms to reduce the number of points required to represent the line and, if desired, produce caricatures are presented and compared with the most promising methods so far suggested.
Abstract: All digitizing methods, as a general rule, record lines with far more data than is necessary for accurate graphic reproduction or for computer analysis. Two algorithms to reduce the number of points required to represent the line and, if desired, produce caricatures, are presented and compared with the most promising methods so far suggested. Line reduction will form a major part of automated generalization. Regle generale, les methodes numeriques enregistrent des lignes avec beaucoup plus de donnees qu'il n'est necessaire a la reproduction graphique precise ou a la recherche par ordinateur. L'auteur presente deux algorithmes pour reduire le nombre de points necessaires pour representer la ligne et produire des caricatures si desire, et les compare aux methodes les plus prometteuses suggerees jusqu'ici. La reduction de la ligne constituera une partie importante de la generalisation automatique.

3,749 citations


"Line generalisation by repeated eli..." refers methods in this paper

  • ...The caricatures produced by the Douglas-Peucker algorithm (Douglas and Peucker, 1973) are believed to be most successful....

    [...]

Journal ArticleDOI
TL;DR: Special types of lawfulness which may exist in space at a fixed time, and which seem particularly relevant to processes of visual perception are focused on.
Abstract: xThe ideas of information theory are at present stimulating many different areas of psychological inquiry. In providing techniques for quantifying situations which have hitherto been difficult or impossible to quantify, they suggest new and more precise ways of conceptualizing these situations (see Miller [12] for a general discussion and bibliography). Events ordered in time are particularly amenable to informational analysis; thus language sequences are being extensively studied, and other sequences, such as those of music, plainly invite research. In this paper I shall indicate some of the ways in which the concepts and techniques of information theory may clarify our understanding of visual perception. When we begin to consider perception as an information-handling process, it quickly becomes clear that much of the information received by any higher organism is redundant. Sensory events are highly interdependent in both space and time: if we know at a given moment the states of a limited number of receptors (i.e., whether they are firing or not firing), we can make better-than-chance inferences with respect to the prior and subsequent states of these receptors, and also with respect to the present, prior, and subsequent states of other receptors. The preceding statement, taken in its broadest im1 The experimental work for this study was performed as part of the United States Air Force Human Resources Research and Development Program. The opinions and conclusions contained in this report are those of the author. They are not to be construed as reflecting the views or indorsement of the Department of the Air Force. plications, is precisely equivalent to an assertion that the world as we know it is lawful. In the present discussion, however, we shall restrict our attention to special types of lawfulness which may exist in space at a fixed time, and which seem particularly relevant to processes of visual perception.

2,800 citations

Journal ArticleDOI
TL;DR: The value of visualization within one problem in cartography, namely the generalisation of lines, is demonstrated and tools for the generation and manipulation of realistic images are of limited value within this application.
Abstract: The primary aim of this paper is to illustrate the value of visualization in cartography and to indicate that tools for the generation and manipulation of realistic images are of limited value within this application. This paper demonstrates the value of visualization within one problem in cartography, namely the generalisation of lines. It reports on the evaluation of the Douglas-Peucker algorithm for line simplification. Visualization of the simplification process and of the results suggest that the mathematical measures of performance proposed by some other researchers are inappropriate, misleading and questionable.

98 citations


"Line generalisation by repeated eli..." refers background in this paper

  • ...However, Visvalingam and Whyatt (1990) demonstrated why a fixed rank-order of critical points, no matter how they are devised, limits the scope for producing appropriate scale-related displays....

    [...]

Journal ArticleDOI
TL;DR: An efficient (linear time), computationally simple algorithm is developed that achieves a high degree of data reduction while producing a representation that is accurate for even the most complex curves.
Abstract: A planar curve may be represented by a sequence of connected line segments. Existent algorithms for reducing the number of line segments used to represent a curve are examined. An efficient (linear time), computationally simple algorithm is developed. This algorithm achieves a high degree of data reduction while producing a representation that is accurate for even the most complex curves.

79 citations


"Line generalisation by repeated eli..." refers methods in this paper

  • ...Whyatt (1991) also found that the method produces better results than the algorithms proposed by Roberge (1985) and Dettori and Falcidieno (1982)....

    [...]

  • ...Whyatt (1991) also found that the method produces better results than the algorithms proposed by Roberge (1985) and Dettori and Falcidieno (1982). However, as with other point-based methods, the...

    [...]

Journal ArticleDOI
TL;DR: This paper examines how effectively both models capture cartographic line width and succeed in producing generalized results, particularly for larger scale reductions.
Abstract: ‘The Theory of a Cartographic Line’ (Peucker 1975) describes width as being the essential characteristic of a cartographic line. Digital representations have tended to ignore this basic attribute and in the context of generalization the oversight is detrimental. The theory claims that a set of enclosing bands captures the cartographic character of width and supports generalization. The Douglas algorithm, still one of the most commonly used algorithms for generalizing digital representations, uses this model. Work of a Polish mathematician Perkal, provides the basis for another model of cartographic line width and a different generalization technique. This paper examines how effectively both models capture cartographic line width and succeed in producing generalized results, particularly for larger scale reductions. The two techniques are assessed by their ability to satisfy two objectives: capturing the essential and recognizable characteristics of geographic features and creating representations which ca...

27 citations