scispace - formally typeset
Search or ask a question
Author

J. D. Whyatt

Bio: J. D. Whyatt is an academic researcher from University of Hull. The author has contributed to research in topics: Cartographic generalization & Visualization. The author has an hindex of 3, co-authored 3 publications receiving 385 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: A new approach to line generalisation which uses the concept of 'effective area' for progressive simplification of a line by point elimination and offers scope for modelling cartographic lines as consisting of features within features so that their geometric manipulation may be modified by application- and/or user-defined rules and weights.
Abstract: This paper presents a new approach to line generalisation which uses the concept of 'effective area' for progressive simplification of a line by point elimination. Two coastlines are used to compare the performance of this, with that of the widely used Douglas-Peucker, algorithm. The results from the area-based algorithm compare favourably with manual generalisation of the same lines. It is capable of achieving both imperceptible minimal simplifications and caricatural generalisations. By careful selection of cut-off values, it is possible to use the same algorithm for scale-dependent and scale-independent generalisations. More importantly, it offers scope for modelling cartographic lines as consisting of features within features so that their geometric manipulation may be modified by application- and/or user-defined rules and weights. The paper examines the merits and limitations of the algorithm and the opportunities it offers for further research and progress in the field of line generalisation.

305 citations

Journal ArticleDOI
TL;DR: The value of visualization within one problem in cartography, namely the generalisation of lines, is demonstrated and tools for the generation and manipulation of realistic images are of limited value within this application.
Abstract: The primary aim of this paper is to illustrate the value of visualization in cartography and to indicate that tools for the generation and manipulation of realistic images are of limited value within this application. This paper demonstrates the value of visualization within one problem in cartography, namely the generalisation of lines. It reports on the evaluation of the Douglas-Peucker algorithm for line simplification. Visualization of the simplification process and of the results suggest that the mathematical measures of performance proposed by some other researchers are inappropriate, misleading and questionable.

98 citations

Journal ArticleDOI
TL;DR: An in‐depth study of a line simplification algorithm shows that computerisation introduces its own sources of variability, and suggests that it would be difficult to adapt the Douglas‐Peucker algorithm to cope with digitising error without altering the method.
Abstract: Cartographic generalisation remains one of the outstanding challenges in digital cartography and Geographical Information Systems (GIS). It is generally assumed that computerisation will lead to the removal of spurious variability introduced by the subjective decisions of individual cartographers. This paper demonstrates through an in‐depth study of a line simplification algorithm that computerisation introduces its own sources of variability. The algorithm, referred to as the Douglas‐Peucker algorithm in cartographic literature, has been widely used in image processing, pattern recognition and GIS for some 20 years. An analysis of this algorithm and study of some implementations in wide use identify the presence of variability resulting from the subjective decisions of software implementors. Spurious variability in software complicates the processes of evaluation and comparison of alternative algorithms for cartographic tasks. No doubt, variability in implementation could be removed by rigorous study and specification of algorithms. Such future work must address the presence of digitising error in cartographic data. Our analysis suggests that it would be difficult to adapt the Douglas‐Peucker algorithm to cope with digitising error without altering the method. Copyright © 1991, Wiley Blackwell. All rights reserved

23 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: A new approach to line generalisation which uses the concept of 'effective area' for progressive simplification of a line by point elimination and offers scope for modelling cartographic lines as consisting of features within features so that their geometric manipulation may be modified by application- and/or user-defined rules and weights.
Abstract: This paper presents a new approach to line generalisation which uses the concept of 'effective area' for progressive simplification of a line by point elimination. Two coastlines are used to compare the performance of this, with that of the widely used Douglas-Peucker, algorithm. The results from the area-based algorithm compare favourably with manual generalisation of the same lines. It is capable of achieving both imperceptible minimal simplifications and caricatural generalisations. By careful selection of cut-off values, it is possible to use the same algorithm for scale-dependent and scale-independent generalisations. More importantly, it offers scope for modelling cartographic lines as consisting of features within features so that their geometric manipulation may be modified by application- and/or user-defined rules and weights. The paper examines the merits and limitations of the algorithm and the opportunities it offers for further research and progress in the field of line generalisation.

305 citations

Patent
17 Jun 2010
TL;DR: In this paper, the authors use tiles of stored geographic data available at some scales to generate maps having several scales, including but not limited to the scales of the stored data, organized in a tile-tree in which different tiles contain approximately the same amount of data.
Abstract: Utilizing the structure and methods of the invention herein when loading geographic vector data, the average time to load all necessary geographic data needed for one rendered map to the next will be greatly reduced - the loading time reduction will be especially large when pan and zoom operations are the main transitions from one rendered map to the next. The invention uses tiles of stored geographic data available at some scales to generate maps having several scales, including but not limited to the scales of the stored data. The geographical data is organized in a tile-tree in which different tiles contain approximately the same amount of data. This is accomplished by a data generation process that packs details higher in the tree for areas where the data density is low.

243 citations

Patent
16 Mar 2001
TL;DR: In this paper, a system and method for making computer-generated maps includes a different scale factor for each road in a route and a refinement technique such as simulated annealing is used to find a solution to the target function.
Abstract: A system and method for making computer-generated maps includes a different scale factor for each road in a route The scale factors are used to optimize the route map against a target function that considers factors such as the number of false intersections in the route and the number of roads falling below a minimum length threshold A refinement technique such as simulated annealing is used to find a solution to the target function Each road in the scaled map is rendered to provide a finished product having the appearance of a hand-drawn map The finished product includes context roads that intersect the main route but are not part of the main route Furthermore, the hand-drawn map is optimized to the characteristics of the viewport used to visualize the map

234 citations

Journal ArticleDOI
TL;DR: A new set of algorithms for locally–adaptive line generalization based on the so-called natural principle of objective generalization is described, which is compared with benchmarks based on both manual cartographic procedures and a standard method found in many geographical information systems.
Abstract: This article describes a new set of algorithms for locally–adaptive line generalization based on the so-called natural principle of objective generalization. The drawbacks of existing methods of line generalization are briefly discussed and the algorithms described. The performance of these new methods is compared with benchmarks based on both manual cartographic procedures and a standard method found in many geographical information systems.

139 citations

Journal ArticleDOI
TL;DR: This paper evaluates performance of the line simplification algorithms using two comprehensive measures of positional accuracy of the simplified line: one displacement measure and one shape distortion measure, both of which are able to consider the displacement between the original line and its simplified version.
Abstract: Many studies of line simplification methods have been developed; however, an evaluation of these methods is still an open issue. This paper aims to evaluate a diversity of automatic line simplification algorithms in terms of positional accuracy and processing time. Past research studies for the performance evaluation were centred on measuring the location difference between a line to be simplified and its simplified version. However, the original line contains positional uncertainty. This paper evaluates performance of the line simplification algorithms using two comprehensive measures of positional accuracy of the simplified line. These two measures include one displacement measure and one shape distortion measure, both of which are able to consider (a) the displacement between the original line and its simplified version, and (b) positional uncertainty of the original line.

121 citations