scispace - formally typeset
Search or ask a question

Showing papers on "Computational geometry published in 1983"


Journal ArticleDOI
TL;DR: In this article, the authors propose an approach based on characterizing the position and orientation of an object as a single point in a configuration space, in which each coordinate represents a degree of freedom in the position or orientation of the object.
Abstract: This paper presents algorithms for computing constraints on the position of an object due to the presence of ther objects. This problem arises in applications that require choosing how to arrange or how to move objects without collisions. The approach presented here is based on characterizing the position and orientation of an object as a single point in a configuration space, in which each coordinate represents a degree of freedom in the position or orientation of the object. The configurations forbidden to this object, due to the presence of other objects, can then be characterized as regions in the configuration space, called configuration space obstacles. The paper presents algorithms for computing these configuration space obstacles when the objects are polygons or polyhedra.

1,996 citations


Journal ArticleDOI
01 Sep 1983

339 citations


Proceedings ArticleDOI
07 Nov 1983
TL;DR: A new framework for geometric computing is presented in which planar curves are formed by the motions of objects that have both position and orientation, which makes it possible to recast large parts of computational geometry, such as convexity, in a new and advantageous light.
Abstract: Extended Abstract 1. Introduction and summary We present a new framework for geometric computing in which planar curves are formed by the motions of objects that have both position and orientation. Informally, consider the motion of a car: at each instant the car is positioned at some point in the plane, and oriented with its hood facing in some direction. Together, the position and the orientation specify a state. In the new framework, curves and polygons are paths in this state space satisfying the constraint that wherever the tangent of the position component of the curve is defined, the orientation must be either the same as that tangent, or the opposite. In other words, the car may move either forwards or backwards, but it may not skid sideways. To distinguish these structures from classical curves and polygons, we call them tracings. Several important concepts can be defined fOf tracings, some familiar from differential topology, but others apparently new. These include the notions of winding number, degree, and sweep number. A general construction, known as fiber product, is used to define various operations on tracings, including convolution and multiplication. Each fiber product involves forming all pairs of states, one from each tracing, that satisfy a certain constraint Using these ideas, it is possible to recast large parts ofcomputational geometry, such as convexity, in a new and advantageous light This applies to proofs of theorems, as well as to descriptions of algorithms. Besides allowing us to recast old material, this kinetic approach has also led to a number of new algorithms that improve on the previously known bounds. For example, one algorithm computes the distance between two convex polygons in logarithmic time, while another tests if two convex polygons can disjointIy fit into a third one in linear time (no rotations allowed). A final attraction of the kinetic approach is that, after appropriately extending the plane into a new manifold, the two-sided plane, we can define a formal duality between points and lines that maintains the sense of left and right Informal notions of such a duality had previously been used in computational geometry, but now for the first time we can say in a precise sense that, for example, an algorithm for computing the convex hull of n points is also an algorithm for computing the intersection of n half-planes. It may be of interest to note that the origins of …

249 citations


Journal ArticleDOI
TL;DR: Three polygon decomposition problems are shown to be NP-hard and thus unlikely to admit efficient algorithms, and the polygonal region is permitted to contain holes.
Abstract: The inherent computational complexity of polygon decomposition problems is of theoretical interest to researchers in the field of computational geometry and of practical interest to those working in syntactic pattern recognition. Three polygon decomposition problems are shown to be NP-hard and thus unlikely to admit efficient algorithms. The problems are to find minimum decompositions of a polygonal region into (perhaps overlapping) convex, star-shaped, or spiral subsets. We permit the polygonal region to contain holes. The proofs are by transformation from Boolean three-satisfiability, a known NP-complete problem. Several open problems are discussed.

211 citations


Proceedings ArticleDOI
07 Nov 1983
TL;DR: A new formulation of the notion of duality that allows the unified treatment of a number of geometric problems is used, to solve two long-standing problems of computational geometry and to obtain a quadratic algorithm for computing the minimum-area triangle with vertices chosen among n points in the plane.
Abstract: This paper uses a new formulation of the notion of duality that allows the unified treatment of a number of geometric problems. In particular, we are able to apply our approach to solve two long-standing problems of computational geometry: one is to obtain a quadratic algorithm for computing the minimum-area triangle with vertices chosen among n points in the plane; the other is to produce an optimal algorithm for the half-plane range query problem. This problem is to preprocess n points in the plane, so that given a test half-plane, one can efficiently determine all points lying in the half-plane. We describe an optimal O(k + log n) time algorithm for answering such queries, where k is the number of points to be reported. The algorithm requires O(n) space and O(n log n) preprocessing time. Both of these results represent significant improvements over the best methods previously known. In addition, we give a number of new combinatorial results related to the computation of line arrangements.

169 citations


Journal ArticleDOI
TL;DR: A linear time algorithm for finding the convex hull of a simple polygon that requires only one stack, instead of two, and runs more efficiently than the result of McCallum and Avis.
Abstract: In this paper we present a linear time algorithm for finding the convex hull of a simple polygon. Compared to the result of McCallum and Avis, our algorithm requires only one stack, instead of two, and runs more efficiently.

93 citations


Proceedings ArticleDOI
07 Nov 1983
TL;DR: This work considers problems in computational geometry when every one of the input points is moving in a prescribed manner and presents and analyze efficient algorithms for a number of problems and proves lower bounds for some of them.
Abstract: We consider problems in computational geometry when every one of the input points is moving in a prescribed manner. We present and analyze efficient algorithms for a number of problems and prove lower bounds for some of them.

87 citations


Journal ArticleDOI
P.E. Allen1
01 Sep 1983

74 citations


Proceedings ArticleDOI
07 Nov 1983
TL;DR: Fitting search as mentioned in this paper preprocess a set of objects so that those satisfying a given property with respect to a query object can be listed very effectively, which can be used to improve the complexity of known algorithms and simplify their implementations as well.
Abstract: We introduce a new technique for solving problems of the following form: preprocess a set of objects so that those satisfying a given property with respect to a query object can be listed very effectively. Among well-known problems to fall into this category we find range query, point enclosure, intersection, near-neighbor problems, etc. The approach which we take is very general and rests on a new concept called fitering search. We show on a number of examples how it can be used to improve the complexity of known algorithms and simplify their implementations as well. In particular, filtering search allows us to improve on the worst-case complexity of the best algorithms known so far for solving the problems mentioned above.

53 citations


Journal ArticleDOI
TL;DR: In this paper, a new computational geometry for the blades and flow passages of centrifugal compressors is described and examples of its use in the design of industrial compressors are given.
Abstract: A new computational geometry for the blades and flow passages of centrifugal compressors is described and examples of its use in the design of industrial compressors are given. The method makes use of Bernstein-Bezier polynomial patches to define the geometrical shape of the flow channels. This has the following main advantages: the surfaces are defined by analytic functions which allow systematic and controlled variation of the shape and give continuous derivatives up to any required order: and the parametric form of the equations allows the blade and channel coordinates to be very simply obtained at any number of points and in any suitable distribution for use in subsequent aerodynamic and stress calculations and for manufacture. The method is particularly suitable for incorporation into a computer-aided design procedure.

43 citations


Journal ArticleDOI
TL;DR: The author covers Surface form selection (Interpolation versus approximation; Representation versus design; Smoothness; Shape fidelity; Local versus global methods; and Rendering), and Interpolation surfaces.
Abstract: The surface appropriate for a given problem depends on the application; there is no universal surface form. The numerous applications of surface methods include modeling physical phenomena (e.g., combustion) and designing objects such as airplanes and cars. In addition to these 3D surfaces, there are interesting 4D "surfaces" such as temperature as a function of the three spatial variables. Because the geometric information for these problems can be located arbitrarily in 3D or 4-Dimensional space, the schemes must be able to handle arbitrarily located data. The standard (and easier) approach to surfacesusing tensor products of curve methods-restricts the surface method's applicability to rectangularly "gridded" data. Two broad classes of methods suitable for solving these problems (i.e., problems for which simplifying geometric assumptions cannot be made) are (1) surface interpolants defined over triangles or tetrahedra and (2) distanceweighted interpolants. Users ordinarily want smoother surfaces than their data imply directly, so additional information must usually be created. (A notable feature of the methods shown here is that the smoothness of the surface is always greater than or equal to the smoothness of the defining data. The author covers Surface form selection (Interpolation versus approximation; Representation versus design; Smoothness; Shape fidelity; Local versus global methods; and Rendering), and Interpolation surfaces. It is noted that triangular interpolants and distance-weighted interpolants excel as surface methods because of their smooth interpolation of arbitrarily located data.

Journal ArticleDOI
TL;DR: It is shown in this paper that the minimum distance between two finite planar sets of n points can be computer in O(n log n) worst-case running time and that this is optimal to within a constant factor.

Journal ArticleDOI
TL;DR: Experimental results indicate that the implementation of an algorithm for computing the convex hull of a finite planar set of points is superior to the other in terms of both running time and storage requirements.

Journal ArticleDOI
TL;DR: The author explores the link between probability and geometry and concludes that urn models are a powerful tool for generating discrete probability distributions, and built into these special distributions are many propitious properties essential to the blending functions of computer-aided geometric design.
Abstract: The author explores the link between probability and geometry. In the process, he shows how to exploit simple probabilistic arguments to derive many of the classical geometric properties of the parametric curves and surfaces currently in vogue in computer-aided geometric design. He also uses this probabilistic approach to introduce many new types of curves and surfaces into computer-aided geometric design, and demonstrates how probability theory can be used to simplify, unify, and generalize many well-known results. He concludes that urn models are a powerful tool for generating discrete probability distributions, and built into these special distributions are many propitious properties essential to the blending functions of computer-aided geometric design. This fact allows mathematicians to use probabilistic arguments to simplify, unify, and generalize many geometric results. He believes that this link between probability and geometry will ultimately prove beneficial to both disciplines, and expects that it will continue to be a productive area for future inspiration and research.

Journal ArticleDOI
TL;DR: It is shown that this Voronoi dual of the set S is not even Hamiltonian, establishing the fact that there can be no Euclidean traveling salesman cycle through S contained in it.