scispace - formally typeset
Search or ask a question

Showing papers on "Polygon published in 1991"


Journal ArticleDOI
TL;DR: A method for comparing polygons that is a metric, invariant under translation, rotation, and change of scale, reasonably easy to compute, and intuitive is presented.
Abstract: A method for comparing polygons that is a metric, invariant under translation, rotation, and change of scale, reasonably easy to compute, and intuitive is presented. The method is based on the L/sub 2/ distance between the turning functions of the two polygons. It works for both convex and nonconvex polygons and runs in time O(mn log mn), where m is the number of vertices in one polygon and n is the number of vertices in the other. Some examples showing that the method produces answers that are intuitively reasonable are presented. >

733 citations


Journal ArticleDOI
TL;DR: An automatic quadrilateral mesh generator based on the advancing front technique for triangular mesh generation is described and the robustness and versatility of the mesh generator are demonstrated by a variety of examples.
Abstract: A methodology for the automatic mesh generation of quadrilateral elements is proposed. The methodology is based on the facts that (i) a region can always be subdivided entirely into quadrilaterals if the polygon which forms the boundary of the region has an even number of sides, (ii) a quadrilateral can be formed by two triangles which share a common side. By taking advantage of the techniques developed in an existing automatic triangular mesh generator, quadrilateral meshes can be generated according to the distribution of user specified element size over the domain of interest. In particular, an automatic quadrilateral mesh generator based on the advancing front technique for triangular mesh generation1 is described. Techniques to improve the quality of the generated quadrilateral meshes are introduced. The robustness and versatility of the mesh generator are demonstrated by a variety of examples.

252 citations


Journal ArticleDOI
TL;DR: Experimental results show that dominant points obtained by this method are less sensitive to the orientation of the boundary than other polygonal approximation algorithms in the sense that the number and the location of the dominant points along the contour remain relatively unchanged.

164 citations


Patent
08 Mar 1991
TL;DR: In this paper, a method and apparatus for providing a texture mapped perspective view for digital map systems is presented, which includes apparatus for storing elevation data, apparatus to store texture data, and apparatus to scan a projected view volume from the elevation data storing apparatus, apparatus for processing, and an apparatus for generating a plurality of planar polygons and rendering images.
Abstract: A method and apparatus for providing a texture mapped perspective view for digital map systems. The system includes apparatus for storing elevation data, apparatus for storing texture data, apparatus for scanning a projected view volume from the elevation data storing apparatus, apparatus for processing, apparatus for generating a plurality of planar polygons and apparatus for rendering images. The processing apparatus further includes apparatus for receiving the scanned projected view volume from the scanning apparatus, transforming the scanned projected view volume from object space to screen space, and computing surface normals at each vertex of each polygon so as to modulate texture space pixel intensity. The generating apparatus generates the plurality of planar polygons from the transformed vertices and supplies them to the rendering apparatus which then shades each of the planar polygons. In one alternate embodiment of the invention, the polygons are shaded by apparatus of the rendering apparatus assigning one color across the surface of each polygon. In yet another alternate embodiment of the invention, the rendering apparatus interpolates the intensities between the vertices of each polygon in a linear fashion as in Gouraud shading.

141 citations


Patent
15 Feb 1991
TL;DR: In this article, the alpha channel comprises either a sub-pixel mask associated with each pixel which indicates the amount and subpixel regions of coverage or a single value indicative of the percentage of coverage of a pixel.
Abstract: A scan conversion process is performed on a polygon using a single pass technique. The pixels which comprise the edges and vertices of the polygon are first determined from the vertices which define the polygon. The alpha channel comprises either a sub-pixel mask associated with each pixel which indicates the amount and sub-pixel regions of coverage or a single value indicative of the percentage of coverage of a pixel. Furthermore, a z value indicative of the depth of each pixel is maintained. The pixels between the edge pixels of the polygon are then turned on, thereby filling the polygon. The pixels which comprise the polygon are then composited with the background pixels on a per pixel basis. The depth value of each pixel of the polygon (the z value) is used to determine the compositing equations to be used to composite each pixel of the polygon to the background. The compositing equations update the color of the pixel, the z buffer value of the background pixel and the sub-pixel mask to reflect the addition of information from the compositing of the pixel of the polygon into the background pixel. Through this method high quality anti-aliased polygons may be rendered without performing the time consuming process of sorting the polygons in depth order prior to compositing.

140 citations


Journal ArticleDOI
TL;DR: A method to represent polygons as cyclic strings is proposed and it is shown how cyclic string-matching techniques can be used for rotation-, translation- and scale-independent polygonal shape recognition.

114 citations


Patent
14 Mar 1991
TL;DR: In this paper, a hardware implementation performs the necessary calculations for each pixel in the input polygon, which removes artifacts in the texture mapped image at a low cost and at a high speed.
Abstract: A method and apparatus for performing the majority of texture map gradient calculations once per polygon so as to increase processing speed in a graphics system. Texture values are identified for each vertex of an input polygon and are interpolated over the polygon in perspective space in order to find the corresponding values at a given pixel within the polygon. The perspective values of the vertices are linearly interpolated across the polygon to determine the value at the given pixel. The texture gradients are then calculated by defining vectors perpendicular and parallel to the horizon of the plane containing the input polygon so that the resulting components may be calculated. The resulting value is the texture gradient, which may then be used to address a texture map to determine the pre-filtered texture value for that point. A hardware implementation performs the necessary calculations for each pixel in the input polygon. The invention so arranged removes artifacts in the texture mapped image at a low cost and at a high speed.

111 citations


Patent
08 Feb 1991
TL;DR: In this article, a car mounted navigation system uses map data of polygons defined by roads of a predetermined rank or more of significance, connecting a starting polygon containing a starting point to a destination polygon with a destination containing a destination with a chain of polygon adjoining at common sides of each pair of adjoining polygons and arranged between the starting and destination polygons.
Abstract: A car mounted navigation system uses map data of polygons defined by roads of a predetermined rank or more of significance, connects a starting polygon containing a starting point to a destination polygon containing a destination with a chain of polygons adjoining at common sides of each pair of adjoining polygons and arranged between the starting and destination polygons to compute a plurality of routes extending from the starting point to the destination polygon, each route including a combination of sides of the chain polygons, the starting polygon and the destination polygon. A car operator selects an appropriate route from the computed routes. The system requires neither need for a great amount of database-made map data nor need for an entry of map data with a digitizer.

110 citations


Journal ArticleDOI
01 Apr 1991
TL;DR: In this article, the authors identify a variety of "awkward" problems, including interpolation, error estimation and dynamic polygon building and e.g., the problem of polygon construction.
Abstract: Experience with the handling of spatial data on a computer led to the identification of a variety of “awkward” problems, including interpolation, error estimation and dynamic polygon building and e...

109 citations


Proceedings ArticleDOI
01 Sep 1991
TL;DR: The graph-theoretic formulation and solution to the gallery problem for polygons in standard form is given, and a complexity analysis is carried out, and open problems are discussed.
Abstract: Art gallery problems which have been extensively studied over the last decade ask how to station a small (minimum) set of guards in a polygon such that every point of the polygon is watched by at least one guard. The graph-theoretic formulation and solution to the gallery problem for polygons in standard form is given. A complexity analysis is carried out, and open problems are discussed. >

103 citations


Patent
20 May 1991
TL;DR: In this paper, a polygon-based graphics/text separation method is comprised of two sequential processes: a raster to contour vector conversion step is used to convert a digitized bitmap into a collection of simple polygons.
Abstract: A polygon-based graphics/text separation method is comprised of two sequential processes. First a raster to contour vector conversion step is used to convert a digitized bitmap into a collection of simple polygons. Next a component classification process is used to extract six particularly defined features of each of the individual polygon-based components to enable the separation of graphics and text polygons. Graphical polygons are further classified into four subclasses. Textual polygons are grouped into polygon strings (text strings). Each string contains a sequence of segmented character contour polygons which is ready for an optical character recognition algorithm to convert them into computer understandable ASCII characters.

Journal ArticleDOI
TL;DR: In this article, a lattice model of two-dimensional vesicles is considered, in which the boundary of the vesicle is the perimeter of a self-avoiding polygon embeddable in the square lattice.
Abstract: We consider a lattice model of two-dimensional vesicles, in which the boundary of the vesicle is the perimeter of a self-avoiding polygon embeddable in the square lattice. With fixed boundary length m we incorporate an osmotic pressure difference by associating a fugacity with the area enclosed by the polygon. We derive rigorous results concerning the behaviour of the associated free energy and the form of the phase diagram. By deriving exact values ofthe numbers of polygons with m edges which enclose area n, and analysing the resulting series, we obtain the free energy, the phase boundary and various scaling exponents and amplitudes numerically.

Journal ArticleDOI
TL;DR: A new data structure for answering shortest path queries inside a simple polygon is described, which has the same asymptotic performance as the previously known data structure, but is significantly less complicated.

Patent
21 Mar 1991
TL;DR: In this paper, the authors present a helicopter flight simulator with improvements in visual cues and modeling, including terrain following shadows and hazing, the latter approximating a set of atmospheric conditions.
Abstract: A helicopter flight simulator having improvements in visual cues and modeling. The unique visual cues include terrain following shadows and hazing, the latter approximating a set of atmospheric conditions. The unique modeling features include a user selectable zoom, horizontal and vertical ground avoidance, and an autorotation model.

Proceedings ArticleDOI
01 Jun 1991
TL;DR: A walk of minimum length within time O(n logn +k), where k is the size of the output, is computed, and it is proved that this is optimal.
Abstract: Given a simple polygon in the plane with two distinguished vertices, s and g ,i s it possible for two guards to simultaneously walk along the two boundary chains from s to g in such a way that they are always mutually visible? We decide this question in time O(n logn) and in linear space, where n is the number of edges of the polygon. Moreover, we compute a walk of minimum length within time O(n logn +k), where k is the size of the output, and we prove that this is optimal.

Patent
06 Dec 1991
TL;DR: In this article, the B-spline definitions of the trimming curves in the uv parameter space of each patch are converted to approximating short straight line segments, and the length of the straight lines segments of the trimmed curves is adjusted to compensate for less than ideal parameterization of trimming curve functions.
Abstract: A graphics accelerator responds to commands from a computer in a graphic system by storing the definitions of nonuniform rational B-spline patches and their associated triming curves. The graphics accelerator then produces device coordinates for trimmed polygons computed for each patch and sends these polygons to a display. Various improvements are incorporated to minimize the effects of roundoff error. The B-spline definitions of the trimming curves in the uv parameter space of each patch are converted to approximating short straight line segments. Untrimmed polygon vertices, the end points of the straight line segments and the intersections of the straight line segments with subspan boundaries corresponding to polygon edges are kept in a data structure of linked lists of vertex tables. The data structure is traversed to determine new polygon vertices for trimmed polygons. The trimming mechanism is compatible with recursive subdivision of patches to overcome pratical limitations on the number of trimming curves that may be associated with each patch. The length of the straight line segments of the trimming curves is adjusted to compensate for less than ideal parameterization of the trimming curve functions. Associated with each trimming curve within a patch is information about the position of that trimming curve in the span. As each polygon for that patch is generated, those trimming curves that are clearly outside the clip limits for that polygon are excluded from consideration. This reduces the average number of trimming curves that must be processed for the patch, and increases the speed of the graphics accelerator.

Journal ArticleDOI
A. Wallin1
TL;DR: An algorithm that automatically produces polygonal representations of 3-D structures within a volume from a set of cross-sectional images is presented, and can be used to describe and visualize normal as well as pathological anatomy.
Abstract: An algorithm that automatically produces polygonal representations of 3-D structures within a volume from a set of cross-sectional images is presented. The method incorporates the requirements necessary for structure analysis to go beyond plain rendering. The algorithm is fully automatic, using local voxel values to determine the connectivity of the surface. The resulting polygons are coherently ordered and connected, and no polygon occurs more than once. Each surface is complete, that is, no holes occur (except as an option, on the boundaries of the volume). The algorithm can be used to describe and visualize normal as well as pathological anatomy. >

Book ChapterDOI
TL;DR: This work investigates fattening by convolving the segments or vertices with disks and attempts to approximate objects with the minimum number of line segments, or with near the minimum, by using efficient greedy algorithms.
Abstract: We study several variations on one basic approach to the task of simplifying a plane polygon or subdivision: Fatten the given object and construct an approximation inside the fattened region. We investigate fattening by convolving the segments or vertices with disks and attempt to approximate objects with the minimum number of line segments, or with near the minimum, by using efficient greedy algorithms. We also discuss additional topological constraints such as simplicity.

Patent
28 Jun 1991
TL;DR: In this paper, an adaptive shading method is utilized to generate shaded images in real-time using a series of tests to determine the order equation that is to be used to interpolate the color or intensity across the polygon between the vertices.
Abstract: An adaptive shading method is utilized to generate shaded images in real time. The technique to shade the image is determined according to the curvature of the surface, the variation of the light vector across the surface, and the variation of the eye vector across the surface. The color or intensity is first computed at each of the vertices of the polygon. A series of tests are then performed to determine the order equation that is to be used to interpolate the color or intensity across the polygon between the vertices. Using this technique, polygons having a slight or no curvature and an infinite light source (thus being the simplest form of shading), will use an extremely fast, low order equation to interpolate across the polygon. Polygons, having a high degree of curvature and/or positional light source will utilize, as necessary, a higher order equation which requires additional computation time but produces desirable shading results.

Patent
Larry J. Thayer1
12 Mar 1991
TL;DR: A polygon rendering circuit for a computer color graphics system consisting of an edge stepper which steps along edges of an input polygon to determine the span of the polygon along each scan line intersected by the polygons is described in this paper.
Abstract: A polygon rendering circuit for a computer color graphics system comprising an edge stepper which steps along edges of an input polygon to determine the span of the polygon along each scan line intersected by the polygon. The coordinate values of the edges on each scan line are determined to sub-pixel resolution such that only those pixels whose centers lie within the true polygon edges (within the span width) must be drawn. Processing efficiency is improved and bandwidth is minimized by passing only those edges of the polygon which are new to that polygon and by computing the Z values in the same manner as, and in parallel with, the X values. Improved results are also possible in accordance with the technique of the invention, for since adjacent polygons compute the same edge by stepping, there can be no gaps between polygons due to round-off errors.

Journal ArticleDOI
Mark de Berg1
TL;DR: In this paper, a data structure using O(n log n) storage was proposed to find a shortest path between two query points in a simple rectilinear polygon P. If both query points are vertices of P, the query time is O(1 + l, where l is the number of links.
Abstract: In this paper we study two link distance problems for rectilinear paths inside a simple rectilinear polygon P.First, we present a data structure using O(n log n) storage such that a shortest path between two query points can be computed efficiently. If both query points are vertices of P, the query time is O(1 + l), where l is the number of links. If the query points are arbitrary points inside P, then the query time becomes O(log n + l). The resulting path is not only optimal in the rectilinear link metric, but it is optimal in the L1-metric as well. Secondly, it is shown that the rectilinear link diameter of P can be computed in time O(n log n). We also give an approximation algorithm that runs in linear time. This algorithm produces a solution that differs by at most three links from the exact diameter.The solutions are based on a rectilinear version of Chazelle's polygon cutting theorem. This new theorem states that any simple rectilinear polygon can be cut into two rectilinear subpolygons of size at most 34 times the original size, and that such a cut segment can be found in linear time.

Patent
Paul Anthony Winser1
20 Jun 1991
TL;DR: In this article, an image of objects in a three dimensional space is generated for display on a two dimensional regular pixel array by offset and span generations (OFGN, SPGN) for anti-alias filtering which causes multiple rendition of the image, with each rendition displaced by a sub-pixel offset (Nx,Ny) with respect to the previous rendition.
Abstract: An image of objects in a three dimensional space is generated for display on a two dimensional regular pixel array by offset and span generations (OFGN, SPGN) for anti-alias filtering which causes multiple rendition of the image, with each rendition displaced by a sub-pixel offset (Nx,Ny) with respect to the previous rendition. Image primitives are rendered by a scan line algorithm using a linked active polygon list (APL) and a deleted polygon list (DPL) to enable vertical offsets to be effected. The deleted polygon list stores primitives which would not be effective for a given line but for the offset to enable anti-alias filtering. These polygons would not normally be available for processing when using the scan line algorithm. Economical hardware (600) is provided for horizontal edge correction of parameters such as depth (z) and texture coordinates (u,v).

Journal ArticleDOI
TL;DR: By combining polygon scan-conversion with a dynamic screen data structure, the technique provides significant speedup in the display time of polygonal scenes that depend on BSP trees, especially in cases where the number of polygons is large.
Abstract: A technique for displaying binary space partitioning (BSP) trees that is faster than the usual back-to-front display method is presented. By combining polygon scan-conversion with a dynamic screen data structure, the technique, a front-to-back approach, provides significant speedup in the display time of polygonal scenes that depend on BSP trees, especially in cases where the number of polygons is large. This speedup is confirmed by applying the technique to randomly generated triangles. >

Journal ArticleDOI
TL;DR: The error polygon technique for measuring telemetry error is invalid for estimating precision of individual location estimates when >2 bearings are used; it is valid only for estimating average precision over many triangulations.
Abstract: The error polygon technique for measuring telemetry error is invalid for estimating precision of individual location estimates when >2 bearings are used; it is valid only for estimating average precision over many triangulations. When 2 bearings are used, it is valid for both individual and mean error estimates

Book ChapterDOI
Ron Goldman1
01 Jan 1991
TL;DR: In this article, the authors discuss formulas for calculating area of planar polygons and volume of polyhedra and highlight that these formulas are valid even for nonconvex polygons.
Abstract: Publisher Summary This chapter discusses formulas for calculating area of planar polygons and volume of polyhedra. It considers a planar polygon with vertices P 0 ,…, P n . There is a simple closed formula for the area of the polygon. If P n +1 = P 0 and if the points P 0 ,……, P n lie in the xy plane, then the formula for area of polygon can be derived from Green's theorem. If the points lie on some arbitrary plane perpendicular to a unit vector N , then the formula for area of the polygon is derived from Stokes theorem. These two formulas are valid even for nonconvex polygons. The chapter discusses formula for volume of a Polyhedron. It consider a polyhedron with planar polygonal faces S 0 ,…, S n . The chapter highlights that there is a simple closed formula for the volume of the polyhedron which is derived from Gauss Theorem.

Patent
07 Oct 1991
TL;DR: A curved surface designing method comprises the steps of a first step of inputting edge shape data of n-sides of a curved surface, a second step of obtaining a regular n-sided polygon in a parameter space on the basis of the given edge shapes of n sides as mentioned in this paper, a third step of calculating distance parameters and blending values based on the distance parameters.
Abstract: A curved surface designing method comprises the steps of a first step of inputting edge shape data of n-sides of a curved surface, a second step of obtaining a regular n-sided polygon in a parameter space on the basis of the given data of n sides, a third step of calculating distance parameters on the basis of the regular n-sided polygon obtained by the second step, a fourth step of calculating blending values on the basis of the distance parameters, a fifth step of mapping an inner point D of the regular n-sided polygon onto the respective sides so as to calculate boundary parameters, a sixth step of calculating at least one of three-dimensional coordinate points, and, if necessary, tangent vectors, on sides which are boundaries of the curved surface on the basis of the boundary parameters, and a seventh step of calculating arbitrary points on the curved surface on the basis of the distance parameters, the blending values, the three-dimensional coordinate points of the respective sides, and, if necessary, the tangent vectors, of the respective sides

Journal ArticleDOI
TL;DR: In this paper, the authors studied the dimensions of self-avoiding polygons on a simple cubic lattice with fixed knot type and showed that the critical exponent and leading amplitude are independent of the knot type of the polygon.
Abstract: The authors study the dimensions (mean-square radius of gyration and mean span) of self-avoiding polygons on the simple cubic lattice with fixed knot type. The approach used is a Monte Carlo algorithm which is a combination of the BFACF algorithm and the pivot algorithm, so that the polygons are studied in the grand canonical ensemble, but the autocorrelation time is not too large. They show that, although the dimensions of polygons are sensitive to knot type, the critical exponent (v) and the leading amplitude are independent of the knot type of the polygon. The knot type influences the confluent correction to the scaling term and hence the rate of approach to the limiting behaviour.

Proceedings ArticleDOI
Sivan Toledo1
01 Jun 1991
TL;DR: This work employs the parametric search technique of Megiddo, and the fixed size polygon placement algorithms developed by Leven and Sharir, to obtain an algorithm that runs in time O(knλ4(kn) log(Kn) log log(kn)).
Abstract: Given a convex polygonal object P and an environment consisting of polygonal obstacles, we seek a placement for the largest copy of P that does not intersect any of the obstacles, allowing translation, rotation and scaling. We employ the parametric search technique of Megiddo [Me], and the fixed size polygon placement algorithms developed by Leven and Sharir [LS, LS1], to obtain an algorithm that runs in time O(knλ4(kn) log(kn) log log(kn)). We also present several other efficient algorithms for restricted variants of the extremal polygon containment problem, using the same ideas. These variants include: placement of the largest homothetic copies of one or two convex polygons in another convex polygon and placement of the largest similar copy of a triangle in a convex polygon.

Book ChapterDOI
14 Aug 1991
TL;DR: It is said that a polygon P is immobilized by a set of points I on its boundary if any rigid motion of P in the plane causes at least one point of I to penetrate the interior of P.
Abstract: We say that a polygon P is immobilized by a set of points I on its boundary if any rigid motion of P in the plane causes at least one point of I to penetrate the interior of P. Three immobilization points are always sufficient for a polygon with vertices in general positions, but four points are necessary for some polygons with parallel edges. An O(n log n) algorithm that finds a set of 3 points that immobilize a given polygon with vertices in general positions is suggested. The algorithm becomes linear for convex polygons. Some results are generalized for d-dimensional polytopes, where 2d points are always sufficient and sometimes necessary to immobilize. When the polytope has vertices in general position d+1 points are sufficient to immobilize.

Proceedings ArticleDOI
01 Apr 1991
TL;DR: This paper presents algorithms and a fully adaptive data structure that efficiently supports insertions, deletions, updates and geometric queries at an arbitrary location for an arbitrary scale and shows how to partition polygon corner points of different priorities such that they can be stored in the structure with practically no redundancy.
Abstract: In geographic information systems, an important goal is the maintenance of seamless, scaleless maps. The amount of detail desired on a map decreases with decreasing scale. Cartographic techniques called {\em generalization} define the representations of geographic objects, depending on the scale. While generalization as a whole is considered an art, simple automatic generalization techniques exist for simple geometric objects. For polygonal lines and polygons, simplification techniques assign priorities to points. A map at a desired scale is then obtained by ignoring all points of sufficiently low priority. This implies that a geometric object appears on a map only if its priority is high enough, and also that an object is represented only by those of its defining points that have sufficiently high priority. In addition to the totally free choice of the representation scale of retrieved data, a seamless access to an arbitrary map should be supported. The efficiency of retrieving a map of some area at a certain scale ideally should only depend on the amount of data retrieved. In this paper, we present algorithms and a fully adaptive data structure that efficiently supports insertions, deletions, updates and geometric queries at an arbitrary location for an arbitrary scale. Our data structure adapts to dynamically changing data in such a way that objects outside the query area or of irrelevant priority do not influence the efficiency of the query substantially. The structure is of broader interest, because it supports general proximity queries with priority threshold for non-zero size geometric objects. Furthermore, we show how to partition polygon corner points of different priorities such that they can be stored in the structure with practically no redundancy, and polygons for any priority threshold can be reassembled easily. A performance evaluation of our implementation with topographic data reveals that a performance gain of a high factor can be achieved with the structure under realistic assumptions.