scispace - formally typeset
Search or ask a question

Showing papers by "Michael T. Goodrich published in 2001"


Book
01 Jan 2001
TL;DR: Part I: FUNDAMENTAL TOOLs.
Abstract: PART I: FUNDAMENTAL TOOLS. Algorithm Analysis. Basic Data Structures. Search Trees and Skip Lists. Sorting, Sets, and Selection. Fundamental Techniques. PART II: GRAPH ALGORITHMS. Graphs. Weighted Graphs. Network Flow and Matching. PART III: INTERNET ALGORITHMICS. Text Processing. Number Theory and Cryptograhy. Network Algorithms. PART IV: ADDITIONAL TOPICS. Computational Geometry. NP-Completeness. Algorithmic Frameworks. Appendix: Useful Mathematical Facts. Bibliography. Index.

241 citations


Proceedings ArticleDOI
12 Jun 2001
TL;DR: Applications of the work include certificate revocation in a public key infrastructure and the publication of data collections on the Internet.
Abstract: We present the software architecture and implementation of an efficient data structure for dynamically maintaining an authenticated dictionary. The building blocks of the data structure are skip lists and one-way commutative hash functions. We also present the results of a preliminary experiment on the performance of the data structure. Applications of our work include certificate revocation in a public key infrastructure and the publication of data collections on the Internet.

200 citations


Book ChapterDOI
01 Oct 2001
TL;DR: Two data structures that can efficiently support an infrastructure for persistent authenticated dictionaries are presented and compared, and their performance is compared.
Abstract: We introduce the notion of persistent authenticated dictionaries, that is, dictionaries where the user can make queries of the type "was element e in set S at time t?" and get authenticated answers. Applications include credential and certificate validation checking in the past (as in digital signatures for electronic contracts), digital receipts, and electronic tickets. We present two data structures that can efficiently support an infrastructure for persistent authenticated dictionaries, and we compare their performance.

130 citations


Patent
08 Nov 2001
TL;DR: In this article, the authors propose a dictionary database that stores information objects so that any individual object can be authenticated as belonging or not belonging to the dictionary using a skip list data structure and communicative hash functions.
Abstract: An efficient and practical method for dynamically maintaining an authenticated dictionary uses a skip list data structure and communicative hash functions to provide a dictionary database (201) that stores information objects so that any individual object can be authenticated as belonging or not belonging to the dictionary. The authentication consists of a short sequence of vales that begin with an element and a sequence of values that, when hashed in order using a cryptographic associative hash function, create the same value as the hashed digest of the entire dictionary. Rather than hashing up a dynamic 2-3 tree, hashes are created in a skip list. Validation of the result of the authenticating step is provided if the hash of the short sequence matches a signed hash of the entire skip list.

99 citations


Proceedings Article
01 Jan 2001
TL;DR: Protocols for distributed certified e-mail use encryption to ensure both confidentiality and fairness, and explore scenarios that support a distributed TTP, in the context of both off-line and online protocols.
Abstract: In this paper we present protocols for distributed certified e-mail, which use encryption to ensure both confidentiality and fairness. As with other protocols for certified e-mail, ours achieve fairness by placing trust on an external entity, referred to as the Trusted Third Party (TTP). The TTP can become a bottleneck, however, and we explore scenarios that support a distributed TTP, in the context of both off-line and online protocols. With several servers dividing the TTP responsibilities, the level of confidence placed in individual servers can be reduced without compromising the TTP’s overall trust.

57 citations


Journal ArticleDOI
TL;DR: This paper presents a lower bound on the area of drawings in which edges are drawn using exactly one circular arc, and gives an algorithm for drawing n-vertex planar graphs such that the edges are sequences of two continuous circular arcs.
Abstract: In this paper we address the problem of drawing planar graphs with circular arcs while maintaining good angular resolution and small drawing area. We present a lower bound on the area of drawings in which edges are drawn using exactly one circular arc. We also give an algorithm for drawing n-vertex planar graphs such that the edges are sequences of two continuous circular arcs. The algorithm runs in O(n) time and embeds the graph on the O(n) × O(n) grid, while maintaining ?(1/d(v)) angular resolution, where d(v) is the degree of vertex v. Since in this case we use circular arcs of infinite radius, this is also the first algorithm that simultaneously achieves good angular resolution, small area, and at most one bend per edge using straight-line segments. Finally, we show how to create drawings in which edges are smooth C1-continuous curves, represented by a sequence of at most three circular arcs.

54 citations


Journal ArticleDOI
TL;DR: Using offset as a notion of distance, it is shown how to compute the corresponding nearest- and furthest-site Voronoi diagrams of point sites in the plane using near-optimal deterministic O(n(logn + log2m) +m)-time algorithms.
Abstract: In this paper we develop the concept of a convexpolygon-offset distance function. Using offset as a notion of distance, we show how to compute the corresponding nearest- and furthest-site Voronoi diagrams of point sites in the plane. We provide near-optimal deterministicO(n(logn + log2m) +m)-time algorithms, wheren is the number of points andm is the complexity of the underlying polygon, for computing compact representations of both diagrams.

33 citations


Journal ArticleDOI
TL;DR: A randomized algorithm for computing the trapezoidal decomposition of a simple polygon, which can be viewed as a combination of Chazelle’s algorithm and of simple nonoptimal randomized algorithms due to Clarkson et al.
Abstract: We describe a randomized algorithm for computing the trapezoidal decomposition of a simple polygon. Its expected running time is linear in the size of the polygon. By a well-known and simple linear time reduction, this implies a linear time algorithm for triangulating a simple polygon. Our algorithm is considerably simpler than Chazelle's [3] celebrated optimal deterministic algorithm. The new algorithm can be viewed as a combination of Chazelle's algorithm and of simple nonoptimal randomized algorithms due to Clarkson et al. [6], [7], [9] and to Seidel [20]. As in Chazelle's algorithm, it is indispensable to include a bottom-up preprocessing phase, in addition to the actual top-down construction. An essential new idea is the use of random sampling on subchains of the initial polygonal chain, rather than on individual edges as is normally done.

31 citations


Proceedings ArticleDOI
01 Jun 2001
TL;DR: A novel efficient and practical algorithm to compute silhouettes from a sequence of viewpoints under perspective projection based on a point-plane duality in three dimensions, which allows an efficient computation of the silhouette of a polygonal model between consecutive frames.
Abstract: Silhouettes are perceptually and geometrically salient features of geo metric models. Hence a number of graphics and visualization applications need to find them to aid further processing. The efficient computation of silhouettes, especially in the context of perspective projection, is known to be difficult. This paper presents a novel efficient and practical algorithm to compute silhouettes from a sequence of viewpoints under perspective projection. Parallel projection is a special case of this algorithm. Our approach is based on a point-plane duality in three dimensions, which allows an efficient computation of the \emph{changes} in the silhouette of a polygonal model between consecutive frames. In addition, we present several applications of our technique to problems from computer graphics and medical visualization. We also provide experimental data that show the efficiency of our approach.

30 citations


Book ChapterDOI
08 Aug 2001
TL;DR: This paper gives a randomized online algorithm that is O(logB)-competitive against an oblivious adversary, where the bid values vary between 1 and B per item, and shows that this algorithm is optimal in the worst-case and that it performs significantly better than any worst- case bounds achievable via deterministic strategies.
Abstract: In this paper we provide an algorithmic approach to the study of online auctioning. From the perspective of the seller we formalize the auctioning problem as that of designing an algorithmic strategy that fairly maximizes the revenue earned by selling n identical items to bidders who submit bids online. We give a randomized online algorithm that is O(logB)-competitive against an oblivious adversary, where the bid values vary between 1 and B per item. We show that this algorithm is optimal in the worst-case and that it performs significantly better than any worst-case bounds achievable via deterministic strategies. Additionally we present experimental evidence to show that our algorithm outperforms conventional heuristic methods in practice. And finally we explore ways of modifying the conventional model of online algorithms to improve competitiveness of other types of auctioning scenarios while still maintaining fairness.

21 citations


Patent
13 Jul 2001
TL;DR: In this paper, a methodology and system is used to facilitate the exchange of valued electronic information in a confidential, fair, and efficient manner, which relies upon one or a plurality of postal agents (servers) to provide secured online exchange of the information by arranging an efficient validation of the required signatures and information being exchanged between the sender and receiver.
Abstract: A methodology and system is used to facilitate the exchange of valued electronic information in a confidential, fair, and efficient manner. Either of two protocols can be employed that used encryption and electronic signatures to effectively guarantee origin and identity of sender and receiver in the exchange of valued information and requires timely response by both sender and receiver. The protocols rely upon one or a plurality of postal agents (servers) to provide secured online exchange of the information by arranging an efficient validation of the required signatures and information being exchanged between the sender and receiver. In the event of a breakdown in the exchange between sender and receiver, the use of a trusted third party (TTP) allows for fair and pre-agreed arbitration based upon the encrypted information and electronic signatures of the sender and receiver. The method does not require the use of the TTP unless a dispute arises.

01 Jan 2001
TL;DR: The software architecture and implementation of an eficient data structure for dynamically maintaining an authenticated dictionary for certijicate revocation in public key infrastructure and the pubkcation of data collections on the Internet is presented.
Abstract: We present the software architecture and implementation of an eficient data structure for dynamically maintaining an authenticated dictionary. The building blocks of the data structure are skip lists and one-way commutative hash functions. We also present the results of a preliminary experiment on the performance of the data structure. Applications of our work include certijicate revocation in public key infrastructure and the pubkcation of data collections on the Internet.


Proceedings ArticleDOI
01 Feb 2001
TL;DR: It is argued that the foundational topics from CS7/DS&A should remain even when it is taught in an Internet-centric manner, and will stimulate new interest and excitement in material that is perceived by some students to be stale, boring, and purely theoretical.
Abstract: We describe an Internet-based approach for teaching important concepts in a Junior-Senior level course on the design and analysis of data structures and algorithms (traditionally called CS7 or DS&A). The main idea of this educational paradigm is twofold. First, it provides fresh motivation for fundamental algorithms and data structures that are finding new applications in the context of the Internet. Second, it provides a source for introducing new algorithms and data structures that are derived from specific Internet applications. In this paper, we suggest some key pedagogical and curriculum updates that can be made to the classic CS7/DS&A course to turn it into a course on Internet Algorithmics. We believe that such a course will stimulate new interest and excitement in material that is perceived by some students to be stale, boring, and purely theoretical. We argue that the foundational topics from CS7/DS&A should remain even when it is taught in an Internet-centric manner. This, of course, should come as no surprise to the seasoned computer scientist, who understands the value of algorithmic thinking.

Book ChapterDOI
01 Jan 2001
TL;DR: It is shown that using randomization in data structures and algorithms is safe and can be used to significantly simplify efficient solutions to various computational problems.
Abstract: We describe simplified analyses of well-known randomized algorithms for searching, sorting, and selection. The proofs we include are quite simple and can easily be made a part of a Freshman-Sophomore Introduction to Data Structures (CS2) course and a Junior-Senior level course on the design and analysis of data structures and algorithms (CS7/DS&A). We show that using randomization in data structures and algorithms is safe and can be used to significantly simplify efficient solutions to various computational problems.

ReportDOI
14 Dec 2001
TL;DR: This research project is aimed at aimed at facilitating an effective technology transfer from computational geometry to the various applied fields to which it is relevant.
Abstract: : This research project is aimed at aimed at facilitating an effective technology transfer from computational geometry to the various applied fields to which it is relevant. Our technical contributions include algorithmic foundations, practical methodologies, emerging technologies, and applications.