scispace - formally typeset
M

Michael T. Goodrich

Researcher at University of California, Irvine

Publications -  445
Citations -  14652

Michael T. Goodrich is an academic researcher from University of California, Irvine. The author has contributed to research in topics: Planar graph & Time complexity. The author has an hindex of 61, co-authored 430 publications receiving 14045 citations. Previous affiliations of Michael T. Goodrich include New York University & Technion – Israel Institute of Technology.

Papers
More filters
Book ChapterDOI

The Galois Complexity of Graph Drawing: Why Numerical Solutions Are Ubiquitous for Force-Directed, Spectral, and Circle Packing Drawings

TL;DR: Galoois theory is used to show that many variants of these problems have solutions that cannot be expressed by nested radicals or nested roots of low-degree polynomials, and that such solutions cannot be computed exactly even in extended computational models that include such operations.
Posted Content

Balanced Circle Packings for Planar Graphs

TL;DR: In this paper, the authors studied balanced circle packings and circle contact representations for planar graphs, where the ratio of the largest circle's diameter to the smallest circle diameter is polynomial in the number of circles.
Proceedings Article

Tracking Paths in Planar Graphs

TL;DR: In this paper, the authors considered the problem of finding the smallest subset of vertices whose intersection with any path results in a unique sequence, and gave a 4-approximation algorithm.
Book ChapterDOI

Parallel algorithms for evaluating sequences of set manipulation operations (preliminary version)

TL;DR: This work shows that the problem of evaluating S is in NC for various combinations of common set-manipulation operations, and develops techniques for improving the time and/or processor complexity.
Posted Content

Fully De-Amortized Cuckoo Hashing for Cache-Oblivious Dictionaries and Multimaps

TL;DR: This work designs hashing-based indexing schemes for dictionaries and multimaps that achieve worst-case optimal performance for lookups and updates, with a small or negligible probability the data structure will require a rehash operation.