scispace - formally typeset
Search or ask a question

Showing papers by "Michael T. Goodrich published in 2012"


Proceedings ArticleDOI
17 Jan 2012
TL;DR: In this paper, the problem of providing privacy-preserving access to an outsourced honest-but-curious data repository for a group of trusted users was studied, and a method with O(log n) amortized access overhead for simulating a RAM algorithm that has a memory of size n, using a scheme that is data oblivious with very high probability.
Abstract: Motivated by cloud computing applications, we study the problem of providing privacy-preserving access to an outsourced honest-but-curious data repository for a group of trusted users. We show how to achieve efficient privacy-preserving data access using a combination of probabilistic encryption, which directly hides data values, and stateless oblivious RAM simulation, which hides the pattern of data accesses. We give a method with O(log n) amortized access overhead for simulating a RAM algorithm that has a memory of size n, using a scheme that is data-oblivious with very high probability. We assume that the simulation has access to a private workspace of size O(nv), for any given fixed constant v > 0, but does not maintain state in between data access requests. Our simulation makes use of pseudorandom hash functions and is based on a novel hierarchy of cuckoo hash tables that all share a common stash. The method outperforms all previous techniques for stateless clients in terms of access overhead. We also provide experimental results from a prototype implementation of our scheme, showing its practicality. In addition, we show that one can eliminate the dependence on pseudorandom hash functions in our simulation while having the overhead rise to be O(log2n).

249 citations


Proceedings ArticleDOI
07 Feb 2012
TL;DR: It is shown that Alice can hide both the content of her data and the pattern in which she accesses her data, with high probability, using a method that achieves O(1) amortized rounds of communication between her and Bob for each data access.
Abstract: We study oblivious storage (OS), a natural way to model privacy-preserving data outsourcing where a client, Alice, stores sensitive data at an honest-but-curious server, Bob. We show that Alice can hide both the content of her data and the pattern in which she accesses her data, with high probability, using a method that achieves O(1) amortized rounds of communication between her and Bob for each data access. We assume that Alice and Bob exchange small messages, of size O(N1/c), for some constant c>=2, in a single round, where N is the size of the data set that Alice is storing with Bob. We also assume that Alice has a private memory of size 2N1/c. These assumptions model real-world cloud storage scenarios, where trade-offs occur between latency, bandwidth, and the size of the client's private memory.

94 citations


Book ChapterDOI
19 Sep 2012
TL;DR: It is shown that there are sparse maximal 1-planar graphs with only $\frac{45}{17} n + \mathcal{O}(1)$ edges, and it is proved that a maximal 1 -planar rotation system of a graph uniquely determines its 1- Planar embedding.
Abstract: A graph is 1-planar if it can be drawn in the plane such that each edge is crossed at most once. It is maximal 1-planar if the addition of any edge violates 1-planarity. Maximal 1-planar graphs have at most 4n−8 edges. We show that there are sparse maximal 1-planar graphs with only $\frac{45}{17} n + \mathcal{O}(1)$ edges. With a fixed rotation system there are maximal 1-planar graphs with only $\frac{7}{3} n + \mathcal{O}(1)$ edges. This is sparser than maximal planar graphs. There cannot be maximal 1-planar graphs with less than $\frac{21}{10} n - \mathcal{O}(1)$ edges and less than $\frac{28}{13} n - \mathcal{O}(1)$ edges with a fixed rotation system. Furthermore, we prove that a maximal 1-planar rotation system of a graph uniquely determines its 1-planar embedding.

86 citations


Journal ArticleDOI
TL;DR: Lombardi drawings as mentioned in this paper represent edges as circular arcs rather than as line segments or polylines, and the vertices have perfect angular resolution: the edges are equally spaced around each vertex.
Abstract: We introduce the notion of Lombardi graph drawings, named after the American abstract artist Mark Lombardi. In these drawings, edges are represented as circular arcs rather than as line segments or polylines, and the vertices have perfect angular resolution: the edges are equally spaced around each vertex. We describe algorithms for finding Lombardi drawings of regular graphs, graphs of bounded degeneracy, and certain families of planar graphs.

48 citations


Posted Content
TL;DR: Techniques for using social gravity as an additional force in force-directed layouts, together with a scaling technique, are presented, to produce drawings of trees and forests, as well as more complex social networks.
Abstract: Force-directed layout algorithms produce graph drawings by resolving a system of emulated physical forces. We present techniques for using social gravity as an additional force in force-directed layouts, together with a scaling technique, to produce drawings of trees and forests, as well as more complex social networks. Social gravity assigns mass to vertices in proportion to their network centrality, which allows vertices that are more graph-theoretically central to be visualized in physically central locations. Scaling varies the gravitational force throughout the simulation, and reduces crossings relative to unscaled gravity. In addition to providing this algorithmic framework, we apply our algorithms to social networks produced by Mark Lombardi, and we show how social gravity can be incorporated into force-directed Lombardi-style drawings.

33 citations


Journal ArticleDOI
01 Jun 2012
TL;DR: The concept of an authenticated web crawler is introduced and its design and prototype implementation are presented, which provides a low communication overhead between the search engine and the user, and fast verification of the returned results by the user.
Abstract: We consider the problem of verifying the correctness and completeness of the result of a keyword search. We introduce the concept of an authenticated web crawler and present its design and prototype implementation. An authenticated web crawler is a trusted program that computes a specially-crafted signature over the web contents it visits. This signature enables (i) the verification of common Internet queries on web pages, such as conjunctive keyword searches---this guarantees that the output of a conjunctive keyword search is correct and complete; (ii) the verification of the content returned by such Internet queries---this guarantees that web data is authentic and has not been maliciously altered since the computation of the signature by the crawler. In our solution, the search engine returns a cryptographic proof of the query result. Both the proof size and the verification time are proportional only to the sizes of the query description and the query result, but do not depend on the number or sizes of the web pages over which the search is performed. As we experimentally demonstrate, the prototype implementation of our system provides a low communication overhead between the search engine and the user, and fast verification of the returned results by the user.

32 citations


Book ChapterDOI
19 Sep 2012
TL;DR: In this paper, the authors present techniques for using social gravity as an additional force in force-directed layouts, together with a scaling technique to produce drawings of trees and forests, as well as more complex social networks.
Abstract: Force-directed layout algorithms produce graph drawings by resolving a system of emulated physical forces We present techniques for using social gravity as an additional force in force-directed layouts, together with a scaling technique, to produce drawings of trees and forests, as well as more complex social networks Social gravity assigns mass to vertices in proportion to their network centrality, which allows vertices that are more graph-theoretically central to be visualized in physically central locations Scaling varies the gravitational force throughout the simulation, and reduces crossings relative to unscaled gravity In addition to providing this algorithmic framework, we apply our algorithms to social networks produced by Mark Lombardi, and we show how social gravity can be incorporated into force-directed Lombardi-style drawings

31 citations


Journal ArticleDOI
TL;DR: In this article, a planar straight-line drawing of a combinatorially-embedded genus-g graph with the graph's canonical polygonal schema drawn as a convex polygon's external face is presented.
Abstract: We study the classic graph drawing problem of drawing a planar graph using straight-line edges with a prescribed convex polygon as the outer face. Unlike previous algorithms for this problem, which may produce drawings with exponential area, our method produces drawings with polynomial area. In addition, we allow for collinear points on the boundary, provided such vertices do not create overlapping edges. Thus, we solve an open problem of Duncan et al., which, when combined with their work, implies that we can produce a planar straight-line drawing of a combinatorially-embedded genus-g graph with the graph's canonical polygonal schema drawn as a convex polygonal external face.

30 citations


Journal ArticleDOI
TL;DR: Methods for maintaining subgraph frequencies in a dynamic graph are presented, using data structures that are parameterized in terms of h, the h-index of the graph, to enable a number of new applications in Bioinformatics and Social Networking research.

22 citations


Book ChapterDOI
03 Dec 2012
TL;DR: This work designs hashing-based indexing schemes for dictionaries and multimaps that achieve worst-case optimal performance for lookups and updates, with minimal space overhead and sub-polynomial probability that the data structure will require a rehash operation.
Abstract: A dictionary (or map) is a key-value store that requires all keys be unique, and a multimap is a key-value store that allows for multiple values to be associated with the same key We design hashing-based indexing schemes for dictionaries and multimaps that achieve worst-case optimal performance for lookups and updates, with minimal space overhead and sub-polynomial probability that the data structure will require a rehash operation Our dictionary structure is designed for the Random Access Machine (RAM) model, while our multimap implementation is designed for the cache-oblivious external memory (I/O) model The failure probabilities for our structures are sub-polynomial, which can be useful in cryptographic or data-intensive applications

11 citations


Book ChapterDOI
19 Sep 2012
TL;DR: It is shown that a number of classic graph drawing algorithms can be efficiently implemented in such a framework where the client can maintain privacy while constructing a drawing of her graph.
Abstract: We study graph drawing in a cloud-computing context where data is stored externally and processed using a small local working storage. We show that a number of classic graph drawing algorithms can be efficiently implemented in such a framework where the client can maintain privacy while constructing a drawing of her graph.

Journal ArticleDOI
TL;DR: In this article, the authors study the degree to which a character string Q leaks details about itself any time it engages in comparison protocols with a strings provided by a querier, Bob, even if those protocols are cryptographically guaranteed to produce no additional information other than the scores that assess the degree of similarity between Q matches strings offered by Bob, and show that such scenarios allow Bob to play variants of the game of Mastermind with Q so as to learn the complete identity of Q.
Abstract: We study the degree to which a character string Q leaks details about itself any time it engages in comparison protocols with a strings provided by a querier, Bob, even if those protocols are cryptographically guaranteed to produce no additional information other than the scores that assess the degree to which Q matches strings offered by Bob. We show that such scenarios allow Bob to play variants of the game of Mastermind with Q so as to learn the complete identity of Q. We show that there are a number of efficient implementations for Bob to employ in these Mastermind attacks, depending on knowledge he has about the structure of Q, which show how quickly he can determine Q. Indeed, we show that Bob can discover Q using a number of rounds of test comparisons that is much smaller than the length of Q, under reasonable assumptions regarding the types of scores that are returned by the cryptographic protocols and whether he can use knowledge about the distribution that Q comes from. We also provide the results of a case study we performed on a database of mitochondrial DNA, showing the vulnerability of existing real-world DNA data to the Mastermind attack.

Posted Content
TL;DR: The concept of an authenticated web crawler is introduced and the design and prototype implementation of this new concept is presented, which gives a low communication overhead between the search engine and the user, and allows for fast verification of the returned results on the user side.
Abstract: Searching accounts for one of the most frequently performed computations over the Internet as well as one of the most important applications of outsourced computing, producing results that critically affect users' decision-making behaviors. As such, verifying the integrity of Internet-based searches over vast amounts of web contents is essential. We provide the first solution to this general security problem. We introduce the concept of an authenticated web crawler and present the design and prototype implementation of this new concept. An authenticated web crawler is a trusted program that computes a special "signature" $s$ of a collection of web contents it visits. Subject to this signature, web searches can be verified to be correct with respect to the integrity of their produced results. This signature also allows the verification of complicated queries on web pages, such as conjunctive keyword searches. In our solution, along with the web pages that satisfy any given search query, the search engine also returns a cryptographic proof. This proof, together with the signature $s$, enables any user to efficiently verify that no legitimate web pages are omitted from the result computed by the search engine, and that no pages that are non-conforming with the query are included in the result. An important property of our solution is that the proof size and the verification time both depend solely on the sizes of the query description and the query result, but not on the number or sizes of the web pages over which the search is performed. Our authentication protocols are based on standard Merkle trees and the more involved bilinear-map accumulators. As we experimentally demonstrate, the prototype implementation of our system gives a low communication overhead between the search engine and the user, and allows for fast verification of the returned results on the user side.

Book ChapterDOI
19 Sep 2012
TL;DR: A new efficient data-oblivious PRAM simulation and several new data-OBlivious graph-drawing algorithms with application to privacy-preserving graph- Drawing in a cloud computing context are given.
Abstract: We give a new efficient data-oblivious PRAM simulation and several new data-oblivious graph-drawing algorithms with application to privacy-preserving graph-drawing in a cloud computing context.