scispace - formally typeset
Search or ask a question
Author

Jon Crowcroft

Bio: Jon Crowcroft is an academic researcher from University of Cambridge. The author has contributed to research in topics: The Internet & Multicast. The author has an hindex of 87, co-authored 672 publications receiving 38848 citations. Previous affiliations of Jon Crowcroft include Memorial University of Newfoundland & Information Technology University.


Papers
More filters
Proceedings ArticleDOI
13 Nov 2005
TL;DR: This paper introduces PDB, the prototype debugger, which is based on a hierarchical, scalable architecture, and explains the design, functionality, and usability of this tool, and demonstrates its usability with two case studies.
Abstract: Developing applications for parallel and distributed systems is hard due to their nondeterministic nature; developing debugging tools for such systems and applications is even harder. A number of distributed debugging tools and techniques exist; however, we believe that they lack the infrastructure to scale to large-scale distributed systems, systems with hundreds and thousands of nodes, such as grids. In this paper, we introduce PDB, our prototype debugger, which is based on a hierarchical, scalable architecture. We explain the design of the PDB, highlight its functionality, and demonstrate its usability with two case studies. Before concluding, we discuss portability and extensibility issues for PDB, and discuss some solutions.

17 citations

Journal ArticleDOI
TL;DR: It is discovered that although a third of movement time in the battlegrounds is spent in inter-node journeys, less than a quarter of these journeys are made in groups.
Abstract: Distributed Virtual Environment (DVE) topology management and message propagation schemes have been proposed for many years. Evaluating DVE message propagation schemes requires a variety of assumptions whose verity significantly affects results, such as details about avatar movement characteristics. We implemented two schemes for waypoint and hotspot detection, and examined their applicability for characterising avatar movement. We confirmed that waypoint detection does not yield good results for characterising human avatar movement, and gained new insight into why by rendering avatar movement as point clouds. We implemented an existing hotspot detection model, and proposed an enhancement to help overcome one limitation of cell-based hotspot detection. We were able to immediately apply this hotspot detection technique to help analyse group movement. We discovered that although a third of movement time in the battlegrounds is spent in inter-node journeys, less than a quarter of these journeys are made in groups.

17 citations

Posted Content
TL;DR: This work explores the controversial technique of so-called immunity passports and presents SecureABC: a decentralised, privacy-preserving protocol for issuing and verifying antibody certificates.
Abstract: COVID-19 has resulted in unprecedented social distancing policies being enforced worldwide. As governments seek to restore their economies, open workplaces and permit travel there is a demand for technologies that may alleviate the requirement for social distancing whilst also protecting healthcare services. In this work we explore the controversial technique of so-called immunity passports and present SecureABC: a decentralised, privacy-preserving protocol for issuing and verifying antibody certificates. We consider the implications of antibody certificate systems, develop a set of risk-minimising principles and a security framework for their evaluation, and show that these may be satisfied in practice. Finally, we also develop two additional protocols that minimise individual discrimination but which still allow for collective transmission risk to be estimated. We use these two protocols to illustrate the utility-privacy trade-offs of antibody certificates and their alternatives.

17 citations

Proceedings ArticleDOI
01 Nov 2016
TL;DR: This paper develops optimal and heuristic caching solutions that explicitly consider both performance and fairness, and argues that only algorithms that are fair to all parties will encourage engagement and cooperation.
Abstract: Caching is a core principle of information-centric networking (ICN). Many novel algorithms have been proposed for enabling ICN caching, many of which rely on collaborative principles, i.e. multiple caches interacting to decide what to store. Past work has assumed entirely altruistic nodes that will sacrifice their own performance for the global optimum. In this paper, we argue that this assumption is flawed. We address this problem by modelling the in-network caching problem as a Nash bargaining game. We develop optimal and heuristic caching solutions that explicitly consider both performance and fairness. We argue that only algorithms that are fair to all parties will encourage engagement and cooperation. Through extensive simulations, we show our heuristic solution, FairCache, ensures that all collaborative caches achieve performance gains without undermining the performance of others.

16 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

Journal ArticleDOI
TL;DR: In this paper, Imagined communities: Reflections on the origin and spread of nationalism are discussed. And the history of European ideas: Vol. 21, No. 5, pp. 721-722.

13,842 citations

Journal ArticleDOI
TL;DR: A thorough exposition of community structure, or clustering, is attempted, from the definition of the main elements of the problem, to the presentation of most methods developed, with a special focus on techniques designed by statistical physicists.
Abstract: The modern science of networks has brought significant advances to our understanding of complex systems. One of the most relevant features of graphs representing real systems is community structure, or clustering, i. e. the organization of vertices in clusters, with many edges joining vertices of the same cluster and comparatively few edges joining vertices of different clusters. Such clusters, or communities, can be considered as fairly independent compartments of a graph, playing a similar role like, e. g., the tissues or the organs in the human body. Detecting communities is of great importance in sociology, biology and computer science, disciplines where systems are often represented as graphs. This problem is very hard and not yet satisfactorily solved, despite the huge effort of a large interdisciplinary community of scientists working on it over the past few years. We will attempt a thorough exposition of the topic, from the definition of the main elements of the problem, to the presentation of most methods developed, with a special focus on techniques designed by statistical physicists, from the discussion of crucial issues like the significance of clustering and how methods should be tested and compared against each other, to the description of applications to real networks.

9,057 citations

Journal ArticleDOI
TL;DR: A thorough exposition of the main elements of the clustering problem can be found in this paper, with a special focus on techniques designed by statistical physicists, from the discussion of crucial issues like the significance of clustering and how methods should be tested and compared against each other, to the description of applications to real networks.

8,432 citations