scispace - formally typeset
Search or ask a question
Author

Jon Crowcroft

Bio: Jon Crowcroft is an academic researcher from University of Cambridge. The author has contributed to research in topics: The Internet & Multicast. The author has an hindex of 87, co-authored 672 publications receiving 38848 citations. Previous affiliations of Jon Crowcroft include Memorial University of Newfoundland & Information Technology University.


Papers
More filters
DOI
01 Jan 2016
TL;DR: This seminar discussed new research direction for data center latency control across the entire software and hardware stack, including in-network solutions, end-host solutions, and others.
Abstract: This report documents the program and the outcomes of Dagstuhl Seminar 16281 "Network Latency Control in Data Centres". This seminar explored existing and future techniques for controlling data centre latency and thus explores research directions in the new field of data centre latency control in networking research. This need for a new research direction is motivated by the fact that traditional networking equipment and TCP-IP stacks were designed for wide-area networks, where the goal is to maximize throughput, and the control loop between end systems is measured in 10s of milliseconds. Consequently, this seminar discussed new research direction for data center latency control across the entire software and hardware stack, including in-network solutions, end-host solutions, and others.

2 citations

01 Sep 2018
TL;DR: In this paper, the authors investigate the problem of the existence and design of efficient ecosystems (modeled as markets) that aim to achieve a maximum social welfare state amongst competing data holders by preserving the heterogeneous privacy preservation constraints upto certain compromise levels, induced by their clients, and at the same time satisfying requirements of agencies that collect and trade client data for the purpose of targeted advertising, assuming the potential practical inevitability of some amount inappropriate data leakage on behalf of the data holders.
Abstract: In the era of the mobile apps and IoT, huge quantities of data about individuals and their activities offer a wave of opportunities for economic and societal value creation. However, the current personal data ecosystem is fragmented and inefficient. On one hand, end-users are not able to control access (either technologically, by policy, or psychologically) to their personal data which results in issues related to privacy, personal data ownership, transparency, and value distribution. On the other hand, this puts the burden of managing and protecting user data on apps and ad-driven entities (e.g., an ad-network) at a cost of trust and regulatory accountability. In such a context, data holders (e.g., apps) may take advantage of the individuals’ inability to fully comprehend and anticipate the potential uses of their private information with detrimental effects for aggregate social welfare. In this paper, we1 investigate the problem of the existence and design of efficient ecosystems (modeled as markets in this paper) that aim to achieve a maximum social welfare state amongst competing data holders by preserving the heterogeneous privacy preservation constraints upto certain compromise levels, induced by their clients, and at the same time satisfying requirements of agencies (e.g., advertisers) that collect and trade client data for the purpose of targeted advertising, assuming the potential practical inevitability of some amount inappropriate data leakage on behalf of the data holders. Using concepts from supply-function economics, we propose the first mathematically rigorous and provably optimal privacy market design paradigm that always results in unique equilibrium (i.e, stable) market states that can be either economically efficient or inefficient, depending on whether privacy trading markets are monopolistic or oligopolistic in nature. Subsequently, we characterize in closed form, the efficiency gap (if any) at market equilibrium.

2 citations

Journal ArticleDOI
22 Jun 2010
TL;DR: This thesis is that the Future Internet is about as relevant as Anthropogenic Global Warming (AGW), in the way it is being used to support various inappropriate activities.
Abstract: There are so many initiatives to look at the Internet's Future http://www.future-internet.eu/, http://www.nets-find.net/ and similar programs in pretty every any other geo-political arena, anyone would think that there was some tremendous threat like global warming, about to bring about its immediate demise, and that this would bring civilisation crashing down around our ears.The Internet has a great future behind it, of course. However, my thesis is that the Future Internet is about as relevant as Anthropogenic Global Warming (AGW), in the way it is being used to support various inappropriate activities. Remember that the start of all this was not the exhaustion of IPv4 address space, or the incredibly slow convergence time of BGP routes, or the problem of scaling router memory for FIBs. It was the US research community reacting to a minor (as in parochial) temporary problem of funding in Communications due to slow down within NSF and differing agendas within DARPA.It is not necessary to invoke all the hype and hysteria - it is both necessary and sufficient to talk about sustainable energy See for example David Mackay's Without Hot Air book, at http://www.withouthotair.com/, and good technical communications research, development, deployment and operations.To continue the analogy between FI and AGW, what we really do not need is yet more climatologists with dodgy data curation methodologies (or ethnographers studying Internet governance).What we do need is some solid engineering, to address a number of problems the Internet has. However, this is in fact happening, and would not stop happening if the entire Future Internet flagship was kidnapped by aliens. "We don't need no" government agency doing top down dictats about what to do when. It won't work and it will be a massive waste of time, energy and other resources - i.e. like AGW, it will be a load of hot air:)On the other hand, there are a number of deeper lessons from the Internet Architecture which might prove useful in other domains, and in the bulk of this opinion piece, I give examples of these, applying the Postel and End-to-end principles to transport, energy, government information/vices.

2 citations

Journal ArticleDOI
TL;DR: The seven papers in this special section focus on recent advances in the growing research field of television.
Abstract: The seven papers in this special section focus om recent advances in the growing research field of television.

2 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

Journal ArticleDOI
TL;DR: In this paper, Imagined communities: Reflections on the origin and spread of nationalism are discussed. And the history of European ideas: Vol. 21, No. 5, pp. 721-722.

13,842 citations

Journal ArticleDOI
TL;DR: A thorough exposition of community structure, or clustering, is attempted, from the definition of the main elements of the problem, to the presentation of most methods developed, with a special focus on techniques designed by statistical physicists.
Abstract: The modern science of networks has brought significant advances to our understanding of complex systems. One of the most relevant features of graphs representing real systems is community structure, or clustering, i. e. the organization of vertices in clusters, with many edges joining vertices of the same cluster and comparatively few edges joining vertices of different clusters. Such clusters, or communities, can be considered as fairly independent compartments of a graph, playing a similar role like, e. g., the tissues or the organs in the human body. Detecting communities is of great importance in sociology, biology and computer science, disciplines where systems are often represented as graphs. This problem is very hard and not yet satisfactorily solved, despite the huge effort of a large interdisciplinary community of scientists working on it over the past few years. We will attempt a thorough exposition of the topic, from the definition of the main elements of the problem, to the presentation of most methods developed, with a special focus on techniques designed by statistical physicists, from the discussion of crucial issues like the significance of clustering and how methods should be tested and compared against each other, to the description of applications to real networks.

9,057 citations

Journal ArticleDOI
TL;DR: A thorough exposition of the main elements of the clustering problem can be found in this paper, with a special focus on techniques designed by statistical physicists, from the discussion of crucial issues like the significance of clustering and how methods should be tested and compared against each other, to the description of applications to real networks.

8,432 citations