scispace - formally typeset
Search or ask a question
Author

Jon Crowcroft

Bio: Jon Crowcroft is an academic researcher from University of Cambridge. The author has contributed to research in topics: The Internet & Multicast. The author has an hindex of 87, co-authored 672 publications receiving 38848 citations. Previous affiliations of Jon Crowcroft include Memorial University of Newfoundland & Information Technology University.


Papers
More filters
Proceedings ArticleDOI
03 Dec 2003
TL;DR: Although Mayhem promotes the trend of middleware, it requires a common base infrastructure - the Java Adaptive Dynamic Environment (JADE), which relies heavily on component design methodology coupled with a simple layered approach to system architectures.
Abstract: The Internet, as an accessible network with global connectivity, presents itself as the likely solution for the supporting infrastructure of cyberspace. In contrast, the field of virtual environments (VE) is fragmented with a wide proliferation of different systems, as a result of the current development trend. However, these divergent systems present a significant overlap of functionality that represents a waste of development resources and a detriment to innovation. This paper proposes Mayhem as a new approach to the design and development of VE systems. Our approach relies heavily on component design methodology coupled with a simple layered approach to system architectures. The combination of the two promotes the construction of systems from smaller building blocks, which may have been developed by different sources, each specializing in the supported functionality. Although Mayhem promotes the trend of middleware, it requires a common base infrastructure - the Java Adaptive Dynamic Environment (JADE). To demonstrate how to build a VE system using Mayhem, the paper describes a simple prototype of a VE and some of its key building blocks.

3 citations

Proceedings ArticleDOI
26 Aug 2019
TL;DR: This paper shows how accessing host physical memory is achieved and discusses why this is not a vulnerability in some platforms, but rather a powerful tool for securing data acquisition when the host is not trusted to perform the acquisition.
Abstract: Modern malware is complex, stealthy, and employ anti-forensics techniques to evade detection. In order to detect malware, data must be collected, such, allows further analyses of the malware's behaviour. However, when both the malware and the detecting system run on the same domain (the CPU) it's questionable whether the data acquired by the acquisition method is not tampered with. Hardware based techniques, such as acquiring data out-of-band using a PCIe device allow for data acquisition that is deemed trusted when the acquisition method does not rely on any data present on the host memory. Unfortunately, in Input-Output Memory Management Unit (IOMMU) based systems, peripheral devices access to host memory go through a stage of translation by the IOMMU. The translation tables which reside in the host's memory are subject to malware control, hence are not trustworthy. In this paper we present a method that allows acquiring the data reliably without dependant on data residing in host memory, even when IOMMU is being used to restrict devices. We show how accessing host physical memory is achieved and discuss why this is not a vulnerability in some platforms, but rather a powerful tool for securing data acquisition when the host is not trusted to perform the acquisition.

3 citations

Journal ArticleDOI
08 Jun 2020
TL;DR: This article shows somewhat surprisingly that, following a cyber-attack, the effect of a network interconnection topology and a wide range of loss distributions on the probability of a Cyber-blackout and the increase in total service-related monetary losses across all organizations are mostly very small.
Abstract: Service liability interconnections among globally networked IT- and IoT-driven service organizations create potential channels for cascading service disruptions worth billions of dollars, due to modern cyber-crimes such as DDoS, APT, and ransomware attacks. A natural question that arises in this context is: What is the likelihood of a cyber-blackout?, where the latter term is defined as the probability that all (or a major subset of) organizations in a service chain become dysfunctional in a certain manner due to a cyber-attack at some or all points in the chain. The answer to this question has major implications to risk management businesses such as cyber-insurance when it comes to designing policies by risk-averse insurers for providing coverage to clients in the aftermath of such catastrophic network events. In this article, we investigate this question in general as a function of service chain networks and different cyber-loss distribution types. We show somewhat surprisingly (and discuss the potential practical implications) that, following a cyber-attack, the effect of (a) a network interconnection topology and (b) a wide range of loss distributions on the probability of a cyber-blackout and the increase in total service-related monetary losses across all organizations are mostly very small. The primary rationale behind these results are attributed to degrees of heterogeneity in the revenue base among organizations and the Increasing Failure Rate property of popular (i.i.d/non-i.i.d) loss distributions, i.e., log-concave cyber-loss distributions. The result will enable risk-averse cyber-risk managers to safely infer the impact of cyber-attacks in a worst-case network and distribution oblivious setting.

3 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

Journal ArticleDOI
TL;DR: In this paper, Imagined communities: Reflections on the origin and spread of nationalism are discussed. And the history of European ideas: Vol. 21, No. 5, pp. 721-722.

13,842 citations

Journal ArticleDOI
TL;DR: A thorough exposition of community structure, or clustering, is attempted, from the definition of the main elements of the problem, to the presentation of most methods developed, with a special focus on techniques designed by statistical physicists.
Abstract: The modern science of networks has brought significant advances to our understanding of complex systems. One of the most relevant features of graphs representing real systems is community structure, or clustering, i. e. the organization of vertices in clusters, with many edges joining vertices of the same cluster and comparatively few edges joining vertices of different clusters. Such clusters, or communities, can be considered as fairly independent compartments of a graph, playing a similar role like, e. g., the tissues or the organs in the human body. Detecting communities is of great importance in sociology, biology and computer science, disciplines where systems are often represented as graphs. This problem is very hard and not yet satisfactorily solved, despite the huge effort of a large interdisciplinary community of scientists working on it over the past few years. We will attempt a thorough exposition of the topic, from the definition of the main elements of the problem, to the presentation of most methods developed, with a special focus on techniques designed by statistical physicists, from the discussion of crucial issues like the significance of clustering and how methods should be tested and compared against each other, to the description of applications to real networks.

9,057 citations

Journal ArticleDOI
TL;DR: A thorough exposition of the main elements of the clustering problem can be found in this paper, with a special focus on techniques designed by statistical physicists, from the discussion of crucial issues like the significance of clustering and how methods should be tested and compared against each other, to the description of applications to real networks.

8,432 citations