scispace - formally typeset
Search or ask a question

Showing papers by "Jon Crowcroft published in 2020"


Journal ArticleDOI
02 Sep 2020
TL;DR: This paper attempts to systematise the various COVID-19 research activities leveraging data science, where data science is defined broadly to encompass the various methods and tools that can be used to store, process, and extract insights from data.
Abstract: COVID-19, an infectious disease caused by the SARS-CoV-2 virus, was declared a pandemic by the World Health Organisation (WHO) in March 2020. By mid-August 2020, more than 21 million people have tested positive worldwide. Infections have been growing rapidly and tremendous efforts are being made to fight the disease. In this paper, we attempt to systematise the various COVID-19 research activities leveraging data science, where we define data science broadly to encompass the various methods and tools—including those from artificial intelligence (AI), machine learning (ML), statistics, modeling, simulation, and data visualization—that can be used to store, process, and extract insights from data. In addition to reviewing the rapidly growing body of recent research, we survey public datasets and repositories that can be used for further work to track COVID-19 spread and mitigation strategies. As part of this, we present a bibliometric analysis of the papers produced in this short span of time. Finally, building on these insights, we highlight common challenges and pitfalls observed across the surveyed works. We also created a live resource repository at https://github.com/Data-Science-and-COVID-19/Leveraging-Data-Science-To-Combat-COVID-19-A-Comprehensive-Review that we intend to keep updated with the latest resources including new papers and datasets.

187 citations


Posted Content
TL;DR: This survey article provides a comprehensive introduction to edge intelligence and its application areas and presents a systematic classification of the state of the solutions by examining research results and observations for each of the four components.
Abstract: Edge intelligence refers to a set of connected systems and devices for data collection, caching, processing, and analysis in locations close to where data is captured based on artificial intelligence. The aim of edge intelligence is to enhance the quality and speed of data processing and protect the privacy and security of the data. Although recently emerged, spanning the period from 2011 to now, this field of research has shown explosive growth over the past five years. In this paper, we present a thorough and comprehensive survey on the literature surrounding edge intelligence. We first identify four fundamental components of edge intelligence, namely edge caching, edge training, edge inference, and edge offloading, based on theoretical and practical results pertaining to proposed and deployed systems. We then aim for a systematic classification of the state of the solutions by examining research results and observations for each of the four components and present a taxonomy that includes practical problems, adopted techniques, and application goals. For each category, we elaborate, compare and analyse the literature from the perspectives of adopted techniques, objectives, performance, advantages and drawbacks, etc. This survey article provides a comprehensive introduction to edge intelligence and its application areas. In addition, we summarise the development of the emerging research field and the current state-of-the-art and discuss the important open issues and possible theoretical and technical solutions.

65 citations


Posted Content
TL;DR: The first solution builds on current "message based" methods and the second leverages ideas from secret sharing and additively homomorphic encryption to protect users privacy.
Abstract: Contact tracing is being widely employed to combat the spread of COVID-19. Many apps have been developed that allow for tracing to be done automatically based off location and interaction data generated by users. There are concerns, however, regarding the privacy and security of users data when using these apps. These concerns are paramount for users who contract the virus, as they are generally required to release all their data. Motivated by the need to protect users privacy we propose two solutions to this problem. Our first solution builds on current "message based" methods and our second leverages ideas from secret sharing and additively homomorphic encryption.

54 citations


Posted ContentDOI
24 Mar 2020
TL;DR: In this article, a regression model is developed to model the relationship between the infection rate of Covid-19 (number of confirmed cases/population at the city level) and outdoor air pollution at city level, after taking into account confounding factors such as meteorology, inter-city movements, demographics, and co-morbidity and coinfection rates.
Abstract: Background: Covid-19 was first reported in Wuhan, China in Dec 2019. Since then, it has been transmitted rapidly in China and the rest of the world. While Covid-19 transmission rate has been declining in China, it is increasing exponentially in Europe and America. Although there are numerous studies examining Covid-19 infection, including an archived paper looking into the meteorological effect, the role of outdoor air pollution has yet to be explored rigorously. It has been shown that air pollution will weaken the immune system, and increase the rate of respiratory virus infection. We postulate that outdoor air pollution concentrations will have a negative effect on Covid-19 infections in China, whilst lockdowns, characterized by strong social distancing and home isolation measures, will help to moderate such negative effect. Methods: We will collect the number of daily confirmed Covid-19 cases in 31 provincial capital cities in China during the period of 1 Dec 2019 to 20 Mar 2020 (from a popular Chinese online platform which aggregates all cases reported by the Chinese national/provincial health authorities). We will also collect daily air pollution and meteorology data at the city-level (from the Chinese National Environmental Monitoring Center and the US National Climatic Data Center), daily inter-city migration flows and intra-city movements (from Baidu). City-level demographics including age distribution and gender, education, and median household income can be obtained from the statistical yearbooks. City-level co-morbidity indicators including rates of chronic disease and co-infection can be obtained from related research articles. A regression model is developed to model the relationship between the infection rate of Covid-19 (number of confirmed cases/population at the city level) and outdoor air pollution at the city level, after taking into account confounding factors such as meteorology, inter- and intra-city movements, demographics, and co-morbidity and co-infection rates. In particular, we shall study how air pollution affects infection rates across different cities, including Wuhan. Our model will also study air pollution would affect infection rates in Wuhan before and after the lockdown. Expected findings: We expect there be a correlation between Covid-19 infection rate and outdoor air pollution. We also expect that reduced intra-city movement after the lockdowns in Wuhan and the rest of China will play an important role in reducing the infection rate. Interpretation: Infection rate is growing exponentially in major cities worldwide. We expect Covid-19 infection rate is related to the air pollution concentration, and is strongly dependent on inter- and intra-city movements. To reduce the infection rate, the international community may deploy effective air pollution reduction plans and social distancing policies.

44 citations


Proceedings ArticleDOI
27 Apr 2020
TL;DR: This work deploys CoLearn on resource-constrained devices in a lab environment to demonstrate an asynchronous participation mechanism for IoT devices in machine learning model training using a publish/subscribe architecture and a mechanism for reducing the attack surface in FL architecture by allowing only IoT MUD-compliant devices to participate in the training phases.
Abstract: Edge computing and Federated Learning (FL) can work in tandem to address issues related to privacy and collaborative distributed learning in untrusted IoT environments. However, deployment of FL in resource-constrained IoT devices faces challenges including asynchronous participation of such devices in training, and the need to prevent malicious devices from participating. To address these challenges we present CoLearn, which build on the open-source Manufacturer Usage Description (MUD) implementation osMUD and the FL framework PySyft. We deploy CoLearn on resource-constrained devices in a lab environment to demonstrate (i) an asynchronous participation mechanism for IoT devices in machine learning model training using a publish/subscribe architecture, (ii) a mechanism for reducing the attack surface in FL architecture by allowing only IoT MUD-compliant devices to participate in the training phases, and (iii) a trade-off between communication bandwidth usage, training time and device temperature (thermal fatigue).

40 citations


Proceedings Article
01 Jan 2020
TL;DR: Numerical simulations show that the proposed federated, asynchronous, and $(\varepsilon, \delta)-differentially private algorithm for PCA in the memory-limited setting exhibits performance that closely matches or outperforms traditional non-federated algorithms, and in the absence of communication latency, it exhibits attractive horizontal scalability.
Abstract: We present a federated, asynchronous, and $(\varepsilon, \delta)$-differentially private algorithm for PCA in the memory-limited setting. Our algorithm incrementally computes local model updates using a streaming procedure and adaptively estimates its $r$ leading principal components when only $\mathcal{O}(dr)$ memory is available with $d$ being the dimensionality of the data. We guarantee differential privacy via an input-perturbation scheme in which the covariance matrix of a dataset $\mathbf{X} \in \mathbb{R}^{d \times n}$ is perturbed with a non-symmetric random Gaussian matrix with variance in $\mathcal{O}\left(\left(\frac{d}{n}\right)^2 \log d \right)$, thus improving upon the state-of-the-art. Furthermore, contrary to previous federated or distributed algorithms for PCA, our algorithm is also invariant to permutations in the incoming data, which provides robustness against straggler or failed nodes. Numerical simulations show that, while using limited-memory, our algorithm exhibits performance that closely matches or outperforms traditional non-federated algorithms, and in the absence of communication latency, it exhibits attractive horizontal scalability.

28 citations


Journal ArticleDOI
TL;DR: The main features of Bluetooth Mesh and 6BLEMesh are presented, and their performance characteristics and trade-offs are investigated.
Abstract: Bluetooth Low Energy (BLE) mesh networking is an emerging technology domain that promises an important role in the Internet of Things. Significant market opportunities for BLE mesh networking have motivated the recent development of two different BLE mesh networking standards, Bluetooth Mesh and 6BLEMesh, produced by the Bluetooth SIG and IETF, respectively. These two standards follow different technical approaches. In this article, we present the main features of Bluetooth Mesh and 6BLEMesh, and investigate their performance characteristics and trade-offs.

24 citations


Posted Content
TL;DR: This work explores the controversial technique of so-called immunity passports and presents SecureABC: a decentralised, privacy-preserving protocol for issuing and verifying antibody certificates.
Abstract: COVID-19 has resulted in unprecedented social distancing policies being enforced worldwide. As governments seek to restore their economies, open workplaces and permit travel there is a demand for technologies that may alleviate the requirement for social distancing whilst also protecting healthcare services. In this work we explore the controversial technique of so-called immunity passports and present SecureABC: a decentralised, privacy-preserving protocol for issuing and verifying antibody certificates. We consider the implications of antibody certificate systems, develop a set of risk-minimising principles and a security framework for their evaluation, and show that these may be satisfied in practice. Finally, we also develop two additional protocols that minimise individual discrimination but which still allow for collective transmission risk to be estimated. We use these two protocols to illustrate the utility-privacy trade-offs of antibody certificates and their alternatives.

17 citations


Journal ArticleDOI
TL;DR: This paper proposes network-assisted device-to-device (D2D) communication in licensed and unlicensed spectrum interoperable networks, to improve D2D users’ throughput while alleviating the spectrum scarcity issue of cellular networks.
Abstract: In this paper, we propose network-assisted device-to-device (D2D) communication in licensed and unlicensed spectrum interoperable networks, to improve D2D users’ throughput while alleviating the spectrum scarcity issue of cellular networks. The idea of licensed and unlicensed spectrum interoperability is based on the findings of the IEEE 1932.1 working group. Conventionally, D2D users were only able to communicate by using either cellular or non-cellular networks and no interoperability mechanism was available. The proposed scheme brings in many benefits including but not limited to higher D2D users’ throughput, alleviation in spectrum scarcity issue of cellular networks, and better network management. However, ensuring quality-of-service (QoS) in this dynamic environment is a challenging task. To this end, we analyze the QoS using a well-known analytical tool “Effective Capacity (EC)” for eNodeB-assisted as well as WiFi-assisted D2D communication. Moreover, we also see the impact of neighboring cells’ load and full-duplex transceiver at eNodeB and WiFi access point on the EC of D2D users. Simulation results show that EC increases with a decrease in neighboring cell’s load and decreases when more stringent QoS constraints are imposed. Results also show that the maximum sustainable source rate at the transmitter’s queue increases with an increase in maximum allowed packet delay but converges to a maximum value soon after that.

16 citations


Journal ArticleDOI
TL;DR: Kabbinale et al. as discussed by the authors proposed blockchain for economically sustainable wireless mesh networks, which has been published in final form at https://doi.org/10.1002/cpe.5349.
Abstract: This is the peer reviewed version of the following article: Kabbinale, AR, Dimogerontakis, E, Selimi, M, et al. Blockchain for economically sustainable wireless mesh networks. Concurrency Computat Pract Exper. 2020; 32:e5349, which has been published in final form at https://doi.org/10.1002/cpe.5349. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-Archiving.

14 citations


Posted Content
TL;DR: In this paper, the authors present a series of perspectives of the subject, and where the authors believe fruitful areas for future research are to be found, and summarize a wide survey of the state of the art in network science and epidemiology.
Abstract: On May $28^{th}$ and $29^{th}$, a two day workshop was held virtually, facilitated by the Beyond Center at ASU and Moogsoft Inc. The aim was to bring together leading scientists with an interest in Network Science and Epidemiology to attempt to inform public policy in response to the COVID-19 pandemic. Epidemics are at their core a process that progresses dynamically upon a network, and are a key area of study in Network Science. In the course of the workshop a wide survey of the state of the subject was conducted. We summarize in this paper a series of perspectives of the subject, and where the authors believe fruitful areas for future research are to be found.

Journal ArticleDOI
01 Feb 2020
TL;DR: This work proposes a novel proactive load balancing scheme that learns users' mobility and demands statistics jointly to proactively cache future contents during their stay at lightly loaded cells, which results in quality of experience maximization and load minimization.
Abstract: Funding information Punjab Higher Education Commission (PHEC), Lahore, Pakistan; National Science Foundation, Grant/Award Number: 1619346, 1559483, 1718956, and 1730650 Abstract Evolution of cellular networks into dynamic, dense, and heterogeneous networks have introduced new challenges for cell resource optimization, especially in the imbalanced traffic load regions. Numerous load balancing schemes have been proposed to tackle this issue; however, they operate in a reactive manner that confines their ability to meet the top-notch quality of experience demands. To address this challenge, we propose a novel proactive load balancing scheme. Our framework learns users' mobility and demands statistics jointly to proactively cache future contents during their stay at lightly loaded cells, which results in quality of experience maximization and load minimization. System level simulations are performed and compared with the state-of-the-art reactive schemes.

Journal ArticleDOI
TL;DR: A design of regulated efficient/bounded inefficient economic mechanisms for oligopoly data trading markets using a novel preference function bidding approach on a simplified sellers-broker market is proposed.
Abstract: In the modern era of the mobile apps (the era of surveillance capitalism - as termed by Shoshana Zuboff) huge quantities of surveillance data about consumers and their activities offer a wave of opportunities for economic and societal value creation. ln-app advertising - a multi-billion dollar industry, is an essential part of the current digital ecosystem driven by free mobile applications, where the ecosystem entities usually comprise consumer apps, their clients (consumers), ad-networks, and advertisers. Sensitive consumer information is often being sold downstream in this ecosystem without the knowledge of consumers, and in many cases to their annoyance. While this practice, in cases, may result in long-term benefits for the consumers, it can result in serious information privacy breaches of very significant impact (e.g., breach of genetic data) in the short term. The question we raise through this paper is: Is it economically feasible to trade consumer personal information with their formal consent (permission) and in return provide them incentives (monetary or otherwise)?. In view of (a) the behavioral assumption that humans are `compromising' beings and have privacy preferences, (b) privacy as a good not having strict boundaries, and (c) the practical inevitability of inappropriate data leakage by data holders downstream in the data-release supply-chain, we propose a design of regulated efficient/bounded inefficient economic mechanisms for oligopoly data trading markets using a novel preference function bidding approach on a simplified sellers-broker market. Our methodology preserves the heterogeneous privacy preservation constraints (at a grouped consumer, i.e., app, level) upto certain compromise levels, and at the same time satisfies information demand (via the broker) of agencies (e.g., advertising organizations) that collect client data for the purpose of targeted behavioral advertising.

Posted Content
TL;DR: It is argued that there is a large class of smart contract applications where on-TTP smart contracts are a better alternative than on-chain smart contracts and that the inclusion of a TTP instead of a blockchain to host the smart contract simplifies the problems and offers pragmatic solutions.
Abstract: The hype about Bitcoin has overrated the potential of smart contracts deployed on-blockchains (on-chains) and underrated the potential of smart contracts deployed on-Trusted Third Parties (on-TTPs). As a result, current research and development in this field is focused mainly on smart contract applications that use on-chain smart contracts. We argue that there is a large class of smart contract applications where on-TTP smart contracts are a better alternative. The problem with on-chain smart contracts is that the fully decentralised model and indelible append-only data model followed by blockchains introduces several engineering problems that are hard to solve. In these situations, the inclusion of a TTP (assuming that the application can tolerate its inconveniences) instead of a blockchain to host the smart contract simplifies the problems and offers pragmatic solutions. The intention and contribution of this paper is to shed some light on this issue. We use a hypothetical use case of a car insurance application to illustrate technical problems that are easier to solve with on-TTP smart contracts than with on-chain smart contracts.

Posted Content
TL;DR: A MUD capable network integrating the authors' User Policy Server (UPS) provides network administrators and endusers an opportunity to interact with MUD components through a user-friendly interface and presents a comprehensive survey of the challenges.
Abstract: Due to the advancement of IoT devices in both domestic and industrial environments, the need to incorporate a mechanism to build accountability in the IoT ecosystem is paramount. In the last few years, various initiatives have been started in this direction addressing many socio-technical concerns and challenges to build an accountable system. The solution that has received a lot of attention in both industry and academia is the Manufacturer Usage Description (MUD) specification. It gives the possibility to the IoT device manufacturers to describe communications needed by each device to work properly. MUD implementation is challenging not only due to the diversity of IoT devices and manufacturer/operator/regulators but also due to the incremental integration of MUD-based flow control in the already existing Internet infrastructure. To provide a better understanding of these challenges, in this work, we explore and investigate the prototypes of three implementations proposed by different research teams and organisations, useful for the community to understand which are the various features implemented by the existing technologies. By considering that there exist some behaviours which can be only defined by local policy, we propose a MUD capable network integrating our User Policy Server(UPS). The UPS provides network administrators and endusers an opportunity to interact with MUD components through a user-friendly interface. Hence, we present a comprehensive survey of the challenges.

Posted Content
TL;DR: It is shown that health tokens could mitigate immunity-based discrimination whilst still presenting a viable mechanism for estimating the collective transmission risk posed by small groups of users, and can be useful in a number of identity-free contexts.
Abstract: In the fight against Covid-19, many governments and businesses are in the process of evaluating, trialling and even implementing so-called immunity passports Also known as antibody or health certificates, there is a clear demand for any technology that could allow people to return to work and other crowded places without placing others at risk One of the major criticisms of such systems is that they could be misused to unfairly discriminate against those without immunity, allowing the formation of an `immuno-privileged' class of people In this work we are motivated to explore an alternative technical solution that is non-discriminatory by design In particular we propose health tokens -- randomised health certificates which, using methods from differential privacy, allow individual test results to be randomised whilst still allowing useful aggregate risk estimates to be calculated We show that health tokens could mitigate immunity-based discrimination whilst still presenting a viable mechanism for estimating the collective transmission risk posed by small groups of users We evaluate the viability of our approach in the context of identity-free and identity-binding use cases and then consider a number of possible attacks Our experimental results show that for groups of size 500 or more, the error associated with our method can be as low as 003 on average and thus the aggregated results can be useful in a number of identity-free contexts Finally, we present the results of our open-source prototype which demonstrates the practicality of our solution

Journal ArticleDOI
TL;DR: In this paper, an analytical model based on ON-OFF queueing networks under exponential and general service time distributions is proposed to evaluate the performance of SDVNs and takes into account the effect of mobility such as, hand overs, node turning ON/OFF, node going temporary out of coverage, and intermittent connections.
Abstract: We have recently witnessed a number of new software-defined paradigms of VANET in what is referred to as Software-Defined Vehicle Networks (SDVN). In order to evaluate the performance of these new proposals and architectures, analytical and simulation models are needed. In this paper, we propose an analytical model based on ON-OFF queueing networks under exponential and general service time distributions. The model can be used to evaluate the performance of SDVNs and takes into account the effect of mobility such as, hand overs, node turning ON/OFF, node going temporary out of coverage, and intermittent connections. This mobility effect was modelled as a queueing station with exponentially random ON-OFF service times, where traffic arrives according to a Poisson random process during the exponentially random ON period and gets served exponentially. However, during the OFF period traffic gets served exponentially but at lower rates. We studied the ON-OFF queueing behaviour extensively for both finite-size and infinite-size queues. Three hypothetical SDVN scenarios were considered, taking into account the effect of mobility and the large number of connected nodes. Results were cross-validated with those obtained by a simulation model. These tools valuable for researchers interested in getting quantitative answers for their SDVN architectures.

Journal ArticleDOI
TL;DR: The evaluation results show that the Multimodal RTO reduces the RTO versus RTT misalignment of the TCP RTO algorithm by an average factor of up to 5, and reduces latency in the presence of losses by up to 2 orders of magnitude, while operating safely.
Abstract: Low-power wide-area networks (LPWANs) are experiencing high momentum as an inexpensive solution for enabling the Internet-of-Things (IoT) applications. Recent Internet connectivity support developments are expected to further fuel the adoption of LPWAN. However, the latter present challenges to the Internet protocols. Remarkably, many LPWAN scenarios exhibit a multimodal round-trip time (RTT) distribution, which deviates from the common Internet RTT characteristics. This leads to a significant mismatch between retransmission timeout (RTO), computed by the standard transmission control protocol (TCP) or alternative experimental RTO algorithms, and RTT. In this article, we present the Multimodal RTO algorithm, which is able to self-adapt to the current RTT mode and produce suitable RTO values. The evaluation results show that the Multimodal RTO reduces the RTO versus RTT misalignment of the TCP RTO algorithm by an average factor of up to 5, and reduces latency in the presence of losses by up to 2 orders of magnitude, while operating safely. The Multimodal RTO is currently being considered by the IETF as a candidate mechanism for standardization.

Posted Content
TL;DR: It is argued that most of the pieces of technology needed for building a barter system are now available, including blockchains, smart contracts, cryptography, secure multiparty computations and fair exchange protocols, however, additional research is needed to refine and integrate the pieces together.
Abstract: We suggest the re-introduction of bartering to create a cryptocurrencyless, currencyless, and moneyless economy segment. We contend that a barter economy would benefit enterprises, individuals, governments and societies. For instance, the availability of an online peer-to-peer barter marketplace would convert ordinary individuals into potential traders of both tangible and digital items and services. For example, they will be able to barter files and data that they collect. Equally motivating, they will be able to barter and re-introduce to the economy items that they no longer need such as, books, garden tools, and bikes which are normally kept and wasted in garages and sheds. We argue that most of the pieces of technology needed for building a barter system are now available, including blockchains, smart contracts, cryptography, secure multiparty computations and fair exchange protocols. However, additional research is needed to refine and integrate the pieces together. We discuss potential research directions.

Posted ContentDOI
26 May 2020-medRxiv
TL;DR: This study is the first group in the world to rigorously explore the effects of outdoor air pollutant concentrations, meteorological conditions and their interactions, and lockdown interventions, on Covid-19 infection in China and finds that PM2.5 concentration eight days ago has the strongest predictive power for COVID-19 Infection.
Abstract: COVID-19 infection, first reported in Wuhan, China in December 2019, has become a global pandemic, causing significantly high infections and mortalities in Italy, the UK, the US, and other parts of the world. Based on the statistics reported by John Hopkins University, 4.7M people worldwide and 84,054 people in China have been confirmed positive and infected with COVID-19, as of 18 May 2020. Motivated by the previous studies which show that the exposures to air pollutants may increase the risk of influenza infection, our study examines if such exposures will also affect Covid-19 infection. To the best of our understanding, we are the first group in the world to rigorously explore the effects of outdoor air pollutant concentrations, meteorological conditions and their interactions, and lockdown interventions, on Covid-19 infection in China. Since the number of confirmed cases is likely to be under-reported due to the lack of testing capacity, the change in confirmed case definition, and the undiscovered and unreported asymptotic cases, we use the rate of change in the daily number of confirmed infection cases instead as our dependent variable. Even if the number of reported infections is under-reported, the rate of change will still accurately reflect the relative change in infection, provided that the trend of under-reporting remains the same. In addition, the rate of change in daily infection cases can be distorted by the government imposed public health interventions, including the lockdown policy, inter-city and intra-city mobility, and the change in testing capacity and case definition. Hence, the effects of the lockdown policy and the inter-city and intra-city mobility, and the change in testing capacity and case definition are all taken into account in our statistical modelling. Furthermore, we adopt the generalized linear regression models covering both the Negative Binomial Regression and the Poisson Regression. These two regression models, when combined with different time-lags (to reflect the COVID-19 incubation period and delay due to official confirmation) in air pollutant exposure (PM2.5), are used to fit the COVID-19 infection model. Our statistical study has shown that higher PM2.5 concentration is significantly correlated with a higher rate of change in the daily number of confirmed infection cases in Wuhan, China (p

Journal ArticleDOI
08 Jun 2020
TL;DR: This article shows somewhat surprisingly that, following a cyber-attack, the effect of a network interconnection topology and a wide range of loss distributions on the probability of a Cyber-blackout and the increase in total service-related monetary losses across all organizations are mostly very small.
Abstract: Service liability interconnections among globally networked IT- and IoT-driven service organizations create potential channels for cascading service disruptions worth billions of dollars, due to modern cyber-crimes such as DDoS, APT, and ransomware attacks. A natural question that arises in this context is: What is the likelihood of a cyber-blackout?, where the latter term is defined as the probability that all (or a major subset of) organizations in a service chain become dysfunctional in a certain manner due to a cyber-attack at some or all points in the chain. The answer to this question has major implications to risk management businesses such as cyber-insurance when it comes to designing policies by risk-averse insurers for providing coverage to clients in the aftermath of such catastrophic network events. In this article, we investigate this question in general as a function of service chain networks and different cyber-loss distribution types. We show somewhat surprisingly (and discuss the potential practical implications) that, following a cyber-attack, the effect of (a) a network interconnection topology and (b) a wide range of loss distributions on the probability of a cyber-blackout and the increase in total service-related monetary losses across all organizations are mostly very small. The primary rationale behind these results are attributed to degrees of heterogeneity in the revenue base among organizations and the Increasing Failure Rate property of popular (i.i.d/non-i.i.d) loss distributions, i.e., log-concave cyber-loss distributions. The result will enable risk-averse cyber-risk managers to safely infer the impact of cyber-attacks in a worst-case network and distribution oblivious setting.

Journal ArticleDOI
16 Oct 2020
TL;DR: In this article, the authors argue that data of strategic individuals with heterogeneous privacy valuations in a distributed online social network (e.g., Facebook) will be underpriced, if traded in a monopoly buyer setting, and will lead to diminishing utilitarian welfare.
Abstract: This letter argues that data of strategic individuals with heterogeneous privacy valuations in a distributed online social network (e.g., Facebook) will be under-priced, if traded in a monopoly buyer setting, and will lead to diminishing utilitarian welfare. This result, for a certain family of online community data trading problems, is in stark contrast to a popular information economics intuition that increased amounts of end-user data signals in a data market improves its efficiency. Our proposed theory paves the way for a future (counter-intuitive) analysis of data trading oligopoly markets for online social networks (OSNs).

Proceedings ArticleDOI
Kun Chen1, Rongpeng Li1, Jon Crowcroft1, Zhifeng Zhao1, Honggang Zhang1 
21 Sep 2020
TL;DR: This work proposes a decentralized collaboration method named as "stigmergy" in network-assisted MAS, by exploiting digital pheromones (DP) as an indirect medium of communication and utilizing deep reinforcement learning (DRL) on top.
Abstract: Multi-agent system (MAS) needs to mobilize multiple simple agents to complete complex tasks. However, it is difficult to coherently coordinate distributed agents by means of limited local information. In this demo, we propose a decentralized collaboration method named as "stigmergy" in network-assisted MAS, by exploiting digital pheromones (DP) as an indirect medium of communication and utilizing deep reinforcement learning (DRL) on top. Correspondingly, we implement an experimental platform, where KHEPERA IV robots form targeted specific shapes in a decentralized manner. Experimental results demonstrate the effectiveness and efficiency of the proposed method. Our platform could be conveniently extended to investigate the impact of network factors (e.g., latency, data rate, etc).

Journal ArticleDOI
10 Apr 2020
TL;DR: The stages of digital technology readiness are viewed through the lens of three contemporary and widely discussed examples, namely distributed ledger technology, machine learning, and the internet of things, to clarify when there is really just an old technology being re-branded, and whether there may be over-claiming.
Abstract: The stages of digital technology readiness are viewed through the lens of three contemporary and widely discussed examples, namely distributed ledger technology, machine learning, and the internet of things. I use these examples to clarify when there is really just an old technology being re-branded, when there is something genuinely new and useful, and whether there may be over-claiming.

Posted Content
TL;DR: This paper focuses on enhancing standardized capabilities of CIoT RRC layer, by designing and implementing a new architecture which accommodate RRC slicing and intelligent controller, which is implemented on an open-source software platform OpenAirInterface.
Abstract: The cellular internet of things (CIoT) has become an important branch to cater various applications of IoT devices. Within CIoT, the radio resource control (RRC) layer is responsible for fundamental functionalities such as connection control and bearer establishment in radio access network (RAN). The emergence of various IoT scenarios and diversified service requirements have made both RAN slicing and intelligent control imperative requirement in RRC layer. This paper focuses on enhancing standardized capabilities of CIoT RRC layer, by designing and implementing a new architecture which accommodate RRC slicing and intelligent controller. The architecture aims to realize functionalities of creating, modifying, and deleting slices in RRC layer, while the intelligent controller is added to satisfy various and dynamic service requirements of different IoT devices smartly. The proposed architecture is further implemented on an open-source software platform OpenAirInterface (OAI), on top of which the effectiveness of RRC slicing is validated and one proof-of-concept case to adopt reinforcement learning to dynamically tune discontinuous reception parameters therein is presented. Simulation results have demonstrated the effectiveness of the proposed intelligent RRC slicing architecture.

Proceedings ArticleDOI
14 Dec 2020
TL;DR: In this article, the authors analyze whether traditional cyber-risk spreading is a sustainable risk management practice and under what conditions, for the quite conservative scenario when proportions of i.i.d. catastrophic cyber-risks of a significant heavy-tailed nature are aggregated by a cyber risk manager.
Abstract: IoT-driven smart cities are popular service-networked ecosystems, whose proper functioning is hugely based on digitally secure and reliable supply chain relationships. However, the naivety in the current security efforts by concerned parties to protect IoT devices, pose tough challenges to scalable and expanding cyber-risk management markets for IoT societies, post a systemic cyber-catastrophe. As firms increasingly turn to cyber-insurance for reliable risk management, and insurers turn to reinsurance for their own risk management, questions arise as to how modern-day cyber risks aggregate and accumulate, and whether reinsurance is a feasible model for reliable catastrophic risk management and transfer in smart cities. In this introductory effort, we analyze (a) whether traditional cyber-risk spreading is a sustainable risk management practice and (b) under what conditions, for the quite conservative scenario when proportions of i.i.d. catastrophic cyber-risks of a significant heavy-tailed nature are aggregated by a cyber-risk manager.

Posted Content
TL;DR: It is argued that data of strategic individuals with heterogeneous privacy valuations in a distributed online social network (e.g., Facebook) will be under-priced, if traded in a monopoly buyer setting, and will lead to diminishing utilitarian welfare.
Abstract: This paper argues that data of strategic individuals with heterogeneous privacy valuations in a distributed online social network (e.g., Facebook) will be under-priced, if traded in a monopoly buyer setting, and will lead to diminishing utilitarian welfare. This result, for a certain family of online community data trading problems, is in stark contrast to a popular information economics intuition that increased amounts of end-user data signals in a data market improves its efficiency. Our proposed theory paves the way for a future (counter-intuitive) analysis of data trading oligopoly markets for online social networks (OSNs).