scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Preventable H5N1 avian influenza epidemics in the British poultry industry network exhibit characteristic scales.

06 Apr 2010-Journal of the Royal Society Interface (The Royal Society)-Vol. 7, Iss: 45, pp 695-701
TL;DR: H5N1 avian influenza transmission probabilities and containment strategies, here modelled on the British poultry industry network, show that infection dynamics can additionally express characteristic scales, and hotspots can make more effective inoculation targets.
Abstract: Epidemics are frequently simulated on redundantly wired contact networks, which have many more links between sites than are minimally required to connect all. Consequently, the modelled pathogen can travel numerous alternative routes, complicating effective containment strategies. These networks have moreover been found to exhibit ‘scale-free’ properties and percolation, suggesting resilience to damage. However, realistic H5N1 avian influenza transmission probabilities and containment strategies, here modelled on the British poultry industry network, show that infection dynamics can additionally express characteristic scales. These system-preferred scales constitute small areas within an observed power law distribution that exhibit a lesser slope than the power law itself, indicating a slightly increased relative likelihood. These characteristic scales are here produced by a network-pervading intranet of so-called hotspot sites that propagate large epidemics below the percolation threshold. This intranet is, however, extremely vulnerable; targeted inoculation of a mere 3–6% (depending on incorporated biosecurity measures) of the British poultry industry network prevents large and moderate H5N1 outbreaks completely, offering an order of magnitude improvement over previously advocated strategies affecting the most highly connected ‘hub’ sites. In other words, hotspots and hubs are separate functional entities that do not necessarily coincide, and hotspots can make more effective inoculation targets. Given the ubiquity and relevance of networks (epidemics, Internet, power grids, protein interaction), recognition of this spreading regime elsewhere would suggest a similar disproportionate sensitivity to such surgical interventions.

Content maybe subject to copyright    Report

Citations
More filters
Dissertation
12 Nov 2010
TL;DR: The junctions of mathematical and computer modeling of infectious disease epidemics, the basis of such research and the communication of results are explored, and the epidemiology of sexual networks is explored, rendering the Swedish population an ideal isotope for sexually transmitted pathogens.
Abstract: This thesis explores the junctions of mathematical and computer modeling of infectious disease epidemics, the basis of such research and the communication of results. With increasing frequency we turn to computers and software for any type of research problem encountered. Computer modeling is a blessing with many hidden trapdoors. Skipping mathematical modeling, resorting to code immediately, is ill advised. Validity, uncertainty, bugs and old mathematical truths must all be taken under careful consideration. The same duality is present in the communication of the results from computer models to the public, to decision makers and to peers. These topics are discussed in the context of four contributing papers. The first paper describes a computer model of an infectious disease epidemic in Sweden. Using Swedish travel data we were able to demonstrate a way of successfully restricting travel to delay the spread of disease. The second paper discusses a known fallacy common to many epidemic models, often overlooked when mathematical models are simulated on computers. It is demonstrated that it must be considered also with more complex models. The model in Paper I is used to exemplify the problem. The third study takes the parsimonial considerations of the first two papers to another level, proposing static models for use in epidemic modeling. Understanding, an eluding especially in computer models but essential component in all models, is benefited. The fourth study explores the epidemiology of sexual networks. Using survey datasets we show that with high probability, the sexually active population is largely connected, in a so called giant component, rendering the Swedish population an ideal isotope for sexually transmitted pathogens. LIST OF PUBLICATIONS I. Camitz, M., F. Liljeros The effect of travel restrictions on the spread of a moderately contagious disease BMC Med 2006,4:32. [Reprint] II. Camitz, M., A. Svensson The effect of time distribution shape on a complex epidemic model Bull Math Biol 2009,71(8):1902-13. [Reprint]

8 citations

Dissertation
01 Nov 2014
TL;DR: Evidence confirming and demonstrating the importance of understanding the tail behaviour of proposals in importance sampling is presented, and a new algorithm, the Kernel Metropolis Hastings (KMH), is proposed as an approximate algorithm for low dimensional marginal inference in situations where the GIMH algorithm fails because of sticking.
Abstract: Statistical inference and model choice for partially observed epidemics provide a variety of challenges both practical and theoretical. This thesis studies some related aspects of models for epidemics and their inference. The use of the matrix exponential to facilitate exact calculations in the General Stochastic Epidemic (GSE) is demonstrated, most usefully in providing the exact marginal likelihood when infection times are unobserved. The bipartite graph epidemic is defined and shown to be a flexible framework which encompasses many existing models. It also provides a way in which a deeper understanding of the relation between existing models could be obtained. The Indian buffet epidemic is introduced as a non-parametric approach to modelling unknown heterogeneous contact structures in epidemics. Inference for the Indian buffet epidemic is a challenging problem, some progress has been made. However the algorithms which have been studied do not yet scale to the size of problem where significant differences from the GSE are apparent. Evidence confirming and demonstrating the importance of understanding the tail behaviour of proposals in importance sampling is presented. The adverse impact of heavy tailed proposals on the Grouped Independence Metropolis-Hastings (GIMH) and Monte Carlo within Metropolis (MCWM) algorithms is demonstrated. A new algorithm, the Kernel Metropolis Hastings (KMH), is proposed as an approximate algorithm for low dimensional marginal inference in situations where the GIMH algorithm fails because of sticking. The KMH is demonstrated on a challenging 2-d problem.

4 citations

Posted ContentDOI
13 Aug 2021-medRxiv
TL;DR: Analyzing the predicted effect of vaccination into an ongoing COVID-19 outbreak, it is found that adaptive combinations of targeted vaccination and non-pharmaceutical interventions (NPIs) are required to reach population immunity.
Abstract: Reaching population immunity against COVID-19 is proving difficult even in countries with high vaccination levels. We demonstrate that this in part is due to heterogeneity and stochasticity resulting from community-specific human-human interaction and infection networks. We address this challenge by community-specific simulation of adaptive strategies. Analyzing the predicted effect of vaccination into an ongoing COVID-19 outbreak, we find that adaptive combinations of targeted vaccination and non-pharmaceutical interventions (NPIs) are required to reach population immunity. Importantly, the threshold for population immunity is not a unique number but strategy and community dependent. Furthermore, the dynamics of COVID-19 outbreaks is highly community-specific: in some communities vaccinating highly interactive people diminishes the risk for an infection wave, while vaccinating the elderly reduces fatalities when vaccinations are low due to supply or hesitancy. Similarly, while risk groups should be vaccinated first to minimize fatalities, optimality branching is observed with increasing population immunity. Bimodality emerges as the infection network gains complexity over time, which entails that NPIs generally need to be longer and stricter. Thus, we analyze and quantify the requirement for NPIs dependent on the chosen vaccination strategy. We validate our simulation platform on real-world epidemiological data and demonstrate that it can predict pathways to population immunity for diverse communities world-wide challenged by limited vaccination.

1 citations

Dissertation
01 Jun 2015
TL;DR: This thesis considers various stochastic models of disease propagation which utilise the concept of a finite contact (social) network and considers the probability that a large scale outbreak will occur when a single infected individual is introduced to a susceptible population.
Abstract: Mathematical models for the spread of infectious diseases between living organisms are crucial to humanity's endeavour to understand and control its environment. The threat posed by communicable diseases is great. For example, the 1918 flu pandemic resulted in the deaths of over 50 million people and the HIV/AIDS pandemic is still under way with 2.3 million new cases in 2012. Mathematical models allow us to make predictions about the likelihood, impact and time scale of possible epidemics, and to devise effective intervention programmes, e.g. mass vaccination. This thesis considers various stochastic models of disease propagation which utilise the concept of a finite contact (social) network. For such models, we investigate ways in which important information can be extracted without a full mathematical `solution' (often unavailable) or numerous time consuming simulations (often inefficient and uninformative). For example, we consider the probability that a large scale outbreak will occur when a single infected individual is introduced to a susceptible population, and the expected number of infected individuals at time t. Although we focus on the context of epidemiology, the models under investigation are elementary and will be applicable to other domains, such as the spread of computer viruses, the spread of ideas, chemical reactions, and interacting particle systems.

Cites background from "Preventable H5N1 avian influenza ep..."

  • ...This network is the largest (5,119 node) strongly connected component of a network which was generated from simulations on a complex model of the spread of H5N1 avian influenza through the British poultry flock (Sharkey et al., 2008; Jonkers et al., 2010)....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: In this paper, a simple model based on the power-law degree distribution of real networks was proposed, which was able to reproduce the power law degree distribution in real networks and to capture the evolution of networks, not just their static topology.
Abstract: The emergence of order in natural systems is a constant source of inspiration for both physical and biological sciences. While the spatial order characterizing for example the crystals has been the basis of many advances in contemporary physics, most complex systems in nature do not offer such high degree of order. Many of these systems form complex networks whose nodes are the elements of the system and edges represent the interactions between them. Traditionally complex networks have been described by the random graph theory founded in 1959 by Paul Erdohs and Alfred Renyi. One of the defining features of random graphs is that they are statistically homogeneous, and their degree distribution (characterizing the spread in the number of edges starting from a node) is a Poisson distribution. In contrast, recent empirical studies, including the work of our group, indicate that the topology of real networks is much richer than that of random graphs. In particular, the degree distribution of real networks is a power-law, indicating a heterogeneous topology in which the majority of the nodes have a small degree, but there is a significant fraction of highly connected nodes that play an important role in the connectivity of the network. The scale-free topology of real networks has very important consequences on their functioning. For example, we have discovered that scale-free networks are extremely resilient to the random disruption of their nodes. On the other hand, the selective removal of the nodes with highest degree induces a rapid breakdown of the network to isolated subparts that cannot communicate with each other. The non-trivial scaling of the degree distribution of real networks is also an indication of their assembly and evolution. Indeed, our modeling studies have shown us that there are general principles governing the evolution of networks. Most networks start from a small seed and grow by the addition of new nodes which attach to the nodes already in the system. This process obeys preferential attachment: the new nodes are more likely to connect to nodes with already high degree. We have proposed a simple model based on these two principles wich was able to reproduce the power-law degree distribution of real networks. Perhaps even more importantly, this model paved the way to a new paradigm of network modeling, trying to capture the evolution of networks, not just their static topology.

18,415 citations

Journal ArticleDOI
27 Jul 2000-Nature
TL;DR: It is found that scale-free networks, which include the World-Wide Web, the Internet, social networks and cells, display an unexpected degree of robustness, the ability of their nodes to communicate being unaffected even by unrealistically high failure rates.
Abstract: Many complex systems display a surprising degree of tolerance against errors. For example, relatively simple organisms grow, persist and reproduce despite drastic pharmaceutical or environmental interventions, an error tolerance attributed to the robustness of the underlying metabolic network1. Complex communication networks2 display a surprising degree of robustness: although key components regularly malfunction, local failures rarely lead to the loss of the global information-carrying ability of the network. The stability of these and other complex systems is often attributed to the redundant wiring of the functional web defined by the systems' components. Here we demonstrate that error tolerance is not shared by all redundant systems: it is displayed only by a class of inhomogeneously wired networks, called scale-free networks, which include the World-Wide Web3,4,5, the Internet6, social networks7 and cells8. We find that such networks display an unexpected degree of robustness, the ability of their nodes to communicate being unaffected even by unrealistically high failure rates. However, error tolerance comes at a high price in that these networks are extremely vulnerable to attacks (that is, to the selection and removal of a few nodes that play a vital role in maintaining the network's connectivity). Such error tolerance and attack vulnerability are generic properties of communication networks.

7,697 citations


"Preventable H5N1 avian influenza ep..." refers background in this paper

  • ...…heterogeneous, redundantly wired networks that feature many possible routes between most network nodes, e.g. Internet, social and ecological systems, protein interaction networks, and which exhibit power law statistics (Rhodes & Anderson 1996; Albert et al. 2000; Albert & Barabasi 2002; May 2006)....

    [...]

  • ...Others offer hope in identifying the most highly connected sites (hubs) as the most vulnerable part of such systems (Albert et al. 2000; Callaway et al. 2000; May & Lloyd 2001; Song et al. 2005; Jeger et al. 2007)....

    [...]

  • ...Following existing advice (Albert et al. 2000; Song et al. 2005; Jeger et al. 2007; Dent et al. 2008), we focused first on hubs; all other sites we call peripherals....

    [...]

  • ...Internet, social and ecological systems, protein interaction networks, and which exhibit power law statistics (Rhodes & Anderson 1996; Albert et al. 2000; Albert & Barabasi 2002; May 2006)....

    [...]

Journal ArticleDOI
09 Jun 2005-Nature
TL;DR: After defining a set of new characteristic quantities for the statistics of communities, this work applies an efficient technique for exploring overlapping communities on a large scale and finds that overlaps are significant, and the distributions introduced reveal universal features of networks.
Abstract: A network is a network — be it between words (those associated with ‘bright’ in this case) or protein structures. Many complex systems in nature and society can be described in terms of networks capturing the intricate web of connections among the units they are made of1,2,3,4. A key question is how to interpret the global organization of such networks as the coexistence of their structural subunits (communities) associated with more highly interconnected parts. Identifying these a priori unknown building blocks (such as functionally related proteins5,6, industrial sectors7 and groups of people8,9) is crucial to the understanding of the structural and functional properties of networks. The existing deterministic methods used for large networks find separated communities, whereas most of the actual networks are made of highly overlapping cohesive groups of nodes. Here we introduce an approach to analysing the main statistical features of the interwoven sets of overlapping communities that makes a step towards uncovering the modular structure of complex systems. After defining a set of new characteristic quantities for the statistics of communities, we apply an efficient technique for exploring overlapping communities on a large scale. We find that overlaps are significant, and the distributions we introduce reveal universal features of networks. Our studies of collaboration, word-association and protein interaction graphs show that the web of communities has non-trivial correlations and specific scaling properties.

5,217 citations


"Preventable H5N1 avian influenza ep..." refers background in this paper

  • ...Instead, hotspots tend to associate spatially (over short distances) to span large areas in thin strands, and, crucially, share many mutual contacts with other hotspots that extend this overlap network-wide (Newman 2003; Palla et al. 2005)....

    [...]

Journal ArticleDOI
TL;DR: This paper studies percolation on graphs with completely general degree distribution, giving exact solutions for a variety of cases, including site percolators, bond percolations, and models in which occupation probabilities depend on vertex degree.
Abstract: Recent work on the Internet, social networks, and the power grid has addressed the resilience of these networks to either random or targeted deletion of network nodes or links. Such deletions include, for example, the failure of Internet routers or power transmission lines. Percolation models on random graphs provide a simple representation of this process but have typically been limited to graphs with Poisson degree distribution at their vertices. Such graphs are quite unlike real-world networks, which often possess power-law or other highly skewed degree distributions. In this paper we study percolation on graphs with completely general degree distribution, giving exact solutions for a variety of cases, including site percolation, bond percolation, and models in which occupation probabilities depend on vertex degree. We discuss the application of our theory to the understanding of network resilience.

2,298 citations


"Preventable H5N1 avian influenza ep..." refers background in this paper

  • ...Others offer hope in identifying the most highly connected sites (hubs) as the most vulnerable part of such systems (Albert et al. 2000; Callaway et al. 2000; May & Lloyd 2001; Song et al. 2005; Jeger et al. 2007)....

    [...]

Journal ArticleDOI
27 Jan 2005-Nature
TL;DR: A power-law relation is identified between the number of boxes needed to cover the network and the size of the box, defining a finite self-similar exponent to explain the scale-free nature of complex networks and suggest a common self-organization dynamics.
Abstract: Complex networks have been studied extensively owing to their relevance to many real systems such as the world-wide web, the Internet, energy landscapes and biological and social networks. A large number of real networks are referred to as 'scale-free' because they show a power-law distribution of the number of links per node. However, it is widely believed that complex networks are not invariant or self-similar under a length-scale transformation. This conclusion originates from the 'small-world' property of these networks, which implies that the number of nodes increases exponentially with the 'diameter' of the network, rather than the power-law relation expected for a self-similar structure. Here we analyse a variety of real complex networks and find that, on the contrary, they consist of self-repeating patterns on all length scales. This result is achieved by the application of a renormalization procedure that coarse-grains the system into boxes containing nodes within a given 'size'. We identify a power-law relation between the number of boxes needed to cover the network and the size of the box, defining a finite self-similar exponent. These fundamental properties help to explain the scale-free nature of complex networks and suggest a common self-organization dynamics.

1,303 citations


"Preventable H5N1 avian influenza ep..." refers background in this paper

  • ...Others offer hope in identifying the most highly connected sites (hubs) as the most vulnerable part of such systems (Albert et al. 2000; Callaway et al. 2000; May & Lloyd 2001; Song et al. 2005; Jeger et al. 2007)....

    [...]

  • ...Following existing advice (Albert et al. 2000; Song et al. 2005; Jeger et al. 2007; Dent et al. 2008), we focused first on hubs; all other sites we call peripherals....

    [...]

Related Papers (5)