scispace - formally typeset
Search or ask a question

Showing papers by "Ben S. Cooper published in 2006"


Journal ArticleDOI
TL;DR: Stochastic models of the international spread of influenza based on extensions of coupled epidemic transmission models are developed and show that under most scenarios restrictions on air travel are likely to be of surprisingly little value in delaying epidemics, unless almost all travel ceases very soon after epidemics are detected.
Abstract: Background The recent emergence of hypervirulent subtypes of avian influenza has underlined the potentially devastating effects of pandemic influenza. Were such a virus to acquire the ability to spread efficiently between humans, control would almost certainly be hampered by limited vaccine supplies unless global spread could be substantially delayed. Moreover, the large increases that have occurred in international air travel might be expected to lead to more rapid global dissemination than in previous pandemics. Methods and Findings To evaluate the potential of local control measures and travel restrictions to impede global dissemination, we developed stochastic models of the international spread of influenza based on extensions of coupled epidemic transmission models. These models have been shown to be capable of accurately forecasting local and global spread of epidemic and pandemic influenza. We show that under most scenarios restrictions on air travel are likely to be of surprisingly little value in delaying epidemics, unless almost all travel ceases very soon after epidemics are detected. Conclusions Interventions to reduce local transmission of influenza are likely to be more effective at reducing the rate of global spread and less vulnerable to implementation delays than air travel restrictions. Nevertheless, under the most plausible scenarios, achievable delays are small compared with the time needed to accumulate substantial vaccine stocks.

329 citations


Journal ArticleDOI
TL;DR: A mathematical model of smallpox transmission is used, for the first time, to account for the spread of the disease over a wide area, and focuses on the island associated with not only the most notorious deliberate release of the virus but also with the development of the technique that ultimately led to its eradication.
Abstract: The second-to-last case of smallpox was diagnosed on August 24, 1978, when Janet Parker, a photographer at the University of Birmingham Medical School (Birmingham, U.K.), was admitted to the hospital. She had been infected when a virus escaped from the school’s smallpox laboratory. On September 6th, Professor Henry Bedson, who was responsible for the laboratory, killed himself. Five days later, Janet died in the hospital. Although many people were exposed to her before the diagnosis, Janet infected only her mother (who survived). The incident shows that fears of the accidental release of smallpox are certainly justified. The variola virus is now officially stored at two locations (one in Russia and one in the United States), and there are concerns that covert stocks may also exist. The events of September 11th, 2001, and the deliberate releases of anthrax in the weeks that followed, led many to believe that the chance of a bioterrorist attack with variola had previously been underestimated. In the event of the deliberate or accidental release of smallpox, what should be done? Would isolation of cases suffice to control an outbreak? Would it also be necessary to trace and vaccinate contacts of cases? Under what circumstances would mass vaccination be justified? In this issue of PNAS, Riley and Ferguson use a mathematical model of smallpox transmission to answer these questions (1). Although theirs is not the first simulation model to address such questions, nor even the first explicitly spatial analysis, it is the first carefully parameterized and repeatable study to account for the spread of the disease over a wide area. Fittingly, the analysis focuses on the island associated with not only the most notorious deliberate release of the virus (the British army infamously gave smallpox-contaminated blankets to Native Americans during Pontiac’s Rebellion in 1763) but also, thanks to Edward Jenner a few years later, with the development of the technique that ultimately led to its eradication.

20 citations


Journal ArticleDOI
TL;DR: The data processing model for the CDF experiment reconstructs events from parallel data streams taken with different combinations of physics event triggers and further splits the events into datasets of specialised physics interests.
Abstract: The data processing model for the CDF experiment is described. Data processing reconstructs events from parallel data streams taken with different combinations of physics event triggers and further splits the events into datasets of specialised physics interests. The design of the processing control system makes strict requirements on bookkeeping records, which trace the status of data files and event contents during processing and storage. The computing architecture was updated to meet the mass data flow of the Run II data collection, recently upgraded to a maximum rate of 40 MByte/sec. The data processing facility consists of a large cluster of Linux computers with data movement managed by the CDF data handling system to a multi-petaByte Enstore tape library. The latest processing cycle has achieved a stable speed of 35 MByte/sec (3 TByte/day). It can be readily scaled by increasing CPU and data-handling capacity as required

5 citations




Journal ArticleDOI
TL;DR: In this paper, the authors describe a data processing model for the CDF experiment, which reconstructs events from parallel data streams taken with different combinations of physics event triggers and further splits the events into datasets of specialized physics datasets.
Abstract: The data processing model for the CDF experiment is described. Data processing reconstructs events from parallel data streams taken with different combinations of physics event triggers and further splits the events into datasets of specialized physics datasets. The design of the processing control system faces strict requirements on bookkeeping records, which trace the status of data files and event contents during processing and storage. The computing architecture was updated to meet the mass data flow of the Run II data collection, recently upgraded to a maximum rate of 40 MByte/sec. The data processing facility consists of a large cluster of Linux computers with data movement managed by the CDF data handling system to a multi-petaByte Enstore tape library. The latest processing cycle has achieved a stable speed of 35 MByte/sec (3 TByte/day). It can be readily scaled by increasing CPU and data-handling capacity as required.