scispace - formally typeset
Open accessJournal ArticleDOI: 10.1109/TEVC.2021.3063217

From Prediction to Prescription: Evolutionary Optimization of Nonpharmaceutical Interventions in the COVID-19 Pandemic

02 Mar 2021-IEEE Transactions on Evolutionary Computation (IEEE)-Vol. 25, Iss: 2, pp 386-401
Abstract: Several models have been developed to predict how the COVID-19 pandemic spreads, and how it could be contained with nonpharmaceutical interventions, such as social distancing restrictions and school and business closures. This article demonstrates how evolutionary AI can be used to facilitate the next step, i.e., determining most effective intervention strategies automatically. Through evolutionary surrogate-assisted prescription, it is possible to generate a large number of candidate strategies and evaluate them with predictive models. In principle, strategies can be customized for different countries and locales, and balance the need to contain the pandemic and the need to minimize their economic impact. Early experiments suggest that workplace and school restrictions are the most important and need to be designed carefully. They also demonstrate that results of lifting restrictions can be unreliable, and suggest creative ways in which restrictions can be implemented softly, e.g., by alternating them over time. As more data becomes available, the approach can be increasingly useful in dealing with COVID-19 as well as possible future pandemics.

... read more

Citations
  More

9 results found


Open accessJournal ArticleDOI: 10.1016/J.CHAOS.2020.110338
Mohammad-H. Tayarani N.1Institutions (1)
Abstract: Colloquially known as coronavirus, the Severe Acute Respiratory Syndrome CoronaVirus 2 (SARS-CoV-2), that causes CoronaVirus Disease 2019 (COVID-19), has become a matter of grave concern for every country around the world. The rapid growth of the pandemic has wreaked havoc and prompted the need for immediate reactions to curb the effects. To manage the problems, many research in a variety of area of science have started studying the issue. Artificial Intelligence is among the area of science that has found great applications in tackling the problem in many aspects. Here, we perform an overview on the applications of AI in a variety of fields including diagnosis of the disease via different types of tests and symptoms, monitoring patients, identifying severity of a patient, processing covid-19 related imaging tests, epidemiology, pharmaceutical studies, etc. The aim of this paper is to perform a comprehensive survey on the applications of AI in battling against the difficulties the outbreak has caused. Thus we cover every way that AI approaches have been employed and to cover all the research until the writing of this paper. We try organize the works in a way that overall picture is comprehensible. Such a picture, although full of details, is very helpful in understand where AI sits in current pandemonium. We also tried to conclude the paper with ideas on how the problems can be tackled in a better way and provide some suggestions for future works.

... read more

57 Citations


Book ChapterDOI: 10.1007/978-3-030-86514-6_24
10 Sep 2021-
Abstract: In this paper, we describe the deep learning-based COVID-19 cases predictor and the Pareto-optimal Non-Pharmaceutical Intervention (NPI) prescriptor developed by the winning team of the 500k XPRIZE Pandemic Response Challenge, a four-month global competition organized by the XPRIZE Foundation. The competition aimed at developing data-driven AI models to predict COVID-19 infection rates and to prescribe NPI Plans that governments, business leaders and organizations could implement to minimize harm when reopening their economies. In addition to the validation performed by XPRIZE with real data, the winning models were validated in a real-world scenario thanks to an ongoing collaboration with the Valencian Government in Spain. We believe that this experience contributes to the necessary transition to more evidence-driven policy-making, particularly during a pandemic.

... read more

2 Citations


Open accessJournal ArticleDOI: 10.3390/IJERPH18189555
Arielle Kaim1, Arielle Kaim2, Tuvia Gering2, Amiram Moshaiov2  +1 moreInstitutions (2)
Abstract: Lessons learnt from the initial stages of the COVID-19 outbreak indicate the need for a more coordinated economic and public health response. While social distancing has been shown to be effective as a non-pharmaceutical intervention (NPI) measure to mitigate the spread of COVID-19, the economic costs have been substantial. Insights combining epidemiological and economic data provide new theoretical predictions that can be used to better understand the health economy tradeoffs. This literature review aims to elucidate perspectives to assist policy implementation related to the management of the ongoing and impending outbreaks regarding the Health Economic Dilemma (HED). This review unveiled the need for information-based decision-support systems which will combine pandemic spread modelling and control, with economic models. It is expected that the current review will not only support policy makers but will also provide researchers on the development of related decision-support-systems with comprehensive information on the various aspects of the HED.

... read more

Topics: Economic cost (52%)

1 Citations


Open accessPosted Content
Reza Sameni1Institutions (1)
Abstract: A model-based signal processing framework is proposed for pandemic trend forecasting and control by using non-pharmaceutical interventions (NPI) at regional and country levels worldwide. The control objective is to prescribe quantifiable NPI strategies at different levels of stringency, which balance between human factors (such as new cases and death rates) and cost of intervention per region/country. Due to the significant differences in infrastructures and priorities of regions and countries, strategists are given the flexibility to weight between different NPIs, and to select the desired balance between the human factor and overall NPI cost. The proposed framework is based on a \textit{finite-horizon optimal control} (FHOC) formulation of the bi-objective problem and the FHOC is numerically solved by using an ad hoc \textit{extended Kalman filtering/smoothing} framework. The algorithm enables strategists to select the desired balance between the human factor and NPI cost with a set of weights and parameters. The parameters of the model, are partially selected by epidemiological facts from COVID-19 studies, and partially trained by using machine learning techniques. The developed algorithm is applied on real global data from the Oxford COVID-19 Government Response Tracker project, which has categorized and quantified the regional responses to the pandemic for more than 300 countries and regions worldwide, since January 2020. This dataset has been used for NPI-based prediction and prescription during the XPRIZE Pandemic Response Challenge. The source codes developed for the proposed method are provided online.

... read more

1 Citations


Open accessJournal ArticleDOI: 10.1007/S42979-021-00540-9
Risto Miikkulainen1Institutions (1)
23 Mar 2021-
Abstract: The main power of artificial intelligence is not in modeling what we already know, but in creating solutions that are new. Such solutions exist in extremely large, high-dimensional, and complex search spaces. Population-based search techniques, i.e. variants of evolutionary computation, are well suited to finding them. These techniques make it possible to find creative solutions to practical problems in the real world, making creative AI through evolutionary computation the likely “next deep learning.”

... read more

Topics: Neuroevolution (62%), Evolutionary computation (61%), Population (53%)

References
  More

57 results found


Open accessProceedings Article
Diederik P. Kingma1, Jimmy Ba2Institutions (2)
01 Jan 2015-
Abstract: We introduce Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments. The method is straightforward to implement, is computationally efficient, has little memory requirements, is invariant to diagonal rescaling of the gradients, and is well suited for problems that are large in terms of data and/or parameters. The method is also appropriate for non-stationary objectives and problems with very noisy and/or sparse gradients. The hyper-parameters have intuitive interpretations and typically require little tuning. Some connections to related algorithms, on which Adam was inspired, are discussed. We also analyze the theoretical convergence properties of the algorithm and provide a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework. Empirical results demonstrate that Adam works well in practice and compares favorably to other stochastic optimization methods. Finally, we discuss AdaMax, a variant of Adam based on the infinity norm.

... read more

Topics: Stochastic optimization (63%), Convex optimization (54%), Rate of convergence (52%) ... read more

78,539 Citations


Journal ArticleDOI: 10.1162/NECO.1997.9.8.1735
Sepp Hochreiter1, Jürgen Schmidhuber2Institutions (2)
01 Nov 1997-Neural Computation
Abstract: Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient based method called long short-term memory (LSTM). Truncating the gradient where this does not do harm, LSTM can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units. Multiplicative gate units learn to open and close access to the constant error flow. LSTM is local in space and time; its computational complexity per time step and weight is O. 1. Our experiments with artificial data involve local, distributed, real-valued, and noisy pattern representations. In comparisons with real-time recurrent learning, back propagation through time, recurrent cascade correlation, Elman nets, and neural sequence chunking, LSTM leads to many more successful runs, and learns much faster. LSTM also solves complex, artificial long-time-lag tasks that have never been solved by previous recurrent network algorithms.

... read more

49,735 Citations


Open accessJournal Article
Abstract: Scikit-learn is a Python module integrating a wide range of state-of-the-art machine learning algorithms for medium-scale supervised and unsupervised problems. This package focuses on bringing machine learning to non-specialists using a general-purpose high-level language. Emphasis is put on ease of use, performance, documentation, and API consistency. It has minimal dependencies and is distributed under the simplified BSD license, encouraging its use in both academic and commercial settings. Source code, binaries, and documentation can be downloaded from http://scikit-learn.sourceforge.net.

... read more

33,540 Citations


Journal ArticleDOI: 10.1109/4235.996017
Abstract: Multi-objective evolutionary algorithms (MOEAs) that use non-dominated sorting and sharing have been criticized mainly for: (1) their O(MN/sup 3/) computational complexity (where M is the number of objectives and N is the population size); (2) their non-elitism approach; and (3) the need to specify a sharing parameter. In this paper, we suggest a non-dominated sorting-based MOEA, called NSGA-II (Non-dominated Sorting Genetic Algorithm II), which alleviates all of the above three difficulties. Specifically, a fast non-dominated sorting approach with O(MN/sup 2/) computational complexity is presented. Also, a selection operator is presented that creates a mating pool by combining the parent and offspring populations and selecting the best N solutions (with respect to fitness and spread). Simulation results on difficult test problems show that NSGA-II is able, for most problems, to find a much better spread of solutions and better convergence near the true Pareto-optimal front compared to the Pareto-archived evolution strategy and the strength-Pareto evolutionary algorithm - two other elitist MOEAs that pay special attention to creating a diverse Pareto-optimal front. Moreover, we modify the definition of dominance in order to solve constrained multi-objective problems efficiently. Simulation results of the constrained NSGA-II on a number of test problems, including a five-objective, seven-constraint nonlinear problem, are compared with another constrained multi-objective optimizer, and the much better performance of NSGA-II is observed.

... read more

Topics: Sorting (57%), Evolutionary algorithm (56%), Mating pool (56%) ... read more

30,928 Citations


Open accessJournal ArticleDOI: 10.1098/RSPA.1927.0118
Abstract: (1) One of the most striking features in the study of epidemics is the difficulty of finding a causal factor which appears to be adequate to account for the magnitude of the frequent epidemics of disease which visit almost every population. It was with a view to obtaining more insight regarding the effects of the various factors which govern the spread of contagious epidemics that the present investigation was undertaken. Reference may here be made to the work of Ross and Hudson (1915-17) in which the same problem is attacked. The problem is here carried to a further stage, and it is considered from a point of view which is in one sense more general. The problem may be summarised as follows: One (or more) infected person is introduced into a community of individuals, more or less susceptible to the disease in question. The disease spreads from the affected to the unaffected by contact infection. Each infected person runs through the course of his sickness, and finally is removed from the number of those who are sick, by recovery or by death. The chances of recovery or death vary from day to day during the course of his illness. The chances that the affected may convey infection to the unaffected are likewise dependent upon the stage of the sickness. As the epidemic spreads, the number of unaffected members of the community becomes reduced. Since the course of an epidemic is short compared with the life of an individual, the population may be considered as remaining constant, except in as far as it is modified by deaths due to the epidemic disease itself. In the course of time the epidemic may come to an end. One of the most important probems in epidemiology is to ascertain whether this termination occurs only when no susceptible individuals are left, or whether the interplay of the various factors of infectivity, recovery and mortality, may result in termination, whilst many susceptible individuals are still present in the unaffected population. It is difficult to treat this problem in its most general aspect. In the present communication discussion will be limited to the case in which all members of the community are initially equally susceptible to the disease, and it will be further assumed that complete immunity is conferred by a single infection.

... read more

Topics: Population (54%), Epidemic model (53%), Economic epidemiology (51%) ... read more

7,409 Citations


Performance
Metrics
No. of citations received by the Paper in previous years
YearCitations
20219