scispace - formally typeset
Search or ask a question
Author

Richard Allmendinger

Bio: Richard Allmendinger is an academic researcher from University of Manchester. The author has contributed to research in topics: Evolutionary algorithm & Evolutionary computation. The author has an hindex of 12, co-authored 49 publications receiving 428 citations. Previous affiliations of Richard Allmendinger include University College London & Karlsruhe Institute of Technology.

Papers
More filters
Journal ArticleDOI
TL;DR: Emerging complexity-related topics in surrogate-assisted multicriteria optimization that may not be prevalent in nonsurrogate-assisted single-objective optimization are discussed and motivated using several real-world problems in which the authors were involved.
Abstract: Complexity in solving real-world multicriteria optimization problems often stems from the fact that complex, expensive, and/or time-consuming simulation tools or physical experiments are used to evaluate solutions to a problem. In such settings, it is common to use efficient computational models, often known as surrogates or metamodels, to approximate the outcome (objective or constraint function value) of a simulation or physical experiment. The presence of multiple objective functions poses an additional layer of complexity for surrogate-assisted optimization. For example, complexities may relate to the appropriate selection of metamodels for the individual objective functions, extensive training time of surrogate models, or the optimal use of many-core computers to approximate efficiently multiple objectives simultaneously. Thinking out of the box, complexity can also be shifted from approximating the individual objective functions to approximating the entire Pareto front. This leads to further complexities, namely, how to validate statistically and apply the techniques developed to real-world problems. In this paper, we discuss emerging complexity-related topics in surrogate-assisted multicriteria optimization that may not be prevalent in nonsurrogate-assisted single-objective optimization. These complexities are motivated using several real-world problems in which the authors were involved. We then discuss several promising future research directions and prospective solutions to tackle emerging complexities in surrogate-assisted multicriteria optimization. Finally, we provide insights from an industrial point of view into how surrogate-assisted multicriteria optimization techniques can be developed and applied within a collaborative business environment to tackle real-world problems.

65 citations

Journal ArticleDOI
TL;DR: This work used a multi-objective evolutionary algorithm to optimize reagent combinations from a dynamic chemical library of 33 compounds with established or predicted targets in the regulatory network controlling IL-1β expression to provide a powerful and general approach to the discovery of novel combinations of pharmacological agents with potentially greater therapeutic indices than those of single drugs.
Abstract: The control of biochemical fluxes is distributed, and to perturb complex intracellular networks effectively it is often necessary to modulate several steps simultaneously. However, the number of possible permutations leads to a combinatorial explosion in the number of experiments that would have to be performed in a complete analysis. We used a multiobjective evolutionary algorithm to optimize reagent combinations from a dynamic chemical library of 33 compounds with established or predicted targets in the regulatory network controlling IL-1β expression. The evolutionary algorithm converged on excellent solutions within 11 generations, during which we studied just 550 combinations out of the potential search space of ~9 billion. The top five reagents with the greatest contribution to combinatorial effects throughout the evolutionary algorithm were then optimized pairwise. A p38 MAPK inhibitor together with either an inhibitor of IκB kinase or a chelator of poorly liganded iron yielded synergistic inhibition of macrophage IL-1β expression. Evolutionary searches provide a powerful and general approach to the discovery of new combinations of pharmacological agents with therapeutic indices potentially greater than those of single drugs.

63 citations

Journal ArticleDOI
TL;DR: A framework consisting of an LCA and economic analysis combined with a sensitivity analysis of manufacturing process parameters and a production scale‐up study is presented to contribute toward the design of more cost‐efficient, robust and environmentally‐friendly manufacturing process for monoclonal antibodies (mAbs).
Abstract: Life-cycle assessment (LCA) is an environmental assessment tool that quantifies the environmental impact associated with a product or a process (e.g., water consumption, energy requirements, and solid waste generation). While LCA is a standard approach in many commercial industries, its application has not been exploited widely in the bioprocessing sector. To contribute toward the design of more cost-efficient, robust and environmentally-friendly manufacturing process for monoclonal antibodies (mAbs), a framework consisting of an LCA and economic analysis combined with a sensitivity analysis of manufacturing process parameters and a production scale-up study is presented. The efficiency of the framework is demonstrated using a comparative study of the two most commonly used upstream configurations for mAb manufacture, namely fed-batch (FB) and perfusion-based processes. Results obtained by the framework are presented using a range of visualization tools, and indicate that a standard perfusion process (with a pooling duration of 4 days) has similar cost of goods than a FB process but a larger environmental footprint because it consumed 35% more water, demanded 17% more energy, and emitted 17% more CO2 than the FB process. Water consumption was the most important impact category, especially when scaling-up the processes, as energy was required to produce process water and water-for-injection, while CO2 was emitted from energy generation. The sensitivity analysis revealed that the perfusion process can be made more environmentally-friendly than the FB process if the pooling duration is extended to 8 days. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1324-1335, 2016.

49 citations

Journal ArticleDOI
TL;DR: This work provides a general problem definition and suitable notation for describing algorithm schemes that can use different evaluation budgets for each objective, and proposes three schemes for the bi-objective version of the problem, including methods that interleave the evaluations of different objectives.

35 citations

Proceedings ArticleDOI
01 Jul 2017
TL;DR: A new EGO-based algorithm is introduced which tries to overcome common issues with Kriging optimization algorithms, mainly: early stagnation, problems with multiple active constraints and frequent crashes.
Abstract: Real-world optimization problems are often subject to several constraints which are expensive to evaluate in terms of cost or time Although a lot of effort is devoted to make use of surrogate models for expensive optimization tasks, not many strong surrogate-assisted algorithms can address the challenging constrained problems Efficient Global Optimization (EGO) is a Kriging-based surrogate-assisted algorithm It was originally proposed to address unconstrained problems and later was modified to solve constrained problems However, these type of algorithms still suffer from several issues, mainly: (1) early stagnation, (2) problems with multiple active constraints and (3) frequent crashes In this work, we introduce a new EGO-based algorithm which tries to overcome these common issues with Kriging optimization algorithms We apply the proposed algorithm on problems with dimension d ≤ 4 from the G-function suite [16] and on an airfoil shape example

34 citations


Cited by
More filters
Journal Article
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON

13,333 citations

Journal ArticleDOI

6,278 citations

Journal ArticleDOI
TL;DR: In this paper, the authors provide guidelines for QSAR development, validation, and application, which are summarized in best practices for building rigorously validated and externally predictive quantitative structure-activity relationship models.
Abstract: Quantitative structure–activity relationship modeling is one of the major computational tools employed in medicinal chemistry. However, throughout its entire history it has drawn both praise and criticism concerning its reliability, limitations, successes, and failures. In this paper, we discuss (i) the development and evolution of QSAR; (ii) the current trends, unsolved problems, and pressing challenges; and (iii) several novel and emerging applications of QSAR modeling. Throughout this discussion, we provide guidelines for QSAR development, validation, and application, which are summarized in best practices for building rigorously validated and externally predictive QSAR models. We hope that this Perspective will help communications between computational and experimental chemists toward collaborative development and use of QSAR models. We also believe that the guidelines presented here will help journal editors and reviewers apply more stringent scientific standards to manuscripts reporting new QSAR stu...

1,314 citations

Journal ArticleDOI
TL;DR: It is shown how network techniques can help in the identification of single-target, edgetic, multi-target and allo-network drug target candidates and an optimized protocol of network-aided drug development is suggested, and a list of systems-level hallmarks of drug quality is provided.

806 citations