scispace - formally typeset
Search or ask a question
Author

Graham Kendall

Bio: Graham Kendall is an academic researcher from University of Nottingham Malaysia Campus. The author has contributed to research in topics: Heuristics & Heuristic. The author has an hindex of 60, co-authored 292 publications receiving 13452 citations. Previous affiliations of Graham Kendall include Universities UK & University of Bradford.


Papers
More filters
Journal ArticleDOI
TL;DR: A critical discussion of the scientific literature on hyper-heuristics including their origin and intellectual roots, a detailed account of the main types of approaches, and an overview of some related areas are presented.
Abstract: Hyper-heuristics comprise a set of approaches that are motivated (at least in part) by the goal of automating the design of heuristic methods to solve hard computational search problems. An underlying strategic research challenge is to develop more generally applicable search methodologies. The term hyper-heuristic is relatively new; it was first used in 2000 to describe heuristics to choose heuristics in the context of combinatorial optimisation. However, the idea of automating the design of heuristics is not new; it can be traced back to the 1960s. The definition of hyper-heuristics has been recently extended to refer to a search method or learning mechanism for selecting or generating heuristics to solve computational search problems. Two main hyper-heuristic categories can be considered: heuristic selection and heuristic generation. The distinguishing feature of hyper-heuristics is that they operate on a search space of heuristics (or heuristic components) rather than directly on the search space of solutions to the underlying problem that is being addressed. This paper presents a critical discussion of the scientific literature on hyper-heuristics including their origin and intellectual roots, a detailed account of the main types of approaches, and an overview of some related areas. Current research trends and directions for future research are also discussed.

1,023 citations

BookDOI
01 Jan 2003
TL;DR: This work extends the application of CPSO to the dynamic problem by considering a bi-modal parabolic environment of high spatial and temporal severity, and suggests that charged swarms perform best in the extreme cases, but neutral swarms are better optimizers in milder environments.
Abstract: Charged particle swarm optimization (CPSO) is well suited to the dynamic search problem since inter-particle repulsion maintains population diversity and good tracking can be achieved with a simple algorithm. This work extends the application of CPSO to the dynamic problem by considering a bi-modal parabolic environment of high spatial and temporal severity. Two types of charged swarms and an adapted neutral swarm are compared for a number of different dynamic environments which include extreme ‘needle-inthe-haystack’ cases. The results suggest that charged swarms perform best in the extreme cases, but neutral swarms are better optimizers in milder environments.

873 citations

Book ChapterDOI
01 Jan 2003
TL;DR: This chapter introduces and overviews an emerging methodology in search and optimisation called hyperheuristics, which aims to raise the level of generality at which optimisation systems can operate and will lead to more general systems that are able to handle a wide range of problem domains.
Abstract: This chapter introduces and overviews an emerging methodology in search and optimisation. One of the key aims of these new approaches, which have been termed hyperheuristics, is to raise the level of generality at which optimisation systems can operate. An objective is that hyper-heuristics will lead to more general systems that are able to handle a wide range of problem domains rather than current meta-heuristic technology which tends to be customised to a particular problem or a narrow class of problems. Hyper-heuristics are broadly concerned with intelligently choosing the right heuristic or algorithm in a given situation. Of course, a hyper-heuristic can be (often is) a (meta-)heuristic and it can operate on (meta-)heuristics. In a certain sense, a hyper-heuristic works at a higher level when compared with the typical application of meta-heuristics to optimisation problems, i.e., a hyper-heuristic could be thought of as a (meta)-heuristic which operates on lower level (meta-)heuristics. In this chapter we will introduce the idea and give a brief history of this emerging area. In addition, we will review some of the latest work to be published in the field.

691 citations

Book
30 Oct 2013
TL;DR: The first edition of Search Methodologies: Introductory Tutorials in Optimization and Decision Support Techniques was originally put together to offer a basic introduction to the various search and optimization techniques that students might need to use during their research, and this new edition continues this tradition.
Abstract: The first edition of Search Methodologies: Introductory Tutorials in Optimization and Decision Support Techniques was originally put together to offer a basic introduction to the various search and optimization techniques that students might need to use during their research, and this new edition continues this tradition. Search Methodologies has been expanded and brought completely up to date, including new chapters covering scatter search, GRASP, and very large neighborhood search. The chapter authors are drawn from across Computer Science and Operations Research and include some of the worlds leading authorities in their field. The book provides useful guidelines for implementing the methods and frameworks described and offers valuable tutorials to students and researchers in the field.As I embarked on the pleasant journey of reading through the chapters of this book, I became convinced that this is one of the best sources of introductory material on the search methodologies topic to be found. The books subtitle, Introductory Tutorials in Optimization and Decision Support Techniques, aptly describes its aim, and the editors and contributors to this volume have achieved this aim with remarkable success. The chapters in this book are exemplary in giving useful guidelines for implementing the methods and frameworks described.Fred Glover, Leeds School of Business, University of Colorado Boulder, USA[The book] aims to present a series of well written tutorials by the leading experts in their fields. Moreover, it does this by covering practically the whole possible range of topics in the discipline. It enables students and practitioners to study and appreciate the beauty and the power of some of the computational search techniques that are able to effectively navigate through search spaces that are sometimes inconceivably large. I am convinced that this second edition will build on the success of the first edition and that it will prove to be just as popular.Jacek Blazewicz, Institute of Computing Science, Poznan University of Technology and Institute of Bioorganic Chemistry, Polish Academy of Sciences

602 citations

Journal ArticleDOI
TL;DR: It is demonstrated that this tabu-search hyperheuristic is an easily re-usable method which can produce solutions of at least acceptable quality across a variety of problems and instances and is fundamentally more general than state-of-the-art problem-specific techniques.
Abstract: Hyperheuristics can be defined to be heuristics which choose between heuristics in order to solve a given optimisation problem. The main motivation behind the development of such approaches is the goal of developing automated scheduling methods which are not restricted to one problem. In this paper we report the investigation of a hyperheuristic approach and evaluate it on various instances of two distinct timetabling and rostering problems. In the framework of our hyperheuristic approach, heuristics compete using rules based on the principles of reinforcement learning. A tabu list of heuristics is also maintained which prevents certain heuristics from being chosen at certain times during the search. We demonstrate that this tabu-search hyperheuristic is an easily re-usable method which can produce solutions of at least acceptable quality across a variety of problems and instances. In effect the proposed method is capable of producing solutions that are competitive with those obtained using state-of-the-art problem-specific techniques for the problems studied here, but is fundamentally more general than those techniques.

551 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: This historical survey compactly summarizes relevant work, much of it from the previous millennium, review deep supervised learning, unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.

14,635 citations

Journal Article
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON

13,333 citations

Journal ArticleDOI
TL;DR: Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.
Abstract: Machine Learning is the study of methods for programming computers to learn. Computers are applied to a wide range of tasks, and for most of these it is relatively easy for programmers to design and implement the necessary software. However, there are many tasks for which this is difficult or impossible. These can be divided into four general categories. First, there are problems for which there exist no human experts. For example, in modern automated manufacturing facilities, there is a need to predict machine failures before they occur by analyzing sensor readings. Because the machines are new, there are no human experts who can be interviewed by a programmer to provide the knowledge necessary to build a computer system. A machine learning system can study recorded data and subsequent machine failures and learn prediction rules. Second, there are problems where human experts exist, but where they are unable to explain their expertise. This is the case in many perceptual tasks, such as speech recognition, hand-writing recognition, and natural language understanding. Virtually all humans exhibit expert-level abilities on these tasks, but none of them can describe the detailed steps that they follow as they perform them. Fortunately, humans can provide machines with examples of the inputs and correct outputs for these tasks, so machine learning algorithms can learn to map the inputs to the outputs. Third, there are problems where phenomena are changing rapidly. In finance, for example, people would like to predict the future behavior of the stock market, of consumer purchases, or of exchange rates. These behaviors change frequently, so that even if a programmer could construct a good predictive computer program, it would need to be rewritten frequently. A learning program can relieve the programmer of this burden by constantly modifying and tuning a set of learned prediction rules. Fourth, there are applications that need to be customized for each computer user separately. Consider, for example, a program to filter unwanted electronic mail messages. Different users will need different filters. It is unreasonable to expect each user to program his or her own rules, and it is infeasible to provide every user with a software engineer to keep the rules up-to-date. A machine learning system can learn which mail messages the user rejects and maintain the filtering rules automatically. Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis. Statistics focuses on understanding the phenomena that have generated the data, often with the goal of testing different hypotheses about those phenomena. Data mining seeks to find patterns in the data that are understandable by people. Psychological studies of human learning aspire to understand the mechanisms underlying the various learning behaviors exhibited by people (concept learning, skill acquisition, strategy change, etc.).

13,246 citations

Christopher M. Bishop1
01 Jan 2006
TL;DR: Probability distributions of linear models for regression and classification are given in this article, along with a discussion of combining models and combining models in the context of machine learning and classification.
Abstract: Probability Distributions.- Linear Models for Regression.- Linear Models for Classification.- Neural Networks.- Kernel Methods.- Sparse Kernel Machines.- Graphical Models.- Mixture Models and EM.- Approximate Inference.- Sampling Methods.- Continuous Latent Variables.- Sequential Data.- Combining Models.

10,141 citations

Book
01 Jan 1995
TL;DR: In this article, Nonaka and Takeuchi argue that Japanese firms are successful precisely because they are innovative, because they create new knowledge and use it to produce successful products and technologies, and they reveal how Japanese companies translate tacit to explicit knowledge.
Abstract: How has Japan become a major economic power, a world leader in the automotive and electronics industries? What is the secret of their success? The consensus has been that, though the Japanese are not particularly innovative, they are exceptionally skilful at imitation, at improving products that already exist. But now two leading Japanese business experts, Ikujiro Nonaka and Hiro Takeuchi, turn this conventional wisdom on its head: Japanese firms are successful, they contend, precisely because they are innovative, because they create new knowledge and use it to produce successful products and technologies. Examining case studies drawn from such firms as Honda, Canon, Matsushita, NEC, 3M, GE, and the U.S. Marines, this book reveals how Japanese companies translate tacit to explicit knowledge and use it to produce new processes, products, and services.

7,448 citations