scispace - formally typeset
Search or ask a question
Author

Michael R. Fellows

Other affiliations: Durham University, University of Idaho, Newcastle University  ...read more
Bio: Michael R. Fellows is an academic researcher from University of Bergen. The author has contributed to research in topics: Parameterized complexity & Vertex cover. The author has an hindex of 67, co-authored 311 publications receiving 18287 citations. Previous affiliations of Michael R. Fellows include Durham University & University of Idaho.


Papers
More filters
Book
06 Nov 1998
TL;DR: An approach to complexity theory which offers a means of analysing algorithms in terms of their tractability, and introduces readers to new classes of algorithms which may be analysed more precisely than was the case until now.
Abstract: An approach to complexity theory which offers a means of analysing algorithms in terms of their tractability. The authors consider the problem in terms of parameterized languages and taking "k-slices" of the language, thus introducing readers to new classes of algorithms which may be analysed more precisely than was the case until now. The book is as self-contained as possible and includes a great deal of background material. As a result, computer scientists, mathematicians, and graduate students interested in the design and analysis of algorithms will find much of interest.

3,651 citations

Book
06 Dec 2013
TL;DR: This comprehensive and self-contained textbook presents an accessible overview of the state of the art of multivariate algorithmics and complexity, enabling the reader who masters the complexity issues under discussion to use the positive and negative toolkits in their own research.
Abstract: This comprehensive and self-contained textbook presents an accessible overview of the state of the art of multivariate algorithmics and complexity. Increasingly, multivariate algorithmics is having significant practical impact in many application domains, with even more developments on the horizon. The text describes how the multivariate framework allows an extended dialog with a problem, enabling the reader who masters the complexity issues under discussion to use the positive and negative toolkits in their own research. Features: describes many of the standard algorithmic techniques available for establishing parametric tractability; reviews the classical hardness classes; explores the various limitations and relaxations of the methods; showcases the powerful new lower bound techniques; examines various different algorithmic solutions to the same problems, highlighting the insights to be gained from each approach; demonstrates how complexity methods and ideas have evolved over the past 25 years.

1,435 citations

Journal ArticleDOI
TL;DR: Using the notion of distillation algorithms, a generic lower-bound engine is developed that allows showing that a variety of FPT problems, fulfilling certain criteria, cannot have polynomial kernels unless the polynomially-bounded hierarchy collapses.

671 citations

Journal ArticleDOI
TL;DR: This work shows that INDEPENDENT SET is complete for W, and the W Hierarchy of parameterized problems was defined, and complete problems were identified for the classes W [ t ] for t ⩾ 2.

659 citations

Journal ArticleDOI
TL;DR: This paper establishes the main results of a completeness program which addresses the apparent fixed-parameter intractability of many parameterized problems and gives a compendium of currently known hardness results.
Abstract: For many fixed-parameter problems that are trivially soluable in polynomial-time, such as ($k$-)DOMINATING SET, essentially no better algorithm is presently known than the one which tries all possible solutions. Other problems, such as ($k$-)FEEDBACK VERTEX SET, exhibit fixed-parameter tractability: for each fixed $k$ the problem is soluable in time bounded by a polynomial of degree $c$, where $c$ is a constant independent of $k$. We establish the main results of a completeness program which addresses the apparent fixed-parameter intractability of many parameterized problems. In particular, we define a hierarchy of classes of parameterized problems $FPT \subseteq W[1] \subseteq W[2] \subseteq \cdots \subseteq W[SAT] \subseteq W[P]$ and identify natural complete problems for $W[t]$ for $t \geq 2$. (In other papers we have shown many problems complete for $W[1]$.) DOMINATING SET is shown to be complete for $W[2]$, and thus is not fixed-parameter tractable unless INDEPENDENT SET, CLIQUE, IRREDUNDANT SET and many other natural problems in $W[2]$ are also fixed-parameter tractable. We also give a compendium of currently known hardness results as an appendix.

497 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.
Abstract: Machine Learning is the study of methods for programming computers to learn. Computers are applied to a wide range of tasks, and for most of these it is relatively easy for programmers to design and implement the necessary software. However, there are many tasks for which this is difficult or impossible. These can be divided into four general categories. First, there are problems for which there exist no human experts. For example, in modern automated manufacturing facilities, there is a need to predict machine failures before they occur by analyzing sensor readings. Because the machines are new, there are no human experts who can be interviewed by a programmer to provide the knowledge necessary to build a computer system. A machine learning system can study recorded data and subsequent machine failures and learn prediction rules. Second, there are problems where human experts exist, but where they are unable to explain their expertise. This is the case in many perceptual tasks, such as speech recognition, hand-writing recognition, and natural language understanding. Virtually all humans exhibit expert-level abilities on these tasks, but none of them can describe the detailed steps that they follow as they perform them. Fortunately, humans can provide machines with examples of the inputs and correct outputs for these tasks, so machine learning algorithms can learn to map the inputs to the outputs. Third, there are problems where phenomena are changing rapidly. In finance, for example, people would like to predict the future behavior of the stock market, of consumer purchases, or of exchange rates. These behaviors change frequently, so that even if a programmer could construct a good predictive computer program, it would need to be rewritten frequently. A learning program can relieve the programmer of this burden by constantly modifying and tuning a set of learned prediction rules. Fourth, there are applications that need to be customized for each computer user separately. Consider, for example, a program to filter unwanted electronic mail messages. Different users will need different filters. It is unreasonable to expect each user to program his or her own rules, and it is infeasible to provide every user with a software engineer to keep the rules up-to-date. A machine learning system can learn which mail messages the user rejects and maintain the filtering rules automatically. Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis. Statistics focuses on understanding the phenomena that have generated the data, often with the goal of testing different hypotheses about those phenomena. Data mining seeks to find patterns in the data that are understandable by people. Psychological studies of human learning aspire to understand the mechanisms underlying the various learning behaviors exhibited by people (concept learning, skill acquisition, strategy change, etc.).

13,246 citations

Book
06 Nov 1998
TL;DR: An approach to complexity theory which offers a means of analysing algorithms in terms of their tractability, and introduces readers to new classes of algorithms which may be analysed more precisely than was the case until now.
Abstract: An approach to complexity theory which offers a means of analysing algorithms in terms of their tractability. The authors consider the problem in terms of parameterized languages and taking "k-slices" of the language, thus introducing readers to new classes of algorithms which may be analysed more precisely than was the case until now. The book is as self-contained as possible and includes a great deal of background material. As a result, computer scientists, mathematicians, and graduate students interested in the design and analysis of algorithms will find much of interest.

3,651 citations

01 Apr 1997
TL;DR: The objective of this paper is to give a comprehensive introduction to applied cryptography with an engineer or computer scientist in mind on the knowledge needed to create practical systems which supports integrity, confidentiality, or authenticity.
Abstract: The objective of this paper is to give a comprehensive introduction to applied cryptography with an engineer or computer scientist in mind. The emphasis is on the knowledge needed to create practical systems which supports integrity, confidentiality, or authenticity. Topics covered includes an introduction to the concepts in cryptography, attacks against cryptographic systems, key use and handling, random bit generation, encryption modes, and message authentication codes. Recommendations on algorithms and further reading is given in the end of the paper. This paper should make the reader able to build, understand and evaluate system descriptions and designs based on the cryptographic components described in the paper.

2,188 citations

Book
01 Jan 2006
TL;DR: This paper discusses Fixed-Parameter Algorithms, Parameterized Complexity Theory, and Selected Case Studies, and some of the techniques used in this work.
Abstract: PART I: FOUNDATIONS 1. Introduction to Fixed-Parameter Algorithms 2. Preliminaries and Agreements 3. Parameterized Complexity Theory - A Primer 4. Vertex Cover - An Illustrative Example 5. The Art of Problem Parameterization 6. Summary and Concluding Remarks PART II: ALGORITHMIC METHODS 7. Data Reduction and Problem Kernels 8. Depth-Bounded Search Trees 9. Dynamic Programming 10. Tree Decompositions of Graphs 11. Further Advanced Techniques 12. Summary and Concluding Remarks PART III: SOME THEORY, SOME CASE STUDIES 13. Parameterized Complexity Theory 14. Connections to Approximation Algorithms 15. Selected Case Studies 16. Zukunftsmusik References Index

1,730 citations

Journal ArticleDOI
TL;DR: Every minor-closed class of graphs that does not contain all planar graphs has a linear-time recognition algorithm that determines whether the treewidth of G is at most at most some constant $k$ and finds a tree-decomposition of G withtreewidth at most k.
Abstract: In this paper, we give for constant $k$ a linear-time algorithm that, given a graph $G=(V,E)$, determines whether the treewidth of $G$ is at most $k$ and, if so, finds a tree-decomposition of $G$ with treewidth at most $k$. A consequence is that every minor-closed class of graphs that does not contain all planar graphs has a linear-time recognition algorithm. Another consequence is that a similar result holds when we look instead for path-decompositions with pathwidth at most some constant $k$.

1,666 citations