scispace - formally typeset
Search or ask a question
Author

Rolf Niedermeier

Bio: Rolf Niedermeier is an academic researcher from Technical University of Berlin. The author has contributed to research in topics: Parameterized complexity & Vertex (geometry). The author has an hindex of 53, co-authored 465 publications receiving 12338 citations. Previous affiliations of Rolf Niedermeier include AGH University of Science and Technology & Ben-Gurion University of the Negev.


Papers
More filters
Book
01 Jan 2006
TL;DR: This paper discusses Fixed-Parameter Algorithms, Parameterized Complexity Theory, and Selected Case Studies, and some of the techniques used in this work.
Abstract: PART I: FOUNDATIONS 1. Introduction to Fixed-Parameter Algorithms 2. Preliminaries and Agreements 3. Parameterized Complexity Theory - A Primer 4. Vertex Cover - An Illustrative Example 5. The Art of Problem Parameterization 6. Summary and Concluding Remarks PART II: ALGORITHMIC METHODS 7. Data Reduction and Problem Kernels 8. Depth-Bounded Search Trees 9. Dynamic Programming 10. Tree Decompositions of Graphs 11. Further Advanced Techniques 12. Summary and Concluding Remarks PART III: SOME THEORY, SOME CASE STUDIES 13. Parameterized Complexity Theory 14. Connections to Approximation Algorithms 15. Selected Case Studies 16. Zukunftsmusik References Index

1,730 citations

Journal ArticleDOI
TL;DR: A brief survey is presented that presents data reduction and problem kernelization as a promising research field for algorithm and complexity theory.
Abstract: To solve NP-hard problems, polynomial-time preprocessing is a natural and promising approach. Preprocessing is based on data reduction techniques that take a problem's input instance and try to perform a reduction to a smaller, equivalent problem kernel. Problem kernelization is a methodology that is rooted in parameterized computational complexity. In this brief survey, we present data reduction and problem kernelization as a promising research field for algorithm and complexity theory.

406 citations

Journal ArticleDOI
TL;DR: An algorithm is presented that constructively produces a solution to the k -DOMINATING SET problem for planar graphs in time O(c^ \sqrt k n) where c=4^ 6\sqrt 34 and k is the size of the face cover set.
Abstract: . We present an algorithm that constructively produces a solution to the k -DOMINATING SET problem for planar graphs in time O(c^ \sqrt k n) , where c=4^ 6\sqrt 34 . To obtain this result, we show that the treewidth of a planar graph with domination number γ (G) is O(\sqrt \rule 0pt 4pt \smash γ (G) ) , and that such a tree decomposition can be found in O(\sqrt \rule 0pt 4pt \smash γ (G) n) time. The same technique can be used to show that the k -FACE COVER problem (find a size k set of faces that cover all vertices of a given plane graph) can be solved in O(c 1 ^ \sqrt k n) time, where c 1 =3^ 36\sqrt 34 and k is the size of the face cover set. Similar results can be obtained in the planar case for some variants of k -DOMINATING SET, e.g., k -INDEPENDENT DOMINATING SET and k -WEIGHTED DOMINATING SET.

291 citations

Journal ArticleDOI
TL;DR: In this article, it was shown that Dominating Set restricted to planar graphs has a problem kernel of linear size, achieved by two simple and easy-to-implement reduction rules.
Abstract: Dealing with the NP-complete Dominating Set problem on graphs, we demonstrate the power of data reduction by preprocessing from a theoretical as well as a practical side. In particular, we prove that Dominating Set restricted to planar graphs has a so-called problem kernel of linear size, achieved by two simple and easy-to-implement reduction rules. Moreover, having implemented our reduction rules, first experiments indicate the impressive practical potential of these rules. Thus, this work seems to open up a new and prospective way how to cope with one of the most important problems in graph theory and combinatorial optimization.

251 citations

Journal ArticleDOI
TL;DR: It is shown that the NP-complete Feedback Vertex Set problem, which asks for the smallest set of vertices to remove from a graph to destroy all cycles, is deterministically solvable in O(c^[email protected]?m) time.

206 citations


Cited by
More filters
01 Jan 2006
TL;DR: For example, Standardi pružaju okvir koje ukazuju na ucinkovitost kvalitetnih instrumenata u onim situacijama u kojima je njihovo koristenje potkrijepljeno validacijskim podacima.
Abstract: Pedagosko i psiholosko testiranje i procjenjivanje spadaju među najvažnije doprinose znanosti o ponasanju nasem drustvu i pružaju temeljna i znacajna poboljsanja u odnosu na ranije postupke. Iako se ne može ustvrditi da su svi testovi dovoljno usavrseni niti da su sva testiranja razborita i korisna, postoji velika kolicina informacija koje ukazuju na ucinkovitost kvalitetnih instrumenata u onim situacijama u kojima je njihovo koristenje potkrijepljeno validacijskim podacima. Pravilna upotreba testova može dovesti do boljih odluka o pojedincima i programima nego sto bi to bio slucaj bez njihovog koristenja, a također i ukazati na put za siri i pravedniji pristup obrazovanju i zaposljavanju. Međutim, losa upotreba testova može dovesti do zamjetne stete nanesene ispitanicima i drugim sudionicima u procesu donosenja odluka na temelju testovnih podataka. Cilj Standarda je promoviranje kvalitetne i eticne upotrebe testova te uspostavljanje osnovice za ocjenu kvalitete postupaka testiranja. Svrha objavljivanja Standarda je uspostavljanje kriterija za evaluaciju testova, provedbe testiranja i posljedica upotrebe testova. Iako bi evaluacija prikladnosti testa ili njegove primjene trebala ovisiti prvenstveno o strucnim misljenjima, Standardi pružaju okvir koji osigurava obuhvacanje svih relevantnih pitanja. Bilo bi poželjno da svi autori, sponzori, nakladnici i korisnici profesionalnih testova usvoje Standarde te da poticu druge da ih također prihvate.

3,905 citations

Book
01 Jan 2006
TL;DR: This paper discusses Fixed-Parameter Algorithms, Parameterized Complexity Theory, and Selected Case Studies, and some of the techniques used in this work.
Abstract: PART I: FOUNDATIONS 1. Introduction to Fixed-Parameter Algorithms 2. Preliminaries and Agreements 3. Parameterized Complexity Theory - A Primer 4. Vertex Cover - An Illustrative Example 5. The Art of Problem Parameterization 6. Summary and Concluding Remarks PART II: ALGORITHMIC METHODS 7. Data Reduction and Problem Kernels 8. Depth-Bounded Search Trees 9. Dynamic Programming 10. Tree Decompositions of Graphs 11. Further Advanced Techniques 12. Summary and Concluding Remarks PART III: SOME THEORY, SOME CASE STUDIES 13. Parameterized Complexity Theory 14. Connections to Approximation Algorithms 15. Selected Case Studies 16. Zukunftsmusik References Index

1,730 citations

01 Jul 2004
TL;DR: In this article, the authors developed a center to address state-of-the-art research, create innovating educational programs, and support technology transfers using commercially viable results to assist the Army Research Laboratory to develop the next generation Future Combat System in the telecommunications sector that assures prevention of perceived threats, and non-line of sight/Beyond line of sight lethal support.
Abstract: Home PURPOSE OF THE CENTER: To develop the center to address state-of-the-art research, create innovating educational programs, and support technology transfers using commercially viable results to assist the Army Research Laboratory to develop the next generation Future Combat System in the telecommunications sector that assures prevention of perceived threats, and Non Line of Sight/Beyond Line of Sight lethal support.

1,713 citations

01 Jan 1995
TL;DR: In this paper, the authors propose a method to improve the quality of the data collected by the data collection system. But it is difficult to implement and time consuming and computationally expensive.
Abstract: 本文对国际科学计量学杂志《Scientometrics》1979-1991年的研究论文内容、栏目、作者及国别和编委及国别作了计量分析,揭示出科学计量学研究的重点、活动的中心及发展趋势,说明了学科带头人在发展科学计量学这门新兴学科中的作用。

1,636 citations

Book
27 Jul 2015
TL;DR: This comprehensive textbook presents a clean and coherent account of most fundamental tools and techniques in Parameterized Algorithms and is a self-contained guide to the area, providing a toolbox of algorithmic techniques.
Abstract: This comprehensive textbook presents a clean and coherent account of most fundamental tools and techniques in Parameterized Algorithms and is a self-contained guide to the area. The book covers many of the recent developments of the field, including application of important separators, branching based on linear programming, Cut & Count to obtain faster algorithms on tree decompositions, algorithms based on representative families of matroids, and use of the Strong Exponential Time Hypothesis. A number of older results are revisited and explained in a modern and didactic way. The book provides a toolbox of algorithmic techniques. Part I is an overview of basic techniques, each chapter discussing a certain algorithmic paradigm. The material covered in this part can be used for an introductory course on fixed-parameter tractability. Part II discusses more advanced and specialized algorithmic ideas, bringing the reader to the cutting edge of current research. Part III presents complexity results and lower bounds, giving negative evidence by way of W[1]-hardness, the Exponential Time Hypothesis, and kernelization lower bounds. All the results and concepts are introduced at a level accessible to graduate students and advanced undergraduate students. Every chapter is accompanied by exercises, many with hints, while the bibliographic notes point to original publications and related work.

1,544 citations