scispace - formally typeset
Search or ask a question
Journal ArticleDOI

The Problem of Simplifying Truth Functions

01 Oct 1952-American Mathematical Monthly (Taylor & Francis)-Vol. 59, Iss: 8, pp 521-531
TL;DR: The Problem of Simplifying Truth Functions is concerned with the problem of reducing the number of operations on a graph to a simple number.
Abstract: (1952). The Problem of Simplifying Truth Functions. The American Mathematical Monthly: Vol. 59, No. 8, pp. 521-531.
Citations
More filters
Book
27 Jul 2015
TL;DR: This comprehensive textbook presents a clean and coherent account of most fundamental tools and techniques in Parameterized Algorithms and is a self-contained guide to the area, providing a toolbox of algorithmic techniques.
Abstract: This comprehensive textbook presents a clean and coherent account of most fundamental tools and techniques in Parameterized Algorithms and is a self-contained guide to the area. The book covers many of the recent developments of the field, including application of important separators, branching based on linear programming, Cut & Count to obtain faster algorithms on tree decompositions, algorithms based on representative families of matroids, and use of the Strong Exponential Time Hypothesis. A number of older results are revisited and explained in a modern and didactic way. The book provides a toolbox of algorithmic techniques. Part I is an overview of basic techniques, each chapter discussing a certain algorithmic paradigm. The material covered in this part can be used for an introductory course on fixed-parameter tractability. Part II discusses more advanced and specialized algorithmic ideas, bringing the reader to the cutting edge of current research. Part III presents complexity results and lower bounds, giving negative evidence by way of W[1]-hardness, the Exponential Time Hypothesis, and kernelization lower bounds. All the results and concepts are introduced at a level accessible to graduate students and advanced undergraduate students. Every chapter is accompanied by exercises, many with hints, while the bibliographic notes point to original publications and related work.

1,544 citations


Cites background from "The Problem of Simplifying Truth Fu..."

  • ...The history of preprocessing, such as that of applying reduction rules to simplify truth functions, can be traced back to the origins of computer science—the 1950 work of Quine [391]....

    [...]

Book
01 Jan 2010
TL;DR: Theories are made easier to understand with 200 illustrative examples, and students can test their understanding with over 350 end-of-chapter review questions.
Abstract: Understand the structure, behavior, and limitations of logic machines with this thoroughly updated third edition. Many new topics are included, such as CMOS gates, logic synthesis, logic design for emerging nanotechnologies, digital system testing, and asynchronous circuit design, to bring students up-to-speed with modern developments. The intuitive examples and minimal formalism of the previous edition are retained, giving students a text that is logical and easy to follow, yet rigorous. Kohavi and Jha begin with the basics, and then cover combinational logic design and testing, before moving on to more advanced topics in finite-state machine design and testing. Theory is made easier to understand with 200 illustrative examples, and students can test their understanding with over 350 end-of-chapter review questions.

1,315 citations

Journal ArticleDOI
TL;DR: A systematic procedure is presented for writing a Boolean function as a minimum sum of products and specific attention is given to terms which can be included in the function solely for the designer's convenience.
Abstract: A systematic procedure is presented for writing a Boolean function as a minimum sum of products This procedure is a simplification and extension of the method presented by W V Quine Specific attention is given to terms which can be included in the function solely for the designer's convenience

1,103 citations

Journal ArticleDOI
TL;DR: The theoretical background is developed here for employing data dependence to convert FORTRAN programs to parallel form and transformations that use dependence to uncover additional parallelism are discussed.
Abstract: The recent success of vector computers such as the Cray-1 and array processors such as those manufactured by Floating Point Systems has increased interest in making vector operations available to the FORTRAN programmer The FORTRAN standards committee is currently considering a successor to FORTRAN 77, usually called FORTRAN 8x, that will permit the programmer to explicitly specify vector and array operationsAlthough FORTRAN 8x will make it convenient to specify explicit vector operations in new programs, it does little for existing code In order to benefit from the power of vector hardware, existing programs will need to be rewritten in some language (presumably FORTRAN 8x) that permits the explicit specification of vector operations One way to avoid a massive manual recoding effort is to provide a translator that discovers the parallelism implicit in a FORTRAN program and automatically rewrites that program in FORTRAN 8xSuch a translation from FORTRAN to FORTRAN 8x is not straightforward because FORTRAN DO loops are not always semantically equivalent to the corresponding FORTRAN 8x parallel operation The semantic difference between these two constructs is precisely captured by the concept of dependence A translation from FORTRAN to FORTRAN 8x preserves the semantics of the original program if it preserves the dependences in that programThe theoretical background is developed here for employing data dependence to convert FORTRAN programs to parallel form Dependence is defined and characterized in terms of the conditions that give rise to it; accurate tests to determine dependence are presented; and transformations that use dependence to uncover additional parallelism are discussed

780 citations

Proceedings ArticleDOI
24 Jan 1983
TL;DR: This paper presents a method for systematically converting control dependences to data dependences in this fashion by eliminating goto statements and introducing logical variables to control the execution of statements in the program.
Abstract: Program analysis methods, especially those which support automatic vectorization, are based on the concept of interstatement dependence where a dependence holds between two statements when one of the statements computes values needed by the other. Powerful program transformation systems that convert sequential programs to a form more suitable for vector or parallel machines have been developed using this concept [AllK 82, KKLW 80].The dependence analysis in these systems is based on data dependence. In the presence of complex control flow, data dependence is not sufficient to transform programs because of the introduction of control dependences. A control dependence exists between two statements when the execution of one statement can prevent the execution of the other. Control dependences do not fit conveniently into dependence-based program translators.One solution is to convert all control dependences to data dependences by eliminating goto statements and introducing logical variables to control the execution of statements in the program. In this scheme, action statements are converted to IF statements. The variables in the conditional expression of an IF statement can be viewed as inputs to the statement being controlled. The result is that control dependences between statements become explicit data dependences expressed through the definitions and uses of the controlling logical variables.This paper presents a method for systematically converting control dependences to data dependences in this fashion. The algorithms presented here have been implemented in PFC, an experimental vectorizer written at Rice University.

616 citations