scispace - formally typeset
Search or ask a question

Showing papers on "Decision tree model published in 1976"


Journal ArticleDOI
TL;DR: Several properties of the graph-theoretic complexity are proved which show, for example, that complexity is independent of physical size and complexity depends only on the decision structure of a program.
Abstract: This paper describes a graph-theoretic complexity measure and illustrates how it can be used to manage and control program complexity. The paper first explains how the graph-theory concepts apply and gives an intuitive explanation of the graph concepts in programming terms. The control graphs of several actual Fortran programs are then presented to illustrate the correlation between intuitive complexity and the graph-theoretic complexity. Several properties of the graph-theoretic complexity are then proved which show, for example, that complexity is independent of physical size (adding or subtracting functional statements leaves complexity unchanged) and complexity depends only on the decision structure of a program.

5,097 citations


Proceedings ArticleDOI
13 Oct 1976
TL;DR: In this article, a graph-theoretic complexity measure for managing and controlling program complexity is presented. But the complexity is independent of physical size, and complexity depends only on the decision structure of a program.
Abstract: This paper describes a graph-theoretic complexity measure and illustrates how it can be used to manage and control program complexity. The paper first explains how the graph theory concepts apply and gives an intuitive explanation of the graph concepts in programming terms. The control graphs of several actual FORTRAN programs are then presented to illustrate the correlation between intuitive complexity and the graph theoretic complexity. Several properties of the graph-theoretic complexity are then proved which show, for example, that complexity is independent of physical size (adding or subtracting functional statements leaves complexity unchanged) and complexity depends only on the decision structure of a program.The issue of using non-structured control flow is also discussed. A characterization of non-structured control graphs is given and a method of measuring the “structuredness” of a program is developed. The relationship between structure and reducibility is illustrated with several examples.The last section of the paper deals with a testing methodology used in conjunction with the complexity measure; a testing strategy is defined that dictates that a program can either admit of a certain minimal testing level or the program can be structurally reduced.

282 citations


Book
01 Jan 1976

39 citations


Journal ArticleDOI
TL;DR: A new algorithm for efficiently generating the minimal cut-sets of a fault tree containing repetitions of basic events, which substantially reduces both execution time and storage requirements, over the classical technique, when programmed.
Abstract: This paper presents a new algorithm for efficiently generating the minimal cut-sets of a fault tree containing repetitions of basic events. The algorithm is easily performed by hand and substantially reduces both execution time and storage requirements, over the classical technique, when programmed. The savings are accomplished by recognizing and recursively reducing the influence of the repetitive events. The theoretical basis of the algorithm is presented and examples from the recent literature, used to demonstrate its efficiency. Finally, the computational complexity is discussed and rules presented for simplifying the tree before the computations begin.

15 citations


Journal ArticleDOI
TL;DR: This paper considers the reduction in algorithmic complexity that can be achieved by permitting approximate answers to computational problems, and shows that partial sorting of N items, insisting on matching any nonzero fraction of the terms with their correct successors, requires O (N \log N) comparisons.
Abstract: This paper considers the reduction in algorithmic complexity that can be achieved by permitting approximate answers to computational problems. It is shown that Shannon's rate-distortion function could, under quite general conditions, provide lower bounds on the mean complexity of inexact computations. As practical examples of this approach, we show that partial sorting of N items, insisting on matching any nonzero fraction of the terms with their correct successors, requires O (N \log N) comparisons. On the other hand, partial sorting in linear time is feasible (and necessary) if one permits any finite fraction of pairs to remain out of order. It is also shown that any error tolerance below 50 percent can neither reduce the state complexity of binary N -sequences from the zero-error value of O(N) nor reduce the combinational complexity of N -variable Boolean functions from the zero-error level of O(2^{N}/N) .

13 citations