scispace - formally typeset
Search or ask a question
Book

Digital Systems Testing and Testable Design

TL;DR: The new edition of Breuer-Friedman's Diagnosis and Reliable Design ofDigital Systems offers comprehensive and state-ofthe-art treatment of both testing and testable design.
Abstract: For many years, Breuer-Friedman's Diagnosis and Reliable Design ofDigital Systems was the most widely used textbook in digital system testing and testable design. Now, Computer Science Press makes available a new and greativ expanded edition. Incorporating a significant amount of new material related to recently developed technologies, the new edition offers comprehensive and state-ofthe-art treatment of both testing and testable design.
Citations
More filters
Proceedings ArticleDOI
04 Mar 1999
TL;DR: Experimental evidence suggests that the size of computed test sets can often be reduced by using set covering models and algorithms, and a noteworthy empirical conclusion is that it may be preferable not to use fault simulation when the final objective is test set compaction.
Abstract: Test set compaction is fundamental problem in digital system testing. In recent years, many competitive solutions have been proposed, most of which based on heuristics approaches. This paper studies the application of set covering models to the compaction of test sets, which can be used with any heuristic test set compaction procedure. For this purpose, recent and highly effective set covering algorithms are used. Experimental evidence suggests that the size of computed test sets can often be reduced by using set covering models and algorithms. Moreover a noteworthy empirical conclusion is that it may be preferable not to use fault simulation when the final objective is test set compaction.

26 citations


Cites background from "Digital Systems Testing and Testabl..."

  • ...In recent years, many competitive solutions have been proposed, most of which based on heuristic approaches [1] [5] [6] [8] [10] [11] [12] [13] [14] [15]....

    [...]

Proceedings ArticleDOI
28 Apr 1996
TL;DR: This work focuses on the information flow model-focuses on test information in the fault dictionary to provide more accurate diagnostics and uses nearest neighbor classification with fault dictionaries to resolve inexact signature matches.
Abstract: Using nearest neighbor classification with fault dictionaries to resolve inexact signature matches in digital circuit diagnosis is inadequate. Nearest neighbor focuses on the possible diagnoses rather than on the tests. Our alternative-the information flow model-focuses on test information in the fault dictionary to provide more accurate diagnostics.

26 citations


Cites background or methods from "Digital Systems Testing and Testabl..."

  • ...These ambiguity groups, which were taken from Abramovici et al. (1990) are listed in Table 1....

    [...]

  • ...For illustration purposes, we use a simple digital circuit (Abramovici, Breuer, and Friedman, 1990)....

    [...]

  • ...Finally, Abramovici et al. (1990) note that the nearest neighbor approach to matching inexact patterns in the dictionary, while effective in many cases, “is not guaranteed to produce the correct diagnosis.”...

    [...]

Proceedings ArticleDOI
16 Apr 2007
TL;DR: This work proposes a departure from conventional debugging techniques by introducing abstraction and refinement during error localization, under this new framework, existing debugging techniques can handle large designs with long counter-examples yet remain run time and memory efficient.
Abstract: Verification is a major bottleneck in the VLSI design flow with the tasks of error detection, error localization, and error correction consuming up to 70% of the overall design effort. This work proposes a departure from conventional debugging techniques by introducing abstraction and refinement during error localization. Under this new framework, existing debugging techniques can handle large designs with long counter-examples yet remain run time and memory efficient. Experiments on benchmark and industrial designs confirm the effectiveness of the proposed framework and encourage further development of abstraction and refinement methodologies for existing debugging techniques.

26 citations


Cites background or methods from "Digital Systems Testing and Testabl..."

  • ...Given a set of vectors V for which a circuit (or netlist) C demonstrates an incorrect behavior, the objective of design debugging is to nd the gates that may be responsible for this incorrect behavior [ 5 ]....

    [...]

  • ...Currently, automated design debugging approaches are based on simulation, symbolic, or constraint satisfaction techniques [ 5 ], [6], [7]....

    [...]

BookDOI
01 Jan 2003
TL;DR: This paper analyzes word-frequency and domain family size distributions across various genomes, and the distribution of the potential ”hot-spots” for segmental duplications in the human genome, and hypothesize and test that such a pattern is the result of a generic dominating evolution mechanism, ”evolution by duplication”.
Abstract: Is there a unifying principle in biology that can be deciphered from the structures of all the present day genomes? Can one hope to determine the general structure and organization of cellular information as well as elucidate the underlying evolutionary dynamics, by systematically exploring various statistical characteristics of the genomic and proteomic data at many levels? A large-scale software and hardware system, Valis, developed by NYU bioinformatics group, aims to do just that. We analyze word-frequency and domain family size distributions across various genomes, and the distribution of the potential ”hot-spots” for segmental duplications in the human genome. We hypothesize and test, by computational analysis, that such a pattern is the result of a generic dominating evolution mechanism, ”evolution by duplication”, originally suggested by Susumu Ohno. We examine what implications these duplications may have in determining the translocations of male-biased genes from sex chromosomes, genome structure of sex chromosomes, copy-number fluctuations (amplifications of oncogenes and homozygous or homozygous deletions of tumor suppressor genes) in cancer genomes, etc. We examine, through our explorations with Valis, how important a role information technology is likely to play in elucidating biology at all levels. T.M. Pinkston and V.K. Prasanna (Eds.): HiPC 2003, LNCS 2913, p. 1, 2003. c © Springer-Verlag Berlin Heidelberg 2003 T.M. Pinkston and V.K. Prasanna (Eds.): HiPC 2003, LNCS 2913, pp. 2–11, 2003. © Springer-Verlag Berlin Heidelberg 2003 Performance Analysis of Blue Gene/L Using Parallel Discrete Event Simulation Ed Upchurch, Paul L. Springer, Maciej Brodowicz, Sharon Brunett, and T.D. Gottschalk Center for Advanced Computing Research, California Institute of Technology 1200 E. California Boulevard, Pasadena, CA 91125, etu@cacr.caltech.edu Abstract. High performance computers currently under construction, such as IBM’s Blue Gene/L, consisting of large numbers (64K) of low cost processing elements with relatively small local memories (256MB) connected via relatively low bandwidth (0.375 Bytes/FLOP) low cost interconnection networks promise exceptional cost-performance for some scientific applications. Due to the large number of processing elements and adaptive routing networks in such systems, performance analysis of meaningful application kernels requires innovative methods. This paper describes a method that combines application analysis, tracing and parallel discrete event simulation to provide early performance prediction. Specifically, results of performance analysis of a Lennard-Jones Spatial (LJS) Decomposition molecular dynamics benchmark code for Blue Gene/L are given. High performance computers currently under construction, such as IBM’s Blue Gene/L, consisting of large numbers (64K) of low cost processing elements with relatively small local memories (256MB) connected via relatively low bandwidth (0.375 Bytes/FLOP) low cost interconnection networks promise exceptional cost-performance for some scientific applications. Due to the large number of processing elements and adaptive routing networks in such systems, performance analysis of meaningful application kernels requires innovative methods. This paper describes a method that combines application analysis, tracing and parallel discrete event simulation to provide early performance prediction. Specifically, results of performance analysis of a Lennard-Jones Spatial (LJS) Decomposition molecular dynamics benchmark code for Blue Gene/L are given.

26 citations


Cites background or methods or result from "Digital Systems Testing and Testabl..."

  • ...In order to avoid the wakeup of empty entries placed in turned on banks, we assume that the wakeup is gated in each individual entry that is empty [8]....

    [...]

  • ...As previous works [8][6], no compaction mechanism for the issue queue has been assumed since compaction results in a significant amount of extra energy consumption....

    [...]

  • ...It has been reported in previous studies of web servers [3], proxy servers [8] and World Cup ’98 Characterization [2] that mean and median document transfer sizes are quite small, fewer than 13K and 3K bytes respectively....

    [...]

  • ...We compare this scheme with the approach in [8], and show that the proposed technique provides significant advantages....

    [...]

  • ...Section 3 describes the proposed technique and the mechanism proposed in [8], which is used for comparison purposes....

    [...]

Journal ArticleDOI
TL;DR: This paper proposes an approach for automatic extraction of word-level model constraints from the behavioral verilog HDL description and demonstrates the effectiveness of the approach by automatically generating the constraint models for an exclusive-shared-invalid multiprocessor cache coherency model and the 16-bit DLX-architecture from their respective Verilog-based behavioral models.
Abstract: With the emergence of complex high-performance microprocessors, functional test generation has become a crucial design step. Constraint-based test generation is a well-studied directed behavioral level functional test generation paradigm. The paradigm involves conversion of a given circuit model into a set of constraints and employing constraint solvers to generate tests for it. However, automatic extraction of constraints from a given behavioral hardware design language (HDL) model remained a challenging open problem. This paper proposes an approach for automatic extraction of word-level model constraints from the behavioral verilog HDL description. The scenarios to be tested are also expressed as constraints. The model and the scenario constraints are solved together using an integer solver to arrive at the necessary functional test. The effectiveness of the approach is demonstrated by automatically generating the constraint models for: 1) an exclusive-shared-invalid multiprocessor cache coherency model and 2) the 16-bit DLX-architecture, from their respective Verilog-based behavioral models. Experimental results that generate test vectors for high level scenarios like pipeline hazards, cache miss, etc., spanning over multiple time-frames are presented.

26 citations


Cites background or methods from "Digital Systems Testing and Testabl..."

  • ...3) Modeling Sequential Circuits: Every sequential circuit can be represented by the conventional Huffman model [ 3 ]....

    [...]

  • ...Previous work reported in the literature for the DBFTG may be broadly classified into two, namely, the conventional automatic test pattern generation (ATPG) [ 3 ]-based approaches and the constraint-based approaches....

    [...]

  • ...Classical ATPG methods [ 3 ] work at gate-level representations of the design and hence exhibit less scalability with increasing design size....

    [...]

  • ...The behavior of a sequential circuit over time frames can be modeled as a combinational circuit using the conventional time frame expansion approach, which unrolls the combinational part of , times [ 3 ]....

    [...]

  • ...The previous technique is efficient only for circuit models in which: 1) the data and control paths are separate and 2) have design for testability (DFT) [ 3 ] support....

    [...]