scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Accelerating functional simulation for processor based designs

R. Klein1, T. Piekarz1
20 Jul 2005-pp 323-328
TL;DR: It is shown how existing software code bases can be used to reduce the time to develop and execute verification tests, and how these techniques can be applied to both unit and system level verification.
Abstract: Design verification is taking an increasing proportion of the design cycle of system-on-chip (SoC) designs. Designers spend up to 70% of their time developing and running tests to verify the functionality of their systems based in 2004 IC/ASIC Functional Verification Study (2005). Running regression suites against the design can take up to several years of CPU time to complete. In this paper we show how existing software code bases can be used to reduce the time to develop and execute verification tests. These techniques can be applied to both unit and system level verification. As shown in various hardware/software co-verification tools as stated in R. Klein (1996) and M. Stanbro (1998), the overall load on a simulation can be reduced by eliminating code and data references from the set of bus cycles generated by the processor model. These same techniques can be applied by hardware designers and verification engineers to use firmware, hardware diagnostics, and other software as a basis for creating functional verification tests. This software is often available from prior versions of the design or other groups on the design team. Simulation run-times can be reduced by partitioning the processor's memory space between the logic simulation and the processor model.
Citations
More filters
Proceedings ArticleDOI
20 May 2007
TL;DR: A method is presented, which identifies possible faulty regions in a combinational circuit, based on its input/output behavior and independent of a fault model, and shows the effectiveness of the approach through experiments with benchmark and industrial circuits.
Abstract: Diagnosis is essential in modern chip production to increase yield, and debug constitutes a major part in the pre-silicon development process. For recent process technologies, defect mechanisms are increasingly complex, and continuous efforts are made to model these defects by using sophisticated fault models. Traditional static approaches for debug and diagnosis with a simplified fault model are more and more limited. In this paper, a method is presented, which identifies possible faulty regions in a combinational circuit, based on its input/output behavior and independent of a fault model. The new adaptive, statistical approach combines a flexible and powerful effect-cause pattern analysis algorithm with high-resolution ATPG. We show the effectiveness of the approach through experiments with benchmark and industrial circuits.

92 citations


Cites background from "Accelerating functional simulation ..."

  • ...Estimates today are that more than 70% of the total design time is on verification [1], [2]....

    [...]

Journal ArticleDOI
TL;DR: A method is presented, which identifies possible faulty regions in a combinational circuit, based on its input/output behavior and independent of a fault model, and shows the effectiveness of the approach through experiments with benchmark and industrial circuits.
Abstract: Diagnosis is essential in modern chip production to increase yield, and debug constitutes a major part in the pre-silicon development process. For recent process technologies, defect mechanisms are increasingly complex, and continuous efforts are made to model these defects by using sophisticated fault models. Traditional static approaches for debug and diagnosis with a simplified fault model are more and more limited. In this paper, a method is presented, which identifies possible faulty regions in a combinational circuit, based on its input/output behavior and independent of a fault model. The new adaptive, statistical approach is named POINTER for `Partially Overlapping Impact couNTER' and combines a flexible and powerful effect-cause pattern analysis algorithm with high-resolution ATPG. We show the effectiveness of the approach through experiments with benchmark and industrial circuits. In addition, even without additional patterns this analysis method provides good resolution for volume diagnosis, too.

74 citations

Book ChapterDOI
01 Jan 2010
TL;DR: This chapter establishes a generalized fault modeling technique and notation, describes and classify existing models, and investigates the properties of a fault model independent diagnosis technique.
Abstract: To cope with the numerous defect mechanisms in nanoelectronic technology, more and more complex fault models have been introduced. Each model comes with its own properties and algorithms for test generation and logic diagnosis. In diagnosis, however, the defect mechanisms of a failing device are not known in advance, and algorithms that assume a specific fault model may fail. Therefore, diagnosis techniques have been proposed that relax fault assumptions or even work without any fault model. In this chapter, we establish a generalized fault modeling technique and notation. Based on this notation, we describe and classify existing models and investigate the properties of a fault model independent diagnosis technique.

23 citations

Proceedings ArticleDOI
17 Jan 2006
TL;DR: An overview of the common aspects of design, verification and test of integrated circuits with millions of gates and how they interact is given.
Abstract: Design, verification and test of integrated circuits with millions of gates put strong requirements on design time, test volume, test application time, test speed and diagnostic resolution. In this paper, an overview is given on the common aspects of these tasks and how they interact. Diagnosis techniques may be used after manufacturing, for chip characterization and field return analysis, and even for rapid prototyping.

5 citations


Cites background from "Accelerating functional simulation ..."

  • ...Other approaches [2], [7], [8], [9], [10], [11] have considered accelerating the test benches, by mapping them partially or fully into hardware, as a means to improve the efficiency of test benches and speedup the verification process....

    [...]

  • ...of the total design time is on verification [1], [2]....

    [...]

01 Jan 2008
TL;DR: A method for describing microprocessors as a special case of digital systems is explained and modeling faults with High-Level Decision Diagrams (HLDD) is presented.
Abstract: Automated test generation for digital systems encompasses three activities: selecting a description method, developing a fault model and generating tests to detect the faults covered by the fault model. The efficiency of test generation (quality, speed) is highly depending on the description method and fault models. As the complexity of digital systems continues to increase, the gate level test generation methods have become obsolete. Promising approaches are high-level methods. In this paper, a method for describing microprocessors as a special case of digital systems is explained and modeling faults with High-Level Decision Diagrams (HLDD) is presented. HLDDs serve as a basis for a general theory of test generation for mixed-level representations of systems, similarly as we have Boolean algebra for logic-level. HLDDs can be used for representing systems uniformly either at logic-level, high- level or simultaneously at both levels. The fault model on HLDDs represents a generalization of the classical gate- level stuck-at fault model to higher levels - the latter was defined for Boolean expressions whereas the former is defined for nodes in HLDDs having more general interpretation.

4 citations


Cites background or methods from "Accelerating functional simulation ..."

  • ...In one or another way, this idea is implemented in different high-level fault models for different classes of digital systems....

    [...]

  • ...The main idea of the high-level fault modeling is to obtain an incorrect version of the system from the high-level description by introducing a fault into the description....

    [...]

References
More filters
Journal ArticleDOI
01 Jan 1998
TL;DR: Integrated circuits will lead to such wonders as home computers or at least terminals connected to a central computer, automatic controls for automobiles, and personal portable communications equipment as mentioned in this paper. But the biggest potential lies in the production of large systems.
Abstract: The future of integrated electronics is the future of electronics itself. The advantages of integration will bring about a proliferation of electronics, pushing this science into many new areas. Integrated circuits will lead to such wonders as home computers—or at least terminals connected to a central computer—automatic controls for automobiles, and personal portable communications equipment. The electronic wristwatch needs only a display to be feasible today. But the biggest potential lies in the production of large systems. In telephone communications, integrated circuits in digital filters will separate channels on multiplex equipment. Integrated circuits will also switch telephone circuits and perform data processing. Computers will be more powerful, and will be organized in completely different ways. For example, memories built of integrated electronics may be distributed throughout the machine instead of being concentrated in a central unit. In addition, the improved reliability made possible by integrated circuits will allow the construction of larger processing units. Machines similar to those in existence today will be built at lower costs and with faster turnaround.

9,647 citations

Journal Article
TL;DR: Integrated circuits will lead to such wonders as home computers or at least terminals connected to a central computer, automatic controls for automobiles, and personal portable communications equipment as discussed by the authors. But the biggest potential lies in the production of large systems.
Abstract: The future of integrated electronics is the future of electronics itself. The advantages of integration will bring about a proliferation of electronics, pushing this science into many new areas. Integrated circuits will lead to such wonders as home computers—or at least terminals connected to a central computer—automatic controls for automobiles, and personal portable communications equipment. The electronic wristwatch needs only a display to be feasible today. But the biggest potential lies in the production of large systems. In telephone communications, integrated circuits in digital filters will separate channels on multiplex equipment. Integrated circuits will also switch telephone circuits and perform data processing. Computers will be more powerful, and will be organized in completely different ways. For example, memories built of integrated electronics may be distributed throughout the machine instead of being concentrated in a central unit. In addition, the improved reliability made possible by integrated circuits will allow the construction of larger processing units. Machines similar to those in existence today will be built at lower costs and with faster turnaround.

6,077 citations

Proceedings ArticleDOI
22 Jul 2003
TL;DR: A novel approach for functional coverage analysis automation supported by a tool that automates the coverage analysis at the functional level and is able to provide a quantitative evaluation of test suites developed to exercise the functionality defined in an executable specification.
Abstract: This paper presents a novel approach for functional coverage analysis automation. It is well known that functional verification is a real bottleneck in any digital design development. Consequently, it is necessary to develop new methodologies to increase the quality of functional verification. A metric that measures the functional coverage is specific to each design, and it depends on its functional requirements. Hence, we propose a methodology supported by a tool that automates the coverage analysis at the functional level. Our tool takes as entry a standard executable specification and generates test bench components aimed at performing a functional coverage analysis on a specific design. We use functional metrics as parameters in our tool and apply theses metrics on an executable specification. Using our methodology, we are able to provide a quantitative evaluation of test suites developed to exercise the functionality defined in an executable specification. The application of these test suites on a RTL design improves error detection, through a better exploration of the design. It also increases the degree of confidence in the design.

11 citations

Journal ArticleDOI
01 Feb 2000
TL;DR: A novel method that combines constraints and input biasing in automatic bit-vector generation for block-level functional verification of digital designs, which is implemented in a tool called SimGen, and the application of SimGen to a set of commercial design blocks is discussed.
Abstract: Constraining and biasing are frequently used techniques to enhance the quality of randomized vector generation. In this paper, we present a novel method that combines constraints and input biasing in automatic bit-vector generation for block-level functional verification of digital designs, which is implemented in a tool called SimGen. Vector generation in SimGen is confined to a legal input space that is defined by constraints symbolically represented in Binary Decision Diagrams (BDDs). A constraint involving state variables in the design defines a state-dependent legal input space. Input biasing can also depend on the state of the design. The effect of constraints and input biasing are combined to form what we called the constrained probabilities of input vectors. An algorithm is developed to efficiently generate input vectors on-the-fly during simulation. The vector generation is a one-pass process, i.e., no backtracking or retry is needed. Also, we describe methods of minimizing the constraint BDDs in an effort to reduce the simulation-time overhead of SimGen. Furthermore we discuss the application of SimGen to a set of commercial design blocks.

4 citations


"Accelerating functional simulation ..." refers methods in this paper

  • ...So the designer is still left with the problem of creating stimulus for the design Designers can use software running on a processor as a part of the stimulus for a verification simulation....

    [...]