scispace - formally typeset
Search or ask a question

Showing papers on "Design for testing published in 2007"


Proceedings ArticleDOI
03 Jun 2007
TL;DR: An open source, variation aware Process Design Kit (PDK), based on Scalable CMOS design rules, down to 45 nm, for use in VLSI research, education and small businesses is discussed.
Abstract: This paper discusses an open source, variation aware Process Design Kit (PDK), based on Scalable CMOS design rules, down to 45 nm,for use in VLSI research, education and small businesses. This kit includes all the necessary layout design rules and extraction command decks to capture layout dependent systematic variation and perform statistical circuit analysis. The kit also includes a standard cell and pad library with the necessary support files to enable full chip place and route and verification for System on Chip designs. Test chips designed with this PDK are designed in such a way so that they can be fabricated by fabrication facilities allowing validation of the design rules so that the rules may be used in future multi-project runs and design contests.

376 citations


Journal ArticleDOI
TL;DR: This paper proposes a technique, called Lock & Key, to neutralize the potential for scan-based side-channel attacks by providing a flexible security strategy to modern designs without significant changes to scan test practices.
Abstract: Traditionally, the only standard method of testing that has consistently provided high fault coverage has been scan test due to the high controllability and high observability this technique provides. The scan chains used in scan test not only allow test engineers to control and observe a chip, but these properties also allow the scan architecture to be used as a means to breach chip security. In this paper, we propose a technique, called Lock & Key, to neutralize the potential for scan-based side-channel attacks. It is very difficult to implement an all inclusive security strategy, but by knowing the attacker, a suitable strategy can be devised. The Lock & Key technique provides a flexible security strategy to modern designs without significant changes to scan test practices. Using this technique, the scan chains are divided into smaller subchains. With the inclusion of a test security controller, access to subchains are randomized when being accessed by an unauthorized user. Random access reduces repeatability and predictability making reverse engineering more difficult. Without proper authorization, an attacker would need to unveil several layers of security before gaining proper access to the scan chain in order to exploit it. The proposed Lock & Key technique is design independent while maintaining a relatively low area overhead.

164 citations


Book
20 Nov 2007
TL;DR: This book is a comprehensive guide to new VLSI Testing and Design-for-Testability techniques that will allow students, researchers, DFT practitioners, and V LSI designers to master quickly System-on-Chip Test architectures, for test debug and diagnosis of digital, memory, and analog/mixed-signal designs.
Abstract: Modern electronics testing has a legacy of more than 40 years. The introduction of new technologies, especially nanometer technologies with 90nm or smaller geometry, has allowed the semiconductor industry to keep pace with the increased performance-capacity demands from consumers. As a result, semiconductor test costs have been growing steadily and typically amount to 40% of today's overall product cost. This book is a comprehensive guide to new VLSI Testing and Design-for-Testability techniques that will allow students, researchers, DFT practitioners, and VLSI designers to master quickly System-on-Chip Test architectures, for test debug and diagnosis of digital, memory, and analog/mixed-signal designs. KEY FEATURES * Emphasizes VLSI Test principles and Design for Testability architectures, with numerous illustrations/examples. * Most up-to-date coverage available, including Fault Tolerance, Low-Power Testing, Defect and Error Tolerance, Network-on-Chip (NOC) Testing, Software-Based Self-Testing, FPGA Testing, MEMS Testing, and System-In-Package (SIP) Testing, which are not yet available in any testing book. * Covers the entire spectrum of VLSI testing and DFT architectures, from digital and analog, to memory circuits, and fault diagnosis and self-repair from digital to memory circuits. * Discusses future nanotechnology test trends and challenges facing the nanometer design era; promising nanotechnology test techniques, including Quantum-Dots, Cellular Automata, Carbon-Nanotubes, and Hybrid Semiconductor/Nanowire/Molecular Computing. * Practical problems at the end of each chapter for students.

151 citations


Journal ArticleDOI
TL;DR: The described method uses ambiguity set concept and evolutionary computations to determine the optimal set of analog test points and the efficiency of the technique is compared with a method based on entropy index.
Abstract: A new approach to an optimal analog test points selection is proposed. The described method uses ambiguity set concept and evolutionary computations to determine the optimal set of analog test points. After a brief introduction to analog testing and genetic algorithms, the proposed strategy is explained. The presented evolutionary approach is illustrated by a practical example of analog circuit and by a series of hypothetical circuits. The efficiency of the technique is compared with a method based on entropy index, and the obtained results are discussed

127 citations


Proceedings ArticleDOI
01 Oct 2007
TL;DR: The design for pre-bond testability enables the structural test necessary to continue 3D integration for microprocessors beyond a few layers, and is implemented at a minimal cost.
Abstract: Die stacking is a promising new technology that enables integration of devices in the third dimension. Recent research thrusts in 3D-integrated microprocessor design have demonstrated significant improvements in both power consumption and performance. However, this technology is currently being held back due to the lack of test technology. Because processor functionality is partitioned across different silicon die layers, only partial circuitry exists on each layer pre-bond. In current 3D manufacturing, layers in the die stack are simply bonded together to form the complete processor; no testing is performed at the pre-bond stage. Such a strategy leads to an exponential decay in the yield of the final product and places an economic limit on the number of die that can be stacked. To overcome this limit, pre-bond test is a necessity. In this paper, we present a technique to enable pre-bond test in each layer. Further, we address several issues with integrating this new test hardware into the final design. Finally, we use a sample 3D floorplan based on the Alpha 21264 to show that our technique can be implemented at a minimal cost (0.2% area overhead). Our design for pre-bond testability enables the structural test necessary to continue 3D integration for microprocessors beyond a few layers.

126 citations


Proceedings ArticleDOI
06 May 2007
TL;DR: This paper presents VIm-Scan: a low overhead scan design methodology that maintains all the advantages of a traditional scan-based testing yet prevents secure key extraction through the scan out process.
Abstract: Scan-based DFT enhances the testability of a system by making its internal nodes more observable and controllable. However, in case of a secure chip, scan chain increases its vulnerability to attack, where the attacker can extract secret information by scanning out states of internal nodes. This paper presents VIm-Scan: a low overhead scan design methodology that maintains all the advantages of a traditional scan-based testing yet prevents secure key extraction through the scan out process. Experimental results show that the proposed approach entails significantly lesser design overhead (~5times reduction in number of additional gates) with comparable or better protection against attack than existing techniques.

103 citations


Proceedings ArticleDOI
Srivaths Ravi1
01 Oct 2007
TL;DR: Concerns and challenges in power-aware test are highlighted, various practices drawn from both academia and industry are surveyed, and critical gaps that need to be addressed in the future are pointed out.
Abstract: Power-aware test is increasingly becoming a major manufacturing test consideration due to the problems of increased power dissipation in various test modes as well as test implications that arise due to the usage of various low-power design technologies in devices today. Several challenges emerge for test engineers and test tool developers, including (and not restricted to) understanding of various concerns associated with power-aware test, development of power-aware design-for-test (DFT), automatic test pattern generation (ATPG) techniques, and test power analysis flows, evaluation of their efficacy and ensuring easy/rapid deployment. This paper highlights concerns and challenges in power-aware test, surveys various practices drawn from both academia and industry, and points out critical gaps that need to be addressed in the future.

98 citations


Book
04 Dec 2007
TL;DR: In this paper, the authors present a comprehensive guide to new VLSI testing and design-for-Testability techniques that will allow students, researchers, DFT practitioners, and VTLI designers to master quickly System-on-Chip Test architectures, for test debug and diagnosis of digital, memory, and analog/mixed-signal designs.
Abstract: Modern electronics testing has a legacy of more than 40 years. The introduction of new technologies, especially nanometer technologies with 90nm or smaller geometry, has allowed the semiconductor industry to keep pace with the increased performance-capacity demands from consumers. As a result, semiconductor test costs have been growing steadily and typically amount to 40 per cent of today's overall product cost. This book is a comprehensive guide to new VLSI Testing and Design-for-Testability techniques that will allow students, researchers, DFT practitioners, and VLSI designers to master quickly System-on-Chip Test architectures, for test debug and diagnosis of digital, memory, and analog/mixed-signal designs.It emphasizes VLSI Test principles and Design for Testability architectures, with numerous illustrations/examples. It has the most up-to-date coverage available, including Fault Tolerance, Low-Power Testing, Defect and Error Tolerance, Network-on-Chip (NOC) Testing, Software-Based Self-Testing, FPGA Testing, MEMS Testing, and System-In-Package (SIP) Testing, which are not yet available in any testing book. It covers the entire spectrum of VLSI testing and DFT architectures, from digital and analog, to memory circuits, and fault diagnosis and self-repair from digital to memory circuits. It discusses future nanotechnology test trends and challenges facing the nanometer design era; promising nanotechnology test techniques, including Quantum-Dots, Cellular Automata, Carbon-Nanotubes, and Hybrid Semiconductor/Nanowire/Molecular Computing. It includes practical problems at the end of each chapter for students.

97 citations


Proceedings ArticleDOI
01 Oct 2007
TL;DR: The results show that both MAP3D and VIA3D approaches require no changes of 2D scan chain algorithms, but OPT3D can achieve the best wire length reduction for the scan chain design.
Abstract: Scan chains are widely used to improve the testability of IC designs. In traditional 2D IC designs, various design techniques on the construction of scan chains have been proposed to facilitate DFT (Design-For-Test). Recently, three-dimensional (3D) technologies have been proposed as a promising solution to continue technology scaling. In this paper, we study the scan chain construction for 3D ICs, examining the impact of 3D technologies on scan chain ordering. Three different 3D scan chain design approaches (namely, VIA3D, MAP3D, and OPT3D) are proposed and compared, with the experimental results for ISCAS89 benchmark circuits. The advantages as well as disadvantages for each approach are discussed. The results show that both MAP3D and VIA3D approaches require no changes of 2D scan chain algorithms, but OPT3D can achieve the best wire length reduction for the scan chain design. The average scan chain wire length of six ISCAS89 benchmarks obtained from OPT3D has 46.0% reduction compared to the 2D scan chain design. To the best of our knowledge, this is the first study on scan chain design for 3D integrated circuits.

68 citations


Proceedings ArticleDOI
01 Oct 2007
TL;DR: This work presents a combinational scan compression method that preserves the low-impact advantages of traditional scan compression, while also allowing any number and distribution of Xs with virtually no loss of test quality.
Abstract: Traditional scan and, more recently, scan compression are increasingly accepted for reducing test cost and improving quality in ever more complex designs. Combinational scan compression techniques are attractive for their low impact on area, timing and design flow, but are best suited for designs with a limited number of unknowns (Xs). However, recent design performance and cost tradeoffs create a much higher density of Xs than previously expected. We present a combinational scan compression method that preserves the low-impact advantages, while also allowing any number and distribution of Xs with virtually no loss of test quality. Results on industrial designs with a varied density of Xs demonstrate consistent data and test time compressions with negligible impact on all design parameters.

63 citations


Proceedings ArticleDOI
04 Jun 2007
TL;DR: A novel low power test scheme integrated with the embedded deterministic test environment reduces significantly switching rates in scan chains with minimal hardware modification and shows clear results obtained for industrial circuits.
Abstract: The paper presents a novel low power test scheme integrated with the embedded deterministic test environment. It reduces significantly switching rates in scan chains with minimal hardware modification. Experimental results obtained for industrial circuits clearly indicate that switching activity can be reduced up to 150 times along with improved compression ratios.

Proceedings ArticleDOI
06 May 2007
TL;DR: The authors present a scan compression method designed for minimal impact in all aspects: area overhead, timing, and design flow, easily adopted on top of existing scan designs and fully integrated in the scan synthesis and test generation flows.
Abstract: Scan is widely accepted as the basis for reducing test cost and improving quality, however its effectiveness is compromised by increasingly complex designs and fault models that can result in high scan data volume and application time. The authors present a scan compression method designed for minimal impact in all aspects: area overhead, timing, and design flow. Easily adopted on top of existing scan designs, the method is fully integrated in the scan synthesis and test generation flows. Data and test time compressions of over 10times were obtained on industrial designs with negligible overhead and no impact on schedule.

Journal ArticleDOI
TL;DR: Oscillation-based built-in self-test (OBIST) methodology for testing analog components in mixed-signal circuits is implemented in this paper and is utilized for on-chip generation of oscillatory responses corresponding to the analog-circuit components.
Abstract: This paper aims to develop an approach to test analog and mixed-signal embedded-core-based system-on-chips (SOCs) with built-in hardware. In particular, oscillation-based built-in self-test (OBIST) methodology for testing analog components in mixed-signal circuits is implemented in this paper. The proposed OBIST structure is utilized for on-chip generation of oscillatory responses corresponding to the analog-circuit components. A major advantage of the OBIST method is that it does not require stimulus generators or complex response analyzers, which makes it suitable for testing analog circuits in mixed-signal SOC environments. Extensive simulation results on sample analog and mixed-signal benchmark circuits and other circuits described by netlist in HSPICE format are provided to demonstrate the feasibility, usefulness, and relevance of the proposed implementations

Proceedings ArticleDOI
06 May 2007
TL;DR: The authors show that glitching activity on nodes must be considered in order to correctly handle constraints on instantaneous peak power and include a power profiler that can analyze a pattern source for violations and a PODEM-based pattern generation engine for generating power-safe patterns.
Abstract: Excessive dynamic voltage drop in the power supply rails during test mode is known to result in false failures and impact yield when testing devices that use low-cost wire-bond packages. Identifying and debugging such test failures is a complex and effort-intensive process, especially when scan compression is involved. From a design cycle-tune view point, it is best to avoid this problem by generating "power-safe" scan patterns. The generation of power-safe patterns must take into consideration the DFT architecture, physical design, tuning and power constraints. In this paper, the authors propose such a framework and show experimental results on some benchmark circuits. The framework can address a non-uniform power grid and region-based power constraints. The authors show that glitching activity on nodes must be considered in order to correctly handle constraints on instantaneous peak power. The framework includes a power profiler that can analyze a pattern source for violations and a PODEM-based pattern generation engine for generating power-safe patterns.

Proceedings ArticleDOI
06 May 2007
TL;DR: A novel low power test scheme integrated with the embedded deterministic test environment reduces switching rates in scan chains with no hardware modification and results obtained indicate that switching activity can be reduced up to 23 times.
Abstract: This paper presents a novel low power test scheme integrated with the embedded deterministic test environment. It reduces switching rates in scan chains with no hardware modification. Experimental results obtained for industrial circuits indicate that switching activity can be reduced up to 23 times.

Proceedings ArticleDOI
20 May 2007
TL;DR: This embedded tutorial introduces the topic of low power test and it overviews the basic techniques and some recent advancements in this field.
Abstract: Excessive power during test affects the reliability of digital integrated circuits, test throughput and manufacturing yield. Numerous low power test methods have been investigated over the past decade and new power-aware automatic test pattern generation, design-for-test and test planning techniques have emerged. This embedded tutorial introduces the topic of low power test and it overviews the basic techniques and some recent advancements in this field.

Proceedings ArticleDOI
16 Apr 2007
TL;DR: An overview of the most important design features of the new full-custom embedded ripple-through FIFO module is given and its test and design-for-test approach is described.
Abstract: Embedded First-In First-Out (FIFO) memories are increasingly used in many IC designs. We have created a new full-custom embedded ripple-through FIFO module with asynchronous read and write clocks. The implementation is based on a micropipeline architecture and is at least a factor two smaller than SRAM-based and standard-cell-based counterparts. This paper gives an overview of the most important design features of the new FIFO module and describes its test and design-for-test approach.

Proceedings ArticleDOI
01 Oct 2007
TL;DR: The proposed test generation algorithm creates a complete test set that guarantees each defective scan cell has unique failing behavior and is extended to handle multiple failing scan chains and designs with embedded scan compression logic.
Abstract: In this paper, we present a test generation algorithm to improve scan chain failure diagnosis resolution. The proposed test generation algorithm creates a complete test set that guarantees each defective scan cell has unique failing behavior. This algorithm handles stuck-at fault and timing fault models. Problems and solutions that may happen in practical usage are discussed. We further extend the test generation algorithm to handle multiple failing scan chains and designs with embedded scan compression logic. Experimental results show the effectiveness of the proposed diagnostic test generation algorithm.

Proceedings ArticleDOI
16 Apr 2007
TL;DR: The proposed DfT solution for modern serial-link transceivers addresses the testability and observability issues of the transceiver for both characterization and production testing and has significant higher fault coverage and lower hardware requirement than the conventional approach.
Abstract: This paper describes a DfT solution for modern serial-link transceivers. We first summarize the architectures of the Crosstalk Canceller and the Equalizer used in advanced transceivers to which the proposed solution can be applied. The solution addresses the testability and observability issues of the transceiver for both characterization and production testing. Without using sophisticated testing instrument setting, the proposed solution could test the clock and data recovery circuit and characterize the decision-feedback equalizer in the receiver. Our experiments demonstrate that the proposed method has significant higher fault coverage and lower hardware requirement than the conventional approach of probing the eye-opening of the signals inside the transceiver.

Proceedings ArticleDOI
01 Oct 2007
TL;DR: The P1687 IJTAG standard working group was created to investigate formalizing and standardizing this use of the 1149.1 TAP and TAP controller - and after a little more than a years worth of effort has produced a hardware architecture proposal that is currently being used as a strawman to investigate and develop the description and protocol language effort.
Abstract: In recent times, the 1149.1 TAP and TAP controller have begun to play a more important role in accessing embedded logic that is not specifically limited to 1149.1's scope as a board-test standard. This logic, referred to by the generic term instruments, includes manufacturing test and design-for-test (DFT) logic; design-for-debug/diagnosis (DFD) logic; design-for-yield (DFY) logic and monitors; and in-system verification logic (such as hardware assertions). The P1687 IJTAG standard working group was created to investigate formalizing and standardizing this use of the 1149.1 controller - and after a little more than a years worth of effort has produced a hardware architecture proposal that is currently being used as a strawman to investigate and develop the description and protocol language effort.

Patent
04 Sep 2007
TL;DR: In this paper, a testability solution for testing power switches is presented, where a comparator (230) having a first input coupled to a reference signal source (215) and having a second input coupled with a node (225) between the one or more switches (115) and the functional block (130) is used to evaluate the behaviour of the switches based on the reference signal and a signal from the node.
Abstract: An integrated circuit (200) comprises a functional block (130) conductively coupled to a supply rail (110) via one or more switches (115). The IC further comprises selection means (220) responsive to a test enable signal for activating the one or more switches (115) in a test mode of the IC and evaluation means such as a comparator (230) having a first input coupled to a reference signal source (215) and having a second input coupled to a node (225) between the one or more switches (115) and the functional block (130) for evaluating the behaviour of the one or more switches (115) based on the reference signal and a signal from the node (225). Thus, the present invention provides a design for testability solution for testing power switches.

Proceedings ArticleDOI
16 Apr 2007
TL;DR: In this article, a self-test scheme for analog and mixed-signal devices based on die-level process monitoring is proposed, which provides a reliable method for early identification of excessive process parameter variations in production tests that allows quickly discarding of the faulty circuits.
Abstract: This paper reports a new built-in self-test scheme for analog and mixed-signal devices based on die-level process monitoring. The objective of this test is not to replace traditional specification-based tests, but to provide a reliable method for early identification of excessive process parameter variations in production tests that allows quickly discarding of the faulty circuits. Additionally, the possibility of on-chip process deviation monitoring provides valuable information, which is used to guide the test and to allow the estimation of selected performance figures. The information obtained through guiding and monitoring process variations is re-used and supplement the circuit calibration

Proceedings ArticleDOI
R. Molyneaux1, T. Ziaja1, Hong Kim1, S. Aryani1, Sungbae Hwang1, A. Hsieh1 
01 Oct 2007
TL;DR: The Niagara2 System-on-Chip is SUN Microsystem's latest processor in the Eco-sensitive CoolThreads line of multi-threaded servers and the RAWWCas memory test, a Hybrid Flop Design and a fast efficient bitmapping architecture called DMO are introduced.
Abstract: The Niagara2 System-on-Chip is SUN Microsystem's latest processor in the Eco-sensitive CoolThreads line of multi-threaded servers. This DFT survey of the Niagara2 chip introduces the RAWWCas memory test, a Hybrid Flop Design and a fast efficient bitmapping architecture called DMO. It also showcases some excellent DFT results for this challenging system-on-chip design project.

Proceedings ArticleDOI
01 Jan 2007
TL;DR: A new approach is proposed that both qualifies and improves the functional verification process of VHDL descriptions, and a heuristic is used to automatically improve IPs validation data.
Abstract: The level of confidence in a VHDL description directly depends on the quality of its verification. This quality can be evaluated by mutation-based test, but the improvement of this quality requires tremendous efforts. In this paper, we propose a new approach that both qualifies and improves the functional verification process. First, we qualify test cases thanks to the mutation testing metrics: faults are injected in the design under verification (DUV) (making DUV's mutants) to check the capacity of test cases to detect theses mutants. Then, a heuristic is used to automatically improve IPs validation data. Experimental results obtained on RTL descriptions from ITC'99 benchmark show how efficient is our approach.

Journal ArticleDOI
TL;DR: A satisfiability (SAT)-based framework for automatically generating test programs that target gate-level stuck-at faults in microprocessors is presented and it is shown that adding design for testability (DFT) elements is equivalent to modifying these clauses such that the unsatisfiable segment becomes satisfiable.
Abstract: In this paper, we present a satisfiability (SAT)-based framework for automatically generating test programs that target gate-level stuck-at faults in microprocessors. The microarchitectural description of a processor is first translated into a unified register-transfer level (RTL) circuit description, called assignment decision diagram (ADD), for test analysis. Test generation involves extraction of justification/propagation paths in the unified circuit representation from an embedded module's input-output (I/O) ports to primary I/O ports, abstraction of RTL modules in the justification/propagation paths, and translation of these paths into Boolean clauses in conjunctive normal form (CNF). Additional clauses are added that capture precomputed test vectors/responses at the embedded module's I/O ports. An SAT solver is then invoked to find valid paths that justify the precomputed vectors to primary input ports and propagate the good/faulty responses to primary output ports. Since the ADD is derived directly from a microarchitectural description, the generated test sequences correspond to a test program. If a given SAT instance is not satisfiable, then Boolean implications (also known as the unsatisfiable segment) that are responsible for unsatisfiability are efficiently and accurately identified. We show that adding design for testability (DFT) elements is equivalent to modifying these clauses such that the unsatisfiable segment becomes satisfiable. Test generation at the RTL also imposes a large number of initial conditions that need to be satisfied for successful detection of targeted stuck-at faults. We demonstrate that application of the Boolean constraint propagation (BCP) engine in SAT solvers propagates these conditions leading to significant pruning of the sequential search space which in turn leads to a reduction in test generation time. Experimental results demonstrate an 11.1X speedup in test generation time for test generation at the RTL over a state-of-the-art gate-level sequential generator called MIX, at comparable fault coverages. An unsatisifiability-based DFT approach at the RTL improves this fault coverage to near 100% and incurs very low area overhead (3.1%). Unlike previous approaches that either generate a test program consisting of random instruction sequences or assume the existence of test program templates, the proposed approach constructs test programs in a deterministic fashion from the microarchitectural description of a processor

Journal ArticleDOI
TL;DR: A new two-pass hybrid method is proposed to design an efficient scan tree architecture based on approximate compatibility that outperforms the earlier methods in reducing test application time significantly without degrading fault coverage.
Abstract: Tree-based scan path architectures have recently been suggested for reducing test application time or test data volume in today's high-density very large scale integrated circuits. However, these techniques strongly rely on the existence of a large number of compatible sets of flip-flops under the given test set and therefore may not be suitable for a highly compact test set generated by an efficient automatic test pattern generator tool. Tree-based architectures also suffer from loss of fault coverage while achieving a significant reduction ratio for test time or data. In this paper, to circumvent this problem, a new two-pass hybrid method is proposed to design an efficient scan tree architecture based on approximate compatibility. The method is particularly suitable for a highly compact test set having fewer don't cares and low compatibility. Finally, to reduce the volume of scan-out data, test responses shifted out from the leaf nodes of the scan tree are compacted by a space compactor, which is designed specially for the proposed scan tree architecture. The compactor uses an XOR tree, and its overhead is low. The design thus offers a solution to both test data and response compaction. Experimental results on various benchmark circuits demonstrate that the proposed algorithm outperforms the earlier methods in reducing test application time significantly without degrading fault coverage.

Proceedings ArticleDOI
06 May 2007
TL;DR: A novel DFT technique based on multimode Illinois scan architecture (MILS) for low pin count test that simultaneously reduces test data volume and test application time is presented.
Abstract: The authors present a novel DFT technique based on multimode Illinois scan architecture (MILS) for low pin count test that simultaneously reduces test data volume and test application time. By using the proposed technique, significant savings in test data volume, and testing time can be obtained without modifying the clock tree of the design and with a very small combinational area overhead. Experimental results for two large industrial circuits show that the test data volume and test application time reduction of the order of 100times can be achieved in all cases with less than 1% area overhead over ILS.

Journal ArticleDOI
TL;DR: A design-for-digital-testability (DfDT) switched-capacitor circuit structure for testing Sigma-Delta modulators with digital stimuli is presented to reduce the overall testing cost and achieves many advantages including lower cost, higher fault coverage, higher measurement accuracy, and the capability of performing at-speed tests.
Abstract: A design-for-digital-testability (DfDT) switched-capacitor circuit structure for testing Sigma-Delta modulators with digital stimuli is presented to reduce the overall testing cost. In the test mode, the DfDT circuits are reconfigured as a one-bit digital-to-charge converter to accept a repetitively applied Sigma-Delta modulated bit-stream as its stimulus. The single-bit characteristic ensures that the generated stimulus is nonlinearity free. In addition, the proposed DfDT structure reuses most of the analog components in the test mode and keeps the same loads for the operational amplifiers as if they were in the normal mode. It thereby achieves many advantages including lower cost, higher fault coverage, higher measurement accuracy, and the capability of performing at-speed tests. A second-order Sigma-Delta modulator was designed and fabricated to demonstrate the effectiveness of the DfDT structure. Our experimental results show that the digital test is able to measure a harmonic distortion lower than -106 dBFS. Meanwhile, the dynamic range measured with the digital stimulus is as high as 84.4 dB at an over-sampling ratio of 128. The proposed DfDT scheme can be easily applied to other types of Sigma-Delta modulators, making them also digitally testable.

Proceedings ArticleDOI
06 Jan 2007
TL;DR: This paper presents low cost solution for implementing LOS tests by adding a small amount of logic in each flip-flop to align the slow scan enable signal to the clock edge to support full LOS and LOC testing.
Abstract: Scan based delay testing is currently mostly implemented using launch-on-capture (LOC) delay tests. Launch-on-shift (LOS) tests are generally more effective, achieving higher fault coverage with significantly fewer test vectors, but require a fast scan enable, which is not supported by most designs. The paper presents low cost solution for implementing LOS tests by adding a small amount of logic (six transistors) in each flip-flop to align the slow scan enable signal to the clock edge. The new design can support full LOS and LOC testing, achieving an average TDF coverage of 92.67% in this combined mode for the ISCAS89 benchmarks. Adding a second slow global scan enable signal also allows mixed LOC/LOS tests, which can further increase coverage up to 94.86% on average for ISCAS89 benchmarks

Proceedings ArticleDOI
29 Aug 2007
TL;DR: A new approach is proposed that both qualifies and improves the functional verification process of VHDL descriptions, and a heuristic is used to automatically improve IPs validation data.
Abstract: The level of confidence in a VHDL description directly depends on the quality of its verification. This quality can be evaluated by mutation-based test, but the improvement of this quality requires tremendous efforts. In this paper, we propose a new approach that both qualifies and improves the functional verification process. First, we qualify test cases thanks to the mutation testing metrics: faults are injected in the design under verification (DUV) (making DUV's mutants) to check the capacity of test cases to detect theses mutants. Then, a heuristic is used to automatically improve IPs validation data. Experimental results obtained on RTL descriptions from ITC'99 benchmark show how efficient is our approach.