Statistical timing for parametric yield prediction of digital integrated circuits
read more
Citations
First-order incremental block-based statistical timing analysis
High-performance CMOS variability in the 65-nm regime and beyond
VLSI Test Principles and Architectures: Design for Testability (Systems on Silicon)
First-Order Incremental Block-Based Statistical Timing Analysis
Mixture importance sampling and its application to the analysis of SRAM designs in the presence of rare failure events
References
Monte Carlo methods
Monte Carlo Methods
The Greatest of a Finite Set of Random Variables
Related Papers (5)
Frequently Asked Questions (14)
Q2. How can the authors reduce the computational burden of finding the ellipsoid?
The use of the path filtering as discussed in the section on path filtering can reduce the computational burden of finding the ellipsoid.
Q3. What is the reason for the fraction count(i)/2n?
The justification for this step is that for a small enough parallelepiped, the fraction count(i)/2n approximates the fraction of the time the given path is the worst slack path when points are sampled within the parallelepiped.
Q4. What is the common method for determining the criticality of a parallelepiped?
In the parallelepiped method, at the lowest level of recursion, the authors can determine for each parallelepiped for which paths the parallelepiped is infeasible.
Q5. What is the criticality of the parallelepiped method?
Since the parallelepiped method becomes infeasible at higher dimensions, the authors cannot use the method above to determine the criticality for each path at a given performance.
Q6. What is the function that returns the vertex of the parallelepiped that has the lowest?
In the algorithm, lowerLeft represents a function that returns the vertex of the parallelepiped that has the lowest coordinate in each dimension.
Q7. What is the way to solve the problem of the binding probability method?
One avenue of future work is to apply the path-filtering algorithm as a preprocessing step for both the parallelepiped as well as the binding probability method.
Q8. What is the method for estimating the true probability distribution curve of a circuit?
It provides a guaranteed lower bound on the true probability distribution curve of circuit delay and a “useful” upper bound on the true probability distribution curve.
Q9. What is the purpose of the ellipsoid filtering?
This purpose is served by the way in which the authors perform path filtering: all important directions are accounted for ensuring that the ellipsoid is “boxed” in all sides, and is therefore fairly representative of the real ellipsoid.
Q10. What are the anticipated applications of these methods?
Although the results are shown here with only two sources of environmental variation, the anticipated applications of these methods are to solve the problem of timing circuits with multiple voltage islands and to take manufacturing variations into account.
Q11. What is the method for calculating the yield of a parallelepiped?
With the above observation, the region of integration in the parameter space is recursively subdivided into progressively smaller parallelepipeds until the authors find parallelepipeds all of whose vertices are feasible.
Q12. What is the motivation behind the binding probability method?
The extreme efficiency of the binding probability method is motivating some new researches into handling skewed distributions in this method.
Q13. What is the difference between block-based and path-based methods?
Block-based methods have linear complexity and are amenable to incremental processing, as noted by [2] and [7], while path-based methods are more accurate in that they better take into account the correlations due to reconvergent fan-out and spatial correlation.
Q14. What is the difference between the Monte Carlo and the parallelepiped methods?
To first order, the Monte Carlo and binding probability methods are unaffected by the number of parameters, whereas the ellipsoid method has polynomial dependence and the parallelepiped method has exponential dependence which dominates the run time above six dimensions.