scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Hierarchical statistical characterization of mixed-signal circuits using behavioral modeling

TL;DR: A methodology for hierarchical statistical circuit characterization which does not rely upon circuit-level Monte Carlo simulation is presented and permits the statistical characterization of large analog and mixed-signal systems.
Abstract: A methodology for hierarchical statistical circuit characterization which does not rely upon circuit-level Monte Carlo simulation is presented. The methodology uses principal component analysis, response surface methodology, and statistics to directly calculate the statistical distributions of higher-level parameters from the distributions of lower-level parameters. We have used the methodology to characterize a folded cascode operational amplifier and a phase-locked loop. This methodology permits the statistical characterization of large analog and mixed-signal systems, many of which are extremely time-consuming or impossible to characterize using existing methods.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: The comparison with Monte Carlo simulations performed by neglecting the effect of mismatch confirmed that local device variations play a crucial role in the design and sizing of the clock distribution network.
Abstract: In this paper, we analyze the impact of process variations on the clock skew of VLSI circuits designed in deep submicrometer technologies. With smaller feature size, the utilization of a dense buffering scheme has been proposed in order to realize efficient and noise-immune clock distribution networks. However, the local variance of MOSFET electrical parameters, such as V/sub T/ and I/sub DSS/, increases with scaling of device dimensions, thus causing large intradie variability of the timing properties of clock buffers. As a consequence, we expect process variations to be a significant source of clock skew in deep submicrometer technologies. In order to accurately verify this hypothesis, we applied advanced statistical simulation techniques and accurate mismatch measurement data in order to thoroughly characterize the impact of intradie variations on industrial clock distribution networks. The comparison with Monte Carlo simulations performed by neglecting the effect of mismatch confirmed that local device variations play a crucial role in the design and sizing of the clock distribution network.

65 citations

Journal ArticleDOI
TL;DR: A method that is capable of handling process variations to evaluate analog/RF test measurements at the design stage and provides a general framework to compare alternative test solutions that are continuously being proposed toward reducing the high cost of specification-based tests is presented.
Abstract: We present a method that is capable of handling process variations to evaluate analog/RF test measurements at the design stage. The method can readily be used to estimate test metrics, such as parametric test escape and yield loss, with parts per million accuracy, and to fix test limits that satisfy specific tradeoffs between test metrics of interest. Furthermore, it provides a general framework to compare alternative test solutions that are continuously being proposed toward reducing the high cost of specification-based tests. The key idea of the method is to build a statistical model of the circuit under test and the test measurements using nonparametric density estimation. Thereafter, the statistical model can be simulated very fast to generate an arbitrarily large volume of new data. The method is demonstrated for a previously proposed built-in self-test measurement for low-noise amplifiers. The result indicates that the new synthetic data have the exact same structure of data generated by a computationally intensive brute-force Monte Carlo circuit simulation.

61 citations

Proceedings ArticleDOI
Xin Li1, Wangyang Zhang1, Fa Wang1, Shupeng Sun1, Chenjie Gu2 
05 Nov 2012
TL;DR: This paper proposes a novel Bayesian model fusion (BMF) technique for efficient parametric yield estimation that achieves up to 3.75× runtime speedup over the traditional kernel estimation method.
Abstract: Parametric yield estimation is one of the most critical-yet-challenging tasks for designing and verifying nanoscale analog and mixed-signal circuits. In this paper, we propose a novel Bayesian model fusion (BMF) technique for efficient parametric yield estimation. Our key idea is to borrow the simulation data from an early stage (e.g., schematic-level simulation) to efficiently estimate the performance distributions at a late stage (e.g., post-layout simulation). BMF statistically models the correlation between early-stage and late-stage performance distributions by Bayesian inference. In addition, a convex optimization is formulated to solve the unknown late-stage performance distributions both accurately and robustly. Several circuit examples designed in a commercial 32 nm CMOS process demonstrate that the proposed BMF technique achieves up to 3.75× runtime speedup over the traditional kernel estimation method.

39 citations

Journal ArticleDOI
TL;DR: In this article, the authors proposed to construct physically consistent closed-form density functions by two monotone interpolation schemes, and then by exploiting the special forms of the obtained density functions, they determine the generalized polynomial-chaos basis functions and the Gauss quadrature rules that are required by a stochastic spectral simulator.
Abstract: Stochastic spectral methods are efficient techniques for uncertainty quantification. Recently they have shown excellent performance in the statistical analysis of integrated circuits. In stochastic spectral methods, one needs to determine a set of orthonormal polynomials and a proper numerical quadrature rule. The former are used as the basis functions in a generalized polynomial chaos expansion. The latter is used to compute the integrals involved in stochastic spectral methods. Obtaining such information requires knowing the density function of the random input a-priori. However, individual system components are often described by surrogate models rather than density functions. In order to apply stochastic spectral methods in hierarchical uncertainty quantification, we first propose to construct physically consistent closed-form density functions by two monotone interpolation schemes. Then, by exploiting the special forms of the obtained density functions, we determine the generalized polynomial-chaos basis functions and the Gauss quadrature rules that are required by a stochastic spectral simulator. The effectiveness of our proposed algorithm is verified by both synthetic and practical circuit examples.

38 citations

Journal ArticleDOI
TL;DR: Variance analysis is used for the estimation of how random device parameter variation effects the behavior of analog integrated circuits as discussed by the authors, which is very effective if the random parameter deviations can be normally distributed and statistically independent and if the nonlinear dependence of the circuit characteristics can be linearized around the nominal (mean) parameter values.
Abstract: Variance analysis is used for the estimation of how random device parameter variation effects the behavior of analog integrated circuits. This method is very effective if the random parameter deviations can be assumed to be normally distributed and statistically independent and if the nonlinear dependence of the circuit characteristics can be linearized around the nominal (mean) parameter values. It is shown under which conditions the nonlinear dependencies of the system characteristics on the parameters have to be taken into account and how this can improve the accuracy of statistical analysis. This is illustrated with two examples: a transconductance amplifier and an analog filter.

33 citations

References
More filters
Book
29 Aug 1995
TL;DR: Using a practical approach, this book discusses two-level factorial and fractional factorial designs, several aspects of empirical modeling with regression techniques, focusing on response surface methodology, mixture experiments and robust design techniques.
Abstract: From the Publisher: Using a practical approach, it discusses two-level factorial and fractional factorial designs, several aspects of empirical modeling with regression techniques, focusing on response surface methodology, mixture experiments and robust design techniques. Features numerous authentic application examples and problems. Illustrates how computers can be a useful aid in problem solving. Includes a disk containing computer programs for a response surface methodology simulation exercise and concerning mixtures.

10,104 citations


"Hierarchical statistical characteri..." refers methods in this paper

  • ...The non-Monte Carlo techniques described in this paper utilize response surface methodology (RSM) [6]....

    [...]

Journal ArticleDOI

3,788 citations


"Hierarchical statistical characteri..." refers methods in this paper

  • ...The most widely used technique for performing statistical characterization is Monte Carlo analysis [1, 2]....

    [...]

Book
13 Mar 1991
TL;DR: In this paper, the authors present a directory of Symbols and Definitions for PCA, as well as some classic examples of PCA applications, such as: linear models, regression PCA of predictor variables, and analysis of variance PCA for Response Variables.
Abstract: Preface.Introduction.1. Getting Started.2. PCA with More Than Two Variables.3. Scaling of Data.4. Inferential Procedures.5. Putting It All Together-Hearing Loss I.6. Operations with Group Data.7. Vector Interpretation I : Simplifications and Inferential Techniques.8. Vector Interpretation II: Rotation.9. A Case History-Hearing Loss II.10. Singular Value Decomposition: Multidimensional Scaling I.11. Distance Models: Multidimensional Scaling II.12. Linear Models I : Regression PCA of Predictor Variables.13. Linear Models II: Analysis of Variance PCA of Response Variables.14. Other Applications of PCA.15. Flatland: Special Procedures for Two Dimensions.16. Odds and Ends.17. What is Factor Analysis Anyhow?18. Other Competitors.Conclusion.Appendix A. Matrix Properties.Appendix B. Matrix Algebra Associated with Principal Component Analysis.Appendix C. Computational Methods.Appendix D. A Directory of Symbols and Definitions for PCA.Appendix E. Some Classic Examples.Appendix F. Data Sets Used in This Book.Appendix G. Tables.Bibliography.Author Index.Subject Index.

3,534 citations

Book
01 Jan 1971

3,429 citations


"Hierarchical statistical characteri..." refers background or methods in this paper

  • ...6 is used to compute cov yi; yj [13]....

    [...]

  • ...Note that for any given coefficients in a quadratic equation, A is uniquely determined [13]....

    [...]

Book
01 Jan 1964
TL;DR: The general nature of Monte Carlo methods can be found in this paper, where a short resume of statistical terms is given, including random, pseudorandom, and quasirandom numbers.
Abstract: 1 The general nature of Monte Carlo methods.- 2 Short resume of statistical terms.- 3 Random, pseudorandom, and quasirandom numbers.- 4 Direct simulation.- 5 General principles of the Monte Carlo method.- 6 Conditional Monte Carlo.- 7 Solution of linear operator equations.- 8 Radiation shielding and reactor criticality.- 9 Problems in statistical mechanics.- 10 Long polymer molecules.- 11 Percolation processes.- 12 Multivariable problems.- References.

3,226 citations