scispace - formally typeset
Search or ask a question
Author

Jiayong Le

Other affiliations: Synopsys, University of Michigan
Bio: Jiayong Le is an academic researcher from Carnegie Mellon University. The author has contributed to research in topics: Parametric statistics & Monte Carlo method. The author has an hindex of 10, co-authored 14 publications receiving 663 citations. Previous affiliations of Jiayong Le include Synopsys & University of Michigan.

Papers
More filters
Proceedings ArticleDOI
07 Jun 2004
TL;DR: An efficient block-based statistical static timing analysis algorithm that can account for correlations from process parameters and re-converging paths and can also accommodate dominant interconnect coupling effects to provide an accurate compilation of statistical timing information is presented.
Abstract: Current technology trends have led to the growing impact of both inter-die and intra-die process variations on circuit performance. While it is imperative to model parameter variations for sub-100nm technologies to produce an upper bound prediction on timing, it is equally important to consider the correlation of these variations for the bound to be useful. In this paper we present an efficient block-based statistical static timing analysis algorithm that can account for correlations from process parameters and re-converging paths. The algorithm can also accommodate dominant interconnect coupling effects to provide an accurate compilation of statistical timing information. The generality and efficiency for the proposed algorithm is obtained from a novel simplification technique that is derived from the statistical independence theories and principal component analysis (PCA) methods. The technique significantly reduces the cost for mean, variance and covariance computation of a set of correlated random variables.

138 citations

Proceedings ArticleDOI
07 Nov 2004
TL;DR: An asymptotic probability extraction method, APEX, for estimating the unknown random distribution when using nonlinear response surface modeling, which uses a binomial moment evaluation to efficiently compute the high order moments of the unknown distribution and applies moment matching to approximate the characteristic function of the random circuit performance by an efficient rational function.
Abstract: While process variations are becoming more significant with each new IC technology generation, they are often modeled via linear regression models so that the resulting performance variations can be captured via normal distributions. Nonlinear (e.g. quadratic) response surface models can be utilized to capture larger scale process variations; however, such models result in non-normal distributions for circuit performance which are difficult to capture since the distribution model is unknown. In this paper we propose an asymptotic probability extraction method, APEX, for estimating the unknown random distribution when using nonlinear response surface modeling. APEX first uses a binomial moment evaluation to efficiently compute the high order moments of the unknown distribution, and then applies moment matching to approximate the characteristic function of the random circuit performance by an efficient rational function. A simple statistical timing example and an analog circuit example demonstrate that APEX can provide better accuracy than Monte Carlo simulation with 10 samples and achieve orders of magnitude more efficiency. We also show the error incurred by the popular normal modeling assumption using standard IC technologies.

110 citations

Journal ArticleDOI
TL;DR: The APEX begins by efficiently computing the high-order moments of the unknown distribution and then applies moment matching to approximate the characteristic function of the random distribution by an efficient rational function, and is proven that such a moment-matching approach is asymptotically convergent when applied to quadratic response surface models.
Abstract: While process variations are becoming more significant with each new IC technology generation, they are often modeled via linear regression models so that the resulting performance variations can be captured via normal distributions. Nonlinear response surface models (e.g., quadratic polynomials) can be utilized to capture larger scale process variations; however, such models result in nonnormal distributions for circuit performance. These performance distributions are difficult to capture efficiently since the distribution model is unknown. In this paper, an asymptotic-probability-extraction (APEX) method for estimating the unknown random distribution when using a nonlinear response surface modeling is proposed. The APEX begins by efficiently computing the high-order moments of the unknown distribution and then applies moment matching to approximate the characteristic function of the random distribution by an efficient rational function. It is proven that such a moment-matching approach is asymptotically convergent when applied to quadratic response surface models. In addition, a number of novel algorithms and methods, including binomial moment evaluation, PDF/CDF shifting, nonlinear companding and reverse evaluation, are proposed to improve the computation efficiency and/or approximation accuracy. Several circuit examples from both digital and analog applications demonstrate that APEX can provide better accuracy than a Monte Carlo simulation with 104 samples and achieve up to 10times more efficiency. The error, incurred by the popular normal modeling assumption for several circuit examples designed in standard IC technologies, is also shown

84 citations

Proceedings ArticleDOI
31 May 2005
TL;DR: Why the traditional concepts of slack and critical path become ineffective under large-scale variations are demonstrated, and a novel sensitivity-based metric to assess the "criticality" of each path and/or arc in the statistical timing graph is proposed.
Abstract: The large-scale process and environmental variations for today's nanoscale ICs are requiring statistical approaches for timing analysis and optimization. Significant research has been recently focused on developing new statistical timing analysis algorithms, but often without consideration for how one should interpret the statistical timing results for optimization. In this paper (Li et al., 2005) we demonstrate why the traditional concepts of slack and critical path become ineffective under large-scale variations, and we propose a novel sensitivity-based metric to assess the "criticality" of each path and/or arc in the statistical timing graph. We define the statistical sensitivities for both paths and arcs, and theoretically prove that our path sensitivity is equivalent to the probability that a path is critical, and our arc sensitivity is equivalent to the probability that an arc sits on the critical path. An efficient algorithm with incremental analysis capability is described for fast sensitivity computation that has a linear runtime complexity in circuit size. The efficacy of the proposed sensitivity analysis is demonstrated on both standard benchmark circuits and large industry examples.

76 citations

Book
08 Aug 2007
TL;DR: Various statistical methodologies that have been recently developed to model, analyze, and optimize performance variations at both transistor level and system level are reviewed.
Abstract: As IC technologies scale to finer feature sizes, it becomes increasingly difficult to control the relative process variations. The increasing fluctuations in manufacturing processes have introduced unavoidable and significant uncertainty in circuit performance; hence ensuring manufacturability has been identified as one of the top priorities of today's IC design problems. In this paper, we review various statistical methodologies that have been recently developed to model, analyze, and optimize performance variations at both transistor level and system level. The following topics will be discussed in detail: sources of process variations, variation characterization and modeling, Monte Carlo analysis, response surface modeling, statistical timing and leakage analysis, probability distribution extraction, parametric yield estimation and robust IC optimization. These techniques provide the necessary CAD infrastructure that facilitates the bold move from deterministic, corner-based IC design toward statistical and probabilistic design.

70 citations


Cited by
More filters
Journal Article
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON

13,333 citations

01 Jan 2008
TL;DR: The recent developments in SSTA are reviewed, first the underlying models and assumptions are discussed, then the major approaches are surveyed, and its remaining key challenges are discussed.
Abstract: Static-timing analysis (STA) has been one of the most pervasive and successful analysis engines in the design of digital circuits for the last 20 years. However, in recent years, the in- creased loss of predictability in semiconductor devices has raised concern over the ability of STA to effectively model statistical variations. This has resulted in extensive research in the so-called statistical STA (SSTA), which marks a significant departure from the traditional STA framework. In this paper, we review the recent developments in SSTA. We first discuss its underlying models and assumptions, then survey the major approaches, and close by discussing its remaining key challenges.

344 citations

Journal ArticleDOI
TL;DR: In this paper, the authors review the recent developments in statistical static-timing analysis (SSTA) and discuss its underlying models and assumptions, then survey the major approaches, and close by discussing its remaining key challenges.
Abstract: Static-timing analysis (STA) has been one of the most pervasive and successful analysis engines in the design of digital circuits for the last 20 years. However, in recent years, the increased loss of predictability in semiconductor devices has raised concern over the ability of STA to effectively model statistical variations. This has resulted in extensive research in the so-called statistical STA (SSTA), which marks a significant departure from the traditional STA framework. In this paper, we review the recent developments in SSTA. We first discuss its underlying models and assumptions, then survey the major approaches, and close by discussing its remaining key challenges.

341 citations