scispace - formally typeset
Search or ask a question
Author

P. Gopalakrishnan

Bio: P. Gopalakrishnan is an academic researcher from Carnegie Mellon University. The author has contributed to research in topics: Logic synthesis & Integrated circuit design. The author has an hindex of 9, co-authored 12 publications receiving 843 citations.

Papers
More filters
Proceedings ArticleDOI
02 Jun 2003
TL;DR: Some of the trade-offs to consider for determination of how much regularity a particular IC or application can afford are discussed, and a Via Patterned Gate Array is proposed as one such example.
Abstract: While advances in semiconductor technologies have pushed achievable scale and performance to phenomenal limits for ICs, nanoscale physical realities dictate IC production based on what we can afford. We believe that IC design and manufacturing can be made more affordable, and reliable, by removing some design and implementation flexibility and enforcing new forms of design regularity. This paper discusses some of the trade-offs to consider for determination of how much regularity a particular IC or application can afford. A Via Patterned Gate Array is proposed as one such example that trades performance for cost by way of new forms of design regularity.

268 citations

Proceedings ArticleDOI
16 Feb 2004
TL;DR: A new, more granular, via-patterned heterogeneous logic block architecture is presented and compared to a less granular LUT-based heterogeneous PLB and results show higher performance and more effective packing of the logic functions due to increased granularity.
Abstract: Driven by the economics of design and manufacturing nanoscale integrated circuits, an emphasis is being placed on developing new, regular logic fabrics that leverage the regularity and programmability of FPGAs, yet deliver a level of performance and density close to ASICs. One example of such a fabric is a Via-Patterned Gate Array (VPGA) according to Pillegi et al. (2002), which employs ASIC style global routing on top of an array of patternable logic blocks (PLBs). Previous works (Koorapaty et al., 2003; Koorapaty, 2003; Pileggi et al., 2003) showed that by employing even limited heterogeneity for the VPGA logic blocks, namely combining a 3-LUT with two 3-input Nand gates, one can achieve performance comparable to that provided by standard cells. Since the area cost for such heterogeneity id far less for FPGAs, we can explore new configurations of via-configurable logic blocks that offer greater heterogeneity and granularity to achieve even higher performance. In this paper, we present a new, more granular, via-patterned heterogeneous logic block architecture and compare it to a less granular LUT-based heterogeneous PLB. Our results show higher performance and more effective packing of the logic functions due to increased granularity.

172 citations

Proceedings ArticleDOI
07 Nov 2004
TL;DR: An asymptotic probability extraction method, APEX, for estimating the unknown random distribution when using nonlinear response surface modeling, which uses a binomial moment evaluation to efficiently compute the high order moments of the unknown distribution and applies moment matching to approximate the characteristic function of the random circuit performance by an efficient rational function.
Abstract: While process variations are becoming more significant with each new IC technology generation, they are often modeled via linear regression models so that the resulting performance variations can be captured via normal distributions. Nonlinear (e.g. quadratic) response surface models can be utilized to capture larger scale process variations; however, such models result in non-normal distributions for circuit performance which are difficult to capture since the distribution model is unknown. In this paper we propose an asymptotic probability extraction method, APEX, for estimating the unknown random distribution when using nonlinear response surface modeling. APEX first uses a binomial moment evaluation to efficiently compute the high order moments of the unknown distribution, and then applies moment matching to approximate the characteristic function of the random circuit performance by an efficient rational function. A simple statistical timing example and an analog circuit example demonstrate that APEX can provide better accuracy than Monte Carlo simulation with 10 samples and achieve orders of magnitude more efficiency. We also show the error incurred by the popular normal modeling assumption using standard IC technologies.

110 citations

Journal ArticleDOI
TL;DR: The APEX begins by efficiently computing the high-order moments of the unknown distribution and then applies moment matching to approximate the characteristic function of the random distribution by an efficient rational function, and is proven that such a moment-matching approach is asymptotically convergent when applied to quadratic response surface models.
Abstract: While process variations are becoming more significant with each new IC technology generation, they are often modeled via linear regression models so that the resulting performance variations can be captured via normal distributions. Nonlinear response surface models (e.g., quadratic polynomials) can be utilized to capture larger scale process variations; however, such models result in nonnormal distributions for circuit performance. These performance distributions are difficult to capture efficiently since the distribution model is unknown. In this paper, an asymptotic-probability-extraction (APEX) method for estimating the unknown random distribution when using a nonlinear response surface modeling is proposed. The APEX begins by efficiently computing the high-order moments of the unknown distribution and then applies moment matching to approximate the characteristic function of the random distribution by an efficient rational function. It is proven that such a moment-matching approach is asymptotically convergent when applied to quadratic response surface models. In addition, a number of novel algorithms and methods, including binomial moment evaluation, PDF/CDF shifting, nonlinear companding and reverse evaluation, are proposed to improve the computation efficiency and/or approximation accuracy. Several circuit examples from both digital and analog applications demonstrate that APEX can provide better accuracy than a Monte Carlo simulation with 104 samples and achieve up to 10times more efficiency. The error, incurred by the popular normal modeling assumption for several circuit examples designed in standard IC technologies, is also shown

84 citations

Proceedings ArticleDOI
07 Nov 2004
TL;DR: ROAD extracts accurate posynomial performance models via transistor-level simulation and optimizes the circuit by geometric programming and sets up all design constraints to include large-scale process variations to facilitate the tradeoff between yield and performance.
Abstract: We propose a robust analog design tool (ROAD) for post-tuning analog/RF circuits. Starting from an initial design derived from hand analysis or analog circuit synthesis based on simplified models, ROAD extracts accurate posynomial performance models via transistor-level simulation and optimizes the circuit by geometric programming. Importantly, ROAD sets up all design constraints to include large-scale process variations to facilitate the tradeoff between yield and performance. A novel convex formulation of the robust design problem is utilized to improve the optimization efficiency and to produce a solution that is superior to other local tuning methods. In addition, a novel projection-based approach for posynomial fitting is used to facilitate scaling to large problem sizes. A new implicit power iteration algorithm is proposed to find the optimal projection space and extract the posynomial coefficients with robust convergence. The efficacy of ROAD is demonstrated on several circuit examples.

70 citations


Cited by
More filters
Journal Article
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON

13,333 citations

Journal ArticleDOI
TL;DR: This tutorial paper collects together in one place the basic background material needed to do GP modeling, and shows how to recognize functions and problems compatible with GP, and how to approximate functions or data in a formcompatible with GP.
Abstract: A geometric program (GP) is a type of mathematical optimization problem characterized by objective and constraint functions that have a special form. Recently developed solution methods can solve even large-scale GPs extremely efficiently and reliably; at the same time a number of practical problems, particularly in circuit design, have been found to be equivalent to (or well approximated by) GPs. Putting these two together, we get effective solutions for the practical problems. The basic approach in GP modeling is to attempt to express a practical problem, such as an engineering analysis or design problem, in GP format. In the best case, this formulation is exact; when this is not possible, we settle for an approximate formulation. This tutorial paper collects together in one place the basic background material needed to do GP modeling. We start with the basic definitions and facts, and some methods used to transform problems into GP format. We show how to recognize functions and problems compatible with GP, and how to approximate functions or data in a form compatible with GP (when this is possible). We give some simple and representative examples, and also describe some common extensions of GP, along with methods for solving (or approximately solving) them.

1,215 citations

01 Jan 2008
TL;DR: The recent developments in SSTA are reviewed, first the underlying models and assumptions are discussed, then the major approaches are surveyed, and its remaining key challenges are discussed.
Abstract: Static-timing analysis (STA) has been one of the most pervasive and successful analysis engines in the design of digital circuits for the last 20 years. However, in recent years, the in- creased loss of predictability in semiconductor devices has raised concern over the ability of STA to effectively model statistical variations. This has resulted in extensive research in the so-called statistical STA (SSTA), which marks a significant departure from the traditional STA framework. In this paper, we review the recent developments in SSTA. We first discuss its underlying models and assumptions, then survey the major approaches, and close by discussing its remaining key challenges.

344 citations

Journal ArticleDOI
TL;DR: In this paper, the authors review the recent developments in statistical static-timing analysis (SSTA) and discuss its underlying models and assumptions, then survey the major approaches, and close by discussing its remaining key challenges.
Abstract: Static-timing analysis (STA) has been one of the most pervasive and successful analysis engines in the design of digital circuits for the last 20 years. However, in recent years, the increased loss of predictability in semiconductor devices has raised concern over the ability of STA to effectively model statistical variations. This has resulted in extensive research in the so-called statistical STA (SSTA), which marks a significant departure from the traditional STA framework. In this paper, we review the recent developments in SSTA. We first discuss its underlying models and assumptions, then survey the major approaches, and close by discussing its remaining key challenges.

341 citations

Proceedings ArticleDOI
09 Nov 2003
TL;DR: Key aspects include better integrations with analysis and manufacturing interfaces, as well as cost-benefit tradeoffs for "regular"layout structures that are likely beyond 90nm, cost optimizations for low-volume production, and the role of robust and/or stochastic optimization in PD.
Abstract: Ultra-deep submicron manufacturability impacts physical design (PD) through complex layout rules and large guard-bands for process variability; this creates new requirements for new manufacturing-aware PD technologies. The first part of this tutorial reviews PD complications and methodology changes notably in the detailed routing arena - that arise from subwavelength lithography and deep-submicron manufacturing (antennas, metal planarization and mask-wafer mismatch). Process variations and their sources are taxonomized for modeling and simulation. A framework of design for cost and value is described. The second part covers yield-constrained optimizations in PD, especially "beyond corners" approaches that escape today's pessimistic or even incorrect corner-based approaches. Statistical timing and noise analyses enable optimization of parametric yield and reliability. Yield-aware cell libraries and "analog" design rules (as opposed to "digital", 0/1 rules) can help designers explore yield-cost tradeoffs, especially for low-volume parts. We then examine performance impact-limited fill insertion which goes beyond mere capacitance rules. Modeling, objectives, and filling strategies are discussed. Finally, we discuss current and near-term prospects for the overall design-to-manufacturing PD methodology. Key aspects include better integrations with analysis and manufacturing interfaces, as well as cost-benefit tradeoffs for "regular" layout structures that are likely beyond 90 nm, cost optimizations for low-volume production, and the role of robust and/or stochastic optimization in PD.

274 citations