scispace - formally typeset
Search or ask a question
Author

Krishnendu Chakrabarty

Other affiliations: Huawei, Wake Forest University, Freescale Semiconductor  ...read more
Bio: Krishnendu Chakrabarty is an academic researcher from Duke University. The author has contributed to research in topics: Biochip & Automatic test pattern generation. The author has an hindex of 79, co-authored 996 publications receiving 27583 citations. Previous affiliations of Krishnendu Chakrabarty include Huawei & Wake Forest University.


Papers
More filters
Proceedings ArticleDOI
28 Apr 2002
TL;DR: This work presents a new approach for wrapper/TAM co-optimization based on generalized rectangle packing, also referred to as two-dimensional packing, which allows us to decrease testing time by reducing the mismatch between a core's test data needs and the width of the TAM to which it is assigned.
Abstract: The testing time for a system-on-chip (SOC) is determined to a large extent by the design of test wrappers and the test access mechanism (TAM). Wrapper/TAM co-optimization is therefore necessary for minimizing SOC testing time. We recently proposed an exact technique for co-optimization based on a combination of integer linear programming (ILP) and exhaustive enumeration. However, this approach is computationally expensive for large SOCs, and it is limited to fixed-width test buses. We present a new approach for wrapper/TAM co-optimization based on generalized rectangle packing, also referred to as two-dimensional packing. This approach allows us to decrease testing time by reducing the mismatch between a core's test data needs and the width of the TAM to which it is assigned. We apply our co-optimization technique to an academic benchmark SOC and three industrial SOCs. Compared to the ILP-based technique, we obtain lower or comparable testing times for two out of the three industrial SOCs. Moreover, we obtain more than two orders of magnitude decrease in the CPU time needed for wrapper/TAM co-design.

182 citations

Journal ArticleDOI
TL;DR: A rigorous analysis is presented to show that the proposed TRP technique reduces testing time compared to a conventional scan-based scheme, and improves upon prior work on run-length coding by showing that test sets that minimize switching activity during scan shifting can be more efficiently compressed using alternating run- length codes.
Abstract: We present a test resource partitioning (TRP) technique that simultaneously reduces test data volume, test application time, and scan power. The proposed approach is based on the use of alternating run-length codes for test data compression. We present a formal analysis of the amount of data compression obtained using alternating run-length codes. We show that a careful mapping of the don't-cares in precomputed test sets to 1's and 0's leads to significant savings in peak and average power, without requiring either a slower scan clock or blocking logic in the scan cells. We present a rigorous analysis to show that the proposed TRP technique reduces testing time compared to a conventional scan-based scheme. We also improve upon prior work on run-length coding by showing that test sets that minimize switching activity during scan shifting can be more efficiently compressed using alternating run-length codes. Experimental results for the larger ISCAS89 benchmarks and an IBM production circuit show that reduced test data volume, test application time, and low power-scan testing can indeed be achieved in all cases.

177 citations

Journal ArticleDOI
TL;DR: This work proposes a system design methodology that attempts to apply classical high-level synthesis techniques to the design of digital microfluidic biochips and develops an optimal scheduling strategy based on integer linear programming and two heuristic techniques that scale well for large problem instances.
Abstract: Microfluidic biochips offer a promising platform for massively parallel DNA analysis, automated drug discovery, and real-time biomolecular recognition. Current techniques for full-custom design of droplet-based “digital” biochips do not scale well for concurrent assays and for next-generation system-on-chip (SOC) designs that are expected to include microfluidic components. We propose a system design methodology that attempts to apply classical high-level synthesis techniques to the design of digital microfluidic biochips. We focus here on the problem of scheduling bioassay functions under resource constraints. We first develop an optimal scheduling strategy based on integer linear programming. However, because the scheduling problem is NP-complete, we also develop two heuristic techniques that scale well for large problem instances. A clinical diagnostic procedure, namely multiplexed in-vitro diagnostics on human physiological fluids, is first used to illustrate and evaluate the proposed method. Next, the synthesis approach is applied to a protein assay, which serves as a more complex bioassay application. The proposed synthesis approach is expected to reduce human effort and design cycle time, and it will facilitate the integration of microfluidic components with microelectronic components in next-generation SOCs.

172 citations

Journal ArticleDOI
TL;DR: In this paper, the dependence of the system behavior on its parameters is investigated and the bifurcation phenomena and a mapping of the parameter space have been presented, which is vital for designing practical circuits.
Abstract: The DC-DC buck power converter, a widely used chopper circuit, exhibits subharmonics and chaos if current feedback is used. This paper investigates the dependence of the system behavior on its parameters. The bifurcation phenomena and a mapping of the parameter space have been presented. This knowledge is vital for designing practical circuits.

167 citations

Journal ArticleDOI
TL;DR: This paper reviews recent developments in DSNs from four aspects: network structure, data processing paradigm, sensor fusion algorithm with emphasis on fault-tolerant algorithm design, and optimal sensor deployment strategy.
Abstract: Advances in sensor technology and computer networks have enabled distributed sensor networks (DSNs) to evolve from small clusters of large sensors to large swarms of micro-sensors, from fixed sensor nodes to mobile nodes, from wired communications to wireless communications, from static network topology to dynamically changing topology. However, these technological advances have also brought new challenges to processing large amount of data in a bandwidth-limited, power-constraint, unstable and dynamic environment. This paper reviews recent developments in DSNs from four aspects: network structure, data processing paradigm, sensor fusion algorithm with emphasis on fault-tolerant algorithm design, and optimal sensor deployment strategy.

166 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

Journal Article
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON

13,333 citations

Christopher M. Bishop1
01 Jan 2006
TL;DR: Probability distributions of linear models for regression and classification are given in this article, along with a discussion of combining models and combining models in the context of machine learning and classification.
Abstract: Probability Distributions.- Linear Models for Regression.- Linear Models for Classification.- Neural Networks.- Kernel Methods.- Sparse Kernel Machines.- Graphical Models.- Mixture Models and EM.- Approximate Inference.- Sampling Methods.- Continuous Latent Variables.- Sequential Data.- Combining Models.

10,141 citations

01 Jan 2002

9,314 citations