scispace - formally typeset
Search or ask a question
Book

Computational Aspects of Vlsi

01 Jan 1984-
About: The article was published on 1984-01-01 and is currently open access. It has received 862 citations till now. The article focuses on the topics: Very-large-scale integration.
Citations
More filters
Journal ArticleDOI

520 citations


Cites background from "Computational Aspects of Vlsi"

  • ...Thus, if our three-dimensional memory is interpreted as a neural network, the number of interconnections required would probably be prohibitive (Ullman, 1984)....

    [...]

Journal ArticleDOI
TL;DR: In this paper, a divide-and-conquer framework for VLSI graph layout is introduced, which is used to design regular and configurable layouts, to assemble large networks of processor using restructurable chips, and to configure networks around faulty processors.

476 citations

Journal ArticleDOI
TL;DR: In this paper, the authors examine the physical limits of the process of computing and find that the limits are based solely on fundamental physical principles, not on whatever technology we may currently be using.
Abstract: What constraints govern the physical process o f computing? Is a minim um amount of energy required, for example, per logic step? There seems to be no minimum, but some other questions are open A computation, whether it is formed by electronic machinery , on an abacus or in a biological system such as the brain, is a physical process. It is subject to the same questions that apply to other physical processes: How much energy must be expended to perform a particular com-putation? How long must it take? How large must the computing device be? In other words, what are the physical limits of the process of computation? So far it has been easier to ask these questions than to answer them. T o the extent that we have found limits, they are terribly far away from the real limits of modern technology. We cannot profess, therefore, to be guiding the technologist or the engineer. What we are doing is really more fundamental. We are looking for general laws that must govern all information processing , no matter how it is accomplished. Any limits we find must be based solely on fundamental physical principles, not on whatever technology we may currently be using. There are precedents for this kind of fundamental examination. In the 1940's Claude E. Shannon of the Bell Telephone Laboratories found there are limits on the amount of information that can be transmitted through a noisy channel; these limits apply no matter how the message is encoded into a signal. Shannon's work represents the birth of modern information science. Earlier, in the mid-and late 19th century, physicists attempting to determine the fundamental limits on the efficiency of steam engines had created the science of thermodynamics. In about 1960 one of us (Landauer) and John at IBM began attempting to apply the same type of analysis to the process of computing. Since the mid-1970's a growing number of other workers at other institutions have entered this field. In our analysis of the physical limits of computation we use the term \"in-formation\" in the technical sense of information theory. In this sense information is destroyed whenever two previously distinct situations become indistinguishable. In physical systems without friction, information can never be destroyed; whenever information is destroyed, some amount of energy must be dissipated (converted into heat). As an example, imagine two easily distinguishable physical situations, such as a …

344 citations

Book
21 Dec 2011
TL;DR: This is the second volume of a systematic two-volume presentation of the various areas of research in the field of structural complexity, addressed to graduate students and researchers and assumes knowledge of the topics treated in the first volume but is otherwise nearly self-contained.
Abstract: This is the second volume of a systematic two-volume presentation of the various areas of research in the field of structural complexity. The mathematical theory of computation has developed into a broad and rich discipline within which the theory of algorithmic complexity can be approached from several points of view. This volume is addressed to graduate students and researchers and assumes knowledge of the topics treated in the first volume but is otherwise nearly self-contained. Topics covered include vector machines, parallel computation, alternation, uniform circuit complexity, isomorphism, biimmunity and complexity cores, relativization and positive relativization, the low and high hierarchies, Kolmogorov complexity and probability classes. Numerous exercises and references are given.

330 citations

Book
01 Jan 1998
TL;DR: In Models of Computation, John Savage re-examines theoretical computer science, offering a fresh approach that gives priority to resource tradeoffs and complexity classifications over the structure of machines and their relationships to languages.
Abstract: From the Publisher: Your book fills the gap which all of us felt existed too long. Congratulations on this excellent contribution to our field." --Jan van Leeuwen, Utrecht University "This is an impressive book. The subject has been thoroughly researched and carefully presented. All the machine models central to the modern theory of computation are covered in depth; many for the first time in textbook form. Readers will learn a great deal from the wealth of interesting material presented." --Andrew C. Yao, Professor of Computer Science, Princeton University "Models of Computation" is an excellent new book that thoroughly covers the theory of computation including significant recent material and presents it all with insightful new approaches. This long-awaited book will serve as a milestone for the theory community." --Akira Maruoka, Professor of Information Sciences, Tohoku University "This is computer science." --Elliot Winard, Student, Brown University In Models of Computation: Exploring the Power of Computing, John Savage re-examines theoretical computer science, offering a fresh approach that gives priority to resource tradeoffs and complexity classifications over the structure of machines and their relationships to languages. This viewpoint reflects a pedagogy motivated by the growing importance of computational models that are more realistic than the abstract ones studied in the 1950s, '60s and early '70s. Assuming onlysome background in computer organization, Models of Computation uses circuits to simulate machines with memory, thereby making possible an early discussion of P-complete and NP-complete problems. Circuits are also used to demonstrate that tradeoffs between parameters of computation, such as space and time, regulate all computations by machines with memory. Full coverage of formal languages and automata is included along with a substantive treatment of computability. Topics such as space-time tradeoffs, memory hierarchies, parallel computation, and circuit complexity, are integrated throughout the text with an emphasis on finite problems and concrete computational models FEATURES: Includes introductory material for a first course on theoretical computer science. Builds on computer organization to provide an early introduction to P-complete and NP-complete problems. Includes a concise, modern presentation of regular, context-free and phrase-structure grammars, parsing, finite automata, pushdown automata, and computability. Includes an extensive, modern coverage of complexity classes. Provides an introduction to the advanced topics of space-time tradeoffs, memory hierarchies, parallel computation, the VLSI model, and circuit complexity, with parallelism integrated throughout. Contains over 200 figures and over 400 exercises along with an extensive bibliography. ** Instructor's materials are available from your sales rep. If you do not know your local sales representative, please call 1-800-552-2499 for assistance, or use the Addison Wesley Longman rep-locator at ...

311 citations


Cites background from "Computational Aspects of Vlsi"

  • ...Ullman [339] summarized the status of the field around 1984 and Lengauer [193] addressed the VLSI layout problem....

    [...]