scispace - formally typeset
Search or ask a question

Showing papers by "University of Maryland, College Park published in 1988"


Journal Article
TL;DR: How the field of computer (and robot) vision has evolved, particularly over the past 20 years, is described, and its central methodological paradigms are introduced.

3,112 citations


Journal ArticleDOI
TL;DR: It is shown how various well-known asymptotic power laws in S(q) are obtained from the above theory, and the theory is compared with experimental results on x-ray scattering from a polished Pyrex glass surface.
Abstract: The scattering of x rays and neutrons from rough surfaces is calculated. It is split into specular reflection and diffuse scattering terms. These are calculated in the first Born approximation, and explicit expressions are given for surfaces whose roughness can be described as self-affine over finite length scales. Expressions are also given for scattering from liquid surfaces, where it is shown that ``specular'' reflections only exist by virtue of a finite length cutoff to the mean-square height fluctuations. Expressions are also given for the scattering from randomly oriented surfaces, as studied in a typical small-angle scattering experiment. It is shown how various well-known asymptotic power laws in S(q) are obtained from the above theory. The distorted-wave Born approximation is next used to treat the case where the scattering is large (e.g., near the critical angle for total external reflection), and its limits of validity are discussed. Finally, the theory is compared with experimental results on x-ray scattering from a polished Pyrex glass surface.

2,031 citations


Proceedings ArticleDOI
01 May 1988
TL;DR: Frequent and sophisticated PC users rated MDA more satisfying, powerful and flexible than CLS, and future applications of the QUIS on computers are discussed.
Abstract: This study is a part of a research effort to develop the Questionnaire for User Interface Satisfaction (QUIS). Participants, 150 PC user group members, rated familiar software products. Two pairs of software categories were compared: 1) software that was liked and disliked, and 2) a standard command line system (CLS) and a menu driven application (MDA). The reliability of the questionnaire was high, Cronbach's alpha=.94. The overall reaction ratings yielded significantly higher ratings for liked software and MDA over disliked software and a CLS, respectively. Frequent and sophisticated PC users rated MDA more satisfying, powerful and flexible than CLS. Future applications of the QUIS on computers are discussed.

1,456 citations


Journal ArticleDOI
TL;DR: The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment) and the first in a series of Tame system prototypes has been developed.
Abstract: Experience from a dozen years of analyzing software engineering processes and products is summarized as a set of software engineering and measurement principles that argue for software engineering process models that integrate sound planning and analysis into the construction process. In the TAME (Tailoring A Measurement Environment) project at the University of Maryland, such an improvement-oriented software engineering process model was developed that uses the goal/question/metric paradigm to integrate the constructive and analytic aspects of software development. The model provides a mechanism for formalizing the characterization and planning tasks, controlling and improving projects based on quantitative analysis, learning in a deeper and more systematic way about the software process and product, and feeding the appropriate experience back into the current and future projects. The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment). The first in a series of TAME system prototypes has been developed. An assessment of experience with this first limited prototype is presented including a reassessment of its initial architecture. >

1,351 citations


Journal ArticleDOI
TL;DR: The development and psychometric evaluation of a structured, 45-minute Quality of Life Interview for the chronically mentally ill is described, which has satisfactory reliability and validity.

1,043 citations


Journal ArticleDOI
TL;DR: In this article, the normative implications of competition among local jurisdictions to attract new industry and income are explored within a neoclassical framework, where local officials set two policy variables, a tax (or subsidy) rate on mobile capital and a standard for local environmental quality, to induce more capital to enter the jurisdiction in order to raise wages.

938 citations


Journal ArticleDOI
TL;DR: The results show that using multiple- query processing algorithms may reduce execution cost considerably, and the presentation and analysis of algorithms that can be used for multiple-query optimization are presented.
Abstract: Some recently proposed extensions to relational database systems, as well as to deductive database systems, require support for multiple-query processing. For example, in a database system enhanced with inference capabilities, a simple query involving a rule with multiple definitions may expand to more than one actual query that has to be run over the database. It is an interesting problem then to come up with algorithms that process these queries together instead of one query at a time. The main motivation for performing such an interquery optimization lies in the fact that queries may share common data. We examine the problem of multiple-query optimization in this paper. The first major contribution of the paper is a systematic look at the problem, along with the presentation and analysis of algorithms that can be used for multiple-query optimization. The second contribution lies in the presentation of experimental results. Our results show that using multiple-query processing algorithms may reduce execution cost considerably.

872 citations


Journal ArticleDOI
TL;DR: The concept and measurement of commitment to goals, a key aspect of goal-setting theory, are discussed in this paper. The strength of the relationship between commitment and performance is asserted to depend on the amount of variance in commitment.
Abstract: The concept and measurement of commitment to goals, a key aspect of goal-setting theory, are discussed. The strength of the relationship between commitment and performance is asserted to depend on the amount of variance in commitment. Three major categories of determinants of commitment are discussed: external factors (authority, peer influence, external rewards), interactive factors (participation and competition), and internal factors (expectancy, internal rewards). Applications of these ideas are made and new research directions are suggested.

810 citations


Journal ArticleDOI
TL;DR: A pair of correlated light quanta of 532-nm wavelength with the same linear polarization but divergent directions of propagation was produced by nonlinear optical parametric down conversion and observed a violation of Bell's inequality by 3 standard deviations.
Abstract: A pair of correlated light quanta of 532-nm wavelength with the same linear polarization but divergent directions of propagation was produced by nonlinear optical parametric down conversion. Each light quantum was converted to a definite polarization eigenstate and was reflected by a turning mirror to superpose with the other at a beam splitter. For coincident detection at separated detectors, polarization correlations of the Einstein-Podolsky-Rosen-Bohm type were observed. We also observed a violation of Bell's inequality by 3 standard deviations.

759 citations


Book
01 Aug 1988
TL;DR: This book discusses Negation in Logic Programming, a Theory of Declarative Knowledge, and its Applications in Deductive Databases and Implementation, as well as other topics.
Abstract: Introduction, by J. Minker Part I - Negation and Stratified Databases Chapter 1 Negation in Logic Programming, by J.C. Shepherdson Chapter 2 Towards a Theory of Declarative Knowledge, by K.R. Apt, H.A. Blair, and A. Walker Chapter 3 Negation as Failure Using Tight Derivations for General Logic Programs, by A. Van Gelder Chapter 4 On the Declarative Semmantics of Logic Programs with Negation, by V. Lifschitz Chapter 5 On the Declarative Semantics of Deductive Databases and Logic Programs, by T.C. Przymusinski Chapter 6 On Domain Independent Databases, by R.W. Topor and E.A. Sonenberg Part II - Fundamental Issues in Deductive Databases and Implementation Chapter 7 Foundations of Semantic Query Optimization for Deductive Databases, by U.S. Chakravarthy, J. Grant, and J. Minker Chapter 8 Intelligent Query Answering in Rule Based Systems, by T. Imielinski Chapter 9 A Theorem-Proving Approach to Database Integrity, by F. Sadri and R. Kowalski Chapter 10 A Logic-based Language for Database Updates, by S. Manchanda and D.S. Warren Chapter 11 Compiling the GCWA in Indefinite Deductive Databases, by L. Henschen and H. Park Chapter 12 Performance Evaluation of Data Intensive Logic Programs, by F. Bancilhon and R. Ramakrishnan Chapter 13 A Superjoin Algorithm for Deductive Databases, by J.A. Thom, K. Ramamohanarao, and L. Naish Part III - Unification and Logic Programs Chapter 14 Logic Programming and Parallel Complexity, by P.C. Kanellakis Chapter 15 Unification Revisited, by J-L Lassez, M.J. Maher, and K. Marriott Chapter 16 Equivalences of Logic Programs, by M.J. Maher Chapter 17 Optimizing Datalog Programs, by Y. Sagiv Chapter 18 Converting AND-Control to OR-Control by Program Transformation, by M.H. van Emden and P. Szeredi Authors Referees Author Index Subject Index

707 citations



Journal ArticleDOI
TL;DR: In this paper, the authors present a survey of the sociological study of the professions in the UK and conclude that "the reality examined in the British literature lead us to conclude that the professions will...
Abstract: In 1983 Richard Hall viewed the sociological study of the professions as near death. However, had Hall examined the recent British literature he would have come to a very different conclusion. Our survey shows that this is a very active area of research and theorizing and that there are important lessons in it for American students of the professions. First, unlike the American literature, work in Britain has not been dominated by fruitless efforts to find the characteristics that differentiate professions from other occupations. Second, the British literature contains four distinctive characteristics that differentiate it from the American literature. They are a focus on inter- and intraprofessional conflicts, the relationship between the professions and the polity, the link between the professions and social stratification, and theoretical roots in the classic ideas of observers such as Marx and Weber. Third, the realities examined in the British literature lead us to conclude that the professions will ...

Journal ArticleDOI
TL;DR: Kemeny's rule as discussed by the authors is the unique social welfare function that satisfies a variant of independence of irrelevant alternatives together with several other standard properties, and is the most likely ranking of the alternatives.
Abstract: Condcrcet's criterion states that an alternative that defeats every other by a simple majority is the socially optimal choice. Condorcet argued that if the object of voting is to determine the “best” decision for society but voters sometimes make mistakes in their judgments, then the majority alternative (if it exists) is statistically most likely to be the best choice. Strictly speaking, this claim is not true; in some situations Bordas rule gives a sharper estimate of the best alternative. Nevertheless, Condorcet did propose a novel and statistically correct rule for finding the most likely ranking of the alternatives. This procedure, which is sometimes known as “Kemeny's rule,” is the unique social welfare function that satisfies a variant of independence of irrelevant alternatives together with several other standard properties.

Journal ArticleDOI
TL;DR: In 1987, the International Satellite Land Surface Climatology Project (ISLSCP) conducted a major field experiment aimed at the development of superior methods for relating satellite remote sensing to biological and physical parameter and exchange process data obtained during simultaneous ground truth measurements as mentioned in this paper.
Abstract: In 1987, the International Satellite Land Surface Climatology Project (ISLSCP) will conduct a major field experiment aimed at the development of superior methods for relating satellite remote sensing to biological and physical parameter and exchange process data obtained during simultaneous ground truth measurements. This, the First ISLSCP Field Experiment, will attempt to arrive at a better understanding of the role of the land surface in the behavior of the global climate system, as required by next-generation global circulation models and general biospheric models. The field experiment will be conducted in the Konza Prairie of Kansas.

Book ChapterDOI
TL;DR: The term “specific inhibitor” has been applied to these types of compounds when they are used to probe the functions of mixed populations of microorganisms, providing powerful experimental tools for investigating the activity and function of certain types of micro organisms in natural samples.
Abstract: The above statement, although meant to be tongue in cheek, contains an essential truism: all work with inhibitors is inherently suspect. This fact has been known by biochemists for some time. However, use of chemical inhibitors of enzymic systems and membranes continues to be a common approach taken toward unraveling the biochemistry and biophysics of plants, animals, and microorganisms. Various types of “broad-spectrum” biochemical inhibitors (e.g., poisons, respiratory inhibitors, and uncouplers) have been employed by ecologists for many years in order to demonstrate the active participation of microbes in chemical reactions occurring in natural samples (e.g., soils, sediments, and water). In recent years, considerable advances have been made in our understanding of the biochemistry of microorganisms of biogeochemical interest. Concurrent with these advances have been the discoveries of novel types of compounds that will block the metabolism of one particular group of microbes, but have little disruptive effect on other physiological types. Thus, the term “specific inhibitor” has been applied to these types of compounds when they are used to probe the functions of mixed populations of microorganisms. These substances provide powerful experimental tools for investigating the activity and function of certain types of microorganisms in natural samples.

Journal ArticleDOI
TL;DR: A user-centered framework for information-seeking is presented that has been used in evaluating two hypertext systems and is applied to key design issues related to information retrieval inhypertext systems.
Abstract: The authors discuss the role of information retrieval, interface design, and cognitive science in hypertext research. They present a user-centered framework for information-seeking that has been used in evaluating two hypertext systems. They apply the framework to key design issues related to information retrieval in hypertext systems. >

Journal ArticleDOI
TL;DR: A general expression is found for the effective diffusion coefficient, which can become very small at low temperatures, which is treated by analysis of the mean first passage time.
Abstract: Diffusion in a spatially rough one-dimensional potential is treated by analysis of the mean first passage time. A general expression is found for the effective diffusion coefficient, which can become very small at low temperatures.

Proceedings ArticleDOI
01 May 1988
TL;DR: The results showed that a touch strategy providing continuous feedback until a selection was confirmed had fewer errors than other touch strategies, with implications for touch screens containing small, densely-packed targets.
Abstract: A study comparing the speed, accuracy, and user satisfaction of three different touch screen strategies was performed The purpose of the experiment was to evaluate the merits of the more intricate touch strategies that are possible on touch screens that return a continuous stream of touch data The results showed that a touch strategy providing continuous feedback until a selection was confirmed had fewer errors than other touch strategies The implications of the results for touch screens containing small, densely-packed targets were discussed

Journal ArticleDOI
TL;DR: In this paper, the authors measured the distributions of turbidity, nutrients, and phytoplankton across the salinity gradients of three estuaries: Chesapeake Bay, Delaware Bay, and the Hudson river estuary.
Abstract: Estuaries receive continuous inputs of nutrients from their freshwater sources, but the fate of the inputs is poorly known. In order to document nutrient removal from the water column by phytoplankton, we measured the distributions of turbidity, nutrients, and phytoplankton across the salinity gradients of three estuaries: Chesapeake Bay, Delaware Bay, and the Hudson river estuary. Mixing diagrams were used to distinguish between conservative and non-conservative behavior; i.e. between loss from the water column and export to the estuarine plume on the shelf. In Chesapeake and Delaware Bays, we frequently observed a turbidity maximum in the oligohaline region, a chlorophyll maximum in clearer waters seaward of the turbidity maximum, and a nutrient-depleted zone at the highest salinities. In the Hudson River estuary, mixing diagrams were dominated by lateral waste inputs from New York City, and nutrient removal could not be estimated. In Chesapeake Bay, there was consistent removal of total N, nitrate, phosphate, and silicate from the water column, whereas in Delaware Bay, total N, ammonium, total P, and phosphate were removed. Total N and P removal in the Chesapeake and Delaware are estimated as ca. 50%, except for TP in the Chesapeake, which appeared to be conservative. Phytoplankton accumulation was associated with inorganic nutrient removal, suggesting that phytoplankton uptake was a major process responsible for nutrient removal. In the high salinity zone near and in the shelf plume, an index of nutrient limitation suggested no limitation in the Hudson, slight or no limitation in the Delaware, and widespread limitation in the Chesapeake, especially for P. These observations and information from the literature are summarized as a conceptual model of the chemical and biological structure of estuaries.

Journal ArticleDOI
TL;DR: In this article, the probability of latent class membership is functionally related to concomitant variables with known distribution, and a general procedure for imposing linear constraints on the parameter estimates is introduced.
Abstract: This article introduces and illustrates a new type of latent-class model in which the probability of latent-class membership is functionally related to concomitant variables with known distribution. The function (or so-called submodel) may be logistic, exponential, or another suitable form. Concomitant-variable models supplement latent-class models incorporating grouping by providing more parsimonious representations of data for some cases. Also, concomitant-variable models are useful when grouping models involve a greater number of parameters than can be meaningfully fit to existing data sets. Although parameter estimates may be calculated using standard iterative procedures such as the Newton—Raphson method, sample analyses presented here employ a derivative-free approach known as the simplex method. A general procedure for imposing linear constraints on the parameter estimates is introduced. A data set involving arithmetic test items in a mastery testing context is used to illustrate fitting a...

Journal ArticleDOI
TL;DR: In this paper, the authors consider the deformation space of flat G-bundles over a closed surface and show that invariants lie in the cohomology group H2(S; lq(G))~ n~(G) (using the orientation on S).
Abstract: Since rt is a finitely generated group, the space Hom0r, G) is a real analytic variety whenever G is a connected Lie group, and is a real affine algebraic variety whenever G is a linear algebraic group over R [3, 18, 27, 32]. The group G acts on Hom(Tr, G) by conjugation and the orbit space will be denoted by Horn(n, G)/G. Geometrically, the G-orbits on Hom(Tr, G) parametrize equivalence classes of fiat principal G-bundles over S and the space Hom(n, G)/G is the deformation space of flat G-bundles over S. The characteristic classes of G-bundles determine invariants of representations n ~ G. When :r is the fundamental group of a closed surface and G is a connected Lie group, the only invariants lie in the cohomology group H2(S; lq(G))~ n~(G) (using the orientation on S). There is an obstruction map

Journal ArticleDOI
TL;DR: This paper treats a system of self-interacting bosons described by λφ4 scalar fields in flat space and adopts the closed-time-path (CTP) functional formalism and uses a two-particle irreducible (2PI) representation for the effective action.
Abstract: This is the first of a series of papers which describe the functional-integral approach to the study of the statistical and kinetic properties of nonequilibrium quantum fields in flat and curved spacetimes. In this paper we treat a system of self-interacting bosons described by \ensuremath{\lambda}${\ensuremath{\varphi}}^{4}$ scalar fields in flat space. We adopt the closed-time-path (CTP or ``in-in'') functional formalism and use a two-particle irreducible (2PI) representation for the effective action. These formalisms allow for a full account of the dynamics of quantum fields, and put the correlation functions on an equal footing with the mean fields. By assuming a thermal distribution we recover the real-time finite-temperature theory as a special case. By requiring the CTP effective action to be stationary with respect to variations of the correlation functions we obtain an infinite set of coupled equations which is the quantum-field-theoretical generalization of the Bogoliubov-Born-Green-Kirkwood-Yvon (BBGKY) hierarchy. Truncation of this series leads to dissipative characteristics in the subsystem. In this context we discuss the nature of dissipation in interacting quantum fields.To one-loop order in a perturbative expansion of the CTP effective action, the 2PI formalism yields results equivalent to the leading 1/N expansion for an O(N)-symmetric scalar field. To higher-loop order we introduce a two-time approximation to separate the quantum-field effects of radiative correction and renormalization from the statistical-kinetic effects of collisions and relaxation. In the weak-coupling quasiuniform limit, the system of nonequilibrium quantum fields can subscribe to a kinetic theory description wherein the propagators are represented in terms of relativistic Wigner distribution functions. From a two-loop calculation we derive the Boltzmann equation for the distribution function and the gap equation for the effective mass of the quasiparticles. One can define an entropy function for the quantum gas of quasiparticles which satisfies the H theorem. We also calculate the limits to the validity of the binary collision approximation from a three-loop analysis. The theoretical framework established here can be generalized to nonconstant background fields and for curved spacetimes.

Journal ArticleDOI
TL;DR: Children aged 7 to 11 years visiting their primary care pediatrician for a wide range of reasons were studied to determine the one-year prevalence of DSM-III disorders and the risk factors associated with them.
Abstract: Children aged 7 to 11 years visiting their primary care pediatrician for a wide range of reasons were studied to determine the one-year prevalence of DSM-III disorders and the risk factors associated with them. Parents completing the Child Behavior Checklist about their children identified problems that placed 24.7% of 789 children in the clinical range. Detailed psychiatric interviews with 300 parents and children, using the Diagnostic Interview Schedule for Children, yielded a one-year weighted prevalence of one or more DSM-III disorders of 22.0% +/- 3.4%, combining diagnoses based on either the child or the parent interview.

Journal ArticleDOI
TL;DR: In this article, it was shown that there exists a neighborhood of ρ in ℜ(Γ, G) which is analytically equivalent to a cone defined by homogeneous quadratic equations.
Abstract: Let Γ be the fundamental group of a compact Kahler manifold M and let G be a real algebraic Lie group. Let ℜ(Γ, G) denote the variety of representations Γ → G. Under various conditions on ρ ∈ ℜ(Γ, G) it is shown that there exists a neighborhood of ρ in ℜ(Γ, G) which is analytically equivalent to a cone defined by homogeneous quadratic equations. Furthermore this cone may be identified with the quadratic cone in the space\(Z^1 (\Gamma ,g_{Ad\rho } )\) of Lie algebra-valued l-cocycles on Γ comprising cocyclesu such that the cohomology class of the cup/Lie product square [u, u] is zero in\(H^2 (\Gamma ,g_{Ad\rho } )\). We prove that ℜ(Γ, G) is quadratic at ρ if either (i) G is compact, (ii) ρ is the monodromy of a variation of Hodge structure over M, or (iii) G is the group of automorphisms of a Hermitian symmetric space X and the associated flat X-bundle over M possesses a holomorphic section. Examples are given where singularities of ℜ(Γ, G) are not quadratic, and are quadratic but not reduced. These results can be applied to construct deformations of discrete subgroups of Lie groups.

Proceedings ArticleDOI
01 May 1988
TL;DR: Pie menus are a format where the items are placed along the circumference of a circle at equal radial distances from the center where they gain over traditional linear menus by reducing target seek time, lowering error rates by fixing the distance factor and increasing the target size in Fitts's Law.
Abstract: Menus are largely formatted in a linear fashion listing items from the top to bottom of the screen or window. Pull down menus are a common example of this format. Bitmapped computer displays, however, allow greater freedom in the placement, font, and general presentation of menus. A pie menu is a format where the items are placed along the circumference of a circle at equal radial distances from the center. Pie menus gain over traditional linear menus by reducing target seek time, lowering error rates by fixing the distance factor and increasing the target size in Fitts's Law, minimizing the drift distance after target selection, and are, in general, subjectively equivalent to the linear style.

Journal ArticleDOI
05 May 1988-Nature
TL;DR: For example, NASA's Coastal Zone Color Scanner and with drifting buoys revealed that the discharge of the Amazon is carried offshore around a retroflection of the North Brazil Current and into the North Equatorial Countercurrent towards Africa between June and January each year as mentioned in this paper.
Abstract: New information obtained with NASA's Coastal Zone Color Scanner and with drifting buoys reveals that the discharge of the Amazon is carried offshore around a retroflection of the North Brazil Current and into the North Equatorial Countercurrent towards Africa between June and January each year. From about February to May, the countercurrent and the retroflection weaken or vanish, and Amazon water flows northwestward toward the Caribbean Sea.

Journal ArticleDOI
TL;DR: The data indicate that the mechanical force type and severity of the pelvic fracture are the keys to the expected organ injury pattern, resuscitation needs, and mortality.
Abstract: Three hundred forty-three multiple trauma patients with major pelvic ring disruption were studied and subdivided into four major groups by mechanism of injury: antero-posterior compression (APC), lateral compression (LC), vertical shear (VS), and combined mechanical injury (CMI). Acetabular fractures which did not disrupt the pelvic ring were excluded. The mode of injury was: MVA, 57.4%; motorcycle, 9.3%; fall, 9.3%; pedestrian, 17.8%; crush, 3.8%. The LC and APC groups were divided into Grades 1-3 of increasing severity. The pattern of organ injury: including brain, lung, liver, spleen, bowel, bladder, pelvic vascular injury (PVASI), retroperitoneal hematoma (RPH) and complications: circulatory shock, sepsis, ARDS, abnormal physiology, and 24-hr total fluid volume administration were all evaluated as a function of mortality (M). As LC grade increased from 1 to 3 there was increased % incidence of PVASI, RPH, shock, and 24-hr volume needs. However, the large incidence of brain, lung, and upper abdominal visceral injuries as causes of death in Grade 1 and 2 fell in LC3, with limitation of the LC3 injury pattern to the pelvis. As APC grade increased from 1 to 3 there was increased % injury to spleen, liver, bowel, PVASI with RPH, shock, sepsis, and ARDS, and large increases in volume needs, with important incidence of brain and lung injuries in all grades. Organ injury patterns and % M associated with vertical shear were similar to those with severe grades of APC, but CMI had an associated organ injury pattern similar to lower grades of APC and LC fractures. The pattern of injury in APC3 was correlated with the greatest 24-hour fluid requirements and with a rise in mortality as the APC grade rose. However, there were major differences in the causes of death in LC vs. APC injuries, with brain injury compounded by shock being significant contributors in LC. In contrast, in APC there were significant influences of shock, sepsis, and ARDS related to the massive torso forces delivered in APC, with large volume losses from visceral organs and pelvis of greater influence in APC, but brain injury was not a significant cause of death. These data indicate that the mechanical force type and severity of the pelvic fracture are the keys to the expected organ injury pattern, resuscitation needs, and mortality.

Journal ArticleDOI
TL;DR: In this article, the use of buffering agents in dairy cow diets are a function of salivary buffer secretion, feedstuff buffering capacity, and feed acidity, where buffer flow from saliva is inadequate, added dietary buffers may be justified.

Journal ArticleDOI
TL;DR: The DELIGHT.SPICE tool, a union of the DELIGHT interactive optimization-based computer-aided-design system and the SPICE circuit analysis program, is presented, yielding substantial improvement in circuit performance.
Abstract: DELIGHT.SPICE is the union of the DELIGHT interactive optimization-based computer-aided-design system and the SPICE circuit analysis program. With the DELIGHT.SPICE tool, circuit designers can take advantage of recent powerful optimization algorithms and a methodology that emphasizes designer intuition and man-machine interaction. Designer and computer are complementary in adjusting parameters of electronic circuits automatically to improve their performance criteria and to study complex tradeoffs between multiple competing objectives, while simultaneously satisfying multiple-constraint specifications. The optimization runs much more efficiently than previously because the SPICE program used has been enhanced to perform DC, AC, and transient sensitivity computation. Industrial analog and digital circuits have been redesigned using this tool, yielding substantial improvement in circuit performance. >

Journal ArticleDOI
TL;DR: The capacity of the AVC is determined with constraints on the transmitted codewords as well as on the channel state sequences, and it is demonstrated that it may be positive but less than the corresponding random code capacity.
Abstract: A well-known result of R. Ahlswede (1970) asserts that the deterministic code capacity of an arbitrarily varying channel (AVC), under the average-error-probability criterion, either equals its random code capacity or else is zero. A necessary and sufficient condition is identified for deciding between these alternative, namely, the capacity is zero if and only if the AVC is symmetrizable. The capacity of the AVC is determined with constraints on the transmitted codewords as well as on the channel state sequences, and it is demonstrated that it may be positive but less than the corresponding random code capacity. A special case of the results resolves a weakened version of a fundamental problem of coding theory. >