scispace - formally typeset
Search or ask a question

Showing papers by "University of Maryland, College Park published in 1990"



Journal ArticleDOI
TL;DR: Rates of biodegradation depend greatly on the composition, state, and concentration of the oil or hydrocarbons, with dispersion and emulsification enhancing rates in aquatic systems and absorption by soil particulates being the key feature of terrestrial ecosystems.

2,450 citations


Journal ArticleDOI
TL;DR: In this article, the effect of group invariance on the stability of solitary waves was studied and applications were given to bound states and traveling wave solutions of nonlinear wave equations, where the authors considered an abstract Hamiltonian system which is invariant under a group of operators.

1,557 citations


Journal ArticleDOI
TL;DR: In this paper, a number of families of nonmonotonic consequence relations, defined in the style of Gentzen [13], are studied from both proof-theoretic and semantic points of view.

1,452 citations


Journal ArticleDOI
TL;DR: Skip lists as mentioned in this paper are data structures that use probabilistic balancing rather than strictly enforced balancing, and the algorithms for insertion and deletion in skip lists are much simpler and significantly faster than equivalent algorithms for balanced trees.
Abstract: Skip lists are data structures that use probabilistic balancing rather than strictly enforced balancing. As a result, the algorithms for insertion and deletion in skip lists are much simpler and significantly faster than equivalent algorithms for balanced trees.

1,113 citations


Journal ArticleDOI
TL;DR: This chapter shows how the theory of stochastic dynamical systems can be used to characterize the equilibria that are most likely to be selected when the evolutionary process is subject to small persistent perturbations, said to be stochastically stable.

1,004 citations


Posted Content
TL;DR: The authors analyzes the origins of this tax haven activity and its implications for the US and foreign governments, showing that American companies report extraordinarily high profit rates on both their real and their financial investments in tax havens.
Abstract: The offshore tax haven affiliates of American corporations account for more than a quarter of US foreign investment, an nearly a third of the foreign profits of US firms. This paper analyzes the origins of this tax haven activity and its implications for the US and foreign governments. Based on the behavior of US fins in 1982, it appears that American companies report extraordinarily high profit rates on both their real and their financial investments in tax havens. We calculate from this behavior that the tax rate that maximizes tax revenue for a typical haven is around 6%. The revenue implications for the US are more complicated, since tax havens may ultimately enhance the ability of the US government to tax the foreign earnings of American companies.

962 citations


Journal ArticleDOI
16 Mar 1990-Science
TL;DR: A coupled numerical model of the global atmosphere and biosphere has been used to assess the effects of Amazon deforestation on the regional and global climate; there was a significant increase in surface temperature and a decrease in evapotranspiration and precipitation over Amazonia.
Abstract: A coupled numerical model of the global atmosphere and biosphere has been used to assess the effects of Amazon deforestation on the regional and global climate. When the tropical forests in the model were replaced by degraded grass (pasture), there was a significant increase in surface temperature and a decrease in evapotranspiration and precipitation over Amazonia. In the simulation, the length of the dry season also increased; such an increase could make reestablishment of the tropical forests after massive deforestation particularly difficult.

924 citations


Proceedings ArticleDOI
05 Dec 1990
TL;DR: The stability of a queuing network with interdependent servers is considered, and the problem of scheduling server activation under the constraints imposed by the dependency among them is studied.
Abstract: The stability of a queuing network with interdependent servers is considered. The dependency among the servers is described by defining subsets of servers that can be activated simultaneously. Packet radio networks provide the motivation for the consideration of these systems. The problem of scheduling server activation under the constraints imposed by the dependency among them is studied. The stability region of the network under a specific scheduling policy is the set of vectors of arrival rates in the queues of the system for which the stochastic process of the queue lengths is ergodic. The 'supremum' (i.e., union) of the stability regions over all the policies is characterized. Then a scheduling policy is specified, the stability region of which is equal to the union of the stability regions over all the policies. Finally, the behavior of the network for arrival rates outside the stability region is studied. >

910 citations


Journal ArticleDOI
TL;DR: The high performance cycle model as discussed by the authors combines aspects of the following theories: goal setting, expectancy, social-cognitive, attribution, job characteristics, equity, and turnover-commitment.
Abstract: After decades of research it is now possible to offer a coherent, data-based theory of work motivation and job satisfaction. The present model combines aspects of the following theories: goal setting, expectancy, social-cognitive, attribution, job characteristics, equity, and turnover-commitment. The resulting model is called the high performance cycle. It begins with organizational members being faced with high challenge or difficult goals. If high challenge is accompanied by high expectancy of success or self-efficacy, high performance results, given that there is: commitment to the goals, feedback, adequate ability, and low situational constraints. High performance is achieved through four mechanisms, direction of attention and action, effort, persistence, and the development of task strategies and plans. High performance, if rewarding, leads to job satisfaction, which in turn facilitates commitment to the organization and its goals. The model has implications for leadership, self-management, and educa...

885 citations


01 Dec 1990
TL;DR: In this article, a new eddy viscosity model is presented which alleviates many of the drawbacks of the existing subgrid-scale stress models, such as the inability to represent correctly with a single universal constant different turbulent field in rotating or sheared flows, near solid walls, or in transitional regimes.
Abstract: One major drawback of the eddy viscosity subgrid-scale stress models used in large-eddy simulations is their inability to represent correctly with a single universal constant different turbulent field in rotating or sheared flows, near solid walls, or in transitional regimes. In the present work, a new eddy viscosity model is presented which alleviates many of these drawbacks. The model coefficient is computed dynamically as the calculation progresses rather than input a priori. The model is based on an algebraic identity (Germano 1990) between the subgrid-scale stresses at two different filtered levels and the resolved turbulent stresses. The subgrid-scale stresses obtained using the proposed model vanish in laminar flow and at a solid boundary, and have the correct asymptotic behavior in the near-wall region of a turbulent boundary layer. The results of large-eddy simulations of transitional and turbulent channel flow that use the proposed model are in good agreement with the direct simulation data.


Journal ArticleDOI
TL;DR: The backpropagation algorithm is applied to model the dynamic response of pH in a CSTR and is shown to be able to pick up more of the nonlinear characteristics of the CSTR.

01 Jan 1990
TL;DR: Condorcet resolut ce probleme en utilisant une forme d'estimation par le maximum de vraisemblance, a procedure qu'il en a tiree peut aussi etre justifiee a partir d'une perspective axiomatique moderne as mentioned in this paper.
Abstract: Condorcet pensait que le but d'un vote est de faire un choix qui soit le meilleur pour la societe. Selon ce point de vue, il y a un choix qui est objectivement le meilleur, un autre vient en second et ainsi de suite. Pour l'etablissement d'une regle de vote, l'objectif devrait etre de choisir l'alternative qui est le plus probablement la meilleure. Condorcet resolut ce probleme en utilisant une forme d'estimation par le maximum de vraisemblance. La procedure qu'il en a tiree peut aussi etre justifiee a partir d'une perspective axiomatique moderne. C'est la seule fonction du choix social qui satisfasse une variante de l'axiome d'independance en meme temps que plusieurs autres proprietes classiques

Journal ArticleDOI
TL;DR: In this article, empirical and theoretical investigations of rural credit markets, in the framework of the imperfect information paradigm, are presented. And the authors show how this framework is useful not only in explaining puzzling features of these markets, but also in providing a policy perspective to assess the successes and failures of specific interventions.
Abstract: Rural credit markets have been at the center of policy intervention in developing countries over the past forty years. Many governments, supported by multilateral and bilateral aid agencies, have devoted considerable resources to supplying cheap credit to farmers in a myriad of institutional settings. The results of many of these interventions have been disappointing, and one explanation for this must be that they were based on an inadequate understanding of the workings of rural credit markets. The articles in this symposium issue are devoted to empirical and theoretical investigations of rural credit markets, in the framework of the imperfect information paradigm. The authors show how this framework is useful not only in explaining puzzling features of these markets, but also in providing a policy perspective to assess the successes and failures of specific interventions.

Journal ArticleDOI
TL;DR: In this article, the geometric structures of configurations with a pair of type A and B nulls permit reconnection across the null-null lines; these are the field lines which join the two nulls.
Abstract: The present investigation of three-dimensional reconnection of magnetic fields with nulls and of fields with closed lines gives attention to the geometry of the former, with a view to their gamma-line and Sigma-surface structures. The geometric structures of configurations with a pair of type A and B nulls permit reconnection across the null-null lines; these are the field lines which join the two nulls. Also noted is the case of magnetostatic reconnection, in which the magnetic field is time-independent and the electrostatic potential is constant along field lines.

Journal ArticleDOI
TL;DR: In this paper, a traveling wave is characterized by its time invariant profile and its ability to translate at constant speed in a single spatial dimension, i.e., it is stable relative to perturbations in the initial conditions for solutions of partial differential equations.
Abstract: Travelling waves are special Solutions of partial differential equations in one space variable. They are characterized by their time invariant profile; indeed, s Solutions they evolve by translating at constant speed in the one spatial dimension. These Solutions are often the centerpiece of a physical System s they represent the transport of Information in a single direction. It is of fundamental importance for a given travelling wave to determine its stability relative to perturbations in the initial conditions for Solutions of the f ll partial differential equations. Stable Solutions are the most physically realistic since the external world provides enough perturbations that we can only expect to see waves which will dampen out these perturbations.

Journal ArticleDOI
TL;DR: It is concluded that the channel-optimized vector quantizer design algorithm, if used carefully, can result in a fairly robust system with no additional delay.
Abstract: Several issues related to vector quantization for noisy channels are discussed. An algorithm based on simulated annealing is developed for assigning binary codewords to the vector quantizer code-vectors. It is shown that this algorithm could result in dramatic performance improvements as compared to randomly selected codewords. A modification of the simulated annealing algorithm for binary codeword assignment is developed for the case where the bits in the codeword are subjected to unequal error probabilities (resulting from unequal levels of error protection). An algorithm for the design of an optimal vector quantizer for a noisy channel is briefly discussed, and its robustness under channel mismatch conditions is studied. Numerical results for a stationary first-order Gauss-Markov source and a binary symmetric channel are provided. It is concluded that the channel-optimized vector quantizer design algorithm, if used carefully, can result in a fairly robust system with no additional delay. The case in which the communication channel is nonstationary (as in mobile radio channels) is studied, and some preliminary ideas for quantizer design are presented. >

Journal ArticleDOI
TL;DR: Qualitative comparisons between the predictions of the model and previously reported experimental findings indicate that the model reproduces the major features of a maximum-height squat jump, including limb-segmental angular displacements, vertical and horizontal ground reaction forces, sequence of muscular activity, overall jump height, and final lift-off time.

Journal ArticleDOI
TL;DR: In this article, a comprehensive study of the problem of scheduling broadcast transmissions in a multihop, mobile packet radio network is provided that is based on throughput optimization subject to freedom from interference.
Abstract: A comprehensive study of the problem of scheduling broadcast transmissions in a multihop, mobile packet radio network is provided that is based on throughput optimization subject to freedom from interference. It is shown that the problem is NP complete. A centralized algorithm that runs in polynomial time and results in efficient (maximal) schedules is proposed. A distributed algorithm that achieves the same schedules is then proposed. The algorithm results in a maximal broadcasting zone in every slot. >

Journal ArticleDOI
29 Nov 1990-Nature
TL;DR: Results show that deleterious mutations are generated at a sufficiently high rate to advance Muller's ratchet in an RNA virus and that beneficial, backward and compensatory mutations cannot stop theRatchet in the observed range of fitness decrease.
Abstract: WHY sex exists remains an unsolved problem in biology1–3. If mutations are on the average deleterious, a high mutation rate can account for the evolution of sex4. One form of this mutational hypothesis is Muller's ratchet5,6. If the mutation rate is high, mutation-free individuals become rare and they can be lost by genetic drift in small populations. In asexual populations, as Muller5 noted, the loss is irreversible and the load of deleterious mutations increases in a ratchet-like manner with the successive loss of the least-mutated individuals. Sex can be advantageous because it increases the fitness of sexual populations by re-creating mutation-free individuals from mutated individuals and stops (or slows) Muller's ratchet. Although Muller's ratchet is an appealing hypothesis, it has been investigated and documented experimentally in only one group of organisms—ciliated protozoa2. I initiated a study to examine the role of Muller's ratchet on the evolution of sex in RNA viruses and report here a significant decrease in fitness due to Muller's ratchet in 20 lineages of the RNA bacteriophage Φ6. These results show that deleterious mutations are generated at a sufficiently high rate to advance Muller's ratchet in an RNA virus and that beneficial, backward and compensatory mutations cannot stop the ratchet in the observed range of fitness decrease.

Journal ArticleDOI
TL;DR: It is argued for a new approach to solving data management system problems, called multidatabase or federated systems, which make databases interoperable, that is, usable without a globally integrated schema.
Abstract: Database systems were a solution to the problem of shared access to heterogeneous files created by multiple autonomous applications in a centralized environment. To make data usage easier, the files were replaced by a globally integrated database. To a large extent, the idea was successful, and many databases are now accessible through local and long-haul networks. Unavoidably, users now need shared access to multiple autonomous databases. The question is what the corresponding methodology should be. Should one reapply the database approach to create globally integrated distributed database systems or should a new approach be introduced?We argue for a new approach to solving such data management system problems, called multidatabase or federated systems. These systems make databases interoperable, that is, usable without a globally integrated schema. They preserve the autonomy of each database yet support shared access.Systems of this type will be of major importance in the future. This paper first discusses why this is the case. Then, it presents methodologies for their design. It further shows that major commerical relational database systems are evolving toward multidatabase systems. The paper discusses their capabilities and limitations, presents and discusses a set of prototypes, and, finally, presents some current research issues.

Book
26 Jun 1990
TL;DR: This paper presents a meta-model for parallel processing for Diagnostic Problem-Solving using the probabilistic Causal Model and a parallel processing model based on the Parsimonious Covering Theory.
Abstract: Contents: Abduction and Diagnostic Inference.- Computational Models for Diagnostic Problem Solving.- Basics of Parsimonious Covering Theory.- Probabilistic Causal Model.- Diagnostic Strategies in the Probabilistic Causal Model.- Causal Chaining.- Parallel Processing for Diagnostic Problem-Solving.- Conclusion.- Bibliography.- Index.

Posted Content
TL;DR: The authors presented a model in which economic crises have positive effects on welfare and showed that periods of very high inflation create the incentive for the resolution of social conflict and thus facilitate the introduction of economic reforms and the achievement of higher levels of welfare.
Abstract: This paper presents a model in which economic crises have positive effects on welfare. Periods of very high inflation create the incentive for the resolution of social conflict and thus facilitate the introduction of economic reforms and the achievement of higher levels of welfare. Policies to reduce the cost of inflation, such as indexation, raise inflation and delay the adoption of reforms, but have no effect on expected social welfare.

Journal ArticleDOI
TL;DR: In this article, a 2-yr period was characterized by a midsummer maximum in NH, + efflux to the overlying water and a May peak in NO, removal from water by sediments.
Abstract: Contemporaneous measurements are reported for nitrification, denitrification, and net sedimentwater fluxes of NH,+ and N03- in the mesohaline region of Chesapeake Bay. Seasonal cycles over a 2-yr period were characterized by a midsummer maximum in NH, + efflux to the overlying water and a May peak in NO,-. removal from water by sediments. Coherent temporal patterns for nitrification and denitrification were observed, with relatively high values in spring and fall and virtual elimination of both processes in summer. Indirect measurements indicate that nitrification was limited by the shallow 0, penetration (< 1 mm) here compared to reports for other marine sediments (2-6 mm). In addition, a strong positive correlation between the two processes suggested that denitrification was generally controlled by nitrification. Comparisons of NO,- fluxes and net nitrification rates (nitrification minus N03- reduction to NH,+) revealed that measurements of denitrification with the acetylene block method systematically underestimated actual rates. Rates of N, loss in denitrification were similar to NH,+ recycling fluxes to the overlying water in spring and fall, but in summer negligible denitrification contributed to enhanced NH,+ recycling. These results suggest that inhibition of denitrification in eutrophic estuaries such as Chesapeake Bay may reinforce the effects of nutrient enrichment by allowing increased rates of NH,’ recycling.

Journal ArticleDOI
TL;DR: The purpose of this paper is to provide a more precise definition of information overload than previously found in the literature, essential to designing usable information systems.
Abstract: The purpose of this paper is to provide a more precise definition of information overload than previously found in the literature. Such a definition is essential to designing usable information systems. Drawing upon work in organization theory, information overload is defined as occuring when the information processing demands on an individual's time to perform interactions and internal calculations exceed the supply or capacity of time available for such processing. Cited empirical evidence demonstrates the reasonableness of the proposed definition.

Journal ArticleDOI
TL;DR: This paper consolidates the major results of these papers emphasizing the techniques and their applicability for optimizing relational queries and indicates how semantic query optimization techniques can be extended to databases that support recursion and integrity constraints that contain disjunction, negation, and recursion.
Abstract: The purpose of semantic query optimization is to use semantic knowledge (e.g., integrity constraints) for transforming a query into a form that may be answered more efficiently than the original version. In several previous papers we described and proved the correctness of a method for semantic query optimization in deductive databases couched in first-order logic. This paper consolidates the major results of these papers emphasizing the techniques and their applicability for optimizing relational queries. Additionally, we show how this method subsumes and generalizes earlier work on semantic query optimization. We also indicate how semantic query optimization techniques can be extended to databases that support recursion and integrity constraints that contain disjunction, negation, and recursion.



Journal ArticleDOI
TL;DR: Using surface marine wind and sea surface temperature data from the period 1950-1987, together with sea surface temperatures and sea level pressure data from several stations in the Pacific, this paper identified two dominant time scales of El Nino-Southern Oscillation (ENSO) variability.