scispace - formally typeset
Search or ask a question

Showing papers by "Mitre Corporation published in 1991"


Journal ArticleDOI
TL;DR: The results indicate that the colored noise Kalman filters provide a significant gain in signal-to- noise ratio (SNR), a visible improvement in the sound spectrogram, and an audible improvement in output speech quality, none of which are available with white-noise-assumption Kalman and Wiener filters.
Abstract: Scalar and vector Kalman filters are implemented for filtering speech contaminated by additive white noise or colored noise, and an iterative signal and parameter estimator which can be used for both noise types is presented. Particular emphasis is placed on the removal of colored noise, such as helicopter noise, by using state-of-the-art colored-noise-assumption Kalman filters. The results indicate that the colored noise Kalman filters provide a significant gain in signal-to-noise ratio (SNR), a visible improvement in the sound spectrogram, and an audible improvement in output speech quality, none of which are available with white-noise-assumption Kalman and Wiener filters. When the filter is used as a prefilter for linear predictive coding, the coded output speech quality and intelligibility are enhanced in comparison to direct coding of the noisy speech. >

302 citations


Book ChapterDOI
08 Apr 1991
TL;DR: In this article, a general method for a secret broadcasting scheme based on k-out-of-n secret sharing is proposed, where each transmitter wishes to broadcast a secret to some subset of its listeners.
Abstract: A single transmitter wishes to broadcast a secret to some subset of his listeners He does not wish to perform, for each of the intended recipients, a separate encryption either of the secret or of a single key with which to protect the secret A general method for such a secret broadcasting scheme is proposed It is based on "k out of n" secret sharing An example using polynomial interpolation is presented as well as a related vector formulation

295 citations


Journal ArticleDOI
TL;DR: An automated tool called the Requirements Apprentice (RA) which assists a human analyst in the creation and modification of software requirements is presented, which develops a coherent internal representation of a requirement from an initial set of disorganized imprecise statements.
Abstract: An automated tool called the Requirements Apprentice (RA) which assists a human analyst in the creation and modification of software requirements is presented. Unlike most other requirements analysis tools, which start from a formal description language, the focus of the RA is on the transition between informal and formal specifications. The RA supports the earliest phases of creating a requirement, in which ambiguity, contradiction, and incompleteness are inevitable. From an artificial intelligence perspective, the central problem the RA faces is one of knowledge acquisition. The RA develops a coherent internal representation of a requirement from an initial set of disorganized imprecise statements. To do so, the RA relies on a variety of techniques, including dependency-directed reasoning, hybrid knowledge representations and the reuse of common forms (cliches). An annotated transcript showing an interaction with a working version of the RA is given. >

280 citations


Patent
27 Sep 1991
TL;DR: In this paper, the authors present an integrated architecture for an extended multilevel secure database management system, which processes security constraints to control certain unauthorized inferences through logical deduction upon queries by users.
Abstract: Apparatus for an integrated architecture for an extended multilevel secure database management system. The multilevel secure database management system processes security constraints to control certain unauthorized inferences through logical deduction upon queries by users and is implemented when the database is queried through the database management system, when the database is updated through the database management system, and when the database is designed using a database design tool.

257 citations


Patent
16 Dec 1991
TL;DR: In this article, a multilevel secure database management system based on a multi-level logic programming system is presented. But the system does not provide a complete answer to queries and prevents unauthorized inferences.
Abstract: Apparatus for designing a multilevel secure database management system based on a multilevel logic programming system. The apparatus includes a multilevel knowledge base which has a multilevel database in which data are classified at different security levels. The multilevel knowledge base also includes schema, which describe the data in the database, and rules, which are used to deduce new data. Also included are integrity constraints, which are constraints enforced on the data, and security constraints, which are rules that assign security levels to the data. The system further includes users cleared to the different security levels for querying the multilevel database, and a multilevel logic programming system is provided for accessing the multilevel knowledge base for processing queries and for processing the integrity and security constraints. The multilevel database management system makes deductions and gives complete answers to queries and prevents certain unauthorized inferences.

137 citations


Journal ArticleDOI
TL;DR: A network design algorithm is described that uses a set of deterministic connectivity measures which result in topologically survivable network designs that also meet processing and performance requirements.
Abstract: The authors describe a network design algorithm that uses a set of deterministic connectivity measures which result in topologically survivable network designs that also meet processing and performance requirements. The authors briefly describe some applicable graph theoretic concepts and recently developed connectivity measures. They describe systematic procedures for improving the topological survivability of a network, and the overall network design process. A design example is presented. >

64 citations


01 Sep 1991
TL;DR: Initial results are reported from a multi-year, interdisciplinary effort to provide guidance and assistance for designers of intelligent systems and their user interfaces to achieve more effective human-computer interaction for systems with real time fault management capabilities.
Abstract: Initial results are reported from a multi-year, interdisciplinary effort to provide guidance and assistance for designers of intelligent systems and their user interfaces The objective is to achieve more effective human-computer interaction (HCI) for systems with real time fault management capabilities Intelligent fault management systems within the NASA were evaluated for insight into the design of systems with complex HCI Preliminary results include: (1) a description of real time fault management in aerospace domains; (2) recommendations and examples for improving intelligent systems design and user interface design; (3) identification of issues requiring further research; and (4) recommendations for a development methodology integrating HCI design into intelligent system design

48 citations


Proceedings Article
24 Aug 1991
TL;DR: The architecture draws from the ideas of universal plans and subsumption's layered control, producing reaction plans that exploit low-level competences as operators that exhibit robust task execution, has high-level goal representations, and maintains consistent semantics between agent states and the environment.
Abstract: This paper describes an agent architecture and its implementation for situated robot control in field environments. The architecture draws from the ideas of universal plans and subsumption's layered control, producing reaction plans that exploit low-level competences as operators. The architecture has been implemented in an extended version of the GAPPS/Rex situated automata programming language. This language produces synchronous virtual circuits which have been shown to have formal epistemic properties. The resulting architecture exhibits robust task execution, has high-level goal representations, and maintains consistent semantics between agent states and the environment. Ongoing experiments using the architecture with two land mobile robots and one undersea mobile robot are described. The robots perform their tasks robustly during normal changes in the task environments.

48 citations


Patent
14 Feb 1991
TL;DR: A programmable decoder that provides both error and erasure decoding for all Reed-Solomon, primitive BCH, non-primitive BCH and binary BCH codes of any rate over any field is described in this paper.
Abstract: A programmable decoder that provides both error and erasure decoding for all Reed-Solomon, primitive BCH, non-primitive BCH, and binary BCH codes of any rate over any field is disclosed. The user can specify decoding parameters including the code block-length, the code-generator polynomial, and the field-generator polynomial. The basic architecture, less the small overhead for programmability, is also recommended for fixed-code applications. The decoding processor of the decoder includes systolic arrays implementing a syndrome calculator, a key equation solver, a Chien search, a recursive extender, and an inverse transform. The number of cells required for each of the five functions is on order of the error correction capability t. The systolic arrays can be fabricated on a single VLSI microchip that is itself systolic. Each of the individual systolic arrays can extended by arraying microchips together, so that any desired error correction capability can be attained by using multiple systolic microchips with a single controller.

47 citations


Journal ArticleDOI
W.J. Hendricks1
TL;DR: The array factors and the statistical properties for two types of random arrays, namely, the totally random and the binned random arrays are compared in this paper, with particular emphasis on the depressed near-in sidelobe behavior that is observed for binned arrays.
Abstract: The array factors and the statistical properties for two types of random arrays, namely, the totally random and the binned random arrays, are compared. In totally random arrays, the array elements are distributed independently across a common aperture according to some common probability distribution function. In binned arrays, the aperture is divided into nonoverlapping bins of equal length and array elements are distributed independently, one per bin, according to some probability distribution across each of the bins. Significant differences exist in the resulting array factors and underlying statistical properties for the two types of random arrays. These differences are delineated, with particular emphasis upon the depressed near-in sidelobe behavior that is observed for the binned arrays. Tables and plots that illustrate the differences are included, and several issues pertaining to implementation are noted. >

46 citations


Book ChapterDOI
01 Jun 1991
TL;DR: A theoretical framework for adding assignments and dynamic data to functional languages without violating their semantic properties is proposed, and a new form of abstraction called observer is designed to encapsulate state-oriented computation from the remaining purely applicative computation.
Abstract: We propose a theoretical framework for adding assignments and dynamic data to functional languages without violating their semantic properties. This differs from semifunctional languages like Scheme and ML in that values of expressions remain static and side-effect-free. A new form of abstraction called observer is designed to encapsulate state-oriented computation from the remaining purely applicative computation. The type system ensures that observers are combined linearly, allowing an implementation in terms of a global store. The utility of this extension is in manipulating shared dynamic data embedded in data structures. Evaluation of well-typed programs is Church-Rosser. Thus, programs produce the same results whether an eager or lazy evaluation order is used (assuming termination). A simple, sound logic permits reasoning about well-typed programs. The benefits of this work include greater expressive power and efficiency (compared to applicative languages), while retaining simplicity of reasoning.

Journal ArticleDOI
TL;DR: A theorem characterizing fractional Brownian motion by the covariance structure of its wavelet transform is established and whether there are alternate Gaussian processes whose wavelet transforms have a natural covarianceructure is examined.
Abstract: A theorem characterizing fractional Brownian motion by the covariance structure of its wavelet transform is established. The authors examine whether there are alternate Gaussian processes whose wavelet transforms have a natural covariance structure. In addition, the authors examine if there are any Gaussian processes whose wavelet transform is stationary with respect to the affine group (i.e. the statistics of the wavelet transform do not depend on translations and dilations of the process). >

Proceedings ArticleDOI
04 Nov 1991
TL;DR: In this article, a technique for producing signals whose energy is concentrated in a given region of the time-frequency plane is examined, where the degree to which a particular signal is concentrated is measured by integrating its timefrequency distribution over the given region.
Abstract: A technique for producing signals whose energy is concentrated in a given region of the time-frequency plane is examined. The degree to which a particular signal is concentrated is measured by integrating its time-frequency distribution over the given region. This technique, using the Wigner distribution, has recently been used as a framework for time-varying filtering. The associated subspace projection operator is studied. Estimates for the eigenvalue decay and the smoothness and decay of the eigenfunctions are presented. >

Journal ArticleDOI
R.L. Fante1
TL;DR: In this article, an ideal two-element array that uses bandwidth partitioning in both the main and auxiliary channels, with an Mth-order adaptive finite impulse response filter in each subband of the auxiliary channel is studied.
Abstract: It has been demonstrated that specular or diffuse jammer multipath can be canceled to a desired level by using an adaptive array that combines bandwidth partitioning with tapped delay lines. Such hybrid systems are studied. In particular, the author studies an ideal two-element array that uses bandwidth partitioning in both the main and auxiliary channels, with an Mth-order adaptive finite impulse response filter in each subband of the auxiliary. The ability of this system to cancel specular moderately diffuse and diffuse multipath is studied. The combinations of bandwidth partitioning and filter order that can achieve a specified jammer cancellation level are discussed. >

Book ChapterDOI
01 Jan 1991
TL;DR: This paper describes some straightforward binary encodings for attribute-based instance spaces that give classifier systems the ability to represent ordinal and nominal attributes as expressively as most symbolic machine learning systems, without sacrificing the building blocks required by the genetic algorithm.
Abstract: Legitimate concerns have been raised about the expressive adequacy of the classifier language. This paper shows that many of those concerns stem from the inadequacies of the binary encodings typically used with classifier systems, not the classifier language per se. In particular, we describe some straightforward binary encodings for attribute-based instance spaces. These encodings give classifier systems the ability to represent ordinal and nominal attributes as expressively as most symbolic machine learning systems, without sacrificing the building blocks required by the genetic algorithm.

Journal ArticleDOI
TL;DR: Embryos from rabbits injected with LV at 24 hours after MTX exhibited either typical MTX-induced lesions or a sequence of reparative events similar to those described for the 16 and 20 hour LV-treated embryos, presumably related directly to its mechanism of developmental toxicity.
Abstract: Methotrexate (MTX) is lethal or teratogenic to embryos of all species tested. New Zealand white rabbit embryos are relatively resistant to the embryolethal effects of MTX. However, when pregnant does were injected iv with 19.2 mg MTX/kg on gestational day 12, virtually all surviving fetuses exhibited multiple malformations of the head, limbs, and trunk. MTX is a structural analogue of folic acid that competitively inhibits dihydrofolate reductase, thereby preventing formation of folinic acid and essentially stopping one carbon metabolism. One carbon metabolism is important in the synthesis of methionine, histidine, glycine, and purine bases that are required for the de novo synthesis of DNA. Presumably these metabolic effects of MTX relate directly to its mechanism of developmental toxicity. An ameliorative treatment has been tested utilizing i.v. injection of pregnant rabbits with leucovorin (LV), a close structural analogue of folinic acid (the product of the inhibited enzyme), at various times after MTX exposure. When LV was injected at times up to 24 hours after MTX fewer malformed fetuses resulted and the incidence of specific malformations was reduced. When given at times up to 20 hours after MTX administration, LV virtually eliminated the grossly apparent effects of MTX at term. In the forelimb bud, MTX increased the extracellular space surrounding limb bud mesenchymal cells within 8-10 hours; this process continued through 16 hours and remained unabated by 24 hours. Mesenchymal cell nuclei became hyperchromatic and pyknotic during this time period. By 24 hours, a moderate amount of cellular debris was observed in the mesenchymal compartment of limb buds from approximately one-third of the embryos examined. Endothelial cell nuclei of the limb bud vasculature did not exhibit the histopathological alterations observed in the mesenchymal cells. Limb buds from embryos injected with LV at times up to 6 hours after MTX were histologically normal. When LV treatment was delayed until 16 or 20 hours after MTX, mesenchymal nuclei regained normal appearance within 2 hours of treatment; further, the abnormally large intracellular space began to decrease during the next 4 hours. Cellular debris was not a prominent feature of limb buds from LV-treated embryos examined at any time. Embryos from rabbits injected with LV at 24 hours after MTX exhibited either typical MTX-induced lesions or a sequence of reparative events similar to those described for the 16 and 20 hour LV-treated embryos.(ABSTRACT TRUNCATED AT 400 WORDS)

Journal ArticleDOI
TL;DR: Peat cores from three aquatic environments (freshwater, brackish and marine) were analyzed for organic, pyritic and sulfatic sulfur contents and isotope ratios as mentioned in this paper.

Proceedings ArticleDOI
01 Dec 1991
TL;DR: The authors present some performance results for the Moving Time Window parallel simulation control protocol: a scheduling paradigm for parallel discrete-event simulation that supports both optimistic and conservative event execution models via a time window.
Abstract: The authors present some performance results for the Moving Time Window (MTW) parallel simulation control protocol: a scheduling paradigm for parallel discrete-event simulation. MTW supports both optimistic and conservative event execution models via a time window. The time window constrains simulation object asynchrony by temporally bounding the difference in local simulation time between objects. MTW also provides a hierarchy of synchronization alternatives aimed at reducing simulation execution time, while maintaining temporal integrity. The authors describe the MTW paradigm in some detail, proposing several initial hypotheses regarding MTW performance, and present experimental evidence to support these hypotheses. >

Proceedings Article
14 Jul 1991
TL;DR: This paper describes a system which processes local sensor data in such a way as to allow efficient, reactive local navigation, and experiments with this system, both in simulation, and with a real robot operating in natural terrain.
Abstract: In order to navigate autonomously, most robot systems are provided with some sort of global terrain map. To make storage practical, these maps usually have a high-level symbolic representation of the terrain. The robot's symbolic map is then used to plan a local path. This paper describes a system which uses the reverse (and perhaps more natural) process. This system processes local sensor data in such a way as to allow efficient, reactive local navigation. A byproduct of this navigation process is an abstraction of the terrain information which forms a global symbolic terrain map of the terrain through which the robot has passed. Since this map is in the same format as that used by the local navigation system, the map is easy for the system to use, augment, or correct. Compared with the data from which the maps are created, the maps are very space efficient, and can be modified, or used for navigation in real-time. Experiments with this system, both in simulation, and with a real robot operating in natural terrain, are described.

Journal ArticleDOI
TL;DR: An architecture for a distributed database management system which operates in a limited heterogeneous environment is described and techniques for query processing and transaction management are discussed.

Proceedings ArticleDOI
08 Jul 1991
TL;DR: An extension to error backpropagation that allows the nodes in a neural network to encode state information in an autoregressive 'memory' gives such networks the ability to learn to recognize sequences and context-sensitive patterns.
Abstract: Describes an extension to error backpropagation that allows the nodes in a neural network to encode state information in an autoregressive 'memory'. This neural model gives such networks the ability to learn to recognize sequences and context-sensitive patterns. Building upon the work of A. Wieland (1990) concerning nodes with a single feedback connection, the authors generalize the method to n feedback connections and address stability issues. The learning algorithm is derived, and a few applications are presented. >

Book ChapterDOI
01 Jan 1991
TL;DR: A reformulation of the genetic algorithm is proposed that makes it appropriate to any representation that can be cast in a formal grammar, and concentrates on the modifications required to make the space of legal structures closed under the crossover operator.
Abstract: High-level syntactically-based representations pose problems for applying the GA because it is hard to construct crossover operators that always result in legal offspring. This paper proposes a reformulation of the genetic algorithm that makes it appropriate to any representation that can be cast in a formal grammar. This reformulation is consistent with recent reinterpretations of GA foundations in set-theoretic terms, and concentrates on the modifications required to make the space of legal structures closed under the crossover operator. The analysis places no restriction on the form of the grammars.

Book ChapterDOI
01 Mar 1991
TL;DR: An automated method for transforming dense, uniformly sampled data grids to an irregular triangular mesh that represents a piecewise planar approximation to the sampled data, derived from a Delaunay triangulation is presented.
Abstract: Interactive visualization of three dimensional data requires construction of a geometric model for rendering by a graphics processor. We present an automated method for transforming dense, uniformly sampled data grids to an irregular triangular mesh that represents a piecewise planar approximation to the sampled data. The mesh vertices comprise surface-specific points, which characterize important surface features. We obtain surface-specific points by a novel application of linear and non-linear filters, and thresholding. We define a procedure for constructing a triangulation, derived from a Delaunay triangulation, that conforms to the sampled data. In our example application, modeling a terrain surface over a large area, an 80% reduction in polygons maintains an acceptable fit. This method also extends to the tessellation of images. Applications include scientific visualization and construction of virtual environments.

Journal ArticleDOI
TL;DR: Sufficient conditions are given for the code of a difference set to be embedded into a duadic code to generalize some wellknown results on projective planes and is related to Wilbrink's Theorem.

Journal ArticleDOI
TL;DR: In this article, an architecture for a scanning reflector antenna that supports wide-angle scanning, multioctave tunable bandwidths, transmit and receive operation, and monopulse processing is presented.
Abstract: An architecture for a scanning reflector antenna that supports wide-angle scanning, multioctave tunable bandwidths, transmit and receive operation, and monopulse processing is presented. It consists of a parabolic dish with an offset phased array. The array is actually a feed-through lens fed from a small feed array in the back of the lens. Beam steering is accomplished by using the feed array to illuminate clusters of elements within the lens that correspond to far-field beam positions. The evolution of the architecture and a computer model is described. Simulated and measured performance results are reported. Off-axis scanning of 10 degrees to 15 degrees appears to be possible while still supporting gain magnifications of four to five. >

Proceedings ArticleDOI
18 Jun 1991
TL;DR: The paper describes nonmonotonic typed multileVEL logic (NTML) for multilevel database applications and discusses techniques for query evaluation and integrity checking.
Abstract: For pt.I. see Proc. 4th Computer Security Foundations, Franconia, USA (1991). In pt.I the author described a logic called nonmonotonic typed multilevel logic (NTML) for multilevel database applications. They also described various approaches to viewing multilevel databases through NTML. In this paper he continues with his discussion of the applications of NTML. In particular, the use of NTML as a programming language, issues on handling negative information in multilevel databases, and approaches for integrity checking in multilevel database systems are described. His work on NTML will be of significance to multilevel data/knowledge base applications in the same way logic programming has been to the development of data/knowledge base applications. >

Proceedings ArticleDOI
30 Apr 1991
TL;DR: PEC augments a rectangular 2-D mesh by adding four longer connections to each node to form meshes between processors that are separated by 2/sup k/ hops on the near-neighbor mesh.
Abstract: Describes packed exponential connections (PEC), a novel interconnection network that is being used to implement a fine grain configurable hardware massively parallel computer. PEC augments a rectangular 2-D mesh by adding four longer connections to each node. These longer connections form meshes between processors that are separated by 2/sup k/ hops on the near-neighbor mesh. The distribution of these meshes provides regular and predictable long-distance connectivity between regions of the near-neighbor mesh. PEC is efficient in use of wire area, and is able to implement both mesh and non-mesh data-transfer patterns. >

Proceedings ArticleDOI
01 Jun 1991
TL;DR: An algorithm to solve the minimum number of total delay buffer stages necessary tochronize a pipelined system, and it is shown that it can be recast in term of the clarsical minimum cost network flow problem.
Abstract: When designing a pipelined digital system, dekay buffers (often implemented as shifr registers) are usually introduced into the system in order to synchronize the various signals impinging on each processing element. That is, to insure that all inputs to a processing block arrive at precisely the same time. Automatic techniques for finding the lengths of such buffers, and their proper points of insertion in the system, have been proposed. They are usually bared on graph-theoretic approaches. [l-31. However, the solution to this synchronization problem is not Imique. so there exist many combinations of buffer locatim and length that can produce overall synchronization in a typical pipelined network. Obviously. it would be beneficial to determine the minimum number of total delay buffer stages necessary to s)mchronize a pipelined system, so that the system hardware cost and complexity can be reduced. In this paper. we present an algorithm to solve this buffer minimization problem. We will show that it can be recast in term of the clarsical minimum cost network flow problem. Hence, the time complexity of our algorithm is polynomial rather than exponential as for the algorithm reported in 141. Our technique is qplicable to system con%aining feedback Imps, but in the interest 4 brevity in this article we will treat only the most common case in which our wderlying system graphs are acyclic. The algorithm has been used in a silicon compiler design environment described in [SI.

Patent
24 Oct 1991
TL;DR: In this article, a very high-speed, high-resolution, low-noise, subranging analog-to-digital (AJD) converter architecture is described, which employs several sample-and-hold circuits in parallel.
Abstract: A very high-speed, high-resolution, low-noise, subranging analog-to-digital (AJD) converter architecture is described. It employs several sample-and-hold circuits in parallel. Also, the second-stage fine quantization flash A/D converter circuit of the conventional subranging A/D converter is replaced with a hybrid subranging converter of higher resolution and linearity. This is shown to achieve a very high dynamic range by minimizing noise due to the sample-and-hold and fine quantization circuits. This architecture permits construction of 16 to 18 bit 10 MHz A/D converters applicable for airborne radar systems.

Proceedings ArticleDOI
P.W. Mallet1
02 Dec 1991
TL;DR: The paper presents a list of considerations for applying a commercial off-the-shelf disk encryptor to an environment where hostile overrun is a significant threat.
Abstract: The paper presents a list of considerations for applying a commercial off-the-shelf disk encryptor to an environment where hostile overrun is a significant threat. The considerations include: how the encryption device is configured and interfaced to the workstation, host, or server; encryption key management including key entry, changeover, and quick destruct; and long term off-line storage. >