scispace - formally typeset
Search or ask a question

Showing papers by "Mitre Corporation published in 1993"


Patent
12 Oct 1993
TL;DR: In this paper, a system for authenticating and authorizing a user to access services on a heterogeneous computer network is described, which includes at least one workstation and one authorization server connected to each other through a network.
Abstract: A system for authenticating and authorizing a user to access services on a heterogenous computer network. The system includes at least one workstation and one authorization server connected to each other through a network. A user couples a personally protectable coprocessor (smart card) to the workstation by means of a bidirectional communications channel. The coprocessor is adapted to receive signals including first encrypted authentication information and to decrypt the first encrypted authentication information using a preselected first key. The coprocessor is further adapted to assemble and encrypt second authentication information using a preselected second key and to transmit the encrypted second encrypted authentication information to the workstation. The workstation then communicates the information onto the network whereby the user is authenticated to access the networked computer or service.

233 citations


Journal ArticleDOI
TL;DR: IMPS is an interactive mathematical proof system intended as a general-purpose tool for formulating and applying mathematics in a familiar fashion and provides some support for modeling applications in computer science.
Abstract: IMPS is an interactive mathematical proof system intended as a general-purpose tool for formulating and applying mathematics in a familiar fashion. The logic of IMPS is based on a version of simple type theory with partial functions and subtypes. Mathematical specification and inference are performed relative to axiomatic theories, which can be related to one another via inclusion and theory interpretation. IMPS provides relatively large primitive inference steps to facilitate human control of the deductive process and human comprehension of the resulting proofs. An initial theory library containing over a thousand repeatable proofs covers significant portions of logic, algebra, and analysis and provides some support for modeling applications in computer science.

175 citations


Book ChapterDOI
01 Jan 1993
TL;DR: While this study focuses on the feasibility, validity, and segregated contribution of exclusively continuous OASR, future highly robust recognition systems should combine optical and acoustic information with syntactic, semantic and pragmatic aids.
Abstract: This study describes the design and implementation of a novel continuous speech recognizer that uses optical information from the oral-cavity shadow of a speaker. The system uses hidden Markov models (HMMs) trained to discriminate optical information and achieves a recognition rate of 25.3 percent on 150 test sentences. This is the first system to accomplish continuous optical automatic speech recognition (OASR). This level of performance--without the use of syntactical, semantic, or any other contextual guide to the recognition process--indicates that OASR may be used as a major supplement for robust multi-modal recognition in noisy environments. Additionally, new features important for OASR were discovered, and novel approaches to vector quantization, training, and clustering were utilized. This study contains three major components. First, it hypothesize 35 static and dynamic optical features to characterize the shadow of the oral-cavity for the speaker. Using the corresponding correlation matrix and a principal component analysis, the study discarded 22 oral-cavity features. The remaining 13 oral-cavity features are mostly dynamic features, unlike the static features used by previous researchers. Second, the study merged phonemes that appear optically similar on the speaker's oral-cavity region into visemes. The visemes were objectively analyzed and discriminated using HMM and clustering algorithms. Most significantly, the visemes for the speaker, obtained through computation, are consistent with the phoneme-to-viseme mapping discussed by most lipreading experts. This similarity, in a sense, verifies the selection of oral-cavity features. Third, the study trained the HMMs to recognize, without a grammar, a set of sentences having a perplexity of 150, using visemes, trisemes (triplets of visemes), and generalized trisemes (clustered trisemes). The system achieved recognition rates of 2 percent, 12.7 percent, and 25.3 percent using, respectively, viseme HMMs, triseme HMMs, and generalized triseme HMMs. The study concludes that methodologies used in this investigation demonstrate the need for further research on continuous OASR and on the integration of optical information with other recognition methods. While this study focuses on the feasibility, validity, and segregated contribution of exclusively continuous OASR, future highly robust recognition systems should combine optical and acoustic information with syntactic, semantic and pragmatic aids.

94 citations


Journal ArticleDOI
TL;DR: Two proximate time-optimal servomechanisms, PTO 53 and PTO53 tau, are proposed for type-3 and type-2 third-order plants, respectively, and it is shown that the control parameters can be adjusted to accommodate, more or less, disturbances and unmodeled dynamics in the system.
Abstract: The problem of proximate time-optimal control of third-order systems is considered. Two proximate time-optimal servomechanisms, PTO53 and PTO53 tau , are proposed for type-3 and type-2 third-order plants, respectively. Theorems stating sufficient conditions on each system's control design parameters to ensure global stability are given, and it is shown that the control parameters can be adjusted to accommodate, more or less, disturbances and unmodeled dynamics in the system. The approach used to develop the controllers is that of constructing a 'slab' in three-dimensional state space that describe the 'switching' structure of the control. The technique relies on three-dimensional phase-space analysis, which is rarely applied to systems of order three or higher. Simulation and experimental results demonstrate the performance of the algorithms developed. >

91 citations


Journal ArticleDOI
TL;DR: In this paper, a national traffic flow management (TFM) strategy that reduces both congestion and delay in the National Airspace System (NAS) is proposed to reduce both traffic growth and changes in traffic patterns.
Abstract: Traffic growth and changes in traffic patterns have caused increasing congestion and delay in the National Airspace System (NAS). A national traffic flow management (TFM) strategy that reduces both...

72 citations


Book ChapterDOI
01 Jan 1993
TL;DR: This paper reviews some well known results in mathematical genetics that use probability distributions to characterize the effects of recombination on multiple loci in the absence of selection and uses this characterization to quantify certain inductive biases associated with crossover operators.
Abstract: Though genetic algorithms are loosely based on the principles of genetic variation and natural selection, the theory of mathematical genetics has not played a large role in most analyses of genetic algorithms. This paper reviews some well known results in mathematical genetics that use probability distributions to characterize the effects of recombination on multiple loci in the absence of selection. The relevance of this characterization to genetic algorithm research is illustrated by using it to quantify certain inductive biases associated with crossover operators. The potential significance of this work for the theory of genetic algorithms is discussed.

67 citations


Journal ArticleDOI
TL;DR: A version of simple type theory, called PF*, in which functions may be partial and types may have subtypes is presented, and it is proved that the axiomatic system is complete with respect to the general models semantics.

53 citations


Journal ArticleDOI
TL;DR: The assumptions underlying current regulatory practices for environmental chemicals are not applicable to the medicinal use of chloral hydrate and a threshold model is appropriate, and possible modifications in its use are suggested.
Abstract: Objective. Current federal regulations of potentially carcinogenic environmental chemicals are based on the assumption that risks for humans can be extrapolated from the effects of chronic high-dose exposure of rodents. It is assumed that all chemicals induce cancer by a genotoxic mechanism (direct interaction with DNA) and that humans metabolize chemicals by the same pathways as the test rodents. Trichloroethylene, a former medicine, is now regulated because of rodent studies. Its major metabolite, chloral hydrate, widely used as a sedative in both adults and children, is in danger of being banned by comparable studies. This paper assesses the safety of chloral hydrate. Design. Analysis of the literature regarding the metabolic, toxicologic, and epidemiologic data on trichloroethylene and chloral hydrate. Results. The dose-response relationship for carcinogenesis of chioral hydrate and other chemicals in its metabolic breakdown pathway is nonlinear in rodents: very high doses given chronically, sufficient to cause cellular necrosis, are necessary for induction of malignancies. In addition, epidemiologic data on people exposed to substantial amounts of trichloroethylene (which is metabolized to chloral hydrate) show no increase in mortality or cancers. Conclusions. The assumptions underlying current regulatory practices for environmental chemicals are not applicable to the medicinal use of chloral hydrate. Instead, a threshold model is appropriate. The data do not suggest the need to ban chloral hydrate as a medicine; however, possible modifications in its use are suggested.

51 citations


Book ChapterDOI
23 Sep 1993
TL;DR: Although the method is based on a nonclassical version of simple type theory, it is intended as a guide for theory interpretation in classical simple type theories as well as in predicate logics with partial functions.
Abstract: Theory interpretation is a logical technique for relating one axiomatic theory to another with important applications in mathematics and computer science as well as in logic itself. This paper presents a method for theory interpretation in a version of simple type theory, called lutins, which admits partial functions and subtypes. The method is patterned on the standard approach to theory interpretation in first-order logic. Although the method is based on a nonclassical version of simple type theory, it is intended as a guide for theory interpretation in classical simple type theories as well as in predicate logics with partial functions.

50 citations


DeSesso Jm1
01 Sep 1993
TL;DR: Although the monkey offers a more appropriate model for studying the toxic effects of inhaled substances on the nasal passages and extrapolating the findings to humans, the rat, which is very different from humans, is a poor model.
Abstract: While nasal cancer is relatively rare among the general population, workers in the nickel refining, leather manufacturing, and furniture building industries exhibit increased incidences of nasal cancer. To investigate the causes of nasal cancer and to design ameliorative strategies, an appropriate animal model for the human upper respiratory regions is required. The present report describes, compares, and assesses the anatomy and physiology of the nasal passages and upper airways of humans, rats, and monkeys for the purpose of determining a relevant animal model in which to investigate potential causes of nasal cancer. Based on the mode of breathing, overall geometry of the nasal passages, relative nasal surface areas, proportions of nasal surfaces lined by various epithelia, mucociliary clearance patterns, and inspiratory airflow routes, the rat, which is very different from humans, is a poor model. In contrast, the monkey exhibits many similarities to humans. Although the monkey does differ from humans in that it exhibits a more rapid respiratory rate, smaller minute and tidal volumes, larger medial turbinate, and a vestibular wing that creates an anterior vortex during inspiration, it offers a more appropriate model for studying the toxic effects of inhaled substances on the nasal passages and extrapolating the findings to humans.

50 citations


Journal ArticleDOI
TL;DR: The authors construct new and improved sonar sequences by applying rotation, multiplication, and shearing transformations to Costas sequence constructions.
Abstract: The authors construct new and improved sonar sequences by applying rotation, multiplication, and shearing transformations to Costas sequence constructions. A catalog of the best known sonar sequences with up to 100 symbols is given. >

Journal ArticleDOI
TL;DR: Fundamental notions in computing exception propagation are discussed and an analysis tool is described that has proved to be effective in detecting inconsistencies in the exception‐handling code of Ada applications.
Abstract: Since the signature of an Ada subprogram does not specify the set of exceptions that the subprogram can propagate, computing the set of exceptions that a subprogram may encounter is not a trivial task. This is a source of error in large Ada systems: for example, a subprogram may not be prepared to handle an exception propagated from another subprogram several layers lower in the call-tree. In a large system, the number of paths in exceptional processing is so great that it is unlikely that testing will uncover all errors in inter-procedural exception handling. Nor are compilers or code inspections likely to locate all such errors. Exception handling is an area where static analysis has a high potential payoff for systems with high reliability requirements. We discuss fundamental notions in computing exception propagation and describe an analysis tool that has proved to be effective in detecting inconsistencies in the exception-handling code of Ada applications.

Journal ArticleDOI
TL;DR: The transport theory model is used to illustrate why DVDDs are best able to support fast presentation from arbitrary directions and the technology underlying various DVDDs is described.
Abstract: Direct volume display devices (DVDDs), which display 3D volumes and surfaces in a volume by providing depth rather than depth cues, are discussed. The transport theory model is used to illustrate why DVDDs are best able to support fast presentation from arbitrary directions. The technology underlying various DVDDs is described. Specifically, the design and operation of the OmniView rotating-screen DVDD are examined. The air-traffic-control/air-tactics-analysis, satellite orbit mechanics, and time-critical target prosecution applications of DVDDs are also discussed. >

Journal ArticleDOI
TL;DR: In this article, a new class of finite dimensional reproducing kernel spaces of m × 1 vector valued analytic functions on a fairly general domain Ω+ is introduced, where reproducing kernels of these spaces have a special form which is based on an m × m matrix valued function Θ, which is J unitary on the boundary of Ω+.

Proceedings ArticleDOI
21 May 1993
TL;DR: The authors describe the lessons learned in extending the capabilities of a reverse engineering tool to analyze both an additional dialect of the language it was initially built to parse and a new embedded assembly language.
Abstract: The authors describe the lessons learned in extending the capabilities of a reverse engineering tool to analyze both an additional dialect of the language it was initially built to parse and a new embedded assembly language. The effort involved in this extension provides data to support the assertion that reverse engineering tools should create a clean separation between parsing the source code and analyzing it. A language independent modeling approach is discussed that will allow achieving this separation. Additional advantages that accrue by maintaining this separation, such as multiple language support and support for design recovery, are discussed. >

Patent
05 Aug 1993
TL;DR: The Fast Fourier Transform (FFT) processor includes a plurality of pipelined, functionally identical stages, each stage adapted to perform a portion of an FFT operation on a block of data as mentioned in this paper.
Abstract: The Fast Fourier Transform (FFT) processor includes a plurality of pipelined, functionally identical stages, each stage adapted to perform a portion of an FFT operation on a block of data. The output of the last stage of the processor is the high-precision Fast Fourier Transform of the data block. Support functions are included at each stage. Thus, each stage includes a computational element and a buffer memory interface. Each stage also includes apparatus for coefficient generation.

Journal ArticleDOI
01 Dec 1993
TL;DR: This paper describes the design and prototype development of an Inference Controller for a MLS/DBMS that functions during query processing and describes some extensions to the inference controller so that an integrated solution can be provided to the problem.
Abstract: The Inference Problem compromises database systems which are usually considered to be secure. here, users pose sets of queries and infer unauthorized information from the responses that they obtain. An Inference Controller is a device that prevents and/or detects security violations via inference. We are particularly interested in the inference problem which occurs in a multilevel operating environment. In such an environment, the users are cleared at different security levels and they access a multilevel database where the data is classified at different sensitivity levels. A multilevel secure database management system (MLS/DBMS) manages a multilevel database where its users cannot access data to which they are not authorized. However, providing a solution to the inference problem, where users issue multiple requests and consequently infer unauthorized knowledge is beyond the capability of currently available MLS/DBMSs. This paper describes the design and prototype development of an Inference Controller for a MLS/DBMS that functions during query processing. To our knowledge this is the first such inference controller prototype to be developed. We also describe some extensions to the inference controller so that an integrated solution can be provided to the problem.

Journal ArticleDOI
TL;DR: Borders are proved by cutting a Voronoi polyhedron into cones, one for each of its faces, and the sum of all the cone volume bounds is minimized when there are 13 faces each of solid angle 4π/13.
Abstract: It is shown that a packing of unit spheres in three-dimensional Euclidean space can have density at most 0.773055..., and that a Voronoi polyhedron defined by such a packing must have volume at least 5.41848... These bounds are superior to the best bounds previously published [5] (0.77836 and 5.382, respectively), but are inferior to the tight bounds of 0.7404... and 5.550... claimed by Hsiang [2]. Our bounds are proved by cutting a Voronoi polyhedron into cones, one for each of its faces. A lower bound is established on the volume of each cone as a function of its solid angle. Convexity arguments then show that the sum of all the cone volume bounds is minimized when there are 13 faces each of solid angle 4?/13.

Journal ArticleDOI
TL;DR: The ambitious data and information system being built by the US National Aeronautics and Space Administration (NASA), with assistance from other agencies, is described, which will manage the mountains of EOS and Earth-science satellite data and also handle the harvesting and sharing of information.
Abstract: The problem of making a flood of remote sensing data readily accessible to a growing international community of global-change researchers and policy makers is discussed. The ambitious data and information system being built by the US National Aeronautics and Space Administration (NASA), with assistance from other agencies, is described. Called the Earth Observing System Data and Information System (EOSDIS), it will manage the mountains of EOS and Earth-science satellite data and will also handle the harvesting and sharing of information, the dissemination of ideas, and the establishment of a community of collaborators that will be professionally close-knit but geographically dispersed both in the United States and elsewhere. >

Proceedings ArticleDOI
02 Jun 1993
TL;DR: This paper extends a multitarget tracking algorithm for use in multisensor tracking situations and shows how filtering can be handled in mult isensor JPDA (MSJPDA) without leading to an exponential increase in filtering complexity.
Abstract: In this paper we extend a multitarget tracking algorithm for use in multisensor tracking situations. The algorithm we consider is Joint Probabilistic Data Association (JPDA). JPDA is extended to handle an arbitrary number of sensors under the assumption that the sensor measurement errors are independent across sensors. We also show how filtering can be handled in multisensor JPDA (MSJPDA) without leading to an exponential increase in filtering complexity. Simulation results are presented comparing the performance of the MSJPDA with another multisensor fusion algorithm and with the single-sensor JPDA algorithm.

Journal ArticleDOI
TL;DR: The data suggest that it is possible to increase substantially the allowable trichloroethylene in drinking water without increasing health hazards, and it is concluded that the assumptions underlying current regulations are not applicable to TCE.

Proceedings ArticleDOI
13 May 1993
TL;DR: The paper addresses issues that must be investigated in order to design and develop a multilevel secure database management system for real-time applications.
Abstract: Database systems for real-time applications must satisfy timing constraints associated with transactions, in addition to maintaining data consistency. In addition to real-time requirements, security is usually required in many applications, because sensitive information must be safeguarded. Multilevel security requirements introduce a new dimension to transaction processing in real-time database systems. The paper addresses issues that must be investigated in order to design and develop a multilevel secure database management system for real-time applications. >

Proceedings ArticleDOI
01 Dec 1993
TL;DR: Digital signatures are especially applicable to interpretations of contracts and statute of fraud law, and may be used to provide assurances in distributed and networked computer environments where electronic transactions require a high degree of trust.
Abstract: Digital Signature (DS) technology may be employed to produce legally enforceable signatures in Electronic Data Interchange (EDI) among computer users within the same general guidelines and requirements as those developed for handwritten signatures on paper. Digital signature technology promises assurance at least equal to written signatures. From a legal standpoint, this assurance remains to be tested in the evidentiary process. Business policies for organizational use of this technology are being created as the use of digital signature technology is adopted. Standard industry practice serves to create and document a legal precedent. Digital signatures are especially applicable to interpretations of contracts and statute of fraud law. Digital signatures may be used to provide assurances in distributed and networked computer environments where electronic transactions require a high degree of trust.

Proceedings ArticleDOI
21 May 1993
TL;DR: An approach that combines process modeling with process assessments is presented that led to process improvements that might have been missed otherwise and was created of a large software maintenance process.
Abstract: The authors present an approach that combines process modeling with process assessments. They use the structured analysis and design technique (SADT) modeling notation (D.A. Marca and C.L. McGowan, 1988). The DoD CIM Initiative has standardized on a subset of SADT, called IDEF0, to model business processes. A SADT (IDEF0) model was created of a large software maintenance process and the model led to process improvements that might have been missed otherwise. This model based process assessment approach is described as a process in its own right. >

Journal ArticleDOI
Young C. Lee1
TL;DR: This paper analyzes GPS RAIM availability with relatively simple GPS augmentations in the form of barometric altimeter aiding and clock coasting and presents the technical analyses used by the Satellite Operational Implementation Team to support its recommendations.
Abstract: In late 1991, the FAA formed a group called the Satellite Operational Implementation Team to accelerate the introduction of satellite navigation and communications into the National Airspace System (NAS). Since then, the team has been addressing major technical and operational issues that need to be resolved before GPS is used in instrument flight rules (IFR) operations in the NAS. One of the most critical criteria for the operational approval of near-term use of GPS in the NAS is availability of receiver autonomous integrity monitoring (RAIM) detection and identification functions. To improve RAIM availability, SOFT members suggested that relatively simple GPS augmentations in the form of barometric altimeter aiding and clock coasting be considered. This paper analyzes GPS RAIM availability with these augmentations and presents the technical analyses used by the team to support its recommendations.

Proceedings ArticleDOI
24 May 1993
TL;DR: A model of multilevel atomicity is offered that defines varying degrees of atomicity and recognizes that lower securitylevel operations within a transaction must be able to commit or abort independently of higher security level operations.
Abstract: Data management applications that use multilevel database management system (DBMS) capabilities have the requirement to read and write objects at multiple levels within the bounds of a multilevel transaction. The authors define a new notion of atomicity that is meaningful within the constraints of the multilevel environment. They offer a model of multilevel atomicity that defines varying degrees of atomicity and recognizes that lower security level operations within a transaction must be able to commit or abort independently of higher security level operations. Execution graphs are provided as a tool for analyzing atomicity requirements in conjunction with internal semantic interdependencies among the operations of a transaction and rules for determining the greatest degree of atomicity are proved that can be attained for a given multilevel transaction. Several alternative transaction management algorithms that can be used to preserve multilevel atomicity are presented. >

Proceedings ArticleDOI
S. Dhar1
02 Jun 1993
TL;DR: This paper addresses the issue of the correction of registration errors due to the presence of biases in the measurements when multiple sensors are used for tracking targets by applying the technique of simultaneously estimating the sensor biases and the state vector (of track parameters), the latter assuming that no biases are present.
Abstract: This paper addresses the issue of the correction of registration errors due to the presence of biases in the measurements when multiple sensors are used for tracking targets. The problem is formulated as one of recursively estimating biases, and is addressed by an application of the technique, using an augmented Kalman filter, of simultaneously estimating the sensor biases and the state vector (of track parameters), the latter assuming that no biases are present. Simulation results are presented comparing the performance of the recursive algorithm with that of a generalized least squares method, which is applicable when targets with known positions are available for measurement by the sensors. Results are also presented for a two-sensor case when the extended Kalman filter is used as the tracking filter.

Journal ArticleDOI
TL;DR: The Standard Clock methodology is extended to real-time discrete event dynamic systems where analytically complex features such as job priorities and interrupt mechanism can be handled in an efficient way for the parallel exploration of the performance surface.

Journal ArticleDOI
W.J. Kerins1
TL;DR: In this article, the effectiveness of off-board RF ECM is analyzed, and specific observations are made about the effectiveness when used to defend friendly aircraft when using such ECM when deployed in RF missile systems.
Abstract: Recent advances in RF missile systems have reduced the effectiveness of conventional on-board electronic countermeasures (ECM) to defend friendly aircraft. To provide the much needed defense, off-board ECM techniques are being developed. The effectiveness of off-board RF ECM is analyzed, and specific observations are made about the effectiveness of such ECM when used to defend friendly aircraft. >

Proceedings ArticleDOI
01 Feb 1993
TL;DR: It is found that current HMDs fall short in important characteristics, such as resolution, field of view, and physical attributes, and a system designer should be able to provide a suitable immersive virtual environment with the inclusion of appropriate graphics rendering techniques and virtual environment input/output devices.
Abstract: Head mounted displays (HMD) provide one means of displaying virtual environments. This paper assesses the state of HMD technology, with respect to the Virtual Reality (VR) goal of creating an environment which matches user's experience with the real world. We find that current HMDs fall short in important characteristics, such as resolution, field of view, and physical attributes. A system designer, familiar with these limitations, should be able to provide a suitable immersive virtual environment with the inclusion of appropriate graphics rendering techniques and virtual environment input/output devices.