scispace - formally typeset
Search or ask a question

Showing papers by "Mitre Corporation published in 1996"


Book ChapterDOI
25 Sep 1996
TL;DR: A unique aspect of the architecture is a “state appraisal” mechanism that protects users and hosts from attacks via state modifications and that provides users with flexible control over the authority of their agents.
Abstract: Mobile agents are processes which can autonomously migrate to new hosts Despite its many practical benefits, mobile agent technology results in significant new security threats from malicious agents and hosts The primary added complication is that, as an agent traverses multiple hosts that are trusted to different degrees, its state can change in ways that adversely impact its functionality In this paper, we discuss achievable security goals for mobile agents, and we propose an architecture to achieve these goals The architecture models the trust relations between the principals of mobile agent systems A unique aspect of the architecture is a “state appraisal” mechanism that protects users and hosts from attacks via state modifications and that provides users with flexible control over the authority of their agents

243 citations


01 Jan 1996
TL;DR: This paper investigates new threats from malicious agents and hosts in mobile agent technology and develops a set of achievable security requirements for mobile agent systems.
Abstract: Mobile agents are processes which can autonomously migrate to new hosts. Despite its many practical bene ts, mobile agent technology results in signi cant new security threats frommalicious agents and hosts. The primary added complication is that, as an agent traverses multiple hosts that are trusted to di erent degrees, its state can change in ways that adversely impact its functionality. In this paper, we investigate these new threats and develop a set of achievable security requirements for mobile agent systems.

232 citations


Proceedings ArticleDOI
01 Nov 1996
TL;DR: The results of the performance tests indicate that the SCPS‐TP extensions yield significant improvements in throughput over unmodified TCP on error‐prone links and significantly improve performance over links with highly asymmetric data rates.
Abstract: The space communication environment and mobile and wireless communication environments show many similarities when observed from the perspective of a transport protocol. Both types of environments exhibit loss caused by data corruption and link outage, in addition to congestion‐related loss. The constraints imposed by the two environments are also similar – power, weight, and physical volume of equipment are scarce resources. Finally, it is not uncommon for communication channel data rates to be severely limited and highly asymmetric. We are working on solutions to these types of problems for space communication environments, and we believe that these solutions may be applicable to the mobile and wireless community. As part of our work, we have defined and implemented the Space Communications Protocol Standards‐Transport Protocol (SCPS‐TP), a set of extensions to TCP that address the problems that we have identified. The results of our performance tests, both in the laboratory and on actual satellites, indicate that the SCPS‐TP extensions yield significant improvements in throughput over unmodified TCP on error‐prone links. Additionally, the SCPS modifications significantly improve performance over links with highly asymmetric data rates.

135 citations


Journal ArticleDOI
01 Jan 1996
TL;DR: In this paper, it was shown that there are no nonzero solutions to lattice-type generalizations of the refinement equation to the Heisenberg group, and that the set of all functions f ∈ L2(R) such that f is independent is an open, dense subset of L 2(R).
Abstract: . The refinement equation φ(t) = ∑N2 k=N1 ck φ(2t − k) plays a key role in wavelet theory and in subdivision schemes in approximation theory. Viewed as an expression of linear dependence among the time-scale translates |a|1/2φ(at − b) of φ ∈ L2(R), it is natural to ask if there exist similar dependencies among the time-frequency translates e2πibtf(t + a) of f ∈ L2(R). In other words, what is the effect of replacing the group representation of L2(R) induced by the affine group with the corresponding representation induced by the Heisenberg group? This paper proves that there are no nonzero solutions to lattice-type generalizations of the refinement equation to the Heisenberg group. Moreover, it is proved that for each arbitrary finite collection {(ak , bk)}k=1, the set of all functions f ∈ L2(R) such that {e2πibktf(t+ ak)}k=1 is independent is an open, dense subset of L2(R). It is conjectured that this set is all of L2(R) \ {0}.

81 citations


Journal ArticleDOI
R.K. Saha1
TL;DR: In this paper, an analysis of a kinematic state vector fusion algorithm when tracks are obtained from dissimilar sensors is described. But the performance of such a track-to-track fusion algorithm can be improved if the cross-correlation matrix between candidate tracks is positive.
Abstract: An analysis is described of a kinematic state vector fusion algorithm when tracks are obtained from dissimilar sensors. For the sake of simplicity, it is assumed that two dissimilar sensors are equipped with nonidentical two-dimensional optimal linear Kalman filters. It is shown that the performance of such a track-to-track fusion algorithm can be improved if the cross-correlation matrix between candidate tracks is positive. This cross-correlation is introduced by noise associated with target maneuver that is common to the tracking filters in both sensors and is often neglected. An expression for the steady state cross-correlation matrix in closed form is derived and conditions for positivity of the cross-correlation matrix are obtained. The effect of positivity on performance of kinematic track-to-track fusion is also discussed.

80 citations


Proceedings ArticleDOI
01 Jul 1996
TL;DR: The time management component of the HLA that defines the means by which individual simulations (called federates) advance through time is described, to support interoperability among federates using different local time management mechanisms.
Abstract: Recently, a considerable amount of effort in the U.S. Department of Defense has been devoted to defining the High Level Architecture (HLA) for distributed simulations. This paper describes the time management component of the HLA that defines the means by which individual simulations (called federates) advance through time. Time management includes synchronization mechanisms to ensure event ordering when this is needed. The principal challenge of the time management structure is to support interoperability among federates using different local time management mechanisms such as that used in DIS, conservative and optimistic mechanisms developed in the parallel simulation community, and real-time hardware-in-the-loop simulations.

78 citations


Journal ArticleDOI
TL;DR: By monitoring gene expression using quantitative stereology, the stages of hepatocarcinogenesis can be analyzed and quantified in sufficient detail so that the animal data can be utilized in biomathematical modeling to develop more accurate models for estimation of human cancer risks.
Abstract: A well characterized model of multistage carcinogenesis is that of hepatocarcinogenesis in the rat. The histopathology as well as the cell and molecular biology of the stages of initiation, promotion, and progression have been elucidated to varying degrees in this system. Putatively single initiated hepatocytes are identified by their expression of the ubiquitous marker of hepatocarcinogenesis, glutathione-S-transferase π (GSTP). 0.5-1.0 X 106 GSTP-positive "initiated" hepatocytes developed within 14 days after initiation with a subcarcinogenic dose of diethylnitrosamine (DEN). Approximately 1% of these cells develop clonally into altered hepatic foci (AHF) in animals administered promoting agents, such as phenobarbital, chronically for 4-8 mo. Hepatocytes within AHF during the stage of promotion exhibit normal diploid karyotypes but various phenotypes depending on the chemical nature of the promoting agent. Continued administration of the promoting agent results in the infrequent development of hepatocel...

76 citations


Journal ArticleDOI
TL;DR: A prototype device for the entry of fingerprints that uses a waveguide hologram as part of the scheme for illuminating the finger is described, enabling reduction in the size, weight, and energy consumption of a fingerprint entry device (or live scan device).
Abstract: The recording and entry of fingerprints into the local and na- tional databases is increasingly relying on optics to simplify and speed up this process. A prototype device for the entry of fingerprints that uses a waveguide hologram as part of the scheme for illuminating the finger is described. The use of the waveguide hologram enables reduction in the size, weight, and energy consumption of a fingerprint entry device (or live scan device). The entry device then becomes highly portable and thus useful in many office and field applications. Coupled with electronic or optical processing and storage along with telephone or radio trans- mission of the captured fingerprints, rapid identification of individuals be- comes a realizable goal. The components required for the prototype ho- lographic fingerprint entry device (HoloFED) including the light source, the illumination waveguide hologram, the imaging system and the stor- age, and processing system are discussed. Examples of fingerprints captured are shown. The trade-offs necessary for the implementation of the prototype are object-to-image distance, optical efficiency, weight, and cost. © 1996 Society of Photo-Optical Instrumentation Engineers.

67 citations


Proceedings ArticleDOI
17 Sep 1996
TL;DR: It is hoped that proponents of different analysis techniques will offer algorithms for compiling this language into whatever form they require, to go a long way toward ensuring that the assumptions made by different techniques, as well as the analysis results, are comparable.
Abstract: CAPSL is a formal language for expressing authentication and key-exchange protocols. It is intended to capture enough of the abstract features of these protocols to perform an analysis for protocol failures. The impetus for such a language grew out of project work in protocol analysis. A common protocol specification language seems necessary to bridge the gap between the typical informal presentations of protocols given in papers and the precise characterizations required to conduct formal analysis. It is hoped that proponents of different analysis techniques will offer algorithms for compiling this language into whatever form they require. Doing so will go a long way toward ensuring that the assumptions made by different techniques, as well as the analysis results, are comparable. Since Denning and Sacco published a replay attack on the Needham-Schroeder protocol in 1981, it has been welI known that protocols for exchanging cryptographic keys over data networks can be vulnerable to message modification attacks. The abundance of flaws in published protocols led to the development of formal techniques for their security aualysis. The proposed techniques, as represented by some of the earlier papers on the subject, include the use of goal-directed state search tools implemented in Prolog, the application of general purpose specification and verification tools, a specially-designed logic of belief, and the application of a model-checking tool for CSP specifications. It has become evident that it was difficult for analysts other than the developers of the various techniques to apply them. One reason for this difficulty is the fact that the protocols had to be m-specified for each technique, and it was not easy to transform the published description of the protocol into the required formal system. Some tool developers began work on translators or compilers that would perform the transformation automatically. The input to any such translator still requires a formally-defined language, but it can be made similar to the message-oriented protocol descriptions that are typically published. Besides our initial work on CAPSL for the Interrogator at MITRE, there were independent efforts by Steve Brai and Gavin Lowe, with a similar language, CASPER, for the application of FDR using a CSP model-checking approach. The idea of having a single common protocol specification language that could be used as the input format for any formal analysis technique was first presented at the 1996 Isaac Newton Institute Programme on Computer Security, Cryptology, and Coding Theory. The design of CAPSL is still in progress. Current documentation for the language, and discussions on design alternatives and extensions, may be found at the CAPSL home page on the World-Wide Web, at the URL http:// www.mitre.org/research/capsI.

66 citations



Proceedings Article
04 Aug 1996
TL;DR: The experiments demonstrate the importance of a generalization hierarchy and the promise of combining natural language processing techniques with machine learning (ML) to address an information retrieval (IR) problem.
Abstract: As more information becomes available electronically, tools for finding information of interest to users becomes increasingly important. The goal of the research described here is to build a system for generating comprehensible user profiles that accurately capture user interest with minimum user interaction. The research described here focuses on the importance of a suitable generalization hierarchy and representation for learning profiles which are predictively accurate and comprehensible. In our experiments we evaluated both traditional features based on weighted term vectors as well as subject features corresponding to categories which could be drawn from a thesaurus. Our experiments, conducted in the context of a content-based profiling system for on-line newspapers on the World Wide Web (the IDD News Browser), demonstrate the importance of a generalization hierarchy and the promise of combining natural language processing techniques with machine learning (ML) to address an information retrieval (IR) problem.

Journal ArticleDOI
J.L. Leva1
TL;DR: In this article, the system of pseudo-range equations is shown to be equivalent to a system of two linear equations together with a range difference and pseudo range equation, and the user's position is represented as the intersection of two planes and a hyperbola branch of revolution.
Abstract: In the four satellite Global Positioning System (GPS) problem, the system of pseudo-range equations is shown to be equivalent to a system of two linear equations together with a range difference and pseudo-range equation. The formulation represents the user's position as the intersection of two planes and a hyperbola branch of revolution. The formulation is three-dimensional and includes almost all degenerate and special case geometries. It provides geometric insight into the characteristics of the solutions and resolves existence and uniqueness questions regarding solution of the pseudo-range equations.

Proceedings ArticleDOI
Stark1
04 Nov 1996
TL;DR: Attributes of both the software maintenance process and the resulting product were measured to direct management and engineering attention toward improvement areas, track the improvement over time, and help make choices among alternatives.
Abstract: Software maintenance is central to the mission of many organizations. Thus, it is natural for managers to characterize and measure those aspects of products and processes that seem to affect the cost, schedule, quality and functionality of software maintenance delivery. This paper answers basic questions about software maintenance for a single organization and discusses some of the decisions made based on the answers. Attributes of both the software maintenance process and the resulting product were measured to direct management and engineering attention toward improvement areas, track the improvement over time, and help make choices among alternatives.

Book ChapterDOI
01 Jan 1996
TL;DR: This work focuses on supporting software maintenance/evolution activities through architectural recovery tools that are based on reverse engineering technology, and extracts architecture-level descriptions linked to the source code fragments that implement architectural features.
Abstract: Recovery of higher level design information and the ability to create dynamic software documentation is crucial to supporting a number of program understanding activities. Software maintainers look for standard software architectural structures (e.g., interfaces, interprocess communication, layers, objects) that the code developers had employed. Our goals center on supporting software maintenance/evolution activities through architectural recovery tools that are based on reverse engineering technology. Our tools start with existing source code and extract architecture-level descriptions linked to the source code fragments that implement architectural features. Recognizers (individual source code query modules used to analyze the target program) are used to locate architectural features in the source code. We also report on representation and organization issues for the set of recognizers that are central to our approach.

Proceedings ArticleDOI
S. Bayer1
03 Oct 1996
TL;DR: This research is part of the work to establish the infrastructure to create Web hosted versions of prototype multimodal interfaces, both intelligent and otherwise, and discusses the approach to several aspects of this goal.
Abstract: We describe work in progress at the MITRE Corporation on embedding speech enabled interfaces in Web browsers. This research is part of our work to establish the infrastructure to create Web hosted versions of prototype multimodal interfaces, both intelligent and otherwise. Like many others, we believe that the Web is the best potential delivery and distribution vehicle for complex software applications, and that the functionality of these Web hosted applications should match the functionality available in standalone applications. We discuss our approach to several aspects of this goal.

Journal ArticleDOI
TL;DR: A history of the development of the FDE requirements, algorithms, and test procedures, as well as a summary of the main issues addressed by the GPS Integrity (FDE) Working Group are provided.
Abstract: Working Group 5 of RTCA Special Committee (SC )-159 has spent the past several years working on the problem of GPS fault detection and exclusion (FDE) for a primary-means navigation system that can support oceanic en route through nonprecision approach modes of flight. The results of this working group's activities have been incorporated into RTCA/DO-229, Minimum Operational Performance Standards (MOPS) for GPS/WAAS Airborne Equipment, and will also serve as a baseline for the GPS/GLONASS and GPS/INS MOPS. The purpose of this paper is to provide a history of the development of the FDE requirements, algorithms, and test procedures, as well as a summary of the main issues addressed by the GPS Integrity (FDE) Working Group.

Proceedings ArticleDOI
05 Aug 1996
TL;DR: This work presents a novel approach to parsing phrase grammars based on Eric Brill's notion of rule sequences, which has somewhat less power than a finite-state machine, and yet achieves high accuracy on standard phrase parsing tasks.
Abstract: We present a novel approach to parsing phrase grammars based on Eric Brill's notion of rule sequences. The basic framework we describe has somewhat less power than a finite-state machine, and yet achieves high accuracy on standard phrase parsing tasks. The rule language is simple, which makes it easy to write rules. Further, this simplicity enables the automatic acquisition of phrase-parsing rules through an error-reduction strategy.

Proceedings ArticleDOI
TL;DR: Processing results of representative MCARM data files are presented to demonstrate that the adaptive processing can help to detect targets in a nonhomogeneous environment.
Abstract: Rome Laboratory's (RLs) Multichannel Airborne Radar Measurements (MCARM) program has collected clutter and target data employing an L-band airborne phased array radar testbed. The data collection is at the output of an electronically steered active array mounted on BAC1-11 aircraft. The MCARM array has 16 columns, each consisting of two four-element subarrays. Each subarray has its own output or is combined into a single output per column with up to 24 outputs for the array. Space-time adaptive processing (STAP) techniques simultaneously combine the signals from the elements of an array antenna and pulses of a radar waveform to suppress interference and provide target detection. To obtain adequate clutter power with the limited power-aperture product of the array for STAP analysis, the transmit mainlobe power can be focused in the receive beam sidelobe region or the transmit beam can be spoiled for broader angular coverage. In addition, the data is collected at different platform altitudes and radar waveforms over different terrain. The performance improvement achievable against Doppler-spread clutter that can be achieved with STAP using the L-band data is demonstrated. We present processing results of representative MCARM data files to demonstrate that the adaptive processing can help to detect targets in a nonhomogeneous environment. The limitations of the number of independent clutter samples in estimating the covariance matrix and its impact on target cancellation are examined by processing files with real and synthetic targets.

Patent
15 Mar 1996
TL;DR: In this paper, a horizontal miss distance filter system is provided for inhibiting resolution alert messages from an air traffic alert and collision avoidance system (210) to a pilot's display.
Abstract: A horizontal miss distance filter system (220) is provided for inhibiting resolution alert messages from an air traffic alert and collision avoidance system (210) to a pilot's display (230) The horizontal miss distance filter employs a parabolic range tracker (10) to derive a range acceleration estimate (11) utilized to discriminate intruder aircraft (110) having non-zero horizontal miss distances The horizontal miss distance calculated from the range data provided by the parabolic range tracker is compared with a bearing based horizontal miss distance provided by a bearing based tracker (22) The smaller of the two calculated horizontal miss distances defines a projected horizontal miss distance which is compared with a threshold value Any resolution alert for intruder aircraft whose projected horizontal miss distance is greater than the threshold will be inhibited unless it is determined that the encounter involves a maneuver of one of the aircraft As many as five maneuver detectors (50, 52, 56, 58 and 64) may be employed to assess whether the encounter involves a maneuver If any of the maneuver detectors establish the occurrence of a maneuver, then a resolution alert provided from the TCAS system (210) will not be inhibited

Journal ArticleDOI
TL;DR: This work compares and illustrates the ordinal ranking methods devised by Borda, Bernardo, Cook and Seiford, Kohler, and Arrow and Raynaud, and shows whether each method places the Condorcet winner in first place, ranks the alternatives according to theCondorcet order, and satisfies two principles of sequential independence.
Abstract: Given multiple criteria and multiple alternatives, the goal is to aggregate the criteria information and obtain an overall ranking of alternatives. An ordinal ranking method requires only that the rank order of the alternatives be known for each criterion. We compare and illustrate the ordinal ranking methods devised by Borda, Bernardo, Cook and Seiford, Kohler, and Arrow and Raynaud. We show whether each method places the Condorcet winner (if it exists) in first place, ranks the alternatives according to the Condorcet order (if it exists), and satisfies two principles of sequential independence. We also consider the application of these methods to cost and operational effectiveness analyses (COEAs). © 1996 John Wiley & Sons, Inc.

Proceedings ArticleDOI
03 Jan 1996
TL;DR: The Aggregate Level Simulation Protocol (ALSP), under the auspices of ADS, provides a mechanism for the integration of existing simulation models to support training via theater-level simulation exercises.
Abstract: The venerable problem-solving technique of simulation finds itself in the midst of a revolution. Where once it was regarded as a "technique of last resort" for systems analysis, today simulation is widely applied to support myriad purposes, including: training, interaction, visualization, hardware testing and decision support in real-time. Advanced distributed simulation (ADS) is the US Department of Defense (DoD) nomenclature used to describe the cooperative utilization of physically distributed simulations toward a common objective. The Aggregate Level Simulation Protocol (ALSP), under the auspices of ADS, provides a mechanism for the integration of existing simulation models to support training via theater-level simulation exercises. Consisting of a collection of infrastructure software and protocols for both inter-model communication through a common interface and time advance using a conservative Chandy-Misra based algorithm, the ALSP has supported an evolving "confederation of models " since 1992. A review of the history and design of ALSP is presented and serves to outline directions for future investigation.

Journal ArticleDOI
TL;DR: It is demon-strate that PB selectively promotes initiated cells with reduced levels of TGFP types I-HI receptors and suggests a mechanistic role for TGFp in PB-induced liver tumorpromotion.
Abstract: Phenobarbital (PB) is a potent tumor promoter in rodentliver. In this study we investigated whether PB selectivelypromotes a population of initiated cells with reduced levelsof transforming growth factor-P (TGFP) receptors types I,II and III. Liver tumors were induced in male Fischer F344rats by diethylnitrosamine (DEN). Following inductionthe animals were divided into PB-treated (DEN/PB) anduntreated groups (DEN). After 3 months of treatment halfof the PB-treated rats were removed from PB for the finalmonth (DEN/PB/OFF). At 4 months, the livers from ratsin the three treatment groups were removed, tumors excisedand frozen with matched surrounding normal tissue. ThemRNA levels for the TGFp receptors types I-HI weresignificantly decreased in tumor tissue from DEN/PB ratswhen compared with surrounding normal liver tissue ortumors from age-matched untreated controls. In tumorsfrom DEN/PB/OFF rats the TGFp receptor types I-mwere also significantly reduced compared with controls andnot different to tumors from DEN/PB rats. There was nodifference in the mRNA levels for the TGFP receptors intumors from rats exposed to DEN alone, when comparedwith the surrounding normal tissue. These results demon-strate that PB selectively promotes initiated cells withreduced levels of TGFP types I-HI receptors and suggestsa mechanistic role for TGFp in PB-induced liver tumorpromotion.Approximately 60% of the chemicals determined by theNational Toxicology Program to be carcinogenic in rats andmice give rise to liver tumors. Some of these carcinogens,however, are either only weakly genotoxic or have been foundto cause no detectable genetic damage. Rather, they appear tofunction as tumor promoting agents. These agents includechemicals to which humans are exposed, e.g. contraceptivesteroids (1,2), tamoxifen (3), benzodiazepine compounds (4),dioxin (5), and phenobarbital (PB*) (6). Although there is nodefinitive mechanism for the carcinogenic activity of thesediverse agents, one hypothesis involves reduced reponsivenessto negative growth signals, especially the potent mitoinhibitortransforming growth factor-P (TGFP) (7).In mammals TGFP exists as three highly homologousisoforms, TGFpi, TGFP2 and TGFP3 (8). These structurally

Journal ArticleDOI
TL;DR: An archive of satellite and aircraft photographs of the western Sudan showed no longterm (1943-1994) trends in the abundance of trees despite several decades of recent drought in this region as mentioned in this paper.
Abstract: An archive of satellite and aircraft photographs of the western Sudan showed no longterm (1943-1994) trends in the abundance of trees despite several decades of recent drought in this region. These data extend the extant historical record of vegetation change in the African Sahel, where recent fluctuations in vegetation greenness have been monitored with the NOAA Advanced Very High Resolution Radiometer since 1980. Despite substantial population turnover, woody vegetation is not yet indicative of the recent climate changes in this region.

Proceedings ArticleDOI
13 May 1996
Abstract: The suitability of ultra-wideband ground-penetrating radar as a tool for the detection of buried metallic mines is explored in this paper. The analysis centers around a 200-800 MHz, dual-polarized ground penetrating radar (GPR) designed and built by SRI International. The analysis consisted of fusing the images from the dual polarizations into a single image to enhance the target objects and suppress clutter. Results are shown for several variations of a Mahalanobis-based fusion technique, and "soft decision" minefield detection results based upon Monte Carlo statistical techniques are also presented. Although relatively few scenes were analyzed, these results show that the dual-polarized GPR is potentially very effective at finding buried mines and minefields.

01 Jan 1996
TL;DR: The goal of the research described here is to build a system for generating comprehensible user profiles that accurately capture user interest with minimum user interaction, and demonstrates the importance of a generalization hierarchy in high predictive accuracy, precision and recall, and stability of learning.
Abstract: As more information becomes available electronically, tools for finding information of interest to users become increasingly important. Building tools for assisting users in finding relevant information is often complicated by the difficulty in artioalating user interest in a form that can be used for searching. The goal of the research described here is to build a system for generating comprehensible user profiles that accurately capture user interest with minimum user interaction. Machine learning methods offer a promising approach to solving this problem. The research described here focuses on the importance of a suitable generalization hierarchy and representation for learning profiles which are predictively accurate and comprehensible. In our experiments using AQISc and C4.5 we evaluated both traditional features based on weighted term vectors as well as subject features corresponding to categories which could be drawn from a thesaurus. Our experiments, conducted in the context of a content-based profiling system for on-line newspapers on the World Wide Web (the IDD News Browser) demonstrate the importance of a generalization hierarchy in olxaining high predictive accuracy, precision and recall, and stability of learning.

01 Jan 1996
TL;DR: In this paper, the authors provide an overview of transaction processing needs and solutions in conventional DBMSs as background, explains the constraints introduced by multilevel security, and then describes the results of research in multi-level secure transaction processing.
Abstract: Since 1990, transaction processing in multilevel secure database management systems (DBMSs) has been receiving a great deal of attention from the database research community. Transaction processing in these systems requires modification of conventional scheduling algorithms and commit protocols. These modifications are necessary because preserving the usual transaction properties when transactions are executing at different security levels often conflicts with the enforcement of the security policy. Considerable effort has been devoted to the development of efficient, secure algorithms for the major types of secure DBMS architectures: kernelized, replicated, and distributed. An additional problem that arises uniquely in multilevel secure DBMSs is that of secure, correct execution when data at multiple security levels must be written within one transaction. Significant progress has been made in a number of these areas, and a few of the techniques have been incorporated into commercial trusted DBMS products. However, there are many open problems remain to be explored. This paper reviews the achievements to date in transaction processing for multilevel secure DBMSs. The paper provides an overview of transaction processing needs and solutions in conventional DBMSs as background, explains the constraints introduced by multilevel security, and then describes the results of research in multilevel secure transaction processing. Research results and limitations in concurrency control, multilevel transaction management, and secure commit protocols are summarized. Finally, important new areas are identified for secure transaction processing research.

Book ChapterDOI
10 Jun 1996
TL;DR: This paper briefly introduces the method of developing and testing a practical software architecture method and describes the experiences with its “alpha” and “beta” applications to two U.S. Army management information systems.
Abstract: Software architecture has come to be recognized as a discipline distinct from software design. Over the past five years, we have been developing and testing a practical software architecture method at the MITRE Software Center. The method begins with an initial statement of system goals, the purchaser's vision for the system, and needs, an abstraction of the system's requirements. Multiple views of the system are then developed, to address specific architectural concerns. Each view is defined in terms of components, connections and constraints and validated against the needs. This paper briefly introduces the method and describes our experiences with its “alpha” and “beta” applications to two U.S. Army management information systems.

Journal ArticleDOI
TL;DR: A new technique for determining at design time, the utilization bound for a specific set of hard real-time periodic tasks with known periods, when scheduled by any arbitrary fixed priority algorithm is developed.

Journal ArticleDOI
01 Feb 1996
TL;DR: An optimistic replica control algorithm is presented that ensures that the authorization table at any given site evolves consistently with respect to other sites, and shows how a site can prune its authorization log by the use of a matrix that records how current remaining sites in the system are.
Abstract: We consider the propagation of authorizations in distributed database systems. We present an optimistic replica control algorithm that ensures that the authorization table at any given site evolves consistently with respect to other sites. The motivation for using optimistic replica control to maintain authorizations is that site and communication failures do not needlessly delay authorization changes. In addition, the semantics of the authorization operations we employ can be exploited to resolve transient inconsistencies without the expense of an undo-redo mechanism. Instead, we give efficient, direct algorithms whereby a site scans its log of authorization requests and updates its authorization table correspondingly. From the system perspective, any inconsistencies in the authorization table replicas maintained at different sites are transient and are eliminated by further communication. We show how a site can prune its authorization log by the use of a matrix that records how current remaining sites in the system are.

Book ChapterDOI
03 Aug 1996
TL;DR: Imps, an Interactive Mathematical Proof System, is intended to provide mechanical support for traditional mathematical techniques and styles of practice and is equally well-suited for applications in mathematics education and in the development of high assurance hardware and software.
Abstract: imps, an Interactive Mathematical Proof System, is intended to provide mechanical support for traditional mathematical techniques and styles of practice. The system consists of a library of axiomatic theories and a collection of tools for exploring and extending the mathematics embodied in the theory library. One of the chief tools is a facility for developing formal proofs. imps is equally well-suited for applications in mathematics education and in the development of high assurance hardware and software. The imps system is available without fee (under the terms of a public license) at the ftp site math.harvard.edu and at the following Web pages: