scispace - formally typeset
Search or ask a question

Showing papers by "Mitre Corporation published in 1994"


Journal ArticleDOI
TL;DR: A theory of semantic values as a unit of exchange that facilitates semantic interoperability betweeen heterogeneous information systems is provided and it is shown how semantic values can either be stored explicitly or be defined by environments.
Abstract: Large organizations need to exchange information among many separately developed systems. In order for this exchange to be useful, the individual systems must agree on the meaning of their exchanged data. That is, the organization must ensure semantic interoperability. This paper provides a theory of semantic values as a unit of exchange that facilitates semantic interoperability betweeen heterogeneous information systems. We show how semantic values can either be stored explicitly or be defined by environments. A system architecture is presented that allows autonomous components to share semantic values. The key component in this architecture is called the context mediator, whose job is to identify and construct the semantic values being sent, to determine when the exchange is meaningful, and to convert the semantic values to the form required by the receiver.Our theory is then applied to the relational model. We provide an interpretation of standard SQL queries in which context conversions and manipulations are transparent to the user. We also introduce an extension of SQL, called Context-SQL (C-SQL), in which the context of a semantic value can be explicitly accessed and updated. Finally, we describe the implementation of a prototype context mediator for a relational C-SQL system.

371 citations


Journal ArticleDOI
TL;DR: In this paper, two approaches for describing the time-domain performance of an antenna were described, one of which uses the transfer function, a function which describes the amplitude and phase of the response over the entire frequency spectrum, and the other one uses time domain parameters, such as efficiency, energy pattern, receiving area, etc.
Abstract: Frequency-domain concepts and terminology are commonly used to describe antennas. These are very satisfactory for a CW or narrowband application. However, their validity is questionable for an instantaneous wideband excitation. Time-domain and/or wideband analyses can provide more insight and more effective terminology. Two approaches for this time-domain analysis have been described. The more complete one uses the transfer function, a function which describes the amplitude and phase of the response over the entire frequency spectrum. While this is useful for evaluating the overall response of a system, it may not be practical when trying to characterize an antenna's performance, and trying to compare it with that of other antennas. A more convenient and descriptive approach uses time-domain parameters, such as efficiency, energy pattern, receiving area, etc., with the constraint that the reference or excitation signal is known. The utility of both approaches, for describing the time-domain performance, was demonstrated for antennas which are both small and large, in comparison to the length of the reference signal. The approaches have also been used for other antennas, such as arrays, where they also could be applied to measure the effects of mutual impedance, for a wide-bandwidth signal. The time-domain ground-plane antenna range, on which these measurements were made, is suitable for symmetric antennas. However, the approach can be readily adapted to asymmetric antennas, without a ground plane, by using suitable reference antennas. >

329 citations


Journal ArticleDOI
TL;DR: Three experimental methods have been developed to help apply formal methods to the security verification of cryptographic protocols of the sort used for key distribution and authentication, and all combine algebraic with state-transition approaches.
Abstract: Three experimental methods have been developed to help apply formal methods to the security verification of cryptographic protocols of the sort used for key distribution and authentication. Two of these methods are based on Prolog programs, and one is based on a general-purpose specification and verification system. All three combine algebraic with state-transition approaches. For purposes of comparison, they were used to analyze the same example protocol with a known flaw.

252 citations



Proceedings ArticleDOI
K.H. Kim1
29 Jun 1994
TL;DR: The linear optimal fused estimate is a convex combination of remote estimates with weights being the estimation confidences (covariances) and the covariance based algorithm is most applicable where the track estimate is generated by a Kalman filter based tracking system.
Abstract: This paper describes techniques for track level fusion of surveillance data that are applicable to existing and near term tactical surveillance systems The linear optimal fused estimate is a convex combination of remote estimates with weights being the estimation confidences (covariances) The covariance based algorithm is most applicable where the track estimate is generated by a Kalman filter based tracking system When track covariance is not available, such as in /spl alpha/-/spl beta/ tracking systems, an estimated covariance can be used for track fusion In addition, track fusion also requires accounting for the cross covariance between tracks Various approaches to estimating the auto covariances and the cross covariances are examined, and the performance is evaluated through computer simulations

143 citations


Journal ArticleDOI
TL;DR: This paper proposes a belief-based semantics for secure databases, which provides a semantics for databases that can “lie” about the state of the world, or about their knowledge about theState of the World, in order to preserve security.
Abstract: The addition of stringent security specifications to the list of requirements for an application poses many new problems in DBMS design and implementation, as well as database design, use, and maintenance. Tight security requirements, such as those that result in silent masking of witholding of true information from a user or the introduction of false information into query answers, also raise fundamental questions about the meaning of the database and the semantics of accompanying query languages. In this paper, we propose a belief-based semantics for secure databases, which provides a semantics for databases that can “lie” about the state of the world, or about their knowledge about the state of the world, in order to preserve security. This kind of semantics can be used as a helpful retrofit for the proposals for a “multilevel secure” database model (a particularly stringent form of security), and may be useful for less restrictive security policies as well. We also propose a family of query languages for multilevel secure relational database applications, and base the semantics of those languages on our semantics for secure databases. Our query languages are free of the semantic problems associated with use of ordinary SQL in a multilevel secure context, and should be easy for users to understand and employ.

104 citations


Journal ArticleDOI
TL;DR: In this paper, the authors analyzed data from an experiment where two females and two males talked to each other in all possible pairs; a total of 60 minutes of conversation (six dyads) was transcribed.
Abstract: This article describes a preliminary experiment looking at possible differences in how females and males interact in conversation. The article analyzes data from an experiment where two females and two males talked to each other in all possible pairs; a total of 60 minutes of conversation (six dyads) was transcribed. The goal was to isolate quantifiable entities related to controlling or directing the conversation. We looked at issues such as who talked how much, how fluently or confidently – and how the two people in the conversation interacted in terms of interruptions and indications of support, agreement, or disagreement. The findings from such a small sample are only relevant to suggest hypotheses for further research. However, we noticed a number of interesting differences: the female speakers used more 1st person pronouns and fewer 3rd person references than the male speakers; the female speakers used mm hmm at a much higher frequency than the male speakers; the female speakers also interrupted each other more; and the female/female conversation seemed more fluent than the other conversations, as measured by number of disfluencies and number of affirmative transitions upon speaker change. All of these differences suggest that this area is a fruitful one for further investigation. (Conversational analysis, gender differences)

88 citations


Journal ArticleDOI
TL;DR: Pregnant New Zealand white rabbits were injected subcutaneously on gestational day 12 with a teratogenic dose of HU in the presence or absence of 550 mg/kg of D-mannitol (Man), a specific scavenger of hydroxyl free radicals, and the teratologic effects of H U were ameliorated by Man as evidenced by decreased incidences of the expected limb malformations.
Abstract: Hydroxyurea (HU) is a potent mammalian teratogen. Within 2–4 hours after maternal injection, HU causes 1) a rapid episode of embryonic cell death and 2) profound inhibition of embryonic DNA synthesis. A variety of antioxidants delays the onset of embryonic cell death and reduces the incidence of birth defects. Antioxidants do not block the inhibition of DNA synthesis, indicating that early embryonic cell death is not caused by inhibited DNA synthesis. We have suggested that some HU molecules may react within the embryo to produce H2O2 and subsequent free radicals, including the very reactive hydroxyl free radical. The free radicals could cause the early cell death; antioxidants are believed to terminate the aberrant free radical reactions resulting in lessened developmental toxicity. To investigate whether hydroxyl free radicals cause the early episode of cell death, pregnant New Zealand white rabbits were injected subcutaneously on gestational day 12 with a teratogenic dose of HU (650 mg/kg) in the presence or absence of 550 mg/kg of D–mannitol (Man), a specific scavenger of hydroxyl free radicals. Osmotic control rabbits received HU plus 550 mg/kg of xylose (Xyl, a nonactive aldose). At term, the teratologic effects of HU were ameliorated by Man as evidenced by decreased incidences of the expected limb malformations. Xyl exerted no demonstrable effect on HU teratogenesis. Histological examination of limb buds at 3–8 hours after maternal injection, showed that Man delayed the onset of HU–induced cell death by as much as 4 hours. Xyl had no effect. That Man acts within the embryo was shown by performing intracoelomic injections on alternate implantation sites with Man, Xyl, or saline followed by subcutaneous injection of the pregnant doe with HU. Embryos were harvested 3–8 hours later. Limb buds from saline– and Xyl–injected embryos exhibited the typical pattern of widespread HU–induced cell death at 3–4 hours, whereas Man–injected embryos did not exhibit cell death until 5–8 hours. These results are consistent with those reported for antioxidant–mediated amelioration of HU–induced developmental toxicity and with the hypothesis that hydroxyl free radicals are the proximate reactive species in HU–induced early embryonic cell death. © 1994 Wiley-Liss, Inc.

81 citations


24 Jun 1994
TL;DR: The overall workshop goals were to define those digital library parameters which especially influences issues of access to, retrieval of, and interaction with information.
Abstract: The overall workshop goals were: 1) to define those digital library parameters which especially influences issues of access to, retrieval of, and interaction with information; 2) to identify key problems which must be solved to make digital library service an effective reality; 3) to identify a general structure or framework for integrating research and solutions; and 4) to propose and encourage specific, high-priority research directions within such a framework.

67 citations


Journal ArticleDOI
TL;DR: Results showed that the cognitive walkthrough method identifies issues almost exclusively within the action specification stage, while guidelines covered more stages, and all the techniques could be improved in assessing semantic distance and addressing all stages on the evaluation side of the HCI activity cycle.

66 citations


Journal ArticleDOI
TL;DR: Results show that all of the ionospheric algorithms used in this paper (grid-based, least-squares, and spherical harmonics) provide roughly equivalent performance.
Abstract: The Federal Aviation Administration (FAA) Satellite Program Office is developing a GPS Wide-Area Augmentation System (WAAS) to support a precision approach capability down to or near the lowest Category I (CAT I) decision height (DH) of 200 ft. In one of the candidate architectures under development, a vector of corrections is sent to the user via geostationary communications satellites (e.g., Inmarsat). This correction vector includes components for ionospheric, clock, and ephemeris corrections. The purpose of this paper is to evaluate the performance of the grid-based algorithms and other real-time ionospheric algorithms that could be implemented at the ground ionospheric reference stations, as well as at the airborne receiver. Results show that all of the ionospheric algorithms used in this paper (grid-based, least-squares, and spherical harmonics) provide roughly equivalent performance. Based on an extensive data collection program, the error in estimating ionospheric delay is derived. An analysis of WAAS accuracy performance is also presented.

Proceedings ArticleDOI
M.A. Rood1
17 Apr 1994
TL;DR: This paper emphasizes linkages between the EA and information systems development by depicting the information systems viewpoint and shows how an EA relates to both an information systems architecture (ISA) and an information architecture (IA).
Abstract: Although the concept of an enterprise architecture (EA) has not been well defined and agreed upon, EAs are being developed to support information system development and enterprise reengineering. Most EAs differ in content and nature, and most are incomplete because they represent only data and process aspects of the enterprise. This paper defines an EA. Basic EA concepts are presented. The purpose and utility of an EA and its place in the information system environment are discussed. To facilitate understanding, a generic EA is provided. Through the generic model, each major component of an EA is defined and the component's purpose and use are presented. The generic EA described here will be used to develop enterprise-specific EAs. This paper emphasizes linkages between the EA and information systems development by depicting the information systems viewpoint. To delineate how an EA is useful in the information technology (IT) world, this paper shows how an EA relates to both an information systems architecture (ISA) and an information architecture (IA). >

Journal ArticleDOI
TL;DR: The metrics effort within NASA's Mission Operations Directorate has helped managers and engineers better understand their processes and products and the toolkit helps ensure consistent data collection across projects and increases the number and types of analysis options available to project personnel.
Abstract: The amount of code in NASA systems has continued to grow over the past 30 years. This growth brings with it the increased risk of system failure caused by software. Thus, managing the risks inherent in software development and maintenance is becoming a highly visible and important field. The metrics effort within NASA's Mission Operations Directorate has helped managers and engineers better understand their processes and products. The toolkit helps ensure consistent data collection across projects and increases the number and types of analysis options available to project personnel. The decisions made on the basis of metrics analysis have helped project engineers make decisions about project and mission readiness by removing the inherent optimism of "engineering judgment". >

Journal ArticleDOI
TL;DR: The main contribution lies in illustrating how theory was adapted to a practical system, and how the consistency and power of a design system can be increased by use of theory.
Abstract: We describe the tools and theory of a comprehensive system for database design, and show how they work together to support multiple conceptual and logical design processes. The Database Design and Evaluation Workbench (DDEW) system uses a rigorous, information-content-preserving approach to schema transformation, but combines it with heuristics, guess work, and user interactions. The main contribution lies in illustrating how theory was adapted to a practical system, and how the consistency and power of a design system can be increased by use of theory.First, we explain why a design system needs multiple data models, and how implementation over a unified underlying model reduces redundancy and inconsistency. Second, we present a core set of small but fundamental algorithms that reaarange a schema without changing its information content. From these reusable components, we easily built larger tools and transformations that were still formally justified. Third, we describe heuristic tools that attempt to improve a schema, often by adding missing information. In these tools, unreliable techniques such as normalization and relationship inference are bolstered by system-guided user interactions to remove errors. We present a rigorous criterion for identifying unnecessary relationships, and discuss an interactive view integrator. Last, we examine the relevance of database theory to building these practically motivated tools and contrast the paradigms of system builders with those of theoreticians.

Journal ArticleDOI
TL;DR: In this article, an iterative solution that accounts for multiple scattering up to second-order is proposed for bistatic specular scattering from a cylinder and sphere pair is discussed and the results are compared with numerical computations based on the method of moments.
Abstract: In the paper, the problem of electromagnetic scattering from two adjacent particles is considered and an iterative solution that accounts for multiple scattering up to second-order is proposed. The first-order solution can easily be obtained by calculating the scattered field of isolated particles when illuminated by a plane-wave. To get the second-order solution, the scattered field from one of the particles, with nonuniform phase, amplitude, and polarization, is considered as the illuminating wave for the other particle and vice versa. In the work, the second-order scattered field is derived analytically using a novel technique based on the reciprocity theorem. In specific, the analytical solution for bistatic specular scattering from a cylinder and sphere pair is discussed and the results are compared with numerical computations based on the method of moments. >

Proceedings ArticleDOI
11 Dec 1994
TL;DR: The evolution of the AIS from the initial prototype to the present is described, emphasizing the discovery of new requirements and how they were accommodated.
Abstract: The aggregate level simulation protocol (ALSP) concept was initiated by ARPA in January 1990, the first laboratory demonstration took place in January 1991, and the first fielding in support of a major military exercise took place in July 1992. Since then, the ALSP confederation of models has grown from the original two members to six. In support of this growing confederation, the ALSP Infrastructure Software (AIS) has evolved from its fundamental functionality to the current focus on improved confederation management and performance. This paper describes the evolution of the AIS from the initial prototype to the present, emphasizing the discovery of new requirements and how they were accommodated.

Patent
Hai V. Tran1
15 Feb 1994
TL;DR: In this paper, a protocol for mixed voice and data access to a synchronous broadcast communications channel (116) is provided, which requires that a user determine whether a time slot is available.
Abstract: A protocol for mixed voice and data access to a synchronous broadcast communications channel (116) is provided. Transmission on the broadcast communications channel (116) is by means of a plurality of time division frames, each such frame being defined by a plurality of time slots (210). The protocol requires that a user determine whether a time slot is available. If a slot is available, a user transmits a preamble (212) on the broadcast communications channel and then substantially simultaneously monitors the channel for determining whether a collision of the preamble has occurred. If a collision has occurred with a second user who has a higher priority, the first user reattempts to acquire an available time slot after a time delay, the time delay being equivalent to a random number of time slots. If on the other hand, the collision was with a second user of equal priority, both users will reattempt acquisition of available time slots after respective random time delays. The preambles of users of different priority are transmitted using a non-interfering code or modulation frequency, thereby allowing the higher priority user to continue transmission of the remaining fields which make up that user's information packet. Another key feature of the protocol is the use of the preamble (212) to identify voice users that are in silence periods, transmitting no data. Under these circumstances, the protocol permits data users to utilize such identified time slots when they are encountered, thereby increasing the throughput of the data communications system.

Journal ArticleDOI
TL;DR: The authors prove that if a particular rank sequence stabilizes to a value strictly less then the common row size of the defining block matrices, then this value equals the number of signals.
Abstract: The authors present a simple method for determining the number of signals impinging on a uniform linear array that is applicable even in the extreme case of fully correlated signals. This technique uses what they term modified rank sequences, which is a modification of the construction implicit in the matrix decomposition method of Di (1981). They prove that if a particular rank sequence stabilizes (the last two terms of the sequence are equal) to a value strictly less then the common row size of the defining block matrices, then this value equals the number of signals, provided that the number of signals has not exceeded a Bresler-Macovski (1986) type bound. Using the above characterization of stability, they formulate an algorithm that either determines the number of signals or indicates that the resolution capability of the algorithm has been exceeded. They also provide theorems that show that under certain conditions, a rank sequence can stabilize to a value strictly less than the number of signals. This result allows them to find simple counterexamples to all of the existing rank sequence methods. >

Proceedings ArticleDOI
01 Jul 1994
TL;DR: A new scalable approach to generalized proximity detection for moving objects in a logically correct parallel discrete-event simulation, designed and tested using the object-oriented Synchronous Parallel Environment for Emulation and Discrete-Event Simulation (SPEEDES) operating system.
Abstract: Generalized proximity detection for moving objects in a logically correct parallel discrete-event simulation is an interesting and fundamentally challenging problem. Determining who can see whom in a manner that is fully scalable in terms of CPU usage, number of messages, and memory requirements is highly non-trivial.A new scalable approach has been developed to solve this problem. This algorithm, called The Distribution List, has been designed and tested using the object-oriented Synchronous Parallel Environment for Emulation and Discrete-Event Simulation (SPEEDES) operating system. Preliminary results show that the Distribution List algorithm achieves excellent parallel performance.

Patent
01 Feb 1994
TL;DR: In this article, a communication system for transmitting data to mobile receivers utilizing a subcarrier within a commercial FM channel of a radio station is described, where the data transmitted is first encoded in encoder (112), utilizing a forward error correction code.
Abstract: A communication system (100) is provided for transmitting data to mobile receivers utilizing a subcarrier within a commercial FM channel of a radio station (55). The data transmitted is first encoded in encoder (112), utilizing a forward error correction code. The sequence of the encoded data is altered in interleaver (116), subdivided into a plurality of subframes, in framing and synchronization circuit (120), which also adds channel state bits to each subframe. The framed data is modulated onto the subcarrier in the differential quadrature phased shift keying modulator (130), the output of which is coupled to the FM modulator (52) of radio station transmitter (50). The transmitted radio frequency signals may be received by a vehicle antenna (12) for coupling to the vehicle's FM receiver (80). The modulated subcarrier is recovered from the FM demodulator (84) of the receiver (80), the modulated subcarrier being demodulated to recover the encoded digital data therefrom. The channel state bits included with the data are extracted from the digital data and utilized to form a data reliability factor for each bit of the encoded data. The data reliability factors thus obtained are associated with each bit of the data in a deinterleaver (360). Deinterleaver (360) provides each data bit in proper sequence, with its associated data reliability factor to a decoder (370). The decoded digital data is provided to a vehicle traffic computer (90) for processing and presentation of traffic information to a user on a display (92).

Journal ArticleDOI
TL;DR: Rat liver tumors initiated with N-nitrosodiethylamine followed by promotion with phenobarbital were examined for expression of transforming growth factor-beta type I, II and III receptors and it was demonstrated that all three TGF beta receptors are expressed in both normal and malignant hepatic tissues.
Abstract: Rat liver tumors initiated with N-nitrosodiethylamine (DEN) followed by promotion with phenobarbital (PB) were examined for expression of transforming growth factor-beta (TGF beta) type I, II and III receptors. RNase protection and TGF beta 1 affinity labeling assays were used to determine TGF beta receptor steady-state mRNA and protein levels, respectively. We have demonstrated that all three TGF beta receptors are expressed in both normal and malignant hepatic tissues. Long-term PB administration did not alter TGF beta receptor mRNA or protein levels in normal liver. However, type I, II and III TGF beta receptor mRNA and protein levels were decreased by approximately 50% in the DEN-initiated/PB-promoted liver tumors as compared to the receptor levels in normal liver tissue surrounding the tumors. In contrast, TGF beta receptor mRNA and protein levels were unchanged in liver tumors initiated with DEN but not PB-promoted. These data demonstrate that PB promotes the formation of a tumor phenotype that is characterized by a significantly reduced number of TGF beta type I, II and III receptors. This suggests that the down-regulation of TGF beta receptors in PB-promoted hepatic tumors may provide a selective growth advantage to the tumor cells by reducing the ability of TGF beta to inhibit their growth.

Journal ArticleDOI
TL;DR: It is suggested that failure of peripheral apoptosis of CD4+ cells allows self-reactive helper T cells to persist and drive autoantibody production, which leads to down-regulation of CD8 and persistence as CD4-, CD8- T cells which contribute to the lymphadenopathy.

Journal ArticleDOI
TL;DR: The authors present the probability generating function (PGF) for joint and marginal buffer occupancy distributions of statistical time division multiplexing systems in this class and discuss inversion of the PGF using discrete Fourier transforms, and a simple technique for obtaining moments of the queue length distribution.
Abstract: The queueing behavior of many communication systems is well modeled by a queueing system in which time is slotted, and the number of entities that arrive during a slot is dependent upon the state of a discrete time, discrete state Markov chain. Techniques for analyzing such systems have appeared in the literature from time to time, but distributions have been presented in only rare instances In the paper, the authors present the probability generating function (PGF) for joint and marginal buffer occupancy distributions of statistical time division multiplexing systems in this class. They discuss inversion of the PGF using discrete Fourier transforms, and also discuss a simple technique for obtaining moments of the queue length distribution. Numerical results, including queue length distributions for some special cases, are presented. >

Proceedings ArticleDOI
18 Apr 1994
TL;DR: The role of software architecture (which reflects high-level implementation constraints) in requirements engineering is clarified by providing perspectives on relevant issues.
Abstract: The role of software architecture (which reflects high-level implementation constraints) in requirements engineering is clarified by providing perspectives on relevant issues, including the following: is requirements engineering merely a front end to the software development process that is concerned only with problem definition? Is software architecture an application-specific, high-level design of a system (for example, "an object-oriented system with a specified object hierarchy")? What is the relationship between the problem definition and the solution structure? What is the relationship between the roles of requirements engineer, software architect, and application domain specialist?. >

Proceedings ArticleDOI
08 May 1994
TL;DR: A supervisory control is derived that provides an optimal distribution of work among the robots while accounting for the robots nonholonomic constraints.
Abstract: Coordinated motion of a group of mobile robots bearing a common load is investigated. By extending the results of research with similar multi-robot material handling systems, a supervisory control is derived that provides an optimal distribution of work among the robots while accounting for the robots nonholonomic constraints. >

Journal ArticleDOI
TL;DR: A multilevel secure federated database system is defined and issues on heterogeneity, autonomy, security policy and architecture are discussed.

Journal ArticleDOI
TL;DR: It is found that lupus-prone NZB, BXSB and MRL strains have a marked increase in expression of Mpmv RNA in their thymuses while bone marrow expression did not differ from normal strains, demonstrating mutations in the NZB endogenous retroviruses which could alter expression.
Abstract: GOURLEY, M.F., KISCH, W.J., MOJCIK, C.F., KING, L.B., KRIEG, A.M. and STEINBERG, A.D. Role of Endogenous Retroviruses in Autoimmune Diseases. Tohoku J. Exp. Med., 1994, 173 (1), 105-114 - Retroviruses have been implicated in the pathogenesis of murine and human lupus; however, many positive findings have been followed by alternative explanations. Initial findings implicating xenotropic retroviruses were subsequently invalidated. The first solid demonstration that endogenous retroviruses mediate disease was the study of SL/Ni mice. Here budding ecotropic retroviral particles from arterial smooth muscle cells caused an antibody response to the particles with subsequent complement deposition. Our laboratory has focused on derangements in endogenous MCF retroviral expression. We found that lupus-prone NZB, BXSB and MRL strains have a marked increase in expression of Mpmv RNA in their thymuses while bone marrow expression did not differ from normal strains. Sequence analysis demonstrated mutations in the NZB endogenous retroviruses which could alter expression. A phosphorothioate antisense oligonucleotide to the initiation sequence of Mpmv caused lymphocyte activation in vivo in normal mice, providing further evidence for in vivo effects of Mpmv and potential for pathological abnormalities in lupus-prone strains.

Journal ArticleDOI
R. Rifkin1
TL;DR: The analysis shows that the OS-based algorithm is quite robust against both interference and clutter edges, and a method is suggested for improving performance at clutter inhomogeneities for short-range targets.
Abstract: Recent interest has focused on order statistic-based (OS-based) algorithms for calculating radar detection thresholds. Previous analyses of these algorithms are extended, to determine closed-form approximations for the signal-to-clutter ratio required to achieve a particular probability of detection in clutter environments whose amplitude statistics are modeled by the Weibull distribution, and where the clutter dominates receiver noise. Performance is evaluated in both homogeneous and inhomogenous clutter. The analysis shows that the OS-based algorithm is quite robust against both interference and clutter edges. A method is suggested for improving performance at clutter inhomogeneities for short-range targets. >

Journal ArticleDOI
TL;DR: In this patient population, a high FMC value appears to reflect cumulative clinical lupus disease activity, involving both intensity and duration of past active disease.
Abstract: Objective. To determine the clinical features that contribute to an increased frequency of mutant T cells (FMC) in patients with systemic lupus erythematosus (SLE). Methods. During in vivo T cell division, there are errors in replication which give rise to mutations throughout the genome. An estimate of such mutations may be obtained by focusing on mutations in the hprt gene, which can be screened by assessing relative growth of T cell clones in the presence and absence of 6-thioguanine. In this study, peripheral blood T cell clones from 47 patients with SLE were assessed, and the frequency of mutant T cells (FMC) determined. An attempt was made to correlate the FMC with disease measures. Results. Patients with SLE had a spectrum of FMC values, ranging from normal to almost 1,000 times normal. Total duration of active disease (rs = 0.94), past highest disease activity index (rs = 0.80), and number of lupus flares (rs = 0.76) correlated most strongly (P < 0.0001) with FMC by Spearman's rank order analysis. In contrast, current disease activity index and current anti-DNA level did not correlate with FMC. Similar correlations between FMC and cumulative past lupus disease activity were found by linear regression analysis (rp = 0.89 for the correlation between the natural logarithm of FMC and cumulative duration of active disease). By both statistical tests, therapy was found to be only a minor contributor to FMC. Conclusion. In our patient population, a high FMC value appears to reflect cumulative clinical lupus disease activity, involving both intensity and duration of past active disease.

Journal ArticleDOI
TL;DR: A model-based framework is presented that provides for both software metrics research and the application of research results to software project assessment and a number of analytical methodologies are discussed to develop models predicting software quality factors.