scispace - formally typeset
Search or ask a question

Showing papers by "Mitre Corporation published in 1999"


Proceedings ArticleDOI
Joseph Mitola1
15 Nov 1999
TL;DR: This paper characterizes the potential contributions of cognitive radio to spectrum pooling and outlines an initial framework for formal radio-etiquette protocols.
Abstract: Wireless multimedia applications require significant bandwidth, some of which will be provided by third-generation (3G) services. Even with substantial investment in 3G infrastructure, the radio spectrum allocated to 3G will be limited. Cognitive radio offers a mechanism for the flexible pooling of radio spectrum using a new class of protocols called formal radio etiquettes. This approach could expand the bandwidth available for conventional uses (e.g. police, fire and rescue) and extend the spatial coverage of 3G in a novel way. Cognitive radio is a particular extension of software radio that employs model-based reasoning about users, multimedia content, and communications context. This paper characterizes the potential contributions of cognitive radio to spectrum pooling and outlines an initial framework for formal radio-etiquette protocols.

1,331 citations


Journal ArticleDOI
TL;DR: A method of forming synthetic aperture radar images of moving targets without using any specific knowledge of the target motion is presented, using a unique processing kernel that involves a one-dimensional interpolation of the deramped phase history which is called keystone formatting.
Abstract: A method of forming synthetic aperture radar (SAR) images of moving targets without using any specific knowledge of the target motion is presented. The new method uses a unique processing kernel that involves a one-dimensional interpolation of the deramped phase history which we call keystone formatting. This preprocessing simultaneously eliminates the effects of linear range migration for all moving targets regardless of their unknown velocity. Step two of the moving target imaging technique involves a two-dimensional focusing of the movers to remove residual quadratic range migration errors. The third and last step removes cubic and higher order defocusing terms. This imaging technique is demonstrated using SAR data collected as part of DARPA's Moving Target Exploitation (MTE) program.

695 citations


Journal ArticleDOI
TL;DR: The approach is distinguished from other work by the simplicity of the model, the precision of the results it produces, and the ease of developing intelligible and reliable proofs even without automated support.
Abstract: A strand is a sequence of events; it represents either an execution by a legitimate party in a security protocol or else a sequence of actions by a penetrator. A strand space is a collection of strands, equipped with a graph structure generated by causal interaction. In this framework, protocol correctness claims may be expressed in terms of the connections between strands of different kinds. Preparing for a first example, the Needham-Schroeder-Lowe protocol, we prove a lemma that gives a bound on the abilities of the penetrator in any protocol. Our analysis of the example gives a detailed view of the conditions under which it achieves authentication and protects the secrecy of the values exchanged. We also use our proof methods to explain why the original Needham-Schroeder protocol fails. Before turning to a second example, we introduce ideals as a method to prove additional bounds on the abilities of the penetrator. We can then prove a number of correctness properties of the Otway-Rees protocol, and we clarify its limitations. We believe that our approach is distinguished from other work by the simplicity of the model, the precision of the results it produces, and the ease of developing intelligible and reliable proofs even without automated support.

574 citations


Journal ArticleDOI
Joseph Mitola1
TL;DR: Analysis of the topological properties of the software radio architecture yields a layered distributed virtual machine reference model and a set of architecture design principles that may be useful in defining interfaces among hardware, middleware, and higher level software components that are needed for cost-effective software reuse.
Abstract: As the software radio makes its transition from research to practice, it becomes increasingly important to establish provable properties of the software radio architecture on which product developers and service providers can base technology insertion decisions. Establishing provable properties requires a mathematical perspective on the software radio architecture. This paper contributes to that perspective by critically reviewing the fundamental concept of the software radio, using mathematical models to characterize this rapidly emerging technology in the context of similar technologies like programmable digital radios. The software radio delivers dynamically defined services through programmable processing capacity that has the mathematical structure of the Turing machine. The bounded recursive functions, a subset of the total recursive functions, are shown to be the largest class of Turing-computable functions for which software radios exhibit provable stability in plug-and-play scenarios. Understanding the topological properties of the software radio architecture promotes plug-and-play applications and cost-effective reuse. Analysis of these topological properties yields a layered distributed virtual machine reference model and a set of architecture design principles for the software radio. These criteria may be useful in defining interfaces among hardware, middleware, and higher level software components that are needed for cost-effective software reuse.

386 citations


Proceedings ArticleDOI
08 Jun 1999
TL;DR: The TIPSTER Text Summarization Evaluation (SUMMAC) has established definitively that automatic text summarization is very effective in relevance assessment tasks.
Abstract: The TIPSTER Text Summarization Evaluation (SUMMAC) has established definitively that automatic text summarization is very effective in relevance assessment tasks. Summaries as short as 17% of full text length sped up decision-making by almost a factor of 2 with no statistically significant degradation in F-score accuracy. SUMMAC has also introduced a new intrinsic method for automated evaluation of informative summaries.

254 citations


Proceedings ArticleDOI
20 Jun 1999
TL;DR: Initial work on Deep Read, an automated reading comprehension system that accepts arbitrary text input (a story) and answers questions about it is described, with a baseline system that retrieves the sentence containing the answer 30--40% of the time.
Abstract: This paper describes initial work on Deep Read, an automated reading comprehension system that accepts arbitrary text input (a story) and answers questions about it. We have acquired a corpus of 60 development and 60 test stories of 3rd to 6th grade material; each story is followed by short-answer questions (an answer key was also provided). We used these to construct and evaluate a baseline system that uses pattern matching (bag-of-words) techniques augmented with additional automated linguistic processing (stemming, name identification, semantic class identification, and pronoun resolution). This simple system retrieves the sentence containing the answer 30--40% of the time.

247 citations


Journal ArticleDOI
TL;DR: In this article, the results of eight VLBA observations at 5 GHz spanning 3 yr have yielded a measured trigonometric parallax for Sco X-1 of 000036±000004; hence, its distance is 2.8±0.3 kpc.
Abstract: The results of eight VLBA observations at 5 GHz, spanning 3 yr, have yielded a measured trigonometric parallax for Sco X-1 of 000036±000004; hence, its distance is 2.8±0.3 kpc. This is the most precise parallax measured to date. Although our measured distance is 40% farther away than previous estimates based on X-ray luminosity, our Rossi X-Ray Timing Explorer observations, with a measured luminosity of 2.3×1038 ergs s-1, and determined distance continue to support the hypothesis that Z-source low-mass X-ray binary systems, like Sco X-1, radiate at the Eddington luminosity at a particular point in their X-ray color-color diagram.

157 citations


Journal ArticleDOI
L. Mitola1
TL;DR: The concepts, architecture, technology challenges, and economics of the continuing productization and globalization of software radio are reviewed.
Abstract: The software radio has emerged from the third-generation strategy for affordable, ubiquitous, global communications. This article reviews the concepts, architecture, technology challenges, and economics of the continuing productization and globalization of software radio.

142 citations


Journal ArticleDOI
16 May 1999
TL;DR: A novel VLSI architecture that removes narrow-band signals from the wide-bandwidth GPS spectrum and interference suppression technique employed is frequency-domain excision.
Abstract: In recent gears, we have witnessed the rapid adoption of the Department of Defense's Global Positioning System (GPS) for navigation in a number of military and civilian applications. Unfortunately, the low-power GPS signal is susceptible to interference. This paper presents a novel VLSI architecture that removes narrow-band signals from the wide-bandwidth GPS spectrum. The interference suppression technique employed is frequency-domain excision. The single-chip frequency-domain excisor transforms the received signal (GPS signal+noise+interference) to the frequency domain, computes signal statistics to determine an excision threshold, removes all spectral energy exceeding that threshold, and restores the remaining signal (GPS signal+noise) to the temporal domain. The heart of this VLSI implementation is an on-chip 256-point fast Fourier transform processor that operates at 40 million complex samples per second. It processes 12-bit (for each I and Q) sampled complex data. The 1.57 million-transistor chip was fabricated in 0.5-/spl mu/m CMOS triple metal technology and is fully functional.

138 citations


Book ChapterDOI
15 Sep 1999
TL;DR: An evaluation against a manually categorized ground truth news corpus shows the TopCat technique is effective in identifying topics in collections of news articles.
Abstract: TopCat (Topic Categories) is a technique for identifying topics that recur in articles in a text corpus. Natural language processing techniques are used to identify key entities in individual articles, allowing us to represent an article as a set of items. This allows us to view the problem in a database/data mining context: Identifying related groups of items. This paper presents a novel method for identifying related items based on “traditional” data mining techniques. Frequent itemsets are generated from the groups of items, followed by clusters formed with a hypergraph partitioning scheme. We present an evaluation against a manually-categorized “ground truth” news corpus showing this technique is effective in identifying topics in collections of news articles.

134 citations


Proceedings Article
22 Aug 1999
TL;DR: Using information retrieval, information extraction, and collaborative filtering techniques, these systems are able to enhance corporate knowledge management by overcoming traditional problems of knowledge acquisition and maintenance and associated (human and financial) costs.
Abstract: In this paper we describe two systems designed to connect users to distributed, continuously changing experts and their knowledge. Using information retrieval, information extraction, and collaborative filtering techniques, these systems are able to enhance corporate knowledge management by overcoming traditional problems of knowledge acquisition and maintenance and associated (human and financial) costs. We describe the purpose of these two systems, how they work, and current deployment in a global corporate environment to enable end users to directly discover experts and their knowledge.

Journal ArticleDOI
TL;DR: The program history is traced starting with the Army's Stand Off Target Acquisition System as it evolved through the Small Aerostat Surveillance System and the Assault Breaker/Pave Mover programs into the currently fielded Joint Surveillance and Target Attack System [JointSTARS], which more than proved its worth in the 1991 Gulf War.
Abstract: The concept of airborne surveillance of enemy ground forces with a Ground Moving Target Indicator [now called GMTI] radar capable of detecting moving ground vehicles and helicopters was proposed in 1968 and resulted in a DoD program to realize its potential. This article traces the program history starting with the Army's Stand Off Target Acquisition System [SOTAS] as it evolved through the Small Aerostat Surveillance System [SASS] and the Assault Breaker/Pave Mover programs into the currently fielded Joint Surveillance and Target Attack System [JointSTARS], which, in prototype form, more than proved its worth in the 1991 Gulf War. New developments and trends in GMTI radars are also discussed together with other potential platforms.

Proceedings ArticleDOI
01 Dec 1999
TL;DR: This work considers the issue of composability as a design principle for simulation and describes a few of the complexities introduced through composability that might tend to offset the benefits of component based modeling on a large scale.
Abstract: We consider the issue of composability as a design principle for simulation. While component based modeling is believed to potentially reduce the complexities of the modeling task, we describe a few of the complexities introduced through composability. We observe that these complexities might tend to offset the benefits of component based modeling on a large scale.

Proceedings ArticleDOI
16 May 1999
TL;DR: A proactive, position-based routing protocol that provides an alternative, simplified way of localizing routing information overhead, without having to resort to complex, multiple-tier hierarchical routing organization schemes is proposed.
Abstract: We propose a proactive, position-based routing protocol that provides an alternative, simplified way of localizing routing information overhead, without having to resort to complex, multiple-tier hierarchical routing organization schemes. This is achieved by integrating the functions of routing and mobility management via the use of geographic position, and the generalization of the routing zone concept. The proposed protocol controls routing overhead generation and propagation by making the overhead generation rate and propagation distance directly proportional to the amount of change in a node's geographic position. In our protocol, a set of geographic routing zones is defined for each node, where the purpose of the ith routing zone is to contain propagation of position updates advertising position differentials equal to the radius of the (i-l)th routing zone. Finally, we show through simulation that the proposed routing protocol is a bandwidth-efficient routing mechanism that can be applied across large scale networks.

Proceedings ArticleDOI
28 Jun 1999
TL;DR: This paper identifies a simple and easily verified characteristic of protocols, and shows that the Otway-Rees protocol remains correct even when used in combination with other protocols that have this characteristic.
Abstract: Strand space analysis is a method for stating and proving correctness properties for cryptographic protocols. In this paper we apply the same method to the related problem of mixed protocols, and show that a protocol can remain correct even when used in combination with a range of other protocols. We illustrate the method with the familiar Otway-Rees protocol. We identify a simple and easily verified characteristic of protocols, and show that the Otway-Rees protocol remains correct even when used in combination with other protocols that have this characteristic. We also illustrate this method on the Neuman-Stubblebine protocol. This protocol has two parts, an authentication protocol (I) in which a key distribution center creates and distributes a Kerberos-like key, and a reauthentication protocol (II) in which a client resubmits a ticket containing that key. The re-authentication protocol II is known to be flawed. We show that in the presence of protocol II, there are also attacks against protocol I. We then define a variant of protocol II, and prove an authentication property of I that holds even in combination with the modified II.

Journal ArticleDOI
TL;DR: In this article, exploratory data analysis helps one to understand the sources, frequency, and types of changes being made, and a regression model helps managers communicate the cost and schedule effects of changing requirements to clients and other release stakeholders.
Abstract: Requirements are the foundation of the software release process. They provide the basis for estimating costs and schedules, as well as developing design and testing specifications. When requirements have been agreed on by both clients and maintenance management, then adding to, deleting from, or modifying those existing requirements during the execution of the software maintenance process impacts the maintenance cost, schedule, and quality of the resulting product. The basic problem is not the changing in itself, but rather the inadequate approaches for dealing with changes in a way that minimizes and communicates the impact to all stakeholders. Using data collected from one organization on 44 software releases spanning seven products, this paper presents two quantitative techniques for dealing with requirements change in a maintenance environment. First, exploratory data analysis helps one to understand the sources, frequency, and types of changes being made. Second, a regression model helps managers communicate the cost and schedule effects of changing requirements to clients and other release stakeholders. These two techniques can help an organization provide a focus for management action during the software maintenance process. Copyright © 1999 John Wiley & Sons, Ltd.

Book ChapterDOI
12 Apr 1999
TL;DR: This tracker was designed to evaluate application-specific Quality of Service (QoS) metrics to quantify its tracking services in a dynamic environment and to derive scheduling parameters directly from these QoS metrics to control tracker behavior.
Abstract: This paper describes a United States Air Force Advanced Technology Demonstration (ATD) that applied value-based scheduling to produce an adaptive, distributed tracking component appropriate for consideration by the Airborne Warning and Control System (AWACS) program. This tracker was designed to evaluate application-specific Quality of Service (QoS) metrics to quantify its tracking services in a dynamic environment and to derive scheduling parameters directly from these QoS metrics to control tracker behavior. The prototype tracker was implemented on the MK7 operating system, which provided native value-based processor scheduling and a distributed thread programming abstraction. The prototype updates all of the tracked-object records when the system is not overloaded, and gracefully degrades when it is. The prototype has performed extremenly well during demonstrations to AWACS operator and tracking system designers. Quantitative results are presented.

Proceedings ArticleDOI
20 Jun 1999
TL;DR: This paper describes a program which revises a draft text by aggregating together descriptions of discourse entities, in addition to deleting extraneous information, which exploits statistical parsing and robust coreference detection.
Abstract: This paper describes a program which revises a draft text by aggregating together descriptions of discourse entities, in addition to deleting extraneous information. In contrast to knowledge-rich sentence aggregation approaches explored in the past, this approach exploits statistical parsing and robust coreference detection. In an evaluation involving revision of topic-related summaries using informativeness measures from the TIPSTER SUMMAC evaluation, the results show gains in informativeness without compromising readability.

Book ChapterDOI
01 Jun 1999
TL;DR: A new type of user model and a new kind of expert model are described and it is shown how these models can be used to individualize the selection of instructional topics.
Abstract: We describe a new kind of user model and a new kind of expert model and show how these models can be used to individualize the selection of instructional topics. The new user model is based on observing the individual’s behavior in a natural environment over a long period of time, while the new expert model is based on pooling the knowledge of numerous individuals. Individualized instructional topics are selected by comparing an individual’s knowledge to the pooled knowledge of her peers.

Proceedings ArticleDOI
31 Oct 1999
TL;DR: One protocol, WIRP, performed well in both types of networks over the scenarios evaluated, while some performed better in sparser networks (e.g., LS) and several remaining issues and areas for continued research are identified.
Abstract: This paper describes a simulation-based evaluation of several unicast routing protocols tailored specifically for mobile ad hoc networks. Four protocols were evaluated: the wireless Internet routing protocol (WIRP), a link state (LS) algorithm with constrained LS updates, a distance vector variant of WIRP, and temporally ordered routing algorithm (TORA). The goal was to determine how well these routing protocols worked in specific tactical conditions, supporting a given mix of traffic. Factors varied included tactical scenario, network size, and loading. Tactical scenario included different connectivities (e.g., "dense" versus "sparse") and link fluctuation rates (e.g., "high" versus "low"). Metrics collected included average end-to end delay per application, path length, total efficiency, and fraction of user messages received. We found that certain protocols performed better in densely connected networks than in sparser networks (e.g., TORA), while some performed better in sparser networks (e.g., LS). One protocol, WIRP, performed well in both types of networks over the scenarios evaluated, Several remaining issues and areas for continued research are identified.

Journal ArticleDOI
Paul Sass1
TL;DR: The components and characteristics of the Army's legacy communications networks are surveyed, to illustrate the directions currently being taken for accomplishing this digitization, to describe the areas in which the civilian and military systems differ, and to define a glide path for convergence of the two technologies in support of the military's increasing appetite for information.
Abstract: In striving to meet the increasing demands for timely delivery of multimedia information to the warfighter of the 21st Century, the US Army is undergoing a gradual evolution from its “legacy” communications networks to a flexible internetwork architecture based solidly on the underlying communications protocols and technology of the commercial Internet. The framework for this new digitized battlefield, as described in the DoD's Joint Technical Architecture (JTA), is taken from the civilian telecommunications infrastructure which, in many cases, differs appreciably from the rigors of the battlefield environment. The purpose of this paper is to survey the components and characteristics of the Army's legacy communications networks, to illustrate the directions currently being taken for accomplishing this digitization, to describe the areas in which the civilian and military systems differ, and to define a glide path for convergence of the two technologies in support of the military's increasing appetite for information.

Journal ArticleDOI
M. R. Dellomo1
TL;DR: This paper considers the feasibility of using a neural network to perform fault detection on vibration measurements given by accelerometer data and the results obtained are presented.
Abstract: One of the most dangerous problems that can occur in both military and civilian helicopters is the failure of the main gearbox. Currently, the principal method of controlling gearbox failure is to regularly overhaul the complete system. This paper considers the feasibility of using a neural network to perform fault detection on vibration measurements given by accelerometer data. The details and results obtained from studying the neural network approach are presented. Some of the elementary underlying physics will be discussed along with the preprocessing necessary for analysis. Several networks were investigated for detection and classification of the gearbox faults. The performance of each network will be presented. Finally, the network weights will be related back to the underlying physics of the problem.

Journal ArticleDOI
TL;DR: Information warfare defense must consider the whole process of attack, response, and recovery; prevention is just one phase; others are explained and then recovery is an equally important phase of information warfare defense.
Abstract: is of course necessary to take steps to prevent attacks from succeeding. At the same time, however, it is important to recognize that not all attacks can be averted at the outset. Attacks that succeed to some degree are unavoidable, and comprehensive support for identifying and responding to attacks is required [1]. Information warfare defense must consider the whole process of attack, response, and recovery. This requires a recognition of the multiple phases of the information warfare process. Prevention is just one phase; we explain others and then focus on the oft-Recent exploits by hackers have drawn attention to the importance of defending against potential information warfare. Defense and civil institutions rely so heavily on their information systems and networks that attacks that disable them could be devastating. Yet, as hacker attacks have demonstrated , protective mechanisms are fallible. Features and services that must be in place to carry out needed, legitimate functions can be abused by being used in unexpected ways to provide an avenue of attack. Further, an attacker who penetrates one system can use its relationships with other systems on the network to compromise them as well. Experiences of actual attacks have led to the recognition of the need to detect and react to attacks that succeed in breaching a system's protective mechanisms. Prevention and detection receive most of the attention, but recovery is an equally important phase of information warfare defense.

Journal ArticleDOI
TL;DR: The Evaluation Working Group in the Defense Advanced Research Projects Agency (DARPA) Intelligent Collaboration and Visualization (IC&V) program has developed a methodology for evaluating collaborative systems, which consists of a framework for classification of CSCW systems, metrics and measures related to the various components in the framework, and a scenario-based evaluation approach.
Abstract: The Evaluation Working Group (EWG) in the Defense Advanced Research Projects Agency (DARPA) Intelligent Collaboration and Visualization (IC&V) program has developed a methodology for evaluating collaborative systems. This methodology consists of a framework for classification of CSCW (Computer Supported Cooperative Work) systems, metrics and measures related to the various components in the framework, and a scenario-based evaluation approach. This paper describes the components of this methodology. Two case studies of evaluations based on this methodology are also described.

Proceedings ArticleDOI
S. Boykin1, A. Merlino1
07 Jun 1999
TL;DR: Techniques and results for automatically segmenting multimedia sources using simple cues within and across multimedia streams, with specific emphasis on broadcast news are discussed.
Abstract: We discuss techniques and results for automatically segmenting multimedia sources, with specific emphasis on broadcast news. We describe techniques that detect news events (e.g., start and stop, of stories or advertisements) using simple cues within and across multimedia streams (e.g., audio, video, text). In addition, we discuss a process for an evaluation and modification of our model using precision and recall metrics. Guided by our findings, we isolate multimedia cues that we consider critical in news story and advertisement segmentation. We also discuss ideas for improving segmentation beyond the methods described. Throughout the paper, we reference MITRE's Broadcast News Navigator (BNN) used to perform event segmentation.

Book
30 Nov 1999
TL;DR: The paper provides an overview of transaction processing needs and solutions in conventional DBMSs as background, explains the constraints introduced by multilevel security, and describes the results of research in multileVEL secure transaction processing, which includes research results and limitations in concurrency control, multilesvel transaction management, and secure commit protocols.
Abstract: Since 1990, transaction processing in multilevel secure database management systems (DBMSs) has been receiving a great deal of attention from the security community. Transaction processing in these systems requires modification of conventional scheduling algorithms and commit protocols. These modifications are necessary because preserving the usual transaction properties when transactions are executing at different security levels often conflicts with the enforcement of the security policy. Considerable effort has been devoted to the development of efficient, secure algorithms for the major types of secure DBMS architectures: kernelized, replicated, and distributed. An additional problem that arises uniquely in multilevel secure DBMSs is that of secure, correct execution when data at multiple security levels must be written within one transaction. Significant progress has been made in a number of these areas, and a few of the techniques have been incorporated into commercial trusted DBMS products. However, there are many open problems remain to be explored. This paper reviews the achievements to date in transaction processing for multilevel secure DBMSs. The paper provides an overview of transaction processing needs and solutions in conventional DBMSs as background, explains the constraints introduced by multilevel security, and then describes the results of research in multilevel secure transaction processing. Research results and limitations in concurrency control, multilevel transaction management, and secure commit protocols are summarized. Finally, important new areas are identified for secure transaction processing research.

Journal ArticleDOI
TL;DR: In this paper, a new algorithm is given for the two-dimensional translational containment problem: find translations for k polygons which place them inside a polygonal container without overlapping.

Journal ArticleDOI
R.L. Fante1
TL;DR: A procedure is developed to simultaneously synthesize sum and difference patterns for space-time adaptive processing (STAP) in such a way that a specified monopulse slope after adaptation is achieved.
Abstract: A procedure is developed to simultaneously synthesize sum and difference patterns for space-time adaptive processing (STAP) in such a way that a specified monopulse slope after adaptation is achieved.

Proceedings ArticleDOI
24 Oct 1999
TL;DR: A new procedure is developed that has better performance for large estimation errors, and when used to initialize the Weiss and Friedlander (1991) MUSIC-based iterative technique, is seen significant improvement over existing techniques for both small and large errors.
Abstract: Self-calibration algorithms estimate both source directions-of-arrival (DOAs) and perturbed array response vector parameters, such as sensor locations. Calibration errors are usually assumed to be small and a first order approximation to the perturbed array response vector is often used to simplify the estimation procedure. In this paper, we develop a new procedure that does not rely on the small error assumption. It has better performance for large estimation errors, and when used to initialize the Weiss and Friedlander (1991) MUSIC-based iterative technique, we see significant improvement over existing techniques for both small and large errors.

Proceedings ArticleDOI
01 Dec 1999
TL;DR: The article addresses problems involving the size and complexity of models, verification, validation and accreditation, the modeling methodological and model execution implications of parallel and distributed simulation, and random number generation and execution efficiency improvements through quasi-Monte Carlo, and variance reduction.
Abstract: The future directions of simulation research are analysed. The formulation of such a vision could provide valuable guidance and assistance with respect to decisions involving the generation and allocation of future research funding. The article addresses problems involving: (1) the size and complexity of models; (2) verification, validation and accreditation; (3) the modeling methodological and model execution implications of parallel and distributed simulation; (4) the centrality of modeling to the discipline of computer science; and (5) random number generation and execution efficiency improvements through quasi-Monte Carlo, and variance reduction.