scispace - formally typeset
Search or ask a question

Showing papers in "The Journal of Defense Modeling and Simulation: Applications, Methodology, Technology in 2013"


Journal ArticleDOI
TL;DR: An introductory overview of the GM-VV basic principles, concepts, methodology components and their interrelationships are provided and some results and lessons learned from several technology demonstration programs of the Dutch Ministry of Defence are illustrated.
Abstract: The Generic Methodology for Verification and Validation (GM-VV) is a generic and comprehensive methodology for structuring, organizing and managing the verification and validation (V&V) of modelling and simulation (M&S) assets. The GM-VV is an emerging recommended practice within the Simulation Interoperability Standards Organization (SISO). The GM-VV provides a technical framework to efficiently develop arguments to justify why M&S assets are acceptable or unacceptable for a specific intended use. This argumentation supports M&S stakeholders in their acceptance decisionmaking process regarding the development, application and reuse of such M&S assets. The GM-VV technical framework assures that during the execution of the V&V work the decisions, actions, information and evidence underlying such acceptance arguments will be traceable, reproducible, transparent and documented. Since the GM-VV is a generic (i.e. abstract) methodology it must be tailored to fit the specific V&V needs of a M&S organization, project or application domain. Therefore, V&V practitioners must incorporate specific V&V techniques within the generic architectural template offered by the GM-VV in order to properly assess the M&S assets under review. The first part of this paper provides an introductory overview of the GM-VV basic principles, concepts, methodology components and their interrelationships. The second part of the paper focuses on how the GM-VV may be tailored for a specific simulation application. This effort is illustrated with some results and lessons learned from several technology demonstration programs of the Dutch Ministry of Defence.

34 citations


Journal ArticleDOI
TL;DR: In this paper, physical fitness data and rushing times were collected, as rushing is a battlefield task influenced by physical fitness, and two scenarios, a helicopter extraction of a squad and rushing for cover in an attempt to throw a grenade, were implemented in agent-based simulations to demonstrate the effect of rushing speed upon the outcome of a tactical infantry scenario.
Abstract: While physical fitness is generally accepted to influence the outcome on the battlefield, it is currently not incorporated into tactical infantry simulations. Infantry soldiers are modeled with equal physical capabilities representing the average of soldiers on the field. However, humans have varying physical capabilities. This research asked the question ‘Does modeling human physical capabilities have an impact upon the tactical success of operations in a simulation?’ Physical fitness data and rushing times were collected, as rushing is a battlefield task influenced by physical fitness. Two scenarios, a helicopter extraction of a squad and rushing for cover in an attempt to throw a grenade, were implemented in agent-based simulations to demonstrate the effect of rushing speed upon the outcome of a tactical infantry scenario. These scenarios used experimentally obtained rushing velocities as input and were compared to real world scenarios to ensure plausibility. In both simulations rushing speed significa...

19 citations


Journal ArticleDOI
TL;DR: The line of pixels used by ILRX shows an advantage over RX and IRX in that it appears to mitigate the deleterious effects of correlation due to the spatial proximity of the pixels; while the iterative adaptation taken from IRX simultaneously eliminates outliers allowingILRX an advantageover LRX.
Abstract: This research involves simulating remote sensing conditions using previously collected hyperspectral imagery (HSI) data. The Reed–Xiaoli (RX) anomaly detector is well-known for its unsupervised ability to detect anomalies in hyperspectral images. However, the RX detector assumes uncorrelated and homogeneous data, both of which are not inherent in HSI data. To address this difficulty, we propose a new method termed linear RX (LRX). Whereas RX places a test pixel at the center of a moving window, LRX employs a line of pixels above and below the test pixel. In this paper, we contrast the performance of LRX, a variant of LRX called iterative linear RX (ILRX), the recently introduced iterative RX (IRX) algorithm, and the support vector data description (SVDD) algorithm, a promising new HSI anomaly detector. Through experimentation, the line of pixels used by ILRX shows an advantage over RX and IRX in that it appears to mitigate the deleterious effects of correlation due to the spatial proximity of the pixels; ...

14 citations


Journal ArticleDOI
TL;DR: This paper examines simulation uncertainties using a paradigm that facilitates consideration of all potential sources of simulation uncertainty, indicates potential magnitude of uncertainties for simulation results in various areas and indicates realistic and comprehensive appreciation for simulation uncertainty.
Abstract: Drawing appropriate conclusions from simulation results requires a correct understanding of the accuracy and context of those results. Simulation communities often assess simulation results without...

13 citations


Journal ArticleDOI
TL;DR: In this article, a possible solution is presented that is based on the notion of bad models, the concept of plausibility, and the method of simulation-based weak point analysis.
Abstract: ‘Data farming’ is based on the idea that simulation models run thousands of times can provide insights into the possible consequences of different options. However, the validity of the models used for data farming, especially in the context of HSCB (human, social, cultural and behavioural) modelling for decision-making and future studies, is at least questionable. This paper first reflects on the epistemological aspects of this predicament in order to illustrate its fundamental severity. Then, a possible solution is presented that is based on the notion of ‘bad models’, the concept of plausibility, and the method of simulation-based weak point analysis. The approach can be complemented by interactive war gaming. Such a systematic approach appears more defendable than most attempts to use HSCB models for affirmative purposes, and is methodologically easier to implement since it solely requires focusing on the validation of empirically amenable micro-processes.

11 citations


Journal ArticleDOI
TL;DR: The simulation model combines elements of traditional differential equation force-on-force modeling with modern social science modeling of networks, PSYOPs, and coalition cooperation to build a framework that can inform various levels of military decision makers in order to understand and improve COIN strategy.
Abstract: We model insurgency and counter-insurgency (COIN) operations with a large-scale system of differential equations and a dynamically changing coalition network. We use these structures to analyze the components of leadership, promotion, recruitment, financial resources, operational techniques, network communications, coalition cooperation, logistics, security, intelligence, infrastructure development, humanitarian aid, and psychological warfare, with the goal of informing today's decision makers of the options available in COIN tactics, operations, and strategy. In modern conflicts, techniques of asymmetric warfare wreak havoc on the inflexible, regardless of technological or numerical advantage. In order to be more effective, the US military must improve its COIN capabilities and flexibility to match the adaptability and rapid time-scales of insurgent networks and terror cells. Our simulation model combines elements of traditional differential equation force-on-force modeling with modern social science modeling of networks, PSYOPs, and coalition cooperation to build a framework that can inform various levels of military decision makers in order to understand and improve COIN strategy. We show a test scenario of eight stages of COIN operation to demonstrate how the model behaves and how it could be used to decide on effective COIN resources and strategies. Language: en

10 citations


Journal ArticleDOI
TL;DR: The proposed concept of integrated executable architecture enables a dynamic combination of the formerly separated and static areas of business processes, system design and resource modelling, which provides time-based, dynamic visualisation of system behaviour by combining business process simulation, system simulation and synthetic environments.
Abstract: This paper presents a framework for integrated executable architectures, which is the integration of architecture modelling tools and simulation tools. This framework allows the use of executable architectures during the conceptual analysis and design phases in order to address system complexity at an early stage of the development. The proposed concept of integrated executable architecture enables a dynamic combination of the formerly separated and static areas of business processes, system design and resource modelling. It provides time-based, dynamic visualisation of system behaviour by combining business process simulation, system simulation and synthetic environments. This allows an early understanding of emergent behaviour, time-dependent behaviour and performance estimation of the architecture.

9 citations


Journal ArticleDOI
TL;DR: BASE-IT (Behavioral Analysis and Synthesis for Intelligent Training), a system in development that aims to automate capture of training data and their analysis, performance evaluation, and AAR report generation, is described.
Abstract: The training objective for urban warfare includes acquisition and perfection of a set of diverse skills in support of kinetic and non-kinetic operations. The US Marines (USMC) employ long-duration acted scenarios with verbal training feedback provided sporadically throughout the training session and at the end in a form of an after-action review (AAR). The inherent characteristic of training ranges for urban warfare is that they are the environments with a high level of physical occlusion, which causes many performances not to be seen by a group of instructors who oversee the training. We describe BASE-IT (Behavioral Analysis and Synthesis for Intelligent Training), a system in development that aims to automate capture of training data and their analysis, performance evaluation, and AAR report generation. The goal of this effort is to greatly increase the amount of observed behavior and improve the quality of the AAR. The system observes training with stationary cameras and personal tracking devices. It t...

9 citations


Journal ArticleDOI
TL;DR: In this paper, the authors outline statistical methods used to analyze the behavior signatures that are hidden deep within data on terrorist attacks, and they have the potential to allow military comman...
Abstract: In this paper we outline statistical methods used to analyze the behavior signatures that are hidden deep within data on terrorist attacks. These methods have the potential to allow military comman...

8 citations


Journal ArticleDOI
TL;DR: In this paper, the authors conducted a DOTMLPF (doctrine, organization, training, maintenance, leadership, personnel, facilities) assessment to determine gaps in the current force structure and solutions for future force design.
Abstract: This study illustrates a new approach to conducting capabilities-based analysis by assessing the requirements and capabilities of Army aeromedical evacuation units. We conducted a DOTMLPF (doctrine, organization, training, maintenance, leadership, personnel, facilities) assessment to determine gaps in the current force structure and solutions for future force design. Specifically, this study tackles the following research questions. RQ1: What are the gaps in medical evacuation mission execution for current operations and operations involving geographically dispersed units? RQ2: What capabilities might mitigate these gaps by examining the design characteristics of DOTLMPF? Our research design involved primary collection of data from senior aviation and medical aviation leaders using structured and unstructured survey questions. Using a mixed-method approach, we addressed RQ1 using quantitative methods and RQ2 through qualitative analysis. The results of our study determined the current organizational probl...

8 citations


Journal ArticleDOI
TL;DR: Agent-based modeling implemented by the Automated Vulnerability Evaluation for Risks of Terrorism (AVERT®) software was used to conduct computer-based simulation modeling to develop options for redeployment of existing maritime law enforcement resources, deployment of new resources, and optimal use of geographic terrain.
Abstract: United States ports must be prepared for the threat of a small-vessel attack using weapons of mass destruction (WMD). To reduce the risk of such an attack, modeling was conducted at the Savannah Ri...

Journal ArticleDOI
TL;DR: In this article, field experiments were used to evaluate three different indirect fire models: the cookie cutter and the Carleton damage functions and a simplified physical model for fragmenting ammunition. Data fr...
Abstract: Field experiments were used to evaluate three different indirect fire models: the cookie cutter and the Carleton damage functions and a simplified physical model for fragmenting ammunition. Data fr...

Journal ArticleDOI
TL;DR: Two risk-analytic approaches for allocating operating funding among defence organization activities, termed priority method and knapsack method, are developed and applied in the context of the Department of National Defence in Canada.
Abstract: We develop two risk-analytic approaches for allocating operating funding among defence organization activities In one, termed the priority method, activities are put in rank order and as many high-priority activities as possible are undertaken while ensuring that the budget holder’s probability of overspending his budget is acceptably small In the second, termed the knapsack method, there are two kinds of activities: must-do activities and optional activities Optional activities are selected using a nonlinear integer program that maximizes the value of the optional activities while keeping the probability of overspending sufficiently low Both approaches are applied in the context of the Department of National Defence in Canada

Journal ArticleDOI
TL;DR: The requirements and a design overview for an easily extensible, customizable and integrable, High Level Architecture (HLA)-based Tactical Environment Application Framework are provided and an Application Framework that realizes the defined system is explained.
Abstract: Using Tactical Environment Simulations as part of simulation systems or real systems; increases the reality and quality of the applications, reduces development time, increases interoperability and reusability of systems. Tactical Environment Simulations are utilized by integrating them into applications which can have a wide variety of requirements; so, they are generally customized to meet specific requirements of the applications. Apart from most of the Commercial-Off-The-Shelf (COTS) tools that are used directly, Tactical Environment Simulations are often provided as Application Frameworks which contain readily available simulation models, middleware, extension points and a tool set to ease customization and integration. In this paper, the requirements and a design overview for an easily extensible, customizable and integrable, High Level Architecture (HLA)-based Tactical Environment Application Framework are provided. After defining the requirements and giving an overview of design, an Application Fr...

Journal ArticleDOI
TL;DR: The concept of visual analytics is explored as an enabler to facilitate rapid, defensible and superior decision making by coupling analytical reasoning with the exceptional human capability to rapidly internalize and understand visual data.
Abstract: The foundational concept of Network Enabled Capability relies on effective, timely information sharing. This information is used in analysis, trade and scenario studies, and ultimately decision making. In this paper, the concept of visual analytics is explored as an enabler to facilitate rapid, defensible and superior decision making. By coupling analytical reasoning with the exceptional human capability to rapidly internalize and understand visual data, visual analytics allows individual and collaborative decision making to occur in the face of vast and disparate data, time pressures and uncertainty. An example visual analytics framework is presented in the form of a decision-making environment centered on the Lockheed C-5A and C-5M aircraft. This environment allows rapid trade studies to be conducted on design, logistics and capability within the aircraft’s operational roles. Through this example, the use of a visual analytics decision-making environment within a military environment is demonstrated.

Journal ArticleDOI
TL;DR: In this article, the authors focus on the search aspect of STA for the unaided human eye and provide empirical results and recommend modeling approaches that improve the representation of un-aided search in military models and simulations.
Abstract: Representation of search and target acquisition (STA) in military models and simulations arguably abstracts the most critical aspects of combat. This research focuses on the search aspect of STA for the unaided human eye. It is intuitive that an individual's environmental characteristics and interpretation of the environment in the context of all comprehended information, commonly summarized as their situational awareness (SA), influences attention and search. Current simulation models use a primitive sweeping search method that devotes an unbiased amount of time to every area in an entity's field of regard and neglects the effects of SA. The goal of this research is to provide empirical results and recommend modeling approaches that improve the representation of unaided search in military models and simulations. The major contributions towards this goal include novel empirical results from two incremental eye-tracking experiments, analysis and modeling of the eye-tracking data to illustrate the effect of the environment and SA on search, and a recommended model for unaided search for high-fidelity combat simulation models. The results of this work support soldier search models driven by metrics that summarize the threat based on environmental characteristics and contextual information. Language: en

Journal ArticleDOI
TL;DR: A network controller that uses outbound router queue size predictions to optimize computer networks and can use such predictions to form context-aware, cognitive processes to managing network communication is developed.
Abstract: While the current routing and congestion control algorithms in use today are often adequate for networks with relatively static topology and relatively lax quality of service (QoS) requirements, these algorithms may not be sufficient for military networks where a strict level of QoS is required in order to achieve mission objectives. Current technology limits a network’s ability to adapt to changes and interactions, and this often results in sub-optimal performance. This article develops a network controller that uses outbound router queue size predictions to optimize computer networks. These queue size predictions are made possible through the use of Kalman filters to detect network congestion. The premise is that intelligent agents can use such predictions to form context-aware, cognitive processes to managing network communication. The system shows great promise when modeled and simulated using the NS2 network simulation platform.

Journal ArticleDOI
TL;DR: This paper applies a relatively new methodology, Random Forests, to the problem of predicting culpability and compares it to some of the more frequently used statistical classification techniques, including multinomial logistic regression and naïve Bayesian classification.
Abstract: Recently, researchers have become interested in the issue of assessing culpability for terrorist attacks when no one group claims or multiple groups claim responsibility. Several new methods have b...

Journal ArticleDOI
TL;DR: The newly created agent-based software ABSNEC (Agent-Based System for Network Enabled Capabilities) is described, highlighting some of its salient features: the ability to represent human factors towards the analysis of battle outcomes in network operations, and the able to represent realistic force structures with tiered C2 architectures.
Abstract: This paper presents an approach to understanding network-enabled operations using agent-based simulations. We describe the newly created agent-based software ABSNEC (Agent-Based System for Network Enabled Capabilities), highlighting some of its salient features: the ability to represent human factors towards the analysis of battle outcomes in network operations, and the ability to represent realistic force structures with tiered C2 architectures. We provide affirmative results of three validation techniques to date on the model. Finally, we demonstrate the utilization of ABSNEC to acquire meaningful insights for analysis through two examples: a study on the interrelationship between fratricide, human factors, and situation awareness; and the generation of alternative combat strategies for a military engagement.

Journal ArticleDOI
TL;DR: This article proposes a novel approach that addresses network-centric terrain database generation and provides distributed bi-directional, incremental updates that addresses many of these needs.
Abstract: A great deal has changed since 1997 when Schiavone first described the generic pipeline generation process for building an environmental database for military models and simulations. Environment representations have gone from cartoonist graphics to near real-life renderings possessing properties of physics and intelligence where appropriate. Within this article we discuss several contemporary and past terrain database generation processes, as well as challenges faced in generating terrain databases for military models and simulations today. These challenges include decentralized, distributed modification, frequent and incremental updates, and the need for a shared level of validity by users with heterogeneous needs. We propose a novel approach that addresses network-centric terrain database generation and provides distributed bi-directional, incremental updates that addresses many of these needs. We also developed, tested, and analyzed a prototype architecture and two types of feature modification applied...

Journal ArticleDOI
TL;DR: The US Department of Defense (DoD) requires all models and simulations that it manages, develops, and/or uses to be verified, validated, and accredited as discussed by the authors, which is critical to irregular warfare (IW) modeling.
Abstract: The US Department of Defense (DoD) requires all models and simulations that it manages, develops, and/or uses to be verified, validated, and accredited. Critical to irregular warfare (IW) modeling ...

Journal ArticleDOI
TL;DR: In this article, the authors present a modeling and simulation approach that clearly increases the efficacy of training and education efforts for peace support operations, and discuss the decision to use a computer simulation to support the training event and provide an overview of the methodology for planning and executing the game.
Abstract: We present a modeling and simulation approach that clearly increases the efficacy of training and education efforts for peace support operations. Our discussion involves how a computer simulation, the Peace Support Operations Model, is integrated into a training and education venue in Kyrgyzstan for a "Game for Peace." On September 12-23, 2011 members of NATO's Partnership for Peace Training and Education Centers collaborated to instruct a United Nations' Peacekeeping Operations course at the Kyrgyz Separate Rifle Battalion in Bujum, Kyrgyzstan. Phase II of the course was also conducted on October 17-21, 2011 for members of the Peacekeeping Brigade of the Kazakhstan Army (KAZBRIG) in Almaty, Kazakhstan. Although such courses are a mainstay in NATO support in preparing member nations for peace support operations, the application of a computer simulation is unique. We relate the decision to use a computer simulation to support the training event and provide an overview of the methodology for planning and executing the game. Insights from the game about training and educating future peacekeepers and lessons for using computer simulations are instructive for future efforts and mark the way to leverage the advantages of computer simulations. Language: en

Journal ArticleDOI
TL;DR: Analysis of observed data trends from a broad range of DE experiments performed within capability development programs for the United Kingdom and Australian Governments over the period of 2001–2010 is presented.
Abstract: Defense Experimentation (DE) using modeling and simulation (M&S) is increasingly being adopted as a means to better understand complex defense capability problems. This is being given added impetus by the amplified focus on Network Enabled Capability (NEC) and the rising use of advanced information and communications technology within military operations. This paper presents analysis of observed data trends from a broad range of DE experiments performed within capability development programs for the United Kingdom and Australian Governments over the period of 2001– 2010. A range of variables were tracked concerning the experiment’s nature, the DE method employed, M&S technology utilized and human resources used across the experiment life cycle. Time and effort results are presented here, broken down by DE method and life-cycle phase. The paper also analyses where reuse took place in the experiment life cycle, and how time and effort were affected by the number of problem-owner and provider stakeholders involved. The insights yielded are expected to help DE planners improve estimation and scheduling of human resourresources. In turn, this is intended to facilitate delivery of more effective NEC concept development and experimentation.

Journal ArticleDOI
TL;DR: The proposed approach to predict IED emplacements with Bayesian inference has been implemented and the results of testing are consistent with a group of actual incidents.
Abstract: In this paper we introduce an approach to predict emplacements of improvised explosive devices (IEDs). With a brief review of studies and technology in related areas, this paper identifies and analyzes various factors in a great number of IED/terrorist attacks, and then categorizes the factors/features based on locations and time. By combining the results of analysis with other significant factors, such as casualties, numbers injured, population density, and international impact, this paper proposes an approach to predict IED emplacements with Bayesian inference. The proposed approach has been implemented and the results of testing are consistent with a group of actual incidents. Language: en

Journal ArticleDOI
TL;DR: The call for papers for this special issue was devoted to novel or emerging VVA conceptual approaches, processes and techniques which contribute to efforts guaranteeing a required level of quality and correctness of MS design and development, of utility and validity of MS applications for specific purposes, as well as for credibility of MS results.
Abstract: Rapid innovations of computer and communication technologies result in significant advancements and increasing applications in various fields of military and nonmilitary modeling and simulation (MS): development and application of new modeling methods and simulation techniques, a wide and increasing range of MS application domains, and – last but most important – a permanently increasing complexity of models and simulation applications. In addition to the benefits of these innovations and advances, increasing risks have to be considered arising from model complexity: quality of model development and credibility of simulation results. Especially in the context of the increasing application of MS techniques like componentbased modeling, agent-based simulation, distributed and collaborative simulation, efficient quality control and credibility assurance measures have to be introduced and applied to avoid incorrect, erroneous, inaccurate or inappropriate model development and application. Therefore, effective verification, validation and acceptance test methods and supporting tools applicable during all phases of a MS lifecycle are urgently required and have to be applied. Verification, validation and accreditation (VVA) is key in making models and their subsequent simulations applicable for trustworthy training, analyses and decision support in defense, safety or emergency response modeling, and a proxy for hardware test and evaluation (TE). On one side, VVA is especially important due to the increasing complexity of models and simulations being applied, but it is also important due to the contemporary emphasis on cost savings and the increased interest in virtual tool sets in all aspects of analyses, training and technology development, deployment and maintenance. The call for papers for this special issue was therefore devoted to novel or emerging VVA conceptual approaches, processes and techniques which contribute to efforts guaranteeing a required level of quality and correctness of MS design and development, of utility and validity of MS applications for specific purposes, as well as for credibility of MS results. This call also asked for project specific applications of VVA processes and techniques and for experiences gained by applying VVA and its benefits. Accordingly, the first three articles of this special issue address some general but important considerations and methodological approaches with respect to the quality of MS life cycles while the latter three articles report on project specific applications of VVA and the resulting experiences. The first article authored by Roza, Voogd and Sebalj reports on a generic methodology for verification of models, simulations and data which – due its generic approach – can be instantiated according to the specifics of the intended fields of MS application, such as for training, analysis or decision support applications. In addition, this approach also includes concepts of tailoring VVA activities at different levels according to specific restrictions or constraints given by a MS standard, an institution or by those specified for a project. The tremendous important issue of uncertainty representation in models and its effect on interpretation of simulation results is being addressed in the article by Pace. This article reports on potential sources of uncertainties which might significantly influence simulation results and open space for inappropriate or even false conclusions on which decisions might be based. The paper refers to a paradigm the author already proposed in 2009, which differentiates between seven different aspects of uncertainty, such as simulation purpose, simulation concept to design or simulation inputs and use. Taking into account all seven uncertainty aspects allows comprehensive consideration of

Journal ArticleDOI
TL;DR: The objective of this work is to produce mobility models that are able to describe tactical mobility in military applications of MANETs, and to show that these models are accurate compared to synthetic traces, and that when used to evaluate network protocols, they provide different conclusions than when using generic mobility models.
Abstract: Mobility management is a key aspect of designing and evaluating protocols for Mobile Ad Hoc Networks (MANETs). The high mobility of nodes in a MANET constantly causes the network topology to change. Mobility patterns of nodes have a direct effect on fundamental network characteristics, such as path length, neighborhood size, and link stability. Consequently, the network performance is strongly affected by the nature of mobility patterns. While evaluating protocols for a specific MANET application, it becomes imperative to use a mobility model that is able to capture the movement of nodes in an accurate manner. The objective of this work is to produce mobility models that are able to describe tactical mobility in military applications of MANETs. We provide models of four tactical scenarios, show that these models are accurate compared to synthetic traces, and that when used to evaluate network protocols, they provide different conclusions than when using generic mobility models.

Journal ArticleDOI
TL;DR: Key aspects of the verification performed on US Army Dugway Proving Ground (DPG) WeatherServer (WXS) are described and the unique role that WXS serves in efficiently providing real-time meteorological data to support Test and Evaluation (T&E) exercises is discussed.
Abstract: Key aspects of the verification performed on US Army Dugway Proving Ground (DPG) WeatherServer (WXS) are described. WXS is a Test and Training Enabling Architecture (TENA)-based modeling and simulation that distributes three-dimensional meteorological data over time to participants in a distributed test event or joint exercise environment. The verification features an iterative process as an effective measure to reduce time and costs while still allowing a comprehensive review. In addition, the unique role that WXS serves in efficiently providing real-time meteorological data to support Test and Evaluation (T&E) exercises is discussed.The verification process demonstrated the fidelity of the WXS output and the correctness of the coded technical algorithms (e.g., altitude-to-pressure conversions). The iterative process involved cycles of testing and improved software builds that continued until a build of WXS was developed that functioned as designed. Insights obtained during the iterative process applied ...

Journal ArticleDOI
TL;DR: In this article, a methodology, a mathematical model, and resource allocation heuristics for static Cordon and Search of a village (a.k.a. village search) is proposed.
Abstract: In the contemporary military environment, making decisions on how to best utilize resources to accomplish a mission with a set of specified constraints is difficult. A Cordon and Search of a village (a.k.a. village search) is an example of such a mission. Leaders must plan the mission, assigning assets (e.g. soldiers, robots, unmanned aerial vehicles, military working dogs) to accomplish the given task in accordance with orders from higher headquarters. Computer tools can assist these leaders in making decisions, and do so in a manner that will ensure the chosen solution is within mission constraints and is robust against uncertainty in environmental parameters. Currently, no such tools exist at the tactical or operational level to assist decision makers in their planning process and, as a result, individual experience and simplistic data tables are the only tools available. Using robustness concepts, this paper proposes a methodology, a mathematical model, and resource allocation heuristics for static pl...


Journal ArticleDOI
TL;DR: This work introduces a method for rapidly estimating radio path loss with Feed-Forward Neural Networks (FFNNs), in which not only path-loss models but also map topology is implicitly encoded in the network.
Abstract: Radio path-loss prediction is an important but computationally expensive component of wireless communications simulation. Models may require significant computation to reach a solution or require that information about the environment between transceivers be collected as model inputs, which may also be computationally expensive. Despite the complexity of the underlying model that generates a path-loss solution, the resulting function is not necessarily complex, and there may be ample opportunity for compression. We introduce a method for rapidly estimating radio path loss with Feed-Forward Neural Networks (FFNNs), in which not only path-loss models but also map topology is implicitly encoded in the network. Since FFNN simulation is amenable to Single Instruction Multiple Data architecture, additional performance can be gained by implementing a trained model in a parallel manner with a Graphical Processing Unit (GPU), such as those found on modern video cards. We first describe the properties of the traini...