scispace - formally typeset
Search or ask a question

Showing papers by "Mitre Corporation published in 2011"


Proceedings Article
27 Jul 2011
TL;DR: The construction of a large, multilingual dataset labeled with gender is described and statistical models for determining the gender of uncharacterized Twitter users are investigated, and several different classifier types are explored.
Abstract: Accurate prediction of demographic attributes from social media and other informal online content is valuable for marketing, personalization, and legal investigation. This paper describes the construction of a large, multilingual dataset labeled with gender, and investigates statistical models for determining the gender of uncharacterized Twitter users. We explore several different classifier types on this dataset. We show the degree to which classifier accuracy varies based on tweet volumes as well as when various kinds of profile metadata are included in the models. We also perform a large-scale human assessment using Amazon Mechanical Turk. Our methods significantly out-perform both baseline models and almost all humans on the same task.

632 citations


Journal ArticleDOI
10 Feb 2011-Nature
TL;DR: An architecture to integrate the programmable nanowire FETs and define a logic tile consisting of two interconnected arrays with 496 functional configurable FET nodes in an area of ∼960 μm2, representing a significant advance in the complexity and functionality of nanoelectronic circuits built from the bottom up with a tiled architecture that could be cascaded to realize fully integrated nanoprocessors with computing, memory and addressing capabilities.
Abstract: A nanoprocessor constructed from intrinsically nanometre-scale building blocks is an essential component for controlling memory, nanosensors and other functions proposed for nanosystems assembled from the bottom up. Important steps towards this goal over the past fifteen years include the realization of simple logic gates with individually assembled semiconductor nanowires and carbon nanotubes, but with only 16 devices or fewer and a single function for each circuit. Recently, logic circuits also have been demonstrated that use two or three elements of a one-dimensional memristor array, although such passive devices without gain are difficult to cascade. These circuits fall short of the requirements for a scalable, multifunctional nanoprocessor owing to challenges in materials, assembly and architecture on the nanoscale. Here we describe the design, fabrication and use of programmable and scalable logic tiles for nanoprocessors that surmount these hurdles. The tiles were built from programmable, non-volatile nanowire transistor arrays. Ge/Si core/shell nanowires coupled to designed dielectric shells yielded single-nanowire, non-volatile field-effect transistors (FETs) with uniform, programmable threshold voltages and the capability to drive cascaded elements. We developed an architecture to integrate the programmable nanowire FETs and define a logic tile consisting of two interconnected arrays with 496 functional configurable FET nodes in an area of ∼960 μm(2). The logic tile was programmed and operated first as a full adder with a maximal voltage gain of ten and input-output voltage matching. Then we showed that the same logic tile can be reprogrammed and used to demonstrate full-subtractor, multiplexer, demultiplexer and clocked D-latch functions. These results represent a significant advance in the complexity and functionality of nanoelectronic circuits built from the bottom up with a tiled architecture that could be cascaded to realize fully integrated nanoprocessors with computing, memory and addressing capabilities.

520 citations


Journal ArticleDOI
TL;DR: This issue of JAMIA focuses on natural language processing (NLP) techniques for clinical-text information extraction and shared tasks like the i2b2/VA Challenge, a shared-task challenge co-sponsored by the Veteran's Administration for the last 2 years.

272 citations


Journal ArticleDOI
TL;DR: Virtualized platforms, which are increasingly well supported on stock hardware, provide a natural basis for the architecture for attestation guided by five central principles.
Abstract: Remote attestation is the activity of making a claim about properties of a target by supplying evidence to an appraiser over a network. We identify five central principles to guide development of attestation systems. We argue that (i) attestation must be able to deliver temporally fresh evidence; (ii) comprehensive information about the target should be accessible; (iii) the target, or its owner, should be able to constrain disclosure of information about the target; (iv) attestation claims should have explicit semantics to allow decisions to be derived from several claims; and (v) the underlying attestation mechanism must be trustworthy. We illustrate how to acquire evidence from a running system, and how to transport it via protocols to remote appraisers. We propose an architecture for attestation guided by these principles. Virtualized platforms, which are increasingly well supported on stock hardware, provide a natural basis for our attestation architecture.

208 citations


Journal ArticleDOI
TL;DR: A short history of the GSC is provided, an overview of its range of current activities are provided, and a call for the scientific community to join forces to improve the quality and quantity of contextual information about the authors' public collections of genomes, metagenomes, and marker gene sequences is made.
Abstract: A vast and rich body of information has grown up as a result of the world's enthusiasm for 'omics technologies. Finding ways to describe and make available this information that maximise its usefulness has become a major effort across the 'omics world. At the heart of this effort is the Genomic Standards Consortium (GSC), an open-membership organization that drives community-based standardization activities, Here we provide a short history of the GSC, provide an overview of its range of current activities, and make a call for the scientific community to join forces to improve the quality and quantity of contextual information about our public collections of genomes, metagenomes, and marker gene sequences.

190 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a comparison between the cost-effectiveness analysis currently used for aviation environmental policy decision-making and an illustrative cost-benefit analysis, and show that different conclusions may be drawn about the same policy options depending on whether benefits and interdependencies are estimated in terms of health and welfare impacts versus changes in NOX emissions inventories.

138 citations


Proceedings ArticleDOI
04 Apr 2011
TL;DR: This paper builds on and extends U.S. Department of Defense published guidance on systems engineering (SE) of systems of systems (SoS) by developing and presenting a view of SoS SE that translates the soS SE core elements, their interrelationships, and SoS decision-making artifacts and information from a “trapeze’ model to a more familiar and intuitive time-sequenced “wave” model representation.
Abstract: This paper builds on and extends U.S. Department of Defense published guidance on systems engineering (SE) of systems of systems (SoS) by developing and presenting a view of SoS SE that translates the SoS SE core elements, their interrelationships, and SoS decision-making artifacts and information from a “trapeze” model to a more familiar and intuitive time-sequenced “wave” model representation. The information is thus rendered in a form more readily usable by SoS SE practitioners in the field and one that corresponds with incremental development approaches that are the norm for SoS capability evolution. The paper describes and motivates the development of the wave model, discusses its key characteristics, and provides examples of SoS efforts that reflect this view of SoS SE. Finally, the paper describes how the information critical to successful SoS SE is created, where it fits into the wave model, how it evolves over time, and in which artifacts the information is normally contained.

126 citations


Journal ArticleDOI
TL;DR: It is concluded that the best performing systems for GN, P PI-ACT and PPI-IMT in realistic settings are not sufficient for fully automatic use and the importance of interactive systems is presented.
Abstract: Background: The overall goal of the BioCreative Workshops is to promote the development of text mining and text processing tools which are useful to the communities of researchers and database curators in the biological sciences. To this end BioCreative I was held in 2004, BioCreative II in 2007, and BioCreative II.5 in 2009. Each of these workshops involved humanly annotated test data for several basic tasks in text mining applied to the biomedical literature. Participants in the workshops were invited to compete in the tasks by constructing software systems to perform the tasks automatically and were given scores based on their performance. The results of these workshops have benefited the community in several ways. They have 1) provided evidence for the most effective methods currently available to solve specific problems; 2) revealed the current state of the art for performance on those problems; 3) and provided gold standard data and results on that data by which future advances can be gauged. This special issue contains overview papers for the three tasks of BioCreative III. Results: The BioCreative III Workshop was held in September of 2010 and continued the tradition of a challenge evaluation on several tasks judged basic to effective text mining in biology, including a gene normalization (GN) task and two protein-protein interaction (PPI) tasks. In total the Workshop involved the work of twenty-three teams. Thirteen teams participated in the GN task which required the assignment of EntrezGene IDs to all named genes in full text papers without any species information being provided to a system. Ten teams participated in the PPI article classification task (ACT) requiring a system to classify and rank a PubMed ® record as belonging to an article either having or not having “PPI relevant” information. Eight teams participated in the PPI interaction method task (IMT) where systems were given full text documents and were required to extract the experimental methods used to establish PPIs and a text segment supporting each such method. Gold standard data was compiled for each of these tasks and participants competed in developing systems to perform the tasks automatically. BioCreative III also introduced a new interactive task (IAT), run as a demonstration task. The goal was to develop an interactive system to facilitate a user’s annotation of the unique database identifiers for all the genes appearing in an article. This task included ranking genes by importance (based preferably on the amount of described experimental information regarding genes). There was also an optional task to assist the user in finding the most relevant articles about a given gene. For BioCreative III, a user advisory group (UAG) was assembled and played an important role 1) in producing some of the gold standard annotations for the GN task, 2) in critiquing IAT systems, and 3) in providing guidance for a future more rigorous evaluation of IAT systems. Six teams participated in the IAT demonstration task and received feedback on their systems from the UAG group. Besides innovations in the GN and PPI tasks making them more realistic and practical and the introduction of the IAT task, discussions were begun on community data standards to promote interoperability and on user requirements and evaluation metrics to address utility and usability of systems.

112 citations


Journal ArticleDOI
TL;DR: The InterActive Task (IAT) was introduced to address the utility and usability of text mining tools for real-life biocuration tasks and provides the first steps toward the definition of metrics and functional requirements that are necessary for designing a formal evaluation of interactive curation systems in the BioCreative IV challenge.
Abstract: The BioCreative challenge evaluation is a community-wide effort for evaluating text mining and information extraction systems applied to the biological domain. The biocurator community, as an active user of biomedical literature, provides a diverse and engaged end user group for text mining tools. Earlier BioCreative challenges involved many text mining teams in developing basic capabilities relevant to biological curation, but they did not address the issues of system usage, insertion into the workflow and adoption by curators. Thus in BioCreative III (BC-III), the InterActive Task (IAT) was introduced to address the utility and usability of text mining tools for real-life biocuration tasks. To support the aims of the IAT in BC-III, involvement of both developers and end users was solicited, and the development of a user interface to address the tasks interactively was requested. A User Advisory Group (UAG) actively participated in the IAT design and assessment. The task focused on gene normalization (identifying gene mentions in the article and linking these genes to standard database identifiers), gene ranking based on the overall importance of each gene mentioned in the article, and gene-oriented document retrieval (identifying full text papers relevant to a selected gene). Six systems participated and all processed and displayed the same set of articles. The articles were selected based on content known to be problematic for curation, such as ambiguity of gene names, coverage of multiple genes and species, or introduction of a new gene name. Members of the UAG curated three articles for training and assessment purposes, and each member was assigned a system to review. A questionnaire related to the interface usability and task performance (as measured by precision and recall) was answered after systems were used to curate articles. Although the limited number of articles analyzed and users involved in the IAT experiment precluded rigorous quantitative analysis of the results, a qualitative analysis provided valuable insight into some of the problems encountered by users when using the systems. The overall assessment indicates that the system usability features appealed to most users, but the system performance was suboptimal (mainly due to low accuracy in gene normalization). Some of the issues included failure of species identification and gene name ambiguity in the gene normalization task leading to an extensive list of gene identifiers to review, which, in some cases, did not contain the relevant genes. The document retrieval suffered from the same shortfalls. The UAG favored achieving high performance (measured by precision and recall), but strongly recommended the addition of features that facilitate the identification of correct gene and its identifier, such as contextual information to assist in disambiguation. The IAT was an informative exercise that advanced the dialog between curators and developers and increased the appreciation of challenges faced by each group. A major conclusion was that the intended users should be actively involved in every phase of software development, and this will be strongly encouraged in future tasks. The IAT Task provides the first steps toward the definition of metrics and functional requirements that are necessary for designing a formal evaluation of interactive curation systems in the BioCreative IV challenge.

95 citations


Proceedings ArticleDOI
21 Aug 2011
TL;DR: By combining simple but effective indexing and disk block accessing techniques, a sequential algorithm iOrca is developed that is up to an order- of-magnitude faster than the state-of-the-art.
Abstract: The problem of distance-based outlier detection is difficult to solve efficiently in very large datasets because of potential quadratic time complexity. We address this problem and develop sequential and distributed algorithms that are significantly more efficient than state-of-the-art methods while still guaranteeing the same outliers. By combining simple but effective indexing and disk block accessing techniques, we have developed a sequential algorithm iOrca that is up to an order-of-magnitude faster than the state-of-the-art. The indexing scheme is based on sorting the data points in order of increasing distance from a fixed reference point and then accessing those points based on this sorted order. To speed up the basic outlier detection technique, we develop two distributed algorithms (DOoR and iDOoR) for modern distributed multi-core clusters of machines, connected on a ring topology. The first algorithm passes data blocks from each machine around the ring, incrementally updating the nearest neighbors of the points passed. By maintaining a cutoff threshold, it is able to prune a large number of points in a distributed fashion. The second distributed algorithm extends this basic idea with the indexing scheme discussed earlier. In our experiments, both distributed algorithms exhibit significant improvements compared to the state-of-the-art distributed method [13].

89 citations


Journal ArticleDOI
01 Jul 2011-Science
TL;DR: In this article, a simple dynamical pattern that may be used to estimate the escalation rate and timing of fatal attacks was uncovered, and the time difference between fatal attacks by insurgent groups within individual provinces in both Afghanistan and Iraq and by terrorist groups operating worldwide, gives a potent indicator of the later pace of lethal activity.
Abstract: In military planning, it is important to be able to estimate not only the number of fatalities but how often attacks that result in fatalities will take place. We uncovered a simple dynamical pattern that may be used to estimate the escalation rate and timing of fatal attacks. The time difference between fatal attacks by insurgent groups within individual provinces in both Afghanistan and Iraq, and by terrorist groups operating worldwide, gives a potent indicator of the later pace of lethal activity.

Patent
21 Jan 2011
TL;DR: In this paper, a user controls one or more UAVs with a smart phone, and the smart phone displays the control from the UAV over the video stream, which can be used to control the vehicle.
Abstract: Described are systems and methods, including computer program products for controlling an unmanned vehicle. A user controls one or more unmanned vehicles with a smart phone. The smart phone receives video stream from the unmanned vehicles, and the smart phone displays the controls from the unmanned vehicle over the video. The smart phone and the unmanned vehicle communicate wirelessly.

Journal ArticleDOI
TL;DR: This paper conducts a series of measurements on a large commercial chat network and proposes a classification system to accurately distinguish chat bots from human users that is highly effective in differentiating bots from humans.
Abstract: The abuse of chat services by automated programs, known as chat bots, poses a serious threat to Internet users. Chat bots target popular chat networks to distribute spam and malware. In this paper, we first conduct a series of measurements on a large0 commercial chat network. Our measurements capture a total of 16 different types of chat bots ranging from simple to advanced. Moreover, we observe that human behavior is more complex than bot behavior. Based on the measurement study, we propose a classification system to accurately distinguish chat bots from human users. The proposed classification system consists of two components: 1) an entropy-based classifier; and 2) a Bayesian-based classifier. The two classifiers complement each other in chat bot detection. The entropy-based classifier is more accurate to detect unknown chat bots, whereas the Bayesian-based classifier is faster to detect known chat bots. Our experimental evaluation shows that the proposed classification system is highly effective in differentiating bots from humans.

Proceedings ArticleDOI
Scott Musman1, Mike Tanner1, Aaron Temin1, Evan Elsaesser1, Lewis Loren1 
04 Apr 2011
TL;DR: This paper describes how to evaluate the impact of a cyber attack on a mission by computing impact as the changes to mission measures of effectiveness, based on the reported effects of a known or suspected attack on one or more parts of the information technology supporting the mission.
Abstract: This paper describes how to evaluate the impact of a cyber attack on a mission. We accomplish this by computing impact as the changes to mission measures of effectiveness, based on the reported effects of a known or suspected attack on one or more parts of the information technology (IT) supporting the mission. Our previous papers have described our goals for computing mission impact and the choices of the techniques we use for modeling missions, IT, and cyber attacks. This paper focuses on how we compute the impact of cyber attacks on IT processes and information. These computations will improve decision-making when under cyber attack by providing accurate and detailed assessments of the impact of those attacks. Although the focus of our work has been on the calculation of cyber mission impacts during mission execution, we have also demonstrated how our representations and computations can be used for performing cyber risk analysis and crown jewels analysis.

Journal ArticleDOI
TL;DR: Simulated spectra were obtained and showed good agreement with experimentally measured spectra using a corona ionization source and the reduced mobilities for TNT and RDX obtained with coronaionization were 1.53 and 1.46 cm(2)/(V s), respectively, and this agreed well with literature values.
Abstract: Ion mobility spectrometry (IMS) has become the most widely used technology for trace explosives detection. A key task in designing IMS systems is to balance the explosives detection performance with size, weight, cost, and safety of the instrument. Commercial instruments are, by and large, equipped with radioactive 63Ni ionization sources which pose inherent problems for transportation, safety, and waste disposal regulation. An alternative to a radioactive source is a corona discharge ionization source, which offers the benefits of simplicity, stability, and sensitivity without the regulatory problems. An IMS system was designed and built based on modeling and simulation with the goal to achieve a lightweight modular design that offered high performance for the detection of trace explosives using a corona ionization source. Modeling and simulations were used to investigate design alternatives and optimize parameters. Simulated spectra were obtained for 2,4,6-trinitrotoluene (TNT) and cyclo-1,3,5-trimethyl...

Journal ArticleDOI
25 Mar 2011-PLOS ONE
TL;DR: The power of whole-genome and modern systems-level approaches to characterize microbial lineages to develop and validate forensic markers for strain discrimination and reveal signatures of deliberate adaptation is demonstrated.
Abstract: Background Despite the decades-long use of Bacillus atrophaeus var. globigii (BG) as a simulant for biological warfare (BW) agents, knowledge of its genome composition is limited. Furthermore, the ability to differentiate signatures of deliberate adaptation and selection from natural variation is lacking for most bacterial agents. We characterized a lineage of BGwith a long history of use as a simulant for BW operations, focusing on classical bacteriological markers, metabolic profiling and whole-genome shotgun sequencing (WGS). Results Archival strains and two “present day” type strains were compared to simulant strains on different laboratory media. Several of the samples produced multiple colony morphotypes that differed from that of an archival isolate. To trace the microevolutionary history of these isolates, we obtained WGS data for several archival and present-day strains and morphotypes. Bacillus-wide phylogenetic analysis identified B. subtilis as the nearest neighbor to B. atrophaeus. The genome of B. atrophaeus is, on average, 86% identical to B. subtilis on the nucleotide level. WGS of variants revealed that several strains were mixed but highly related populations and uncovered a progressive accumulation of mutations among the “military” isolates. Metabolic profiling and microscopic examination of bacterial cultures revealed enhanced growth of “military” isolates on lactate-containing media, and showed that the “military” strains exhibited a hypersporulating phenotype. Conclusions Our analysis revealed the genomic and phenotypic signatures of strain adaptation and deliberate selection for traits that were desirable in a simulant organism. Together, these results demonstrate the power of whole-genome and modern systems-level approaches to characterize microbial lineages to develop and validate forensic markers for strain discrimination and reveal signatures of deliberate adaptation.

Journal ArticleDOI
TL;DR: It is shown how an annotation-based NLP architecture implementing this idea can be realised and that it successfully performs on a corpus of authentic learner answers to reading comprehension questions, and a general exchange format is defined for such exercise data.
Abstract: Contextualised, meaning-based interaction in the foreign language is widely recognised as crucial for second language acquisition. Correspondingly, current exercises in foreign language teaching generally require students to manipulate both form and meaning. For intelligent language tutoring systems to support such activities, they thus must be able to evaluate the appropriateness of the meaning of a learner response for a given exercise. We discuss such a content-assessment approach, focusing on reading comprehension exercises. We pursue the idea that a range of simultaneously available representations at different levels of complexity and linguistic abstraction provide a good empirical basis for content assessment. We show how an annotation-based NLP architecture implementing this idea can be realised and that it successfully performs on a corpus of authentic learner answers to reading comprehension questions. To support comparison and sustainable development on content assessment, we also define a general exchange format for such exercise data.

Journal ArticleDOI
TL;DR: The isolation of an sdAb against Staphyloccocus aureus enterotoxin B (SEB) was found to have a high affinity and an extraordinarily high Tm and could still refold to recover activity after heat denaturation, making this sdAb a good candidate for use in antibody-based toxin detection technologies.
Abstract: Camelids and sharks possess a unique subclass of antibodies comprised of only heavy chains. The antigen binding fragments of these unique antibodies can be cloned and expressed as single domain antibodies (sdAbs). The ability of these small antigen-binding molecules to refold after heating to achieve their original structure, as well as their diminutive size, makes them attractive candidates for diagnostic assays. Here we describe the isolation of an sdAb against Staphyloccocus aureus enterotoxin B (SEB). The clone, A3, was found to have high affinity (Kd = 75 pM) and good specificity for SEB, showing no cross reactivity to related molecules such as Staphylococcal enterotoxin A (SEA), Staphylococcal enterotoxin D (SED), and Shiga toxin. Most remarkably, this anti-SEB sdAb had an extremely high Tm of 85°C and an ability to refold after heating to 95°C. The sharp Tm determined by circular dichroism, was found to contrast with the gradual decrease observed in intrinsic fluorescence. We demonstrated the utility of this sdAb as a capture and detector molecule in Luminex based assays providing limits of detection (LODs) of at least 64 pg/mL. The anti-SEB sdAb A3 was found to have a high affinity and an extraordinarily high Tm and could still refold to recover activity after heat denaturation. This combination of heat resilience and strong, specific binding make this sdAb a good candidate for use in antibody-based toxin detection technologies.

Journal ArticleDOI
TL;DR: Using semantic attributes of concepts and information about document structure as features for statistical classification of assertions is a good way to leverage rule-based and statistical techniques in this task.

Journal ArticleDOI
TL;DR: The model provides SNR loss values for GNSS signals in the presence of both additive white Gaussian noise and interference, provided that the interference can be accurately modeled as a non-white, Gaussian wide sense stationary process.
Abstract: Global Navigation Satellite System (GNSS) receivers suffer signal-to-noise ratio (SNR) losses due to bandlimiting, quantization, and sampling. This paper presents an analytical model for GNSS receiver losses applicable to a wide variety of hardware configurations. The model addresses digitization of the received signal by a uniform quantizer with an arbitrary (even or odd) integer number of output levels. The model provides SNR loss values for GNSS signals in the presence of both additive white Gaussian noise and interference, provided that the interference can be accurately modeled as a non-white, Gaussian wide sense stationary process.


Proceedings ArticleDOI
19 Dec 2011
TL;DR: Actionable architectural and operational recommendations to address the advanced cyber threat and to enable mission assurance for critical operations can create transformational improvements by helping to reverse adversary advantage, minimize exploit impact to essential operations, increase adversary cost and uncertainty, and act as a deterrent.
Abstract: Our national security and critical infrastructure sectors have become increasingly dependent on commercial information systems and technologies whose pedigree is uncertain given the globalization of the supply chain. Furthermore, these system architectures are brittle and fail or are compromised when subjected to ever-increasingly advanced and adaptive cyber attacks, resulting in failed, disrupted or compromised mission operations. While we must continue to raise the bar to protect mission critical systems from these threats by implementing best security practices, the current philosophy of trying to keep the adversaries out, or the assumption that they will be detected if they get through the first line of defense, is no longer valid. Given the sophistication, adaptiveness, and persistence of cyber threats, we can no longer assume that we can completely defend against intruders and must change our mindset to assume some degree of adversary success and be prepared to “fight through” cyber attacks to ensure mission success even in a degraded or contested environment. This paper will focus on actionable architectural and operational recommendations to address the advanced cyber threat and to enable mission assurance for critical operations. These recommendations can create transformational improvements by helping to reverse adversary advantage, minimize exploit impact to essential operations, increase adversary cost and uncertainty, and act as a deterrent. These approaches go well beyond traditional information assurance, disaster recovery and survivability techniques. The approaches and strategies to be discussed include creative applications of trust technologies and advanced detection capabilities in conjunction with combination of techniques using diversity, redundancy, isolation and containment, least privilege, moving target defense, randomization and unpredictability, deception, and adaptive management and response.

Journal ArticleDOI
TL;DR: A multidimensional measure of display clutter for advanced head-up displays incorporating enhanced and synthetic vision and models of flight performance based on the clutter score and workload ratings were developed, but with less predictive power.
Abstract: INTRODUCTION: This study was conducted to: develop a multidimensional measure of display clutter for advanced head-up displays (HUDs) incorporating enhanced and synthetic vision; assess the influence of HUD configuration on perceptions of display clutter, workload, and flight performance; model clutter scores in terms of visual display properties; and model flight performance in terms of subjective and objective clutter indices. METHODS: In a flight simulator, 18 pilots with different levels of flight experience flew approaches divided into three segments. Three HUD configuration sets were presented under two levels of flight workload. Pilot ratings of overall display clutter, its underlying dimensions, and mental workload were recorded along with flight performance measures. Display image analysis software was used to measure visual properties of the HUDs. RESULTS: The multidimensional measure of clutter showed internal consistency with overall perceived clutter. Calculated clutter scores were sensitive to HUD configurations and in agreement with a priori display classifications. There was a trend for the extremes of display clutter to cause higher workload and less stable performance due to cognitive complexity and a lack of information for high and low clutter displays, respectively. Multiple linear regression models of perceived clutter were developed based on HUD visual properties with predictive utility. Models of flight performance based on the clutter score and workload ratings were also developed, but with less predictive power. DISCUSSION: Measures and models of display clutter are expected to be applicable to the evaluation of a range of display concepts.

Proceedings ArticleDOI
07 May 2011
TL;DR: This research examines the deployment of an online innovation management platform to execute an annual research and development proposal competition over two cycles of usage and suggests strategies for monitoring and measuring the effectiveness of social media's impact to an existing innovation process within the context of a business strategy.
Abstract: Incorporating social media into the Enterprise is a key opportunity as well as critical challenge facing many organizations today. Tantamount in decision-making about social media implementation is the question of 'value'. Our research examines the deployment of an online innovation management platform to execute an annual research and development proposal competition over two cycles of usage. Our findings suggest strategies for monitoring and measuring the effectiveness of social media's impact to an existing innovation process within the context of a business strategy.

Proceedings ArticleDOI
20 Sep 2011
TL;DR: In this article, an operational concept and corresponding framework for flow contingency management, a component of strategic traffic flow management in the Next Generation Air Transportation System, is presented to address the lack of information, and simulation and evaluation capabilities provided to decision makers in today's strategic planning process.
Abstract: This paper presents an operational concept and corresponding framework for flow contingency management, a component of strategic traffic flow management in the Next Generation Air Transportation System. The concept and framework described in this paper aim to address the lack of information, and simulation and evaluation capabilities provided to decision makers in today’s strategic planning process. Specifically, the proposed concept explicitly models the uncertainties present at longer look-ahead times and provides quantitative analysis tools to evaluate the impact of proposed congestion-mitigation actions. This paper develops the overall concept and defines the associated modeling framework which specifies the flow of information throughout the decision making process. An example weather and traffic situation, taken from historic data, is simulated to illustrate the concept.

Journal ArticleDOI
31 May 2011-PLOS ONE
TL;DR: A transition state model for VX hydrolysis (VXts) in water is created using quantum mechanical/molecular mechanical simulations, and the analysis showed that only conformations which have the attacking hydroxyl group of VXts coordinated by the sidechain oxygen of D269 have a significant correlation with experimental results.
Abstract: Human Serum paraoxonase 1 (HuPON1) is an enzyme that has been shown to hydrolyze a variety of chemicals including the nerve agent VX. While wildtype HuPON1 does not exhibit sufficient activity against VX to be used as an in vivo countermeasure, it has been suggested that increasing HuPON1's organophosphorous hydrolase activity by one or two orders of magnitude would make the enzyme suitable for this purpose. The binding interaction between HuPON1 and VX has recently been modeled, but the mechanism for VX hydrolysis is still unknown. In this study, we created a transition state model for VX hydrolysis (VXts) in water using quantum mechanical/molecular mechanical simulations, and docked the transition state model to 22 experimentally characterized HuPON1 variants using AutoDock Vina. The HuPON1-VXts complexes were grouped by reaction mechanism using a novel clustering procedure. The average Vina interaction energies for different clusters were compared to the experimentally determined activities of HuPON1 variants to determine which computational procedures best predict how well HuPON1 variants will hydrolyze VX. The analysis showed that only conformations which have the attacking hydroxyl group of VXts coordinated by the sidechain oxygen of D269 have a significant correlation with experimental results. The results from this study can be used for further characterization of how HuPON1 hydrolyzes VX and design of HuPON1 variants with increased activity against VX.

Journal ArticleDOI
TL;DR: A formal model of collaboration which addresses confidentiality concerns is proposed and it is shown that it is PSPACE-complete to schedule a plan leading from a given initial state to a desired goal state while simultaneously deciding compliance with respect to the agents’ policies.
Abstract: Collaboration among organizations or individuals is common.While these participants are often unwilling to share all their information with each other, some information sharing is unavoidable when achieving a common goal. The need to share information and the desire to keep it confidential are two competing notions which affect the outcome of a collaboration. This paper proposes a formal model of collaboration which addresses confidentiality concerns. We draw on the notion of a plan which originates in the AI literature. We use data confidentiality policies to assess confidentiality in transition systems whose actions have an equal number of predicates in their pre- and post-conditions. Under two natural notions of policy compliance, we show that it is PSPACE-complete to schedule a plan leading from a given initial state to a desired goal state while simultaneously deciding compliance with respect to the agents' policies.

Proceedings ArticleDOI
08 Dec 2011
TL;DR: A novel approach is presented which builds an adaptive taxi- out prediction model based on a historical traffic flow database generated using the ASDE-X data and results show significant improvement in taxi-out predictions as compared to predictions from FAA's Enhanced Traffic Management System (ETMS).
Abstract: Flights incur a large percentage of delay on the ground during the departure process; however, predicting the taxi-out time is difficult due to uncertainties associated with the factors influencing it, such as airport surface traffic, downstream traffic restrictions, runway configuration, weather, and human causes. Airport Surface Detection Equipment, Model X (ASDE-X) surveillance data provides high resolution coverage of aircraft surface movement which can be leveraged to address this problem. This paper presents a novel approach which builds an adaptive taxi-out prediction model based on a historical traffic flow database generated using the ASDE-X data. The model correlates taxi-out time and taxi-out delay to a set of explanatory variables such as aircraft queue position, distance to the runway, arrival rates, departure rates and weather. Two prediction models are developed. One treats aircraft movement from starting location to the runway threshold uniformly while the other models aircraft time to get to the runway queue different from the wait time experienced by the aircraft while in the runway queue. The models are evaluated using data from New York's John F Kennedy (JFK) airport during the summer of 2010. Results show significant improvement in taxi-out predictions as compared to predictions from FAA's Enhanced Traffic Management System (ETMS).

Proceedings ArticleDOI
13 Feb 2011
TL;DR: Taking inspiration from the biomechanics of the human hand, a dynamically resizing, ergonomic, and multi-touch controller (the DREAM Controller) is created that allows chording and simultaneous multi-handed interaction anywhere that the user wishes to place his or her hands.
Abstract: Controlling the movements of mobile robots, including driving the robot through the world and panning the robot's cameras, typically requires many physical joysticks, buttons, and switches. Operators will often employ a technique called "chording" to cope with this situation. Much like a piano player, the operator will simultaneously actuate multiple joysticks and switches with his or her hands to create a combination of complimentary movements. However, these controls are in fixed locations and unable to be reprogrammed easily. Using a Microsoft Surface multi-touch table, we have designed an interface that allows chording and simultaneous multi-handed interaction anywhere that the user wishes to place his or her hands. Taking inspiration from the biomechanics of the human hand, we have created a dynamically resizing, ergonomic, and multi-touch controller (the DREAM Controller). This paper presents the design and testing of this controller with an iRobot ATRV-JR robot.

Proceedings ArticleDOI
06 Sep 2011
TL;DR: This work describes how a multi-organizational provenance store that collects provenance from heterogeneous systems addresses problems of an open world, where the data usage is not determined in advance and can take place across many systems and organizations.
Abstract: It can be difficult to fully understand the result of integrating information from diverse sources. When all the information comes from a single organization, there is a collective knowledge about where it came from and whether it can be trusted. Unfortunately, once information from multiple organizations is integrated, there is no longer a shared knowledge of the data and its quality. It is often impossible to view and judge the information from a different organization; when errors occur, notification does not always reach all users of the data. We describe how a multi-organizational provenance store that collects provenance from heterogeneous systems addresses these problems. Unlike most provenance systems, we cope with an open world, where the data usage is not determined in advance and can take place across many systems and organizations.