scispace - formally typeset
Search or ask a question

Showing papers on "Vulnerability (computing) published in 2010"


Journal ArticleDOI
TL;DR: A taxonomy identifying and analyzing attacks against machine learning systems is presented, showing how these classes influence the costs for the attacker and defender, and a formal structure defining their interaction is given.
Abstract: Machine learning's ability to rapidly evolve to changing and complex situations has helped it become a fundamental tool for computer security. That adaptability is also a vulnerability: attackers can exploit machine learning systems. We present a taxonomy identifying and analyzing attacks against machine learning systems. We show how these classes influence the costs for the attacker and defender, and we give a formal structure defining their interaction. We use our framework to survey and analyze the literature of attacks against machine learning systems. We also illustrate our taxonomy by showing how it can guide attacks against SpamBayes, a popular statistical spam filter. Finally, we discuss how our taxonomy suggests new lines of defenses.

811 citations


Proceedings ArticleDOI
16 May 2010
TL;DR: Experimental results show that TaintScope can accurately locate the checksum checks in programs and dramatically improve the effectiveness of fuzz testing, an automatic fuzzing system using dynamic taint analysis and symbolic execution techniques.
Abstract: Fuzz testing has proven successful in finding security vulnerabilities in large programs. However, traditional fuzz testing tools have a well-known common drawback: they are ineffective if most generated malformed inputs are rejected in the early stage of program running, especially when target programs employ checksum mechanisms to verify the integrity of inputs. In this paper, we present TaintScope, an automatic fuzzing system using dynamic taint analysis and symbolic execution techniques, to tackle the above problem. TaintScope has several novel contributions: 1) TaintScope is the first checksum-aware fuzzing tool to the best of our knowledge. It can identify checksum fields in input instances, accurately locate checksum-based integrity checks by using branch profiling techniques, and bypass such checks via control flow alteration. 2) TaintScope is a directed fuzzing tool working at X86 binary level (on both Linux and Window). Based on fine-grained dynamic taint tracing, TaintScope identifies which bytes in a well-formed input are used in security-sensitive operations (e.g., invoking system/library calls) and then focuses on modifying such bytes. Thus, generated inputs are more likely to trigger potential vulnerabilities. 3) TaintScope is fully automatic, from detecting checksum, directed fuzzing, to repairing crashed samples. It can fix checksum values in generated inputs using combined concrete and symbolic execution techniques. We evaluate TaintScope on a number of large real-world applications. Experimental results show that TaintScope can accurately locate the checksum checks in programs and dramatically improve the effectiveness of fuzz testing. TaintScope has already found 27 previously unknown vulnerabilities in several widely used applications, including Adobe Acrobat, Google Picasa, Microsoft Paint, and ImageMagick. Most of these severe vulnerabilities have been confirmed by Secunia and oCERT, and assigned CVE identifiers (such as CVE-2009-1882, CVE-2009-2688). Corresponding patches from vendors are released or in progress based on our reports.

363 citations


Journal ArticleDOI
01 Jul 2010
TL;DR: A supervisory control and data acquisition security framework with the following four major components is proposed: (1) real-time monitoring; (2) anomaly detection; (3) impact analysis; and (4) mitigation strategies; an attack-tree-based methodology for impact analysis is developed.
Abstract: Disruption of electric power operations can be catastrophic on national security and the economy. Due to the complexity of widely dispersed assets and the interdependences among computer, communication, and power infrastructures, the requirement to meet security and quality compliance on operations is a challenging issue. In recent years, the North American Electric Reliability Corporation (NERC) established a cybersecurity standard that requires utilities' compliance on cybersecurity of control systems. This standard identifies several cyber-related vulnerabilities that exist in control systems and recommends several remedial actions (e.g., best practices). In this paper, a comprehensive survey on cybersecurity of critical infrastructures is reported. A supervisory control and data acquisition security framework with the following four major components is proposed: (1) real-time monitoring; (2) anomaly detection; (3) impact analysis; and (4) mitigation strategies. In addition, an attack-tree-based methodology for impact analysis is developed. The attack-tree formulation based on power system control networks is used to evaluate system-, scenario -, and leaf-level vulnerabilities by identifying the system's adversary objectives. The leaf vulnerability is fundamental to the methodology that involves port auditing or password strength evaluation. The measure of vulnerabilities in the power system control framework is determined based on existing cybersecurity conditions, and then, the vulnerability indices are evaluated.

355 citations


Journal ArticleDOI
TL;DR: The modelling approach considers structural properties, as employed in graph theory, as well as functional properties to increase its fidelity and usefulness and it is concluded that the proposed modelling approach is promising and suitable in the context of vulnerability analyses of interdependent systems.

342 citations


Journal ArticleDOI
28 Sep 2010-Chaos
TL;DR: It is concluded that evaluating vulnerability in power networks using purely topological metrics can be misleading, and the vulnerability metrics for individual simulations show only a mild correlation.
Abstract: In order to identify the extent to which results from topological graph models are useful for modeling vulnerability in electricity infrastructure, we measure the susceptibility of power networks to random failures and directed attacks using three measures of vulnerability: characteristic path lengths, connectivity loss, and blackout sizes. The first two are purely topological metrics. The blackout size calculation results from a model of cascading failure in power networks. Testing the response of 40 areas within the Eastern U.S. power grid and a standard IEEE test case to a variety of attack/failure vectors indicates that directed attacks result in larger failures using all three vulnerability measures, but the attack-vectors that appear to cause the most damage depend on the measure chosen. While the topological metrics and the power grid model show some similar trends, the vulnerability metrics for individual simulations show only a mild correlation. We conclude that evaluating vulnerability in power networks using purely topological metrics can be misleading.

313 citations


Journal ArticleDOI
TL;DR: In this article, the authors examined the use of bilevel programming to analyse the vulnerability of power systems under multiple contingencies, and compared two solution approaches for the resulting mixed-integer non-linear BLE programs.
Abstract: This study examines the use of bilevel programming to analyse the vulnerability of power systems under multiple contingencies. One of the main purposes of this study is to explain the state of the art of the subject matter. A minimum vulnerability model and a maximum vulnerability model are presented and discussed. In both models, the upper-level optimisation determines a set of simultaneous outages in the transmission network whereas the lower-level optimisation models the reaction of the system operator against the outages identified in the upper level. The system operator reacts by minimising the system load shed through an optimal operation of the power system. Two solution approaches for the resulting mixed-integer non-linear bilevel programs are analysed and compared. Both methodologies are based on the equivalent transformation of the lower-level problem into a set of constraints, so that the original bilevel programs, respectively, become a single-level optimisation problem. The first approach is based on the application of Karush-Kuhn-Tucker optimality conditions whereas the second procedure relies on duality theory. This study shows that both approaches are essentially equivalent from a rigorous mathematical viewpoint; however, the second method is more suitable for off-the-shell branch-and-cut software as corroborated by numerical simulations.

212 citations


Journal ArticleDOI
TL;DR: In this article, the authors measure the susceptibility of power networks to random failures and directed attacks using three measures of vulnerability: characteristic path lengths, connectivity loss and blackout sizes, and conclude that evaluating vulnerability in power networks using purely topological metrics can be misleading.
Abstract: In order to identify the extent to which results from topological graph models are useful for modeling vulnerability in electricity infrastructure, we measure the susceptibility of power networks to random failures and directed attacks using three measures of vulnerability: characteristic path lengths, connectivity loss and blackout sizes. The first two are purely topological metrics. The blackout size calculation results from a model of cascading failure in power networks. Testing the response of 40 areas within the Eastern US power grid and a standard IEEE test case to a variety of attack/failure vectors indicates that directed attacks result in larger failures using all three vulnerability measures, but the attack vectors that appear to cause the most damage depend on the measure chosen. While our topological and power grid model results show some trends that are similar, there is only a mild correlation between the vulnerability measures for individual simulations. We conclude that evaluating vulnerability in power networks using purely topological metrics can be misleading.

206 citations


Patent
29 Jul 2010
TL;DR: In this paper, a protocol parser is invoked to parse a data message and then matched against a plurality of vulnerability signatures in parallel using a candidate selection algorithm and detecting an unwanted network intrusion based on an outcome of the matching.
Abstract: Systems, methods, and apparatus are provided for vulnerability signature based Network Intrusion Detection and/or Prevention which achieves high throughput comparable to that of the state-of-the-art regex-based systems while offering improved accuracy. A candidate selection algorithm efficiently matches thousands of vulnerability signatures simultaneously using a small amount of memory. A parsing transition state machine achieves fast protocol parsing. Certain examples provide a computer-implemented method for network intrusion detection. The method includes capturing a data message and invoking a protocol parser to parse the data message. The method also includes matching the parsed data message against a plurality of vulnerability signatures in parallel using a candidate selection algorithm and detecting an unwanted network intrusion based on an outcome of the matching.

158 citations


Journal ArticleDOI
TL;DR: A hybrid approach for structural vulnerability analysis of power transmission networks is proposed, in which a DC power flow model with hidden failures is embedded into the traditional error and attack tolerance methodology to form a new scheme for power grids vulnerability assessment and modeling.
Abstract: Power grids have been studied as a typical example of real-world complex networks Different from previous methods, this paper proposes a hybrid approach for structural vulnerability analysis of power transmission networks, in which a DC power flow model with hidden failures is embedded into the traditional error and attack tolerance methodology to form a new scheme for power grids vulnerability assessment and modeling The new approach embodies some important characteristics of power transmission networks Furthermore, the simulation on the standard IEEE 118 bus system demonstrates that a critical region might exist and when the power grid operates in the region, it is vulnerable to both random and intentional attacks Finally, a brief theoretical analysis is presented to explain the new phenomena

145 citations


Journal ArticleDOI
TL;DR: This paper proposes a new model, the E-Awareness Model (E-AM), in which home users can be forced to acquaint themselves with the risks involved in venturing into cyber space, and proposes a way to improve information security awareness among home users.

141 citations


Journal ArticleDOI
TL;DR: A unique data set from the Computer Emergency Response Team/Coordination Center (CERT) and SecurityFocus is compiled to suggest that disclosure accelerates patch release and finds that vendors are more responsive to more severe vulnerabilities.
Abstract: A key aspect of better and more secure software is timely patch release by software vendors for the vulnerabilities in their products. Software vulnerability disclosure, which refers to the publication of vulnerability information, has generated intense debate. An important consideration in this debate is the behavior of software vendors. How quickly do vendors patch vulnerabilities and how does disclosure affect patch release time? This paper compiles a unique data set from the Computer Emergency Response Team/Coordination Center (CERT) and SecurityFocus to answer this question. Our results suggest that disclosure accelerates patch release. The instantaneous probability of releasing the patch rises by nearly two and a half times because of disclosure. Open source vendors release patches more quickly than closed source vendors. Vendors are more responsive to more severe vulnerabilities. We also find that vendors respond more slowly to vulnerabilities not disclosed by CERT. We verify our results by using another publicly available data set and find that results are consistent. We also show how our estimates can aid policy makers in their decision making.

Journal Article
TL;DR: In this paper, some security threats and challenges faced by WSNs are discussed.
Abstract: Wireless sensor networks have become a growing area of research and development due to the tremendous number of applications that can greatly benefit from such systems and has lead to the development of tiny, cheap, disposable and self contained battery powered computers, known as sensor nodes or “motes”, which can accept input from an attached sensor, process this input data and transmit the results wirelessly to the transit network. Despite making such sensor networks possible, the very wireless nature of the sensors presents a number of security threats when deployed for certain applications like military ,surveillances etc . The problem of security is due to the wireless nature of the sensor networks and constrained nature of resources on the wireless sensor nodes, which means that security architectures used for traditional wireless networks are not viable. Furthermore, wireless sensor networks have an additional vulnerability because nodes are often placed in a hostile or dangerous environment where they are not physically protected. In this paper we discuss some security threats and challenges faced by WSNs.

Proceedings ArticleDOI
22 Mar 2010
TL;DR: A conceptual layered framework for protecting power grid automation systems against cyber attacks is proposed and desirable performance in terms of modularity, scalability, extendibility, and manageability is taken into account.
Abstract: Like any other industry sector, the electrical power industry is facing challenges involved with the increasing demand for interconnected system operations and control under the restructured electrical industry due to deregulation of the electrical market and the trend of the Smart Grid . This moves automation networks from outdated, proprietary, closed networks to the more current arena of Information Technology (IT). However, while gaining all of the cost and performance benefits of IT, existing IT security challenges are acquired as well. The power grid automation network has inherent security risks due to the fact that the systems and applications in the network were not originally designed for the general IT environment. In this paper, we propose a conceptual layered framework for protecting power grid automation systems against cyber attacks. The following factors are taken into account: 1) integration with existing, legacy systems in a non-intrusive fashion; 2) desirable performance in terms of modularity, scalability, extendibility, and manageability; 3) alignment to the “Roadmap to Secure Control Systems in the Energy Sector” [12] and the future intelligent power delivery systems [2,3,11]. The on-site test result of the system prototype is briefly presented as well.

Proceedings ArticleDOI
01 Nov 2010
TL;DR: In this study of the 39,393 unique CVEs until the end of 2009, the following trends are identified, given here in the form of a weather forecast: PHP: declining, with occasional SQL injection, and Buffer Overflows: flattening out after decline.
Abstract: We study the vulnerability reports in the Common Vulnerability and Exposures (CVE) database by using topic models on their description texts to find prevalent vulnerability types and new trends semi-automatically. In our study of the 39,393 unique CVEs until the end of 2009, we identify the following trends, given here in the form of a weather forecast: PHP: declining, with occasional SQL injection. Buffer Overflows: flattening out after decline. Format Strings: in steep decline. SQL Injection and XSS: remaining strong, and rising. Cross-Site Request Forgery: a sleeping giant perhaps, stirring. Application Servers: rising steeply.

Proceedings ArticleDOI
04 Nov 2010
TL;DR: The combination of commands that will cause meters to interrupt the supply, of applets and software upgrades that run in the meters, and of cryptographic keys that are used to authenticate these commands and software changes, create a new strategic vulnerability, which is discussed in this paper.
Abstract: We're about to acquire a significant new cyber- vulnerability. The world's energy utilities are starting to install hundreds of millions of 'smart meters' which contain a remote off switch. Its main purpose is to ensure that customers who default on their payments can be switched remotely to a prepay tariff; secondary purposes include supporting interruptible tariffs and implementing rolling power cuts at times of supply shortage. The off switch creates information security problems of a kind, and on a scale, that the energy companies have not had to face before. From the viewpoint of a cyber attacker - whether a hostile government agency, a terrorist organisation or even a militant environmental group - the ideal attack on a target country is to interrupt its citizens' electricity supply. This is the cyber equivalent of a nuclear strike; when electricity stops, then pretty soon everything else does too. Until now, the only plausible ways to do that involved attacks on critical generation, transmission and distribution assets, which are increasingly well defended. Smart meters change the game. The combination of commands that will cause meters to interrupt the supply, of applets and software upgrades that run in the meters, and of cryptographic keys that are used to authenticate these commands and software changes, create a new strategic vulnerability, which we discuss in this paper.

Journal ArticleDOI
TL;DR: An overview of the most important types of risk maps that can be distinguished using examples from the scientific literature: contamination maps, exposure maps, hazard maps, vulnerability maps and 'true' risk maps are presented.

Journal ArticleDOI
Tyler Moore1
TL;DR: The various economic challenges plaguing cybersecurity in greater detail are outlined: misaligned incentives, information asymmetries and externalities, and the regulatory options that are available to overcome these barriers in the cybersecurity context are discussed.

Journal ArticleDOI
TL;DR: The CVSS Risk Level Estimation Model estimates a security risk level from vulnerability information as a combination of frequency and impact estimates derived from the CVSS as a Bayesian Belief Network (BBN) topology, which allows not only the use of CVSS-based estimates but also the combination of disparate information sources.

Proceedings ArticleDOI
20 Sep 2010
TL;DR: The evaluation for 60 vulnerabilities on 176 releases of 119 open-source software systems shows that SecureSync is able to detect recurring vulnerabilities with high accuracy and to identify 90 releases having potentially vulnerable code that are not reported or fixed yet, even in mature systems.
Abstract: Software security vulnerabilities are discovered on an almost daily basis and have caused substantial damage. Aiming at supporting early detection and resolution for them, we have conducted an empirical study on thousands of vulnerabilities and found that many of them are recurring due to software reuse. Based on the knowledge gained from the study, we developed SecureSync, an automatic tool to detect recurring software vulnerabilities on the systems that reuse source code or libraries. The core of SecureSync includes two techniques to represent and compute the similarity of vulnerable code across different systems. The evaluation for 60 vulnerabilities on 176 releases of 119 open-source software systems shows that SecureSync is able to detect recurring vulnerabilities with high accuracy and to identify 90 releases having potentially vulnerable code that are not reported or fixed yet, even in mature systems. A couple of cases were actually confirmed by their developers.

Proceedings ArticleDOI
05 Jan 2010
TL;DR: The findings show that computer self-efficacy and response efficacy both positively affect the backing up of data, while perceived security vulnerability and perceived security threat both negatively affect the back-up of data.
Abstract: This study uses Protection Motivation Theory (PMT) as a theoretical framework to empirically test why people back up data on their personal computers. The theory was tested using 112 surveys collected using both paper and online data sources. The findings show that computer self-efficacy and response efficacy both positively affect the backing up of data, while perceived security vulnerability and perceived security threat both negatively affect the backing up of data. The results and implications of these findings suggest further research is necessary to fully understand the relationship between security threats and protective behaviors.

Journal ArticleDOI
TL;DR: The attack graph and multiple criteria decision-making (MCDM) are introduced to deal with the difficulties of security assessment of power control process, and the security degree of each control step.
Abstract: The security assessment is a key function that should be performed in advance of any security deployment. Since experiences of cyber attack in power control systems are still limited, a complete methodology of security assessment for communication networks of power control systems is needed. According to past research, the difficulties of security assessment include the security analysis of power control process, and the security degree of each control step. Therefore, the attack graph and multiple criteria decision-making (MCDM) are introduced to deal with these difficulties. The overall security assessment is decomposed into two parts. One is the security analysis model for power control systems using attack graph, includes the basic concepts definition, construction algorithm, vulnerability function of each control step, and connection model-based system vulnerability calculation. Another one is focused on the quantification of the security degree in each control step-a hybrid MCDM approach integrated with an analytic hierarchy process (AHP) and a technique for order preference by similarity to ideal solution (TOPSIS) are proposed to value the vulnerability factors derived by the security analysis model. Finally, an instance communication network of power control system is modeling to test the validity of security assessment. The result supports the usefulness of the security assessment.

Journal ArticleDOI
01 Jun 2010
TL;DR: The study disproves the popular belief of minutiae templates non-reversibility and raises a key vulnerability issue in the use of non-encrypted standard templates.
Abstract: This work reports a vulnerability evaluation of a highly competitive ISO matcher to direct attacks carried out with fake fingers generated from ISO templates. Experiments are carried out on a fingerprint database acquired in a real-life scenario and show that the evaluated system is highly vulnerable to the proposed attack scheme, granting access in over 75% of the attempts (for a high-security operating point). Thus, the study disproves the popular belief of minutiae templates non-reversibility and raises a key vulnerability issue in the use of non-encrypted standard templates. (This article is an extended version of Galbally et al., 2008, which was awarded with the IBM Best Student Paper Award in the track of Biometrics at ICPR 2008).

Book
01 Jan 2010
TL;DR: This book discusses RFID Privacy, Election Verifiability in Electronic Voting Protocols, and Bayesian Nash Equilibria for Network Security Games with Limited Information.
Abstract: RFID and Privacy.- A New Framework for RFID Privacy.- Readers Behaving Badly.- Privacy-Preserving, Taxable Bank Accounts.- Formal Analysis of Privacy for Vehicular Mix-Zones.- Software Security.- IntPatch: Automatically Fix Integer-Overflow-to-Buffer-Overflow Vulnerability at Compile-Time.- A Theory of Runtime Enforcement, with Results.- Enforcing Secure Object Initialization in Java.- Flexible Scheduler-Independent Security.- Cryptographic Protocols.- Secure Multiparty Linear Programming Using Fixed-Point Arithmetic.- A Certifying Compiler for Zero-Knowledge Proofs of Knowledge Based on ?-Protocols.- Short Generic Transformation to Strongly Unforgeable Signature in the Standard Model.- DR@FT: Efficient Remote Attestation Framework for Dynamic Systems.- Traffic Analysis.- Website Fingerprinting and Identification Using Ordered Feature Sequences.- Web Browser History Detection as a Real-World Privacy Threat.- On the Secrecy of Spread-Spectrum Flow Watermarks.- Traffic Analysis against Low-Latency Anonymity Networks Using Available Bandwidth Estimation.- End-User Security.- A Hierarchical Adaptive Probabilistic Approach for Zero Hour Phish Detection.- Kamouflage: Loss-Resistant Password Management.- Formal Analysis.- Sequential Protocol Composition in Maude-NPA.- Verifying Security Property of Peer-to-Peer Systems Using CSP.- Modeling and Analyzing Security in the Presence of Compromising Adversaries.- On Bounding Problems of Quantitative Information Flow.- E-voting and Broadcast.- On E-Vote Integrity in the Case of Malicious Voter Computers.- Election Verifiability in Electronic Voting Protocols.- Pretty Good Democracy for More Expressive Voting Schemes.- Efficient Multi-dimensional Key Management in Broadcast Services.- Authentication, Access Control, Authorization and Attestation.- Caught in the Maze of Security Standards.- User-Role Reachability Analysis of Evolving Administrative Role Based Access Control.- An Authorization Framework Resilient to Policy Evaluation Failures.- Optimistic Fair Exchange with Multiple Arbiters.- Anonymity and Unlinkability.- Speaker Recognition in Encrypted Voice Streams.- Evaluating Adversarial Partitions.- Providing Mobile Users' Anonymity in Hybrid Networks.- Complexity of Anonymity for Security Protocols.- Network Security and Economics.- k-Zero Day Safety: Measuring the Security Risk of Networks against Unknown Attacks.- Are Security Experts Useful? Bayesian Nash Equilibria for Network Security Games with Limited Information.- RatFish: A File Sharing Protocol Provably Secure against Rational Users.- A Service Dependency Model for Cost-Sensitive Intrusion Response.- Secure Update, DOS and Intrustion Detection.- Secure Code Update for Embedded Devices via Proofs of Secure Erasure.- D(e|i)aling with VoIP: Robust Prevention of DIAL Attacks.- Low-Cost Client Puzzles Based on Modular Exponentiation.- Expressive, Efficient and Obfuscation Resilient Behavior Based IDS.

Proceedings ArticleDOI
01 Oct 2010
TL;DR: This paper model an attack as a disk around its epicenter, and provides efficient algorithms to find vulnerable points within the network, under various metrics, and develops algorithms that identify potential points where an attack is likely to cause a significant damage.
Abstract: Telecommunications networks heavily rely on the physical infrastructure and, are therefore, vulnerable to natural disasters, such as earthquakes or floods, as well as to physical attacks, such as an Electromagnetic Pulse (EMP) attack. Large-scale disasters are likely to destroy network equipment and to severely affect interdependent systems such as the power-grid. In turn, long-term outage of the power-grid might cause additional failures to the telecommunication network. In this paper, we model an attack as a disk around its epicenter, and provide efficient algorithms to find vulnerable points within the network, under various metrics. In addition, we consider the case in which multiple disasters happen simultaneously and provide an approximation algorithm to find the points which cause the most significant destruction. Finally, since a network element does not always fail, even when it is close to the attack's epicenter, we consider a simple probabilistic model in which the probability of a network element failure is given. Under this model, we tackle the cases of single and multiple attacks and develop algorithms that identify potential points where an attack is likely to cause a significant damage.

Proceedings ArticleDOI
19 Jun 2010
TL;DR: The Hardware Vulnerability Factor (HVF) is introduced and analyzed to quantify the vulnerability of hardware and it is demonstrated that this technique can estimate AVF at runtime with an average absolute error of less than 3%.
Abstract: Fault tolerance is now a primary design constraint for all major microprocessors. One step in determining a processor's compliance to its failure rate target is measuring the Architectural Vulnerability Factor (AVF) of each on-chip structure. The AVF of a hardware structure is the probability that a fault in the structure will affect the output of a program. While AVF generates meaningful insight into system behavior, it cannot quantify the vulnerability of an individual system component (hardware, user program, etc.), limiting the amount of insight that can be generated. To address this, prior work has introduced the Program Vulnerability Factor (PVF) to quantify the vulnerability of software. In this paper, we introduce and analyze the Hardware Vulnerability Factor (HVF) to quantify the vulnerability of hardware. HVF has three concrete benefits which we examine in this paper. First, HVF analysis can provide insight to hardware designers beyond that gained from AVF analysis alone. Second, separating AVF analysis into HVF and PVF steps can accelerate the AVF measurement process. Finally, HVF measurement enables runtime AVF estimation that combines compile-time PVF estimates with runtime HVF measurements. A key benefit of this technique is that it allows software developers to influence the runtime AVF estimates. We demonstrate that this technique can estimate AVF at runtime with an average absolute error of less than 3%.

Patent
04 Oct 2010
TL;DR: In this article, the authors proposed a distributed multi-factor authentication scheme for mobile phones in modern economies, which leverages the wide penetration of mobile phones as the basis for the distributed multifactor authentication.
Abstract: The invention described here provides a fully-distributed solution to the problem of confirming the identity of the presenter of a payment card or other credentials, using multiple factors to authenticate the presenter. The invention leverages the wide penetration of mobile phones in modern economies as the basis for the distributed multi-factor authentication. For additional confidence levels biometric data can be incrementally included as part of the multi-factor authentication. The loss of any one of the multiple authentication factors does not compromise the integrity of the system or the individual, and there is no single point of vulnerability for attack or theft. The invention is fully backwards compatible with current payment cards systems and can be extended to almost any situation where the identity of the presenter of credentials needs to be authenticated prior to allowing the individual access to the protected services, systems, or locations. This allows for incremental adoption across a wide range of current and future systems.

Journal ArticleDOI
TL;DR: In this paper, a risk-based approach for the transmission network expansion problem under deliberate outages is presented, where the risk associated with this uncertainty is explicitly addressed in the proposed model.
Abstract: This paper presents a risk-based approach for the transmission network expansion problem under deliberate outages. Malicious attacks expose network planners to a new challenge: how to expand and reinforce the transmission network so that the vulnerability against intentional attacks is mitigated while meeting budgetary limits. Within this framework network planners face the nonrandom uncertainty of deliberate outages. Unlike in previous approaches, the risk associated with this uncertainty is explicitly addressed in the proposed model. Risk characterization is implemented through the minimax weighted regret paradigm. The resulting mixed-integer nonlinear programming formulation is transformed into an equivalent mixed-integer linear programming problem for which efficient commercial solvers are available. Numerical results illustrate the performance of the proposed methodology. The risk-based expansion plans are compared with those achieved by a previously reported risk-neutral model. An out-of-sample assessment is carried out to show the advantages of the risk-based model over the risk-neutral approach. In addition, the tradeoff between risk mitigation and cost minimization is analyzed.

Proceedings ArticleDOI
15 Sep 2010
TL;DR: This paper describes the system and adversary characterization data that are collected as input for the executable model and describes the simulation algorithms for adversary attack behavior and the computation for the probability that an attack attempt is successful.
Abstract: To provide insight on system security and aid decision-makers, we propose the ADversary VIew Security Evaluation (ADVISE) method to quantitatively evaluate the strength of a system's security. Our approach is to create an executable state-based security model of a system. The security model is initialized with information characterizing the system and the adversaries attacking the system. The model then simulates the attack behavior of the adversaries to produce a quantitative assessment of system security strength. This paper describes the system and adversary characterization data that are collected as input for the executable model. This paper also describes the simulation algorithms for adversary attack behavior and the computation for the probability that an attack attempt is successful. A simple case study illustrates how to analyze system security using the ADVISE method. A tool is currently under development to facilitate automatic model generation and simulation. The ADVISE method aggregates security-relevant information about a system and its adversaries to produce a quantitative security analysis useful for holistic system security decisions.

Book ChapterDOI
01 Jan 2010
TL;DR: In this article, the authors examined the role of the major actors within the security ecosystem and the processes they participate in, and the paths vulnerability data take through the ecosystem and examined the impact of each of these on security risk.
Abstract: The security of information technology and computer networks is effected by a wide variety of actors and processes which together make up a security ecosystem; here we examine this ecosystem, consolidating many aspects of security that have hitherto been discussed only separately. First, we analyze the roles of the major actors within this ecosystem and the processes they participate in, and the the paths vulnerability data take through the ecosystem and the impact of each of these on security risk. Then, based on a quantitative examination of 27,000 vulnerabilities disclosed over the past decade and taken from publicly available data sources, we quantify the systematic gap between exploit and patch availability. We provide the first examination of the impact and the risks associated with this gap on the ecosystem as a whole. Our analysis provides a metric for the success of the “responsible disclosure” process. We measure the prevalence of the commercial markets for vulnerability information and highlight the role of security information providers (SIP), which function as the “free press” of the ecosystem.

Journal ArticleDOI
TL;DR: This work study the impact of network security vulnerability and supply chain integration on firms’ incentives to invest in information security, and finds that even though an increase in either the degree of network vulnerability or the level of supply chain Integration increases the security risk, they have different impacts on firms' incentives to investment in security.
Abstract: Recent supply chain reengineering efforts have focused on integrating firms' production, inventory and replenishment activities with the help of communication networks. While communication networks and supply chain integration facilitate optimization of traditional supply chain functions, they also exacerbate the information security risk: communication networks propagate security breaches from one firm to another, and supply chain integration causes breach on one firm to affect other firms in the supply chain. We study the impact of network security vulnerability and supply chain integration on firms' incentives to invest in information security. We find that even though an increase in either the degree of network vulnerability or the degree of supply chain integration increases the security risk, they have different impacts on firms' incentives to invest in security. If the degree of supply chain integration is low, then an increase in network vulnerability induces firms to reduce, rather than increase, their security investments. A sufficiently high degree of supply chain integration alters the impact of network vulnerability into one in which firms have an incentive to increase their investments when the network vulnerability is higher. Though an increase in the degree of supply integration enhances firms' incentives to invest in security, private provisioning for security always results in a less than socially optimal security level. A liability mechanism that makes the responsible party partially compensate for the other party's loss induces each firm to invest at the socially optimal level. If firms choose the degree of integration, in addition to security investment, then firms may choose a higher degree of integration when they decide individually than when they decide jointly, suggesting an even greater security risk to the supply chain.