scispace - formally typeset
Search or ask a question

Showing papers on "Vulnerability (computing) published in 1997"


Proceedings ArticleDOI
08 Dec 1997
TL;DR: This paper uses short-term private-key/public-key key pairs to reduce the magnitude of this vulnerability in e-mail security systems.
Abstract: Current e-mail security systems base their security on the secrecy of the long-term private key. If this private key is ever compromised, an attacker can decrypt any messages-past, present or future-encrypted with the corresponding public key. The system described in this paper uses short-term private-key/public-key key pairs to reduce the magnitude of this vulnerability.

28 citations


Journal ArticleDOI
TL;DR: Type of computer crime and security attacks are discussed and a classification of the types of security crackers is presented, which shows the stability of today's military and commercial institutions and the cybermarkets that are envisioned for the Internet is called into question.
Abstract: In 1996 the U.S. Federal Computer Incident Response Capability (FedCIRC) reported more than 2500 incidents, defined as adverse events in a computer system or networks caused by a failure of a security mechanism, or an attempted or threatened breach of these mechanisms. The Federal Bureau of Investigation's National Computer Crimes Squad, Washington, D.C., estimates that less than 15 percent of all computer crimes are even detected, and only 10 percent of those are reported. Without solidly built investigative techniques, which would contribute to a public perception of safety, the very stability of today's military and commercial institutions, not to mention the cybermarkets that are envisioned for the Internet, is called into question. The paper discusses types of computer crime and security attacks. It also presents a classification of the types of security crackers.

16 citations


Journal ArticleDOI
TL;DR: The knowledge-based system developed for damage assessment and vulnerability analysis of structures subjected to cyclones and that acquired through feedback to questionnaires from experts/engineers forms a part of the knowledge basis.

15 citations


Journal Article
TL;DR: A real quantification of the risk and a spatial distribution of this measure is found very useful in proposing an objective negotiation allowing real land-use management and taking into account flood risk and socially acceptable risk.
Abstract: Past years' damage by floods in France and Europe has shown that we have still much work to do to cope with this problem. To do so, it seems that the conceptualization of risk by dividing it between a socio-economic dimension (vulnerability) and a hydrological-hydraulic dimension (hazard) is a good way of investigation. Moreover, recent hydrological models named flow-duration-frequency models allow us to propose a real quantification of these two parameters of risks that are vulnerability and hazard. We find a real quantification of the risk and a spatial distribution of this measure very useful in proposing an objective negotiation allowing real land-use management and taking into account flood risk and socially acceptable risk. Representative maps, such as those proposed by the Inondabilite model, can be proposed to decision makers in order to help them use hydrologic and hydraulic results in a more efficient way. These new concepts and methods should improve risk mitigation and lead to a more acceptable risk level in potential flood areas.

14 citations


01 Jan 1997
TL;DR: In this paper, the authors propose a deterministic approach to evaluate the vulnerability of a communication network with respect to the possible disruption of some of its components, in the case of no available statistical information about the dependability properties of the network components.
Abstract: We consider the problem of evaluating the behavior of a communication network face to the possible disruption of some of its components. We) are interested in the case when there is no available statistical information about the dependability properties of the network components. Instead of working with reliability metrics in a stochastic context, we analyze vulnerability measures in a deterministic framework. This approach allows us to propose a solution to other classes of problems (not easily handled in reliability theory). For instance, we can consider the problem of evaluating the capacity of a network to resist to external attacks. We can also address the problem of quantifying the network ability to satisfy some capacity constraints in transporting information. In the paper, we propose a definition of vulnerability allowing the numerical evaluation of these aspects of a communication system. We show that it verifies some intuitively desirable properties, which is not the case of previously proposed means of vulnerability analysis. Last, we discuss the algorithmic issues related with the evaluation of the proposed metric.

8 citations


Journal ArticleDOI
TL;DR: The Business Intelligence Collection ModelSM described in this paper can help ensure CI/counterintelligence integration by defining what needs to be protected, for how long, and from whom; assessing the rival's CI collection capabilities; testing and managing your firm's vulnerability to the adversary's collection methods; developing and implementing aggressive countermeasures (including disinformation); analyzing the results; and integrating this information with data from the CI side of the process.
Abstract: Typically, there is little linkage between a firm's intelligence collection and security functions. Nevertheless, counterintelligence measures often are relegated to security personnel, if they're thought about at all. But while security seeks to protect a firm's assets by a combination of policies, procedures and practices, counterintelligence, properly understood, aims to engage and neutralize a competitor's collection efforts through a variety of imaginative, flexible, and active measures. An organized counterintelligence approach, such as the Business Intelligence Collection ModelSM described here, can help ensure CI/counterintelligence integration. This process involves defining what needs to be protected, for how long, and from whom; assessing the rival's CI collection capabilities; testing and managing your firm's vulnerability to the rival's collection methods; developing and implementing aggressive countermeasures (including disinformation); analyzing the results; and integrating this information with data from the CI side of the process, thereby providing management with a more complete intelligence picture of the marketplace (revealing what the competition is trying to collect, for instance, is often an early indicator of where they're headed as a company). A case study is provided. © 1997 John Wiley & Sons, Inc.

7 citations


ReportDOI
01 Feb 1997
TL;DR: This note discusses the survival probabilities for multiple weapons missile attacks and derives analytic and approximate expressions for average survival probabilities from the exponential approximation.
Abstract: This note discusses the survival probabilities for multiple weapons missile attacks. It derives analytic and approximate expressions for average survival probabilities. Both are readily computable. The exponential approximation is sufficiently useful for rough calculations and is well suited to analytic optimizations.

6 citations


ReportDOI
01 Aug 1997
TL;DR: In this paper, the authors argue that the current US grand strategy for national security is obsolete because: (1) it is based upon industrial age threats and defenses that have limited information age applicability; (2) It fails to defend against structured information attacks threatening US centers of gravity; and (3) It is still reliant upon DOD as sole provider of national defense.
Abstract: : The Information Age brings enormous benefit to the United States; however, US dependence upon technology results in a new strategic threat aimed at the information systems that control key aspects of our military, economic, and political power. New Strategic Threat Overwhelming US conventional military might suggests that future competitors may embrace grand strategies that avoid directly attacking US defense forces and focus on undermining our national will to fight by exploiting our reliance upon information systems, present technological vulnerability, and the democratic method of governing. This threat would be most effective in situations where US force application is discretionary and the desirability of its employment is not clear-cut. Though it will never equate to the strategic threat of physical occupation by conventional military forces, it is a potent coercive policy weapon. We believe the current US grand strategy for national security is obsolete because: (1) It is based upon industrial age threats and defenses that have limited information age applicability. (2) It fails to defend against structured information attacks threatening US centers of gravity. (3) It is still reliant upon DOD as sole provider of national defense.

6 citations


01 Feb 1997
TL;DR: It appears to be that many of these seals can be dramatically improved with minor, low-cost modifications to either the seal or the use protocol.
Abstract: Computer systems, electronic communications, digital data, and computer storage media are often highly vulnerable to physical tampering. Tamper-indicating devices, also called security seals, are widely used to detect physical tampering or unauthorized access. We studied 94 different security seals, both passive and electronic, developed either commercially or by the US government. Most of these seals are in wide-spread use, including for critical applications. We learned how to defeat all 94 seals using rapid, inexpensive, low-tech methods. Cost was not a good predictor of seal security. It appears to us that many of these seals can be dramatically improved with minor, low-cost modifications to either the seal or the use protocol.

6 citations


04 Apr 1997
TL;DR: The security challenges small states face in the evolving new world order are analyzed and viable security options for small states in general and Nepal in particular are suggested.
Abstract: : The post cold war period is marked by a new multi-dimensional strategic environment giving new focus to international relations and security of small states. Though the US is the only superpower, the world is moving to multipolarity and interdependence where regional powers and international systems have an increasingly powerful role. In such an environment small states are finding themselves even more vulnerable. This paper analyzes the security challenges small states face in the evolving new world order and suggests viable security options for small states in general and Nepal in particular. It analyzes the special characteristics of small states and their vulnerability to both traditional and new forms of threats. It relates national interests with world order and makes an in depth study of the security systems of balance of power and collective security from the perspective of a small state. It analyzes Nepal's regional and internal security environment as well as her historical setting and national interests. The paper then applies the concepts of security systems in the context of Nepal to determine viable security options.

4 citations


Book
01 Jan 1997
TL;DR: This work motivates and introduces the concept of a reusable security infrastructure which will be built using a small set of proven security technology primitives and will have a single set of administrative processes, policies, databases and user keys, and describes the Yaksha security system which is an example of such an infrastructure.
Abstract: In this work we first motivate and introduce the concept of a reusable security infrastructure. Such an infrastructure will be built using a small set of proven security technology primitives and will have a single set of administrative processes, policies, databases and user keys. This single infrastructure, once implemented, will provide multiple security functions such as authentication, digital signatures, key exchange and key escrow by protocol variations. We believe that such reusable security infrastructures are the only cost effective way of implementing security on large public networks like the Internet, or within large organizations. Next we describe the Yaksha security system which is an example of such an infrastructure. Built using an RSA variant as a building block, the system can be used for digital signatures, key exchange and key escrow. It can also be used for authentication, and several authentication protocols are feasible within the infrastructure. We choose to describe an authentication protocol which is an extension of Kerberos. Significantly, it appears that breaking the Yaksha system is equivalent to breaking RSA. The Yaksha system achieves more than just reuse, it provides significant improvements over the state of the art. Its method of achieving digital signatures allows for short user private keys, and provides real time revocation of compromised keys. The extension of Kerberos implemented using the infrastructure removes the vulnerability to catastrophic failure and dictionary attacks inherent in the original Kerberos specification. The method of key escrow Yaksha provides does not require an authority to ever learn a user's long term private secrets and can be used for applications ranging from telephony to e-mail to file storage. Passwords are an important part of any security infrastructure, and we overview and point to some of our results on how to build strong password systems. Finally, we note that the fundamental primitives in the Yaksha infrastructure are powerful, and consequently a Yaksha infrastructure can be extended and reused in a myriad of ways.

Journal ArticleDOI
TL;DR: This paper examines INs and highlights their vulnerability to network delay attacks, in particular, the nature of real-time communications and the main approach that has been developed to determine whether received data is being sent in real- time or not.

Proceedings ArticleDOI
27 Jul 1997
TL;DR: VISART is a government-developed risk management software package supporting defining risk posture and choosing optimal countermeasures and an algorithm generalizing "expected value" to nonmutually exclusive events is used to aggregate risks.
Abstract: VISART is a government-developed risk management software package supporting defining risk posture and choosing optimal countermeasures. Vulnerability and threat estimates are inputs. The indirect effect of information disclosure is modeled. An algorithm generalizing "expected value" to nonmutually exclusive events is used to aggregate risks from any number of potential undesirable events.

Book ChapterDOI
TL;DR: A new definition of vulnerability is proposed allowing the numerical evaluation of a communication system and it is shown that it verifies some intuitively desirable properties, which is not the case of previously proposed means of vulnerability analysis.
Abstract: We consider the problem of evaluating the behavior of a communication network face to the possible disruption of some of its components, when there is no statistical information about the dependability properties of the latter Instead of working with reliability metrics in a stochastic context, we analyze vulnerability measures in a deterministic framework In the paper we propose a new definition of vulnerability allowing the numerical evaluation of a communication system We show that it verifies some intuitively desirable properties, which is not the case of previously proposed means of vulnerability analysis Last, we discuss the algorithmic issues related with the evaluation of the proposed metrics

03 Apr 1997
TL;DR: The civilian information infrastructure is the most vulnerable point in the authors' national security and the civil information infrastructure that represents the social and economic fabric of the nation is the Achilles heal of the national defense.
Abstract: : The civilian information infrastructure is the most vulnerable point in our national security. Adversaries are fully capable of exploiting the U.S. information infrastructure and associated technologies to destroy our economic and national security. Exhaustive dependence on economic, industrial, military, and communications technology presents a perilous mix of blessings and risks to the Nation. The explosive growth and increasing dependence on information systems is phenomenal. The Executive Branch acclaiins a knowledge based global system that includes electronic commerce, health care, research communities, education systems, and a virtual electronic government. Disrupt this vast labyrinth of information and the result is national paralysis. Attackers from cyberspace have the advantages of anonymity, legal ambiguity, easily avallable weapons systems, attack speed, and nonlinear gains for their efforts. Attacks may come from myriad sources and by numerous means. An electronic Pearl Harbor is possible. The threat is real and actively growing. The military, commercial and economic sectors are technologically inseparable. The civil information infrastructure that represents the social and economic fabric of the nation is the Achilles heal of the national defense.

Book ChapterDOI
01 Jan 1997
TL;DR: This paper proposes that requirements for information security should be integrated in the development process in an early phase, so that information security will become an integral part of the system.
Abstract: As organisations become aware of their vulnerability to threats to their information and telecommunication systems, this often results in the ad-hoc addition of safeguards to those systems. This causes operational problems, because information security requirements were never an issue during the development of these systems. In this paper we propose that requirements for information security should be integrated in the development process in an early phase. The benefit of the integration is that information security will become an integral part of the system. We discuss the complications and some preliminary guidelines, assuming that system developers are used to a ‘traditional’ development process.


ReportDOI
01 Apr 1997
TL;DR: The performance of the first-generation STAWs is so impressive - and the advances in STAW supporting sensor and computational so rapid - that it must be acted now to develop a doctrines which addresses this threat.
Abstract: : New Smart Top Attack Weapons (STAWs) are rapidly emerging in the research world and entering the battlefield, exposing our soldiers to a new threat. We need to develop doctrine, tactics, and a training program to tell our soldiers how they can reduce their vulnerability to this new family of antitank weapons. The performance of the first-generation STAWs is so impressive - and the advances in STAW supporting sensor and computational so rapid - that we must act now to develop a doctrines which addresses this threat. To delay would virtually ensure that American soldiers will face STAW systems without the training and knowledge necessary to operate effectively in the STAW environment.

Book ChapterDOI
01 Jan 1997
TL;DR: The general Cascade vulnerability problem is presented, the basic properties of the most important detection algorithms are described, and a brief comparative analysis is conducted.
Abstract: The Cascade Vulnerability Problem is a potential problem which must be faced when using the interconnected accredited system approach of the Trusted Network Interpretation. In this paper, we present the general Cascade vulnerability problem, describe the basic properties of the most important detection algorithms, and conduct a brief comparative analysis.

Book ChapterDOI
01 Jan 1997
TL;DR: Focus in this paper is put on events and failures causing degraded operational modes or traffic restrictions such as reduced speed, single track operation, full stop in traffic, etc., and not on safety related aspects.
Abstract: This paper describes the work carried out in connection with a reliability and vulnerability study of the Signalling System at “Gardermobanen” - a highspeed rail link between the centre of Oslo and the Main Airport in Oslo to be built and operated by NSB Gardermobanen - in order to ensure that the specified success criteria and quality goals are met. Focus in this paper is put on events and failures causing degraded operational modes or traffic restrictions such as reduced speed, single track operation, full stop in traffic, etc., and not on safety related aspects. Failure statistics from the Norwegian State Railways (NSB) show that the signalling system is one of the largest contributors to train delays (30–40%). The main results of a reliability and vulnerability analysis of the Signalling System at Gardermobanen carried out at an early stage of the development project to assist the purchaser (NSB Gardermobanen) as well as the supplier of the system (Siemens) are given together with an outline of the proposed reliability management programme for the further work. Our experience with respect to this type of work is presented together with a discussion of how similar reliability management programmes could be improved.