scispace - formally typeset
Open AccessProceedings ArticleDOI

A graph-based system for network-vulnerability analysis

TLDR
A graph-based tool can identify the set of attack paths that have a high probability of success (or a low effort cost) for the attacker, and is used to test the effectiveness of making configuration changes, implementing an intrusion detection system, etc.
Abstract
This paper presents a graph-based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The graph-based tool can identify the set of attack paths that have a high probability of success (or a low effort cost) for the attacker. The system could be used to test the effectiveness of making configuration changes, implementing an intrusion detection system, etc. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level-of-effort for the attacker, various graph algorithms such as shortest-path algorithms can identify the attack paths with the highest probability of success.

read more

Content maybe subject to copyright    Report

SANB--Bb-/
I37L
A
Graph-Based System for Network-Vulnerability Analysis
Cow-
9
8
059
f--
Laura Painton Swiler and Cynthia Phillips
Sandia National Laboratories
Albuquerque,
NM
87185
RECE!fVED
JUN
0
8
1998
Abstract
OSTI
This paper presents a graph-based approach to network vulnerability analysis.
The
method is flexible, allowing analysis of attacks from both outside and inside the network.
It can analyze risks to a specific network asset, or examine the universe of possible
consequences following a successful attack. The graph-based tool can identify the set of
attack paths that have a high probability of success (or a low “effort” cost) for the
attacker. The system could be used to test the effectiveness of making configuration
changes, implementing an intrusion detection system, etc.
The analysis system requires
as
input a database of common attacks, broken into atomic
steps, specific network configuration and topology information, and an attacker profile.
The attack information is “matched” with the network configuration information and an
attacker profile to create a superset attack graph. Nodes identify a stage of attack, for
example the class of machines the attacker has accessed and the user privilege level he or
she has compromised. The arcs in the attack graph represent attacks or stages of attacks.
By
assigning probabilities of success on the
arcs
or costs representing level-of-effort for
the attacker, various graph algorithms such
as
shortest-path algorithms can identify the
attack paths with the highest probability of success.
Sandia
is
a
multipiogram
laboratory
opcrated
by
Sandia
Corporation,
a
Lockhecd
Martin
Conipany,
for
the
United
States
Department
of
Energy
under
contract
DE-ACO4-94hL85000.

.
DISCLAIMER
This
report
was prepared as an account of work sponsored by an agency of the
United States Government. Neither the United States Government nor any agency
thereof, nor any of their employees, makes any warranty, express
or
implied. or
assumes
any
legal liability
or
responsibility for the accuracy, completeness, or use-
fulness of any information, apparatus, product,
or
process disclosed,
or
represents
that its use would not infringe privately owned rights. Reference herein
to
any
spe-
cific commercial product. process,
or
senice by trade name, trademark, manufac-
turer,
or
otherwise does not necessarily constitute
or
imply
its
endorsement,
mom-
mendotion,
or
favoring by the United States Government
or
any agency thereof.
The views and opinions of authors expressed herein do not necessarily state
or
reflect
those
of the United States Government
or
any agency thereof.
I.

t
I
1. Introduction
Military, government, commercial, and civilian operations all depend upon the security
and availability of computer systems and networks.
In October 1997, the Presidential
Commission on Critical Infrastructure recommended increasing spending to a $lB level
during the next seven years. The Commission recommended that
this
money be heavily
focused on cyber-security research, including vulnerability assessment, risk management,
intrusion detection, and information assurance technologies (Commission Report, Oct.
1997). In this paper, we describe a systematic analysis approach that can be used by
persons with limited expertise in risk assessment, vulnerability analysis, and computer
security to
(1)
examine how an adversary might be able to exploit identified weaknesses
in
order to perform undesirable activities, and
(2)
assess the universe of undesirable
activities that an adversary could accomplish given that they were able to enter the
network using an identified weakness.
Ideally, a network-vulnerability risk-analysis system should be able to model the
dynamic aspects of the network
(e.g.,
virtual topology changing), multiple levels of
attacker ability, dynamic behavior of a single attacker
(e.g.,
learning), multiple
simultaneous events or multiple attacks, user access controls, and time-dependent,
ordered sequences of attacks. Intrusion-detection systems have attempted to monitor
abnormal patterns of system usage (such as suspicious configuration information
changes) to detect security violations (Denning,
1985;
Lunt, 1993). Our system would
be complementary to an intrusion detection system. If an administrator does not want
to pay the full cost (development cost or system-performance hit) of all possible
intrusion-detection strategies, our system could suggest cost-effective subsets which
focus on the most vulnerable system components.
Probabilistic Risk Assessment
(PRA)
techniques such
as
fault-tree and event-tree
analysis provide systematic methods for examining how individual faults can either
propagate into or be exploited to cause unwanted effects on systems.
For
example, in a
fault-tree
a negative consequence, such as the compromise of a file server, is the root
of the tree. Each possible event that can lead
directly
to
this
compromise (e.g., an
attacker gaining root privileges on the machine) becomes a child of the root. Similarly,
each child is broken into a complete list of all events which can directly lead to it and
so
on. Wyss, Schriner, and Gaylor (Wyss et.
al)
have used PRA techniques to
investigate network performance. Their fault tree modeled a
loss
of network
connectivity, specifically the “all terminal connectivity” problem. Physical security and
.
vital-area analyses have also successfully used
PRA
techniques (Stack and Hill, 1984).
Since PRA methods can measure the importance of particular components to overall risk,
it seems that they could provide insights for the design
of
networks more inherently
resistant to
known
attack methods. These methods, however, have limited effectiveness
in the analysis of computer networks because they cannot model multiple attacker
attempts, time dependencies, or access controls. In addition, fault trees don’t model
cycles (such as an attacker starting at one machine, hopping to two others, returning to
2

I
the original host, and starting in another direction at a higher privilege level). Methods
such as influence diagrams and event trees suffer from the same limitations as fault
trees.
The major advance of our method over other computer-security-risk methods is that it
considers the physical network topology in conjunction with the set of attacks.
Thus,
it
goes beyond the scanning tools such
as
the SATAN (Security Administrator Tool for
Analyzing Networks) tool that are currently available which check a “laundry list” of
services or conditions that are enabled on a particular machine. For example, SATAN
checks for the following vulnerabilities on
UNIX
based systems:
1.
Are NFS file systems exported to unprivileged programs?
2.
Are NFS file systems exported to arbitrary hosts?
3.
Is
X
server access control disabled?
4.
Is
there a writable anonymous FTP home directory?
5.
Is
there an insecure version of sendmail in use?
...
but gives no indication of how these items lead
to
system compromise. All the
vulnerabilities SATAN finds are well
known
and have either bulletins and/or patches
from an incident response team or a vendor. SATAN is a usefid network analysis tool
and can provide a system administrator with a set of items to patch
or
fix. However, it
cannot identify paths of attacks, alternative network configurations that would be more
robust, or linked attacks such that a combined sequence of attacks would do more harm
than an individual attack and it doesn’t help the system administrator set security
priorities.
Our approach to modeling network risks is based on an
attack
graph.
Each node in the
graph represents a possible attack state. A node will usually be some combination of
physical machine(s), user access level, and effects of the attack
so
far, such
as
placement
of trojan horses or modification of access control. Edges represent a change of state
caused by a single action taken by the attacker (including normal user transitions if they
have gained access to a normal user’s account) or actions taken by an unwitting assistant
(such as the execution of a trojan horse). Attack graphs will be presented in more detail
in Sections
2
and
3.
The attack graph is automatically generated given three types of input: attack templates,
a configuration file, and an attacker profile.
Attack templates
represent generic
(known
or
hypothesized) attacks including conditions, such
as
operating system version, which must
hold for the attack to be possible. The
confgurationfile
gives detailed information about
the specific system to be analyzed including the topology of the network and
configuration of particular network elements such as workstations, printers, or routers.
The
attacker profle
contains information about the assumed attacker’s capabilities, such
as
the possession of an automated toolkit or a sniffer
as
well
as
skill level. The attack
graph is a customization of the generic attack templates to the attacker profile and the
3

I
network specified in the configuration file. Though attack templates represent pieces of
known attacks or hypothesized methods of moving from one state to another, their
combinations can lead to descriptions of new attacks. That is, any path in the attack
graph represents an attack, though it could be cobbled together from many known attacks,
Each edge has a weight representing a success probability or a cost to an attacker (edges
with zero probability are generally omitted). This weight is
a
function of configuration
and attacker profile. Furthermore, each node can have local “overwrites” of these files
representing effects of previous attacker actions on configuration (e.g. severed network
connections, or changes to file-access privileges) or acquired attacker knowledge
(learning). In Section
2
we discuss possible ways to estimate edge weights.
A
short path in the attack graph represents a low-cost attack. Since edge weights will
only be estimates, we consider the set of all near-optimal paths. If the edge weights are
reasonably accurate,
this
set
as
a group represents the most vulnerable parts
of
the
network. If one can assume independence of success probabilities, the same (shortest-
path) algorithms can find paths with high success probability. By having multiple
weights on each edge, one can represent potentially-conflicting criteria (e.g. the attacker
wishes to minimize both cost and probability
of
detection).
This system can answer “what-if’ questions regarding security effects of configuration
changes such as topology changes or installation of intrusion-detection systems.
It
can
indicate which attacks are possible only from highly-skilled well-funded attackers, and
which can be achieved with lower levels of effort. A business owner might decide it is
acceptable to allow a relatively high probability of network penetration by a “national-
scale” effort, but will tolerate only a small probability of attack from an “average”
attacker. Government sites, which are attacked with much higher frequency’, may need
exceptionally low probability of success for a particular attacker level in order to expect
few penetrations, and they may be more willing to pay the cost for that level of security.
Finally, this system can simulate dynamic attacks and use the results to test intrusion-
detection systems. These analysis methods, as well
as
possible ways to calculate cost-
effective defense strategies, are explained in more detail in Section
4.
The remainder of the paper is organized
as
follows. Section
2
gives a more detailed
description of attack templates, the configuration file, and attacker profile. Section
3
discusses attack-graph generation. Section
4
presents analysis methods. Section
5
provides some concluding remarks. Appendix A lists some implementation details
associated with generating the attack graph. Appendix
B
gives a detailed example
applied to
a
test network we have built.
The Defense Information Systems Agency reports that the Department of Defense
is
attacked 250,000
times a year.
Los
Alamos National Laboratories is attacked daily,
with
22
proven outsider intrusions in the
last five months. From “Security Measures,” Albuquerque Journal, March 24,1998, pp. B1-B2.
4

Citations
More filters
Proceedings ArticleDOI

Automated generation and analysis of attack graphs

TL;DR: This paper presents an automated technique for generating and analyzing attack graphs, based on symbolic model checking algorithms, letting us construct attack graphs automatically and efficiently.
Proceedings ArticleDOI

Scalable, graph-based network vulnerability analysis

TL;DR: This paper revisits the idea of attack graphs themselves, and argues that they represent more information explicitly than is necessary for the analyst, and proposes a more compact and scalable representation.
Proceedings ArticleDOI

A scalable approach to attack graph generation

TL;DR: This paper proposes logical attack graphs, which directly illustrate logical dependencies among attack goals and configuration information, and shows experimental evidence that the logical attack graph generation algorithm is very efficient.
Journal ArticleDOI

Dynamic Security Risk Management Using Bayesian Attack Graphs

TL;DR: This paper proposes a risk management framework using Bayesian networks that enable a system administrator to quantify the chances of network compromise at various levels and shows how to use this information to develop a security mitigation and management plan.
Book ChapterDOI

Foundations of attack trees

TL;DR: A denotational semantics is provided, based on a mapping to attack suites, which abstracts from the internal structure of an attack tree, which is indispensable to precisely understand how attack trees can be manipulated during construction and analysis.
References
More filters
Book

Computers and Intractability: A Guide to the Theory of NP-Completeness

TL;DR: The second edition of a quarterly column as discussed by the authors provides a continuing update to the list of problems (NP-complete and harder) presented by M. R. Garey and myself in our book "Computers and Intractability: A Guide to the Theory of NP-Completeness,” W. H. Freeman & Co., San Francisco, 1979.
Journal ArticleDOI

An Intrusion-Detection Model

TL;DR: A model of a real-time intrusion-detection expert system capable of detecting break-ins, penetrations, and other forms of computer abuse is described, based on the hypothesis that security violations can be detected by monitoring a system's audit records for abnormal patterns of system usage.
Proceedings ArticleDOI

Shortest paths algorithms: theory and experimental evaluation

TL;DR: An extensive computational study of shortest paths algorithms, including some very recent algorithms, is conducted, based on several natural problem classes which identify strengths and weaknesses of various algorithms.
Related Papers (5)