scispace - formally typeset
Search or ask a question

Showing papers by "Barry O'Sullivan published in 2006"


Book ChapterDOI
01 Jan 2006
TL;DR: This paper explores how the traditional assurance measures that are used in the network multilevel security model can be re-interpreted and generalised to provide the basis of a framework for reasoning about the quality of protection provided by a secure system configuration.
Abstract: Constraining how information may flow within a system is at the heart of many protection mechanisms and many security policies have direct interpretations in terms of information flow and multilevel security style controls. However, while conceptually simple, multilevel security controls have been difficult to achieve in practice. In this paper we explore how the traditional assurance measures that are used in the network multilevel security model can be re-interpreted and generalised to provide the basis of a framework for reasoning about the quality of protection provided by a secure system configuration.

28 citations


Proceedings Article
16 Jul 2006
TL;DR: A new SAT-based version space algorithm for acquiring constraint networks from examples of solutions and non-solutions of a target problem is proposed.
Abstract: Constraint programming is a commonly used technology for solving complex combinatorial problems. However, users of this technology need significant expertise in order to model their problems appropriately. We propose a basis for addressing this problem: a new SAT-based version space algorithm for acquiring constraint networks from examples of solutions and non-solutions of a target problem. An important advantage of the algorithm is the ease with which domain-specific knowledge can be exploited.

26 citations


Journal ArticleDOI
TL;DR: It is demonstrated that heavy-tailed behaviour can be eliminated from particular classes of random problems by carefully selecting the search heuristics, even when using chronological backtrack search.
Abstract: The heavy-tailed phenomenon that characterises the runtime distributions of backtrack search procedures has received considerable attention over the past few years. Some have conjectured that heavy-tailed behaviour is largely due to the characteristics of the algorithm used. Others have conjectured that problem structure is a significant contributor. In this paper we attempt to explore the former hypothesis, namely we study how variable and value ordering heuristics impact the heavy-tailedness of runtime distributions of backtrack search procedures. We demonstrate that heavy-tailed behaviour can be eliminated from particular classes of random problems by carefully selecting the search heuristics, even when using chronological backtrack search. We also show that combinations of good search heuristics can eliminate heavy tails from quasigroups with holes of order 10 and 20, and give some insights into why this is the case. These results motivate a more detailed analysis of the effects that variable and value orderings can have on heavy-tailedness. We show how combinations of variable and value ordering heuristics can result in a runtime distribution being inherently heavy-tailed. Specifically, we show that even if we were to use an oracle to refute insoluble subtrees optimally, for some combinations of heuristics we would still observe heavy-tailed behaviour. Finally, we study the distributions of refutation sizes found using different combinations of heuristics and gain some further insights into what characteristics tend to give rise to heavy-tailed behaviour.

22 citations


01 Jan 2006
TL;DR: It is demonstrated how nogoods can be propagated using a known al- gorithm for achieving generalised arc consistency and the experimental results demonstrate the utility of this approach.
Abstract: Nogood recording is a well known technique for reducing the thrash- ing encountered by tree search algorithms. One of the most significant disadvan- tages of nogood recording has been its prohibitive space complexity. In this paper we attempt to mitigate this by using an automaton to compactly represent a set of nogoods. We demonstrate how nogoods can be propagated using a known al- gorithm for achieving generalised arc consistency. Our experimental results on a number of benchmark problems demonstrate the utility of our approach.

13 citations


Proceedings Article
22 May 2006
TL;DR: This paper shows that constraints can also be used to guide the search process by actively proposing the next choice point to be branched on, and shows that search effort can be reduced significantly.
Abstract: Constraint satisfaction problems are traditionally solved using some form of backtrack search that propagates constraints after each decision is made. The efficiency of search relies heavily on the use of good variable and value ordering heuristics. In this paper we show that constraints can also be used to guide the search process by actively proposing the next choice point to be branched on. We show that search effort can be reduced significantly.

7 citations


Book ChapterDOI
23 Oct 2006
TL;DR: The cascade problem in self-configuring networks is described: when individual network components that are securely configured are connected together (in an apparently secure manner), a configuration cascade can occur resulting in a mis-configured network.
Abstract: The challenge for autonomic network management is the provision of future network management systems that have the characteristics of self-management, self-configuration, self-protection and self-healing, in accordance with the high level objectives of the enterprise or human end-user. This paper proposes an abstract model for network configuration that is intended to help understand fundamental underlying issues in self-configuration. We describe the cascade problem in self-configuring networks: when individual network components that are securely configured are connected together (in an apparently secure manner), a configuration cascade can occur resulting in a mis-configured network. This has implications for the design of self-configuring systems and we discuss how a soft constraint-based framework can provide a solution.

5 citations


Journal ArticleDOI
TL;DR: This special issue, arising from CP 2005, is dedicated to Eugene C. Freuder, whose career has been an inspiration to many constraints researchers.
Abstract: The Eleventh International Conference on the Principles and Practice of Constraint Programming (CP 2005) was held in Sitges (Barcelona), Spain, October 1–5, 2005. We are pleased to dedicate this special issue, arising from CP 2005, to Eugene C. Freuder. At the conference a session was devoted to honouring Gene on the occasion of his 60th birthday. Several guests gave short statements of both a professional and personal nature, while others sent messages that were read out on their behalf. In addition to ourselves, Professor David Waltz (Columbia University, USA) was present to deliver a very personal perspective on Gene’s impact on him and on the field of constraints. Other statements were read from Professor Boi Faltings (Swiss Federal Institute of Technology, Switzerland), Professor Alan Mackworth (University of British Columbia, Canada) and Professor Patrick Winston (Massachusetts Institute of Technology, USA). In this issue we have included editorials from both David Waltz and Alan Mackworth. A highlight of the session was the presentation to Gene of the first Research Excellence Award of the Association for Constraint Programming for Ba program of pioneering and sustained research in constraint programming of consistent high quality yielding many substantial and significant results.’’ Professor Francesca Rossi (University of Padova, Italy), the current President of the Association of Constraint Programming, presented the award. There is no more deserving recipient of this award than Gene, whose career has been an inspiration to many constraints researchers. Gene received his B.A. in Mathematics from Harvard in 1967 and his Ph.D. in Computer Science from the Massachusetts Institute of Technology in 1975 under the supervision of Professor Patrick Winston, and spent most of his academic career Constraints (2006) 11: 83–84 DOI 10.1007/s10601-006-8055-z

4 citations


Book ChapterDOI
25 Sep 2006
TL;DR: This work considers the problem of relaxing an instance of the Quantified CSP when it is unsatisfiable and proposes several novel forms of problem relaxations and presents an algorithm for generating conflict-based explanations of inconsistency.
Abstract: The Quantified CSP (QCSP) is a generalisation of the classical CSP in which some of the variables are universally quantified [3].We consider the problem of relaxing an instance of the QCSP when it is unsatisfiable. We propose several novel forms of problem relaxations for the QCSP and present an algorithm for generating conflict-based explanations of inconsistency. Our motivation comes from problems in supply-chain management and conformant planning.

4 citations


Journal Article
TL;DR: In this paper, the authors perform an empirical study of runtime distributions associated with a continuum of problem formulations for QWH-10 with 90% holes defined between the points where a formulation is entirely specified in terms of binary inequality constraints to models specified using an increasing number of AllDifferent global constraints.
Abstract: We perform an in-depth empirical study of runtime distributions associated with a continuum of problem formulations for QWH-10 with 90% holes defined between the points where a formulation is entirely specified in terms of binary inequality constraints to models specified using an increasing number of AllDifferent global constraints [4]. For each model we study a variety of variable and value ordering heuristics. We compare their runtime distributions against runtime distributions where any mistakes made in search are refuted optimally [2], and make the following observations:

3 citations


Book ChapterDOI
25 Sep 2006
TL;DR: An in-depth empirical study of runtime distributions associated with a continuum of problem formulations for QWH-10 with 90% holes defined between the points where a formulation is entirely specified in terms of binary inequality constraints to models specified using an increasing number of AllDifferent global constraints.
Abstract: We perform an in-depth empirical study of runtime distributions associated with a continuum of problem formulations for QWH-10 with 90% holes defined between the points where a formulation is entirely specified in terms of binary inequality constraints to models specified using an increasing number of AllDifferent global constraints [4] For each model we study a variety of variable and value ordering heuristics We compare their runtime distributions against runtime distributions where any mistakes made in search are refuted optimally [2], and make the following observations:

3 citations


Proceedings Article
16 Jul 2006
TL;DR: It is shown that approximate compilation is an effective means of generating the highest-valued environments, while obtaining a representation whose size can be tailored to any embedded application, and a graceful way to tradeoff space requirements with the completeness of the environment space.
Abstract: The use of embedded technology has become widespread. Many complex engineered systems comprise embedded features to perform self-diagnosis or self-reconfiguration. These features require fast response times in order to be useful in domains where embedded systems are typically deployed. Researchers often advocate the use of compilation-based approaches to store the set of environments (resp. solutions) to a diagnosis (resp. reconfiguration) problem, in some compact representation. However, the size of a compiled representation may be exponential in the treewidth of the problem. In this paper we propose a novel method for compiling the most preferred environments in order to reduce the large space requirements of our compiled representation. We show that approximate compilation is an effective means of generating the highest-valued environments, while obtaining a representation whose size can be tailored to any embedded application. The method also provides a graceful way to tradeoff space requirements with the completeness of our coverage of the environment space.

Book ChapterDOI
26 Jun 2006
TL;DR: A class of variable ordering heuristics that exploit the clustered structure of the constraint network to inform search and can be used in conjunction with nogood learning to develop efficient solvers that can exploit propagation based on either forward checking or maintaining arc-consistency algorithms.
Abstract: In this paper we present a novel approach to solving Constraint Satisfaction Problems whose constraint graphs are highly clustered and the graph of clusters is close to being acyclic. Such graphs are encountered in many real world application domains such as configuration, diagnosis, model-based reasoning and scheduling. We present a class of variable ordering heuristics that exploit the clustered structure of the constraint network to inform search. We show how these heuristics can be used in conjunction with nogood learning to develop efficient solvers that can exploit propagation based on either forward checking or maintaining arc-consistency algorithms. Experimental results show that maintaining arc-consistency alone is not competitive with our approach, even if nogood learning and a well known variable ordering are incorporated. It is only by using our cluster-based heuristics can large problems be solved efficiently. The poor performance of maintaining arc-consistency is somewhat surprising, but quite easy to explain.

Journal Article
TL;DR: In this article, the authors compare mean and median effort based on the number of backtracks, constraint checks, or nodes in the search tree, but measures such as the incorrect decisions have also been proposed.
Abstract: Search effort is typically measured in terms of the number of backtracks, constraint checks, or nodes in the search tree, but measures such as the number of incorrect decisions have also been proposed. Comparisons based on mean and median effort are common. However, other researchers focus on studying runtime distributions, where one can observe a (non-)heavy-tailed distribution under certain conditions [2, 3].

Journal Article
TL;DR: In this paper, the authors consider the problem of relaxing an instance of the Quantified CSP when it is unsatisfiable and propose several novel forms of problem relaxations for the QCSP and present an algorithm for generating conflict-based explanations of inconsistency.
Abstract: The Quantified CSP (QCSP) is a generalisation of the classical CSP in which some of the variables are universally quantified [3].We consider the problem of relaxing an instance of the QCSP when it is unsatisfiable. We propose several novel forms of problem relaxations for the QCSP and present an algorithm for generating conflict-based explanations of inconsistency. Our motivation comes from problems in supply-chain management and conformant planning.

Book ChapterDOI
25 Sep 2006
TL;DR: Search effort is typically measured in terms of the number of backtracks, constraint checks, or nodes in the search tree, but measures such as thenumber of incorrect decisions have also been proposed.
Abstract: Search effort is typically measured in terms of the number of backtracks, constraint checks, or nodes in the search tree, but measures such as the number of incorrect decisions have also been proposed. Comparisons based on mean and median effort are common. However, other researchers focus on studying runtime distributions, where one can observe a (non-)heavy-tailed distribution under certain conditions [2, 3].