scispace - formally typeset
Search or ask a question

Showing papers by "Axel Legay published in 2014"


Journal ArticleDOI
TL;DR: The classic linear-time–branching-time spectrum is extended to a quantitative setting, parametrized by trace distance, and a general transfer principle is proved, showing that all system distances are mutually topologically inequivalent.

59 citations


Journal ArticleDOI
TL;DR: This work addresses the state explosion problem of SPL model checking by applying the principles of symbolic model checking to FTS-based verification of SPLs and proves that fSMV and FTSs are expressively equivalent.

51 citations


Book ChapterDOI
03 Nov 2014
TL;DR: This paper provides efficient methods for computing reachability strategies that will both ensure worst case time-bounds as well as provide (near-) minimal expected cost.
Abstract: (Priced) timed games are two-player quantitative games involving an environment assumed to be completely antogonistic. Classical analysis consists in the synthesis of strategies ensuring safety, time-bounded or cost-bounded reachability objectives. Assuming a randomized environment, the (priced) timed game essentially defines an infinite-state Markov (reward) decision proces. In this setting the objective is classically to find a strategy that will minimize the expected reachability cost, but with no guarantees on worst-case behaviour. In this paper, we provide efficient methods for computing reachability strategies that will both ensure worst case time-bounds as well as provide (near-) minimal expected cost. Our method extends the synthesis algorithms of the synthesis tool Uppaal-Tiga with suitable adapted reinforcement learning techniques, that exhibits several orders of magnitude improvements w.r.t. previously known automated methods.

50 citations


Proceedings ArticleDOI
22 Jan 2014
TL;DR: This work explores how ideas of statistical testing, based on a usage model (a Markov chain), can be used to extract configurations of interest according to the likelihood of their executions, in Software Product Lines (SPLs).
Abstract: Software Product Lines (SPLs) are inherently difficult to test due to the combinatorial explosion of the number of products to consider. To reduce the number of products to test, sampling techniques such as combinatorial interaction testing have been proposed. They usually start from a feature model and apply a coverage criterion (e.g. pairwise feature interaction or dissimilarity) to generate tractable, fault-finding, lists of configurations to be tested. Prioritization can also be used to sort/generate such lists, optimizing coverage criteria or weights assigned to features. However, current sampling/prioritization techniques barely take product behaviour into account. We explore how ideas of statistical testing, based on a usage model (a Markov chain), can be used to extract configurations of interest according to the likelihood of their executions. These executions are gathered in featured transition systems, compact representation of SPL behaviour. We discuss possible scenarios and give a prioritization procedure validated on a web-based learning management software.

44 citations


Book ChapterDOI
01 Sep 2014
TL;DR: In this paper, the authors present the basis of scalable verification for Markov decision processes (MDPs), using an ''mathcal {O(1)'' memory representation of history-dependent schedulers.
Abstract: Markov decision processes (MDP) are useful to model concurrent process optimisation problems, but verifying them with numerical methods is often intractable. Existing approximative approaches do not scale well and are limited to memoryless schedulers. Here we present the basis of scalable verification for MDPSs, using an \(\mathcal {O}(1)\) memory representation of history-dependent schedulers. We thus facilitate scalable learning techniques and the use of massively parallel verification.

34 citations


Book ChapterDOI
22 Sep 2014
TL;DR: A formal definition of the projection on traces given a property to verify is proposed and conditions ensuring the correct preservation of the property on the abstract model are provided, and the approach is validated on the Herman's Self Stabilizing protocol.
Abstract: This paper investigates the combined use of abstraction and probabilistic learning as a means to enhance statistical model checking performance. We are given a property (or a list of properties) for verification on a (large) stochastic system. We project on a set of traces generated from the original system, and learn a (small) abstract model from the projected traces, which contain only those labels that are relevant to the property to be verified. Then, we model-check the property on the reduced, abstract model instead of the large, original system. In this paper, we propose a formal definition of the projection on traces given a property to verify. We also provide conditions ensuring the correct preservation of the property on the abstract model. We validate our approach on the Herman’s Self Stabilizing protocol. Our experimental results show that (a) the size of the abstract model and the verification time are drastically reduced, and that (b) the probability of satisfaction of the property being verified is correctly estimated by statistical model checking on the abstract model with respect to the concrete system.

32 citations


Book ChapterDOI
08 Oct 2014
TL;DR: This work lays the foundations of model-based testing approach to SPLs, defines several FTS-aware coverage criteria and reports on the experience combining FTS with usage- based testing for configurable websites.
Abstract: Featured Transition Systems FTS is a mathematical structure to represent the behaviour of software product line in a concise way. The combination of the well-known transition systems approach to formal behavioural modelling with feature expressions was pivotal to the design of efficient verification approaches. Such approaches indeed avoid to consider products' behaviour independently, leading to often exponential savings. Building on this successful structure, we lay the foundations of model-based testing approach to SPLs. We define several FTS-aware coverage criteria and report on our experience combining FTS with usage-based testing for configurable websites.

30 citations


Book ChapterDOI
08 Oct 2014
TL;DR: How importance splitting may be used with SMC to overcome rare properties pose a challenge is described, to decompose a logical property into nested properties whose probabilities are easier to estimate.
Abstract: Statistical model checking avoids the intractable growth of states associated with numerical model checking by estimating the probability of a property from simulations Rare properties pose a challenge because the relative error of the estimate is unbounded In [13] we describe how importance splitting may be used with SMC to overcome this problem The basic idea is to decompose a logical property into nested properties whose probabilities are easier to estimate To improve performance it is desirable to decompose the property into many equi-probable levels, but logical decomposition alone may be too coarse

28 citations


Book ChapterDOI
05 Apr 2014
TL;DR: This paper presents the first known (to the best of the authors' knowledge) automatic model merging and differencing operators supported by a formal semantic theory guaranteeing that they are semantically sound.
Abstract: Class diagrams are among the most popular modeling languages in industrial use. In a model-driven development process, class diagrams evolve, so it is important to be able to assess differences between revisions, as well as to propagate differences using suitable merge operations. Existing differencing and merging methods are mainly syntactic, concentrating on edit operations applied to model elements, or they are based on sampling: enumerating some examples of instances which characterize the difference between two diagrams. This paper presents the first known (to the best of our knowledge) automatic model merging and differencing operators supported by a formal semantic theory guaranteeing that they are semantically sound. All instances of the merge of a model and its difference with another model are automatically instances of the second model. The differences we synthesize are represented using class diagram notation (not edits, or instances), which allows creation of a simple yet flexible algebra for diffing and merging. It also allows presenting changes comprehensively, in a notation already known to users.

27 citations


Journal ArticleDOI
01 Sep 2014
TL;DR: It is shown how to use Interval Markov Chains to model abstractions of deterministic systems with confidential data, and its application to synthesize an implementation maximizing entropy is shown.
Abstract: The channel capacity of a deterministic system with confidential data is an upper bound on the amount of bits of data an attacker can learn from the system. We encode all possible attacks to a system using a probabilistic specification, an Interval Markov Chain. Then the channel capacity computation reduces to finding a model of a specification with highest entropy. Entropy maximization for probabilistic process specifications has not been studied before, even though it is well known in Bayesian inference for discrete distributions. We give a characterization of global entropy of a process as a reward function, a polynomial algorithm to verify the existence of a system maximizing entropy among those respecting a specification, a procedure for the maximization of reward functions over Interval Markov Chains and its application to synthesize an implementation maximizing entropy. We show how to use Interval Markov Chains to model abstractions of deterministic systems with confidential data, and use the above results to compute their channel capacity. These results are a foundation for ongoing work on computing channel capacity for abstractions of programs derived from code.

23 citations


Proceedings ArticleDOI
11 Nov 2014
TL;DR: CEGAR (Counterexample-Guided Abstraction Refinement) is applied and new forms of abstraction specifically for SPLs are designed and shown to hold the potential to achieve large reductions in verification time.
Abstract: The model-checking problem for Software Products Lines (SPLs) is harder than for single systems: variability constitutes a new source of complexity that exacerbates the state-explosion problem. Abstraction techniques have successfully alleviated state explosion in single-system models. However, they need to be adapted to SPLs, to take into account the set of variants that produce a counterexample. In this paper, we apply CEGAR (Counterexample-Guided Abstraction Refinement) and we design new forms of abstraction specifically for SPLs. We carry out experiments to evaluate the efficiency of our new abstractions. The results show that our abstractions, combined with an appropriate refinement strategy, hold the potential to achieve large reductions in verification time, although they sometimes perform worse. We discuss in which cases a given abstraction should be used.

Proceedings ArticleDOI
11 Nov 2014
TL;DR: This work model a defective artifact as a transition system with a specific feature selected and consider it as a member of a mutant family and shows how to evaluate a test suite against the set of all candidate defects by using mutant families.
Abstract: Mutation testing is an effective technique for either improving or generating fault-finding test suites. It creates defective or incorrect program artifacts of the program under test and evaluates the ability of test suites to reveal them. Despite being effective, mutation is costly since it requires assessing the test cases with a large number of defective artifacts. Even worse, some of these artifacts are behaviourally ``equivalent'' to the original one and hence, they unnecessarily increase the testing effort. We adopt a variability perspective on mutation analysis. We model a defective artifact as a transition system with a specific feature selected and consider it as a member of a mutant family. The mutant family is encoded as a Featured Transition System, a compact formalism initially dedicated to model-checking of software product lines. We show how to evaluate a test suite against the set of all candidate defects by using mutant families. We can evaluate all the considered defects at the same time and isolate some equivalent mutants. We can also assist the test generation process and efficiently consider higher-order mutants.

Proceedings ArticleDOI
10 Feb 2014
TL;DR: The statistical model checking engine of U PPAAL has been extended to the setting of dynamic networks of hybrid systems and quantified MTL, allowing for natural modelling of concepts such as multiple threads found in various programming paradigms, as well as the dynamic evolution of biological systems.
Abstract: In this paper we present a modelling formalism for dynamic networks of stochastic hybrid automata. In particular, our formalism is based on primitives for the dynamic creation and termination of hybrid automata components during the execution of a system. In this way we allow for natural modelling of concepts such as multiple threads found in various programming paradigms, as well as the dynamic evolution of biological systems. We provide a natural stochastic semantics of the modelling formalism based on re- peated output races between the dynamic evolving components of a system. As specification language we present a quantified extension of the logic Metric Tempo- ral Logic (MTL). As a main contribution of this paper, the statistical model checking engine of U PPAAL has been extended to the setting of dynamic networks of hybrid systems and quantified MTL. We demonstrate the usefulness of the extended for- malisms in an analysis of a dynamic version of the well-known Train Gate example, as well as in natural monitoring of a MTL formula, where observations may lead to dynamic creation of monitors for sub-formulas.

Journal ArticleDOI
TL;DR: A tropical analogue of Fourier-Motzkin elimination is developed from which geometrical properties of tropical polyhedra are derived and it is proved that the redundant inequalities produced when performing successive elimination steps can be dynamically deleted by reduction to mean payoff game problems.
Abstract: We introduce a generalization of tropical polyhedra able to express both strict and non-strict inequalities. Such inequalities are handled by means of a semiring of germs (encoding infinitesimal perturbations). We develop a tropical analogue of Fourier-Motzkin elimination from which we derive geometrical properties of these polyhedra. In particular, we show that they coincide with the tropically convex union of (non-necessarily closed) cells that are convex both classically and tropically. We also prove that the redundant inequalities produced when performing successive elimination steps can be dynamically deleted by reduction to mean payoff game problems. As a complement, we provide a coarser (polynomial time) deletion procedure which is enough to arrive at a simply exponential bound for the total execution time. These algorithms are illustrated by an application to real-time systems (reachability analysis of timed automata).

BookDOI
01 Jan 2014
TL;DR: This paper proposes a formal framework for studying information flow security in component-based systems and introduces two kinds of non-interference properties that ensures and simplifies the automated verification.
Abstract: This paper proposes a formal framework for studying information flow security in component-based systems. The security policy is defined and verified from the early steps of the system design. Two kinds of non-interference properties are formally introduced and for both of them, sufficient conditions that ensures and simplifies the automated verification are proposed. The verification is compositional, first locally, by checking the behavior of every atomic component and then globally, by checking the inter-components communication and coordination. The potential benefits are illustrated on a concrete case study about constructing secure heterogeneous distributed systems.

Journal ArticleDOI
TL;DR: This paper proposes a new theory of quantitative specifications that generalizes the notions of step-wise refinement and compositional design operations from the Boolean to an arbitrary quantitative setting.
Abstract: This paper proposes a new theory of quantitative specifications. It generalizes the notions of step-wise refinement and compositional design operations from the Boolean to an arbitrary quantitative setting. Using a great number of examples, it is shown that this general approach permits to unify many interesting quantitative approaches to system design.

Proceedings ArticleDOI
20 Nov 2014
TL;DR: This paper introduces a systematic method for building stochastic abstract performance models using statistical inference and model calibration and proposes statistical model checking as performance evaluation technique upon the obtained models.
Abstract: Performance and functional correctness are key for successful design of modern embedded systems. Both aspects must be considered early in the design process to enable founded decision making towards final implementation. Nonetheless, building abstract system-level models that faithfully capture performance information along to functional behavior is a challenging task. In contrast to functional aspects, performance details are rarely available during early design phases and no clear method is known to characterize them. Moreover, once such system-level models are built they are inherently complex as they usually mix software models, hardware architecture constraints and environment abstractions. Their analysis by using traditional performance evaluation methods is reaching the limits and the need for more scalable and accurate techniques is becoming urgent. In this paper, we introduce a systematic method for building stochastic abstract performance models using statistical inference and model calibration and we propose statistical model checking as performance evaluation technique upon the obtained models. We experimented our method on a real-life case study and we were able to verify different timing properties.

Book ChapterDOI
08 Oct 2014
TL;DR: The main contribution of the presented work is the metamodel-based domain-specific construction of the corresponding code generators for the verification tools Uppaal, Spin, Plasma-lab, and Prism.
Abstract: In this paper we discuss an elaborate case study utilizing the domain-specific development of code generators within the Cinco meta tooling suite. Cinco is a framework that allows for the automatic generation of a wide range of graphical modeling tools from an abstract high-level specification. The presented case study makes use of Cinco to rapidly construct custom graphical interfaces for multi-faceted, concurrent systems, comprising non-functional properties like time, probability, data, and costs. The point of this approach is to provide user communities and their favorite tools with graphical interfaces tailored to their specific needs. This will be illustrated by generating graphical interfaces for timed automata TA, probabilistic timed automata PTA, Markov decision processes MDP and simple labeled transition systems LTS. The main contribution of the presented work, however, is the metamodel-based domain-specific construction of the corresponding code generators for the verification tools Uppaal, Spin, Plasma-lab, and Prism.

Book ChapterDOI
17 Sep 2014
TL;DR: In this article, a new notion of structural refinement for the modal nu-calculus is introduced, which is a sound abstraction of logical implication, and it is shown that these two specification formalisms are structurally equivalent.
Abstract: We introduce a new notion of structural refinement, a sound abstraction of logical implication, for the modal nu-calculus. Using new translations between the modal nu-calculus and disjunctive modal transition systems, we show that these two specification formalisms are structurally equivalent.

Journal ArticleDOI
TL;DR: This work considers a fixed perturbation and study the robustness of timed specifications with respect to the operators of the theory, and synthesizes robust strategies in timed games to address the problem of robust implementations in timed specification theories.

Proceedings ArticleDOI
01 Sep 2014
TL;DR: The main functionalities of the PLASMA statistical model checking platform developed at India are surveyed.
Abstract: This paper surveys the main functionalities of the PLASMA statistical model checking platform developed at India.

Journal ArticleDOI
TL;DR: In this article, a difference operator for stochastic systems whose specifications are represented by APAs is proposed, where the goal is to produce a specification APA that represents all witness PAs of this failure.
Abstract: This paper studies a difference operator for stochastic systems whose specifications are represented by Abstract Probabilistic Automata (APAs). In the case refinement fails between two specifications, the target of this operator is to produce a specification APA that represents all witness PAs of this failure. Our contribution is an algorithm that allows to approximate the difference of two APAs with arbitrary precision. Our technique relies on new quantitative notions of distances between APAs used to assess convergence of the approximations, as well as on an in-depth inspection of the refinement relation for APAs. The procedure is effective and not more complex to implement than refinement checking.

Journal ArticleDOI
TL;DR: This work proposes Modal Specifications with Data (MSDs), the first modal specification theory with explicit representation of data, and serves as a new abstraction-based formalism for transition systems with data.

Proceedings ArticleDOI
TL;DR: In this paper, a systematic mapping process was used to assess the importance of flattening in state machine formalisms and 30 publications were finally selected, but it appeared that flattening is rarely the sole focus of the publications and that care devoted to the description and validation of flattens techniques varies greatly.
Abstract: State machine formalisms equipped with hierarchy and parallelism allow to compactly model complex system behaviours. Such models can then be transformed into executable code or inputs for model-based testing and verification techniques. Generated artifacts are mostly flat descriptions of system behaviour. \emph{Flattening} is thus an essential step of these transformations. To assess the importance of flattening, we have defined and applied a systematic mapping process and 30 publications were finally selected. However, it appeared that flattening is rarely the sole focus of the publications and that care devoted to the description and validation of flattening techniques varies greatly. Preliminary assessment of associated tool support indicated limited tool availability and scalability on challenging models. We see this initial investigation as a first step towards generic flattening techniques and scalable tool support, cornerstones of reliable model-based behavioural development.

Book ChapterDOI
08 Oct 2014
TL;DR: A frequency domain metric is proposed to judge the a priori performance of an abstraction and provide an a posteriori indicator to aid construction of abstractions optimised for critical properties.
Abstract: Monte Carlo simulations may be used to efficiently estimate critical properties of complex evolving systems but are nevertheless computationally intensive. Hence, when only part of a system is new or modified it seems wasteful to re-simulate the parts that have not changed. It also seems unnecessary to perform many simulations of parts of a system whose behaviour does not vary significantly. To increase the efficiency of designing and testing complex evolving systems we present simulation techniques to allow such a system to be verified against behaviour-preserving statistical abstractions of its environment. We propose a frequency domain metric to judge the a priori performance of an abstraction and provide an a posteriori indicator to aid construction of abstractions optimised for critical properties.

Book ChapterDOI
14 Oct 2014
TL;DR: This article proposed a new similarity measure between texts which, contrary to the current state-of-the-art approaches, takes a global view of the texts to be compared, and implemented a tool to compute their textual distance and conducted experiments on several corpuses of texts.
Abstract: We propose a new similarity measure between texts which, contrary to the current state-of-the-art approaches, takes a global view of the texts to be compared. We have implemented a tool to compute our textual distance and conducted experiments on several corpuses of texts. The experiments show that our methods can reliably identify different global types of texts.

Book ChapterDOI
06 Apr 2014
TL;DR: This work surveys extensions of modal transition systems to specification theories for probabilistic and timed systems and investigates the role of specification theories in modal transitions.
Abstract: We survey extensions of modal transition systems to specification theories for probabilistic and timed systems.

Proceedings ArticleDOI
15 Dec 2014
TL;DR: By modeling systems with Markov chains, this work is able to measure the effectiveness of attacks on non-terminating systems by characterizations and algorithms to define meaningful measures of security for non- terminating systems, and to compute them when possible.
Abstract: In recent years, quantitative security techniques have been providing effective measures of the security of a system against an attacker. Such techniques usually assume that the system produces a finite amount of observations based on a finite amount of secret bits and terminates, and the attack is based on these observations. By modeling systems with Markov chains, we are able to measure the effectiveness of attacks on non-terminating systems. Such systems do not necessarily produce a finite amount of output and are not necessarily based on a finite amount of secret bits. We provide characterizations and algorithms to define meaningful measures of security for non-terminating systems, and to compute them when possible. We also study the bounded versions of the problems, and show examples of non-terminating programs and how their effectiveness in protecting their secret can be measured.

Book ChapterDOI
09 Sep 2014
TL;DR: This work provides a framework for compositional and iterative design and verification of systems with quantitative information, such as rewards, time or energy, based on disjunctive modal transition systems and shows how to compute the results of standard operations on the systems, including the quotient, which has not been previously considered for quantitative non-deterministic systems.
Abstract: We provide a framework for compositional and iterative design and verification of systems with quantitative information, such as rewards, time or energy. It is based on disjunctive modal transition systems where we allow actions to bear various types of quantitative information. Throughout the design process the actions can be further refined and the information made more precise. We show how to compute the results of standard operations on the systems, including the quotient (residual), which has not been previously considered for quantitative non-deterministic systems. Our quantitative framework has close connections to the modal nu-calculus and is compositional with respect to general notions of distances between systems and the standard operations.