scispace - formally typeset
Search or ask a question

Showing papers presented at "Dagstuhl Seminar Proceedings in 2009"


Proceedings Article
01 Jan 2009
TL;DR: In this paper, the authors constructed public-key cryptosystems that are secure assuming the worst-case hardness of approximating the shortest vector problem on lattices, which is known as learning with errors (LWE).
Abstract: We construct public-key cryptosystems that are secure assuming the *worst-case* hardness of approximating the shortest vector problem on lattices. Prior cryptosystems with worst-case connections (e.g., the Ajtai-Dwork system) were based either on a *special case* of the shortest vector problem, or on the conjectured hardness of lattice problems for *quantum* algorithms. Our main technical innovation is a reduction from certain variants of the shortest vector problem to corresponding versions of the "learning with errors" (LWE) problem; previously, only a quantum reduction of this kind was known. In addition, we construct new cryptosystems based on LWE, including a very natural chosen ciphertext-secure system that has a much simpler description and tighter underlying worst-case approximation factor than prior constructions. (Duration: 30 minutes, on or before Wednesday.)

298 citations


Proceedings Article
01 Jan 2009
TL;DR: Although sequential parameter optimization relies on enhanced statistical techniques such as design and analysis of computer experiments, it can be performed algorithmically and requires basically the specification of the relevant algorithm's parameters.
Abstract: We provide a comprehensive, effective and very efficient methodology for the design and experimental analysis of algorithms. We rely on modern statistical techniques for tuning and understanding algorithms from an experimental perspective. Therefore, we make use of the sequential parameter optimization (SPO) method that has been successfully applied as a tuning procedure to numerous heuristics for practical and theoretical optimization problems. Two case studies, which illustrate the applicability of SPO to algorithm tuning and model selection, are presented.

273 citations


Proceedings Article
01 Jan 2009
TL;DR: Grostl as mentioned in this paper is a SHA-3 candidate with a compression function built from two fixed, large, distinct permutations, which are used to give strong statements about the resistance of Grostl against large classes of cryptanalytic attacks.
Abstract: Grostl is a SHA-3 candidate proposal. Grostl is an iterated hash function with a compression function built from two �fixed, large, distinct permutations. The design of Grostl is transparent and based on principles very different from those used in the SHA-family. The two permutations are constructed using the wide trail design strategy, which makes it possible to give strong statements about the resistance of Grostl against large classes of cryptanalytic attacks. Moreover, if these permutations are assumed to be ideal, there is a proof for the security of the hash function. Grostl is a byte-oriented SP-network which borrows components from the AES. The S-box used is identical to the one used in the block cipher AES and the diffusion layers are constructed in a similar manner to those of the AES. As a consequence there is a very strong confusion and diffusion in Grostl

184 citations


Proceedings Article
01 Jan 2009
TL;DR: ADOL-C as mentioned in this paper is a C++ package that facilitates the evaluation of first and higher derivatives of vector functions that are defined by computer programs written in C or C++.
Abstract: The C++ package ADOL-C described in this paper facilitates the evaluation of first and higher derivatives of vector functions that are defined by computer programs written in C or C++. The numerical values of derivative vectors are obtained free of truncation errors at mostly a small multiple of the run time and a fix small multiple random access memory required by the given function evaluation program. Derivative matrices are obtained by columns, by rows or in sparse format. This tutorial describes the source code modification required for the application of ADOL-C, the most frequently used drivers to evaluate derivatives and some recent developments.

148 citations


Book ChapterDOI
14 Oct 2009
TL;DR: This paper presents a simplified version of a programming language that is designed to implement norm-based artefacts, and a logic is presented that can be used to specify and verify properties of programs developed in this language.
Abstract: Multi-agent systems are viewed as consisting of individual agents whose behaviors are regulated by an organization artefact. This paper presents a simplified version of a programming language that is designed to implement norm-based artefacts. Such artefacts are specified in terms of norms being enforced by monitoring, regimenting and sanctioning mechanisms. The syntax and operational semantics of the programming language are introduced and discussed. A logic is presented that can be used to specify and verify properties of programs developed in this language.

112 citations


Book ChapterDOI
01 Jan 2009
TL;DR: In this article, the main strands of requirements research over the past two decades and identify persistent and new challenges are reviewed and identified persistent challenges in the design of software intensive systems, including business process focus, system transparency, integration focus, distributed requirements, layered requirements, criticality of information architectures, increased deployment of COTS and software components, design fluidity and interdependent complexity.
Abstract: Requirements have remained one of the grand challenges in the design of software intensive systems. In this paper we review the main strands of requirements research over the past two decades and identify persistent and new challenges. Based on a field study that involved interviews of over 30 leading IT professionals involved in large and complex software design and implementation initiatives, we review the current state-of-the-art in the practice of design requirements management. We observe significant progress in the deployment of modeling methods, tools, risk-driven design, and user involvement. We note nine emerging themes and challenges in the requirement management arena: 1) business process focus, 2) systems transparency, 3) integration focus, 4) distributed requirements, 5) layered requirements, 6) criticality of information architectures, 7) increased deployment of COTS and software components, 8) design fluidity and 9) interdependent complexity. Several research challenges and new avenues for research are noted in the discovery, specification, and validation of requirements in light of these requirements features.

88 citations


Proceedings Article
Andreas Wächter1
01 Jan 2009
TL;DR: Ipopt as discussed by the authors is an open-source software package for large-scale non-linear optimization that allows the reader to install and test the package on a UNIX-like system and to run simple examples in a short period of time.
Abstract: Ipopt is an open-source software package for large-scale non- linear optimization. This tutorial gives a short introduction that should allow the reader to install and test the package on a UNIX-like system, and to run simple examples in a short period of time.

65 citations


Proceedings Article
Jan Broersen1
01 Jan 2009
TL;DR: This paper shows it can formalize several of these categories of mens rea, a complete stit logic featuring operators for actions taking effect in next states, operators for S5-knowledge and operators for SDL-type obligation.
Abstract: Most juridical systems contain the principle that an act is only unlaw- ful if the agent conducting the act has a `guilty mind' (`mens rea'). Dif- ferent law systems distinguish different modes of mens rea. For instance, American law distinguishes between `knowingly' performing a criminal act, `recklessness', `strict liability', etc. I will show we can formalize several of these categories. The formalism I use is a complete stit-logic featuring operators for stit-actions taking effect in `next' states, S5-knowledge op- erators and SDL-type obligation operators. The different modes of `mens rea' correspond to the violation conditions of different types of obligation definable in the logic.

60 citations


Proceedings Article
01 Jan 2009
TL;DR: The cryptographic hash function Lane is proposed as a candidate for the SHA-3 competition organised by NIST and aims to be secure, easy to understand, elegant and flexible in implementation.
Abstract: We propose the cryptographic hash function Lane as a candidate for the SHA-3 competition organised by NIST. Lane is an iterated hash function supporting multiple digest sizes. Components of the AES block cipher are reused as building blocks. Lane aims to be secure, easy to understand, elegant and flexible in implementation.

48 citations


Proceedings Article
01 Jan 2009
TL;DR: Ten guidelines for the use of normative systems in computer science are introduced and discussed, because norms are used to coordinate, organize, guide, regulate or control interaction among distributed autonomous systems.
Abstract: In this paper we introduce and discuss ten guidelines for the use of normative systems in computer science. We adopt a multiagent sys- tems perspective, because norms are used to coordinate, organize, guide, regulate or control interaction among distributed autonomous systems. The first six guidelines are derived from the computer science literature. From the so-called 'normchange' definition of the first workshop on nor- mative multiagent systems in 2005 we derive the guidelines to motivate which definition of normative multiagent system is used, to make explicit why norms are a kind of (soft) constraints deserving special analysis, and to explain why and how norms can be changed at runtime. From the so-called 'mechanism design' definition of the second workshop on nor- mative multiagent systems in 2007 we derive the guidelines to discuss the use and role of norms as a mechanism in a game-theoretic setting, clarify the role of norms in the multiagent system, and to relate the no- tion of "norm" to the legal, social, or moral literature. The remaining four guidelines follow from the philosophical literature: use norms also to resolve dilemmas, and in general to coordinate, organize, guide, regulate or control interaction among agents, distinguish norms from obligations, prohibitions and permissions, use the deontic paradoxes only to illustrate the normative multiagent system, and consider regulative norms in rela- tion to other kinds of norms and other social-cognitive computer science concepts.

43 citations


Proceedings Article
01 Jan 2009
TL;DR: The most significant tools in Zoltan: dynamic partitioning, graph coloring and ordering are described and how to obtain, build, and use them in parallel applications is described.
Abstract: The Zoltan library is a toolkit of parallel combinatorial algorithms for unstructured and/or adaptive computations. In this paper, we briefly describe the most significant tools in Zoltan: dynamic partitioning, graph coloring and ordering. We also describe how to obtain, build, and use Zoltan in parallel applications.

Proceedings Article
01 Jan 2009
TL;DR: This work gives a @Q(n(k+logmlogk)logn) time randomised algorithm which finds the correct answer with high probability and develops an approach based on k-selectors that runs in @Q (nkpolylogm) time.
Abstract: We present solutions for the k-mismatch pattern matching problem with don't cares. Given a text t of length n and a pattern p of length m with don't care symbols and a bound k, our algorithms find all the places that the pattern matches the text with at most k mismatches. We first give an \Theta(n(k + logmlog k) log n) time randomised algorithm which finds the correct answer with high probability. We then present a new deter- ministic \Theta(nk^2 log^m)time solution that uses tools originally developed for group testing. Taking our derandomisation approach further we de- velop an approach based on k-selectors that runs in \Theta(nk polylogm) time. Further, in each case the location of the mismatches at each alignment is also given at no extra cost.

Proceedings Article
01 Jan 2009
TL;DR: In this paper, a study of information flow during epidemic spread in such dynamic human networks, a topic which shares many issues with network-based epidemiology, is presented, where the authors explore hub nodes extracted from real world connections and show their influence on the epidemic to demonstrate the characteristics of information propagation.
Abstract: The emergence of Delay Tolerant Networks (DTNs) has culminated in a new generation of wireless networking New communication paradigms, which use dynamic interconnectedness as people encounter each other opportunistically, lead towards a world where digital traffic flows more easily We focus on humanto- human communication in environments that exhibit the characteristics of social networks This paper describes our study of information flow during epidemic spread in such dynamic human networks, a topic which shares many issues with network-based epidemiology We explore hub nodes extracted from real world connectivity traces and show their influence on the epidemic to demonstrate the characteristics of information propagation

Proceedings Article
01 Jan 2009
TL;DR: The aim of this paper is to demonstrate that the use of PCE may help and speed up the optimization process if compared to standard approaches such as Monte Carlo and Latin Hypercube sampling.
Abstract: Robust design optimization is a modeling methodology, com- bined with a suite of computational tools, which is aimed to solve prob- lems where some kind of uncertainty occurs in the data or in the model This paper explores robust optimization complexity in the multiobjective case, describing a new approach by means of Polynomial Chaos expan- sions (PCE) The aim of this paper is to demonstrate that the use of PCE may help and speed up the optimization process if compared to standard approaches such as Monte Carlo and Latin Hypercube sampling

Book ChapterDOI
04 Feb 2009
TL;DR: A general framework for argumentation-based resolution of conflicts amongst desires and norms is proposed, and arguments for and against compliance are arguments justifying rewards and punishments exacted by `enforcing' agents.
Abstract: Norms represent what ought to be done, and their fulfillment can be seen as benefiting the overall system, society or organisation. However, individual agent goals (desire) may conflict with system norms. If a decision to comply with a norm is determined exclusively by an agent or, conversely, if norms are rigidly enforced, then system performance may be degraded, and individual agent goals may be inappropriately obstructed. To prevent such deleterious effects we propose a general framework for argumentation-based resolution of conflicts amongst desires and norms. In this framework, arguments for and against compliance are arguments justifying rewards, respectively punishments, exacted by `enforcing' agents. The arguments are evaluated in a recent extension to Dung's abstract argumentation framework, in order that the agents can engage in metalevel argumentation as to whether the rewards and punishments have the required motivational force. We provide an example instantiation of the framework based on a logic programming formalism.

Proceedings Article
01 Feb 2009
TL;DR: This tutorial will show how to add a simple genetic algorithm routine to the graph bipartitioning methods of Scotch, and describe its visible objects and data structures.
Abstract: The design of the Scotch library for static mapping, graph partitioning and sparse matrix ordering is highly modular, so as to allow users and potential contributors to tweak it and add easily new static mapping, graph bipartitioning, vertex separation or graph ordering methods to match their particular needs. The purpose of this tutorial is twofold. It will start with a description of the interface of Scotch, presenting its visible objects and data structures. Then, we will step into the API mirror and have a look at the inside: the internal representation of graphs, mappings and orderings, and the basic sequential and parallel building blocks: graph induction, graph coarsening which can be re-used by third-party software. As an example, we will show how to add a simple genetic algorithm routine to the graph bipartitioning methods.

Proceedings Article
01 Jan 2009
TL;DR: The role of the belt in the light of dierential trails and the relative advantages of a block mode hash function compared to a stream mode one are explained.
Abstract: In this paper, we explain the design choices of Panama (8) and RadioGat un (1), which lead to Keccak (3). After a brief recall of Panama, RadioGat un and the trail backtracking cost, we focus on three important aspects. First, we explain the role of the belt in the light of dierential trails. Second, we discuss the relative advantages of a block mode hash function compared to a stream mode one. Finally, we point out why Panama and RadioGat un are not sponge functions (2) and why their design philosophy diers from that of Keccak.

Proceedings Article
01 Jan 2009
TL;DR: SuiteSparseQR is a sparse multifrontal QR factorization algorithm that obtains a substantial fraction of the theoretical peak performance of a multicore computer.
Abstract: SuiteSparseQR is a sparse multifrontal QR factorization algorithm. Dense matrix methods within each frontal matrix enable the method to obtain high performance on multicore architectures. Parallelism across different frontal matrices is handled with Intel's Threading Building Blocks library. Rank-detection is performed within each frontal matrix using Heath's method, which does not require column pivoting. The resulting sparse QR factorization obtains a substantial fraction of the theoretical peak performance of a multicore computer.

Proceedings Article
20 Mar 2009
TL;DR: Based on the simulation works on norms, a life-cycle model for norms is proposed and different mechanisms used by researchers to study norm creation, spreading, enforcement, and emergence are discussed.
Abstract: In multi-agent systems, software agents are modelled to possess characteristics and behaviour borrowed from human societies. Norms are expectations of behaviours of the agents in a society. Norms can be established in a society in different ways. In human societies, there are several types of norms such as moral norms, social norms and legal norms (laws). In artificial agent societies, the designers can impose these norms on the agents. Being autonomous, agents might not always follow the norms. Monitoring and controlling mechanisms should be in place to enforce norms. As the agents are autonomous, they themselves can evolve new norms while adapting to changing needs. In order to design and develop robust artificial agent societies, it is important to understand different approaches proposed by researchers by which norms can spread and emerge within agent societies. This paper makes two contributions to the study of norms. Firstly, based on the simulation works on norms, we propose a life-cycle model for norms. Secondly, we discuss different mechanisms used by researchers to study norm creation, spreading, enforcement and emergence.

Proceedings Article
01 Jan 2009
TL;DR: In this paper, the authors propose a logic-based formalism for describing both the semantics of normative specifications and the============ semantics of compliance checking procedures, and propose a model for checking whether agents' behaviour is compliant with the rules regulating them.
Abstract: The import of the notion of institution in the design of MASs requires to develop formal and efficient methods for modeling the interaction between agents' behaviour and normative systems This paper discusses how to check whether agents' behaviour is compliant with the rules regulating them The key point of our approach is that compliance is a relationship between two sets of specifications: the specifications for executing a process and the specifications regulating it We propose a logic-based formalism for describing both the semantics of normative specifications and the semantics of compliance checking procedures

Book ChapterDOI
01 Jan 2009
TL;DR: The problems and solutions underlying these next-generation intelligent search engines are reviewed and examples of the power of this new search paradigm are given.
Abstract: With the ever increasing size of scientific literature, finding relevant documents and answering questions has become even more of a challenge. Recently, ontologies—hierarchical, controlled vocabularies—have been introduced to annotate genomic data. They can also improve the question and answering and the selection of relevant documents in the literature search. Search engines such as GoPubMed.org use ontological background knowledge to give an overview over large query results and to answer questions. We review the problems and solutions underlying these next-generation intelligent search engines and give examples of the power of this new search paradigm.

Proceedings Article
01 Jan 2009
TL;DR: Experimental results comparing the behavior of the MutantXL algorithm to the F_4 algorithm on HFE and randomly generated multivariate systems show that MutantXL is faster and uses less memory than the Magma's implementation of $F_4$.
Abstract: MutantXL is an algorithm for solving systems of polynomial equations that was proposed at SCC 2008 and improved in PQC 2008 This article gives an overview over the MutantXL algorithm It also presents experimental results comparing the behavior of the MutantXL algorithm to the $F_4$ algorithm on HFE and randomly generated multivariate systems In both cases MutantXL is faster and uses less memory than the Magma's implementation of $F_4$

Proceedings Article
01 Jan 2009
TL;DR: In this article, Gentry, Peikert, and Vaikuntanathan presented two efficient blind signature schemes based on hard worst-case lattice problems, which are provably secure in the random oracle model and unconditionally blind.
Abstract: Motivated by the need to have secure blind signatures even in the presence of quantum computers, we present two efficient blind signature schemes based on hard worst-case lattice problems. Both schemes are provably secure in the random oracle model and unconditionally blind. The first scheme is based on preimage samplable functions that were introduced at STOC 2008 by Gentry, Peikert, and Vaikuntanathan. The scheme is stateful and runs in 3 moves. The second scheme builds upon the PKC 2008 identification scheme of Lyubashevsky. It is stateless, has 4 moves, and its security is based on the hardness of worst-case problems in ideal lattices.

Proceedings Article
01 Jan 2009
TL;DR: This paper presents a general applicable taxonomy for ensuring compliance that can be consulted for analyzing, comparing and developing enforcement strategies and hopefully will stimulate research in this area.
Abstract: With the ongoing evolution from closed to open distributed systems and the lifting of the assumption that agents acting in such a system do not pursue own goals and act in the best interest of the society, new problems arise. One of them is that compliance cannot be assumed necessarily and consequently trust issues arise. One way of tackling this problem is by regulating the behavior of the agents with the help of institutions. However for institutions to function effectively their compliance needs to be ensured. Using a utility computing scenario as sample application, this paper presents a general applicable taxonomy for ensuring compliance that can be consulted for analyzing, comparing and developing enforcement strategies and hopefully will stimulate research in this area.

Proceedings Article
01 Jan 2009
TL;DR: The goal of this Dagstuhl seminar is to discuss this important paradigm shift in an interdisciplinary international community of key researchers, to investigate innovative research methodologies and to deepen the scientific understanding of this topic which is highly relevant for the economic success of future mobile and fixed communication services.
Abstract: From May 05 to May 08, 2009, the Dagstuhl Seminar 09192 ``From Quality of Service to Quality of Experience '' was held in Schloss Dagstuhl~--~Leibniz Center for Informatics. The notion of \emph{Quality of Service} has served as a central research topic in communication networks for more than a decade, however, usually starting from a rather technical view on service quality. Therefore, recently the notion of emph{Quality of Experience} has emerged, redirecting the focus towards the end user and trying to quantify her subjective experience gained from using a service. The goal of this Dagstuhl seminar is to discuss this important paradigm shift in an interdisciplinary international community of key researchers, to investigate innovative research methodologies and to deepen the scientific understanding of this topic which is highly relevant for the economic success of future mobile and fixed communication services.

Proceedings Article
01 Jan 2009
TL;DR: Seven phrases around which the authors could rally, which are drawn from twelve years of immersion in the field of computational creativity during which an automated mathematician and an automated painter have created artefacts which I believe are of real value to society.
Abstract: I understand that simulating creative processes by computer can enhance our understanding of creativity in humans. I also understand that there is more need than ever for software to help people to be more efficient in creative jobs. And I know that computational creativity research can be of great value in both these areas. However, I'm really only interested in the intellectual challenge of enabling nuts and bolts machines - bits and bytes computers - to create artefacts of real cultural value to society. Such behaviour used to be thought of as divinely inspired, no less than a gift from the Gods. This is why it is a worthy challenge for me to bet my career against. Building a truly computationally creative machine is as much a societal as a technical challenge, and it will need computational creativity researchers to come together in consensus about certain aspects of their field. To this end, I have written here seven phrases around which we could rally (or about which we could debate - which may also be healthy). I present the ideas from which the phrases emerged with little argumentation, in the tradition of a position paper. They are drawn from twelve years of immersion in the field of computational creativity during which I've written an automated mathematician (HR) and an automated painter (The Painting Fool), and they have created artefacts which I believe are of real value to society.

Proceedings Article
01 Jan 2009
TL;DR: For a particular class of supermodular cost cooperative games that arises from a scheduling problem, it is shown that the Shapley value---which, in this case, is computable in polynomial time---is in the least core, while computing the leastcore value is NP-hard.
Abstract: We apply ideas from cooperative game theory to study situations in which a set of agents face supermodular costs. These situations appear in a variety of scheduling contexts, as well as in some settings related to facility location and network design. Intuitively, cooperation amongst rational agents who face supermodular costs is unlikely. However, in circumstances where the failure to cooperate may lead to negative externalities, one might be interested in methods of encouraging cooperation. The least core value of a cooperative game is the minimum penalty we need to charge a coalition for acting independently that encourages cooperation by ensuring the existence of an efficient and stable cost allocation. The set of all such cost allocations is called the least core. In this work, we study the computational complexity and approximability of computing the least core value of supermodular cost cooperative games. We show that computing the least core value of supermodular cost cooperative games is strongly NP-hard, and build a framework to approximate the least core value of these games using oracles that approximately determine maximally violated constraints. This framework yields a $(3+epsilon)$-approximation algorithm for computing the least core value of supermodular cost cooperative games. As a by-product, we show how to compute accompanying approximate least core cost allocations for these games. We also apply our approximation framework to obtain better results for two particular classes of supermodular cost cooperative games that arise from scheduling and matroid optimization.

Proceedings Article
01 Jan 2009
TL;DR: In this article, a new implicit formulation for shift scheduling problems, using context-free grammars to model regulation in the composition of shifts, is presented, and an integer programming model allowing the same set of shifts as Dantzig's set covering model is presented.
Abstract: We present a new implicit formulation for shift scheduling problems, using context-free grammars to model regulation in the composition of shifts. From the grammar, we generate an integer programming (IP) model allowing the same set of shifts as Dantzig’s set covering model. When solved by a state-of-the- art IP solver on problems allowing a small number of shifts, our model, the set covering formulation and a typical implicit model from the literature yield comparable solving times. Moreover, on instances where many shifts are allowed, our model is superior and can encode a wider variety of constraints. Among others, multi-activity cases, which cannot be modeled by existing implicit formulations, can easily be captured with grammars.

Proceedings Article
01 Jan 2009
TL;DR: This paper discusses how composition can be applied to RESTful services in order to foster their reuse and identifies a number of challenges for current service composition languages and technologies.
Abstract: Composition is one of the central tenets of service oriented computing. This paper discusses how composition can be applied to RESTful services in order to foster their reuse. Given the specific constraints of the REST architectural style, a number of challenges for current service composition languages and technologies are identified to point out future research directions.

Proceedings Article
01 Jan 2009
TL;DR: In this paper, the authors describe the main characteristics of disaster supply chains and highlight the particular issues that are faced when managing these supply chains, and illustrate how Operations Research tools can be used to make better decisions, taking debris management operations as an example.
Abstract: Disasters recently received the attention of the Operations Research community due to the great potential of improving disaster related operations through the use of analytical tools, and the impact on people that this implies. In this introductory article, we describe the main characteristics of disaster supply chains, and we highlight the particular issues that are faced when managing these supply chains. We illustrate how Operations Research tools can be used to make better decisions, taking debris management operations as an example, and discuss potential general research directions in this area.