scispace - formally typeset
Search or ask a question

Showing papers presented at "Dagstuhl Seminar Proceedings in 2007"


Proceedings Article
01 Jan 2007
TL;DR: Cosley et al. as mentioned in this paper show that despite the popularity of online communities, many such communities fail due to nonparticipation and under-contribution, and that 50% of social, hobby, and work mailing lists had no tra c over a 122 day period.
Abstract: With the increasing popularity of the Internet, information technology is changing the way we interact, entertain, communicate and consume. Concurrently, traditional social forums, such as the League of Women Voters, the United Way, or the monthly bridge club, have seen a decrease (Putnam 2000). Supporting thousands of online communities, the Internet poses an opportunity to create new social capital to replace what is lost by the decline of bowling leagues and fraternal societies. In online communities, groups of people meet to share information, discuss mutual interests, play games and carry out business. Users of communities such as SourceForge (http://sourceforge.net/) and Wikipedia (http://www.wikipedia.org/) contribute information goods, which are typically shared as public goods. However, despite the popularity of online communities, many such communities fail due to nonparticipation and under-contribution. For example, Butler (2001) found that 50% of social, hobby, and work mailing lists had no tra‐c over a 122 day period. Undercontribution is a problem even in active and successful online communities. For example, in MovieLens (http://www.movielens.org), an online movie recommendation website that invites users to rate movies and, in return, makes personalized recommendations and predictions for movies the user has not already rated, under-contribution is common. More than 22% of the movies listed on the site have fewer than 40 ratings, so few that the software cannot make accurate predictions about which users would like these movies (Cosley, Ludford

344 citations


Proceedings Article
01 Jan 2007
TL;DR: In this article, the authors discuss the notion of software redundancy, cloning, duplication, and similarity, and describe various categorizations of clone types, empirical studies on the root causes for cloning, current opinions and wisdom of consequences of cloning, ways to remove, to avoid, and to detect them, empirical evaluations of existing automatic clone detector performance and their fitness for a particular purpose, benchmarks for clone detector evaluations, presentation issues, and last but not least application of clone detection in other related fields.
Abstract: Summary. This report summarizes my overview talk on software clone detection research. It first discusses the notion of software redundancy, cloning, duplication, and similarity. Then, it describes various categorizations of clone types, empirical studies on the root causes for cloning, current opinions and wisdom of consequences of cloning, empirical studies on the evolution of clones, ways to remove, to avoid, and to detect them, empirical evaluations of existing automatic clone detector performance (such as recall, precision, time and space consumption) and their fitness for a particular purpose, benchmarks for clone detector evaluations, presentation issues, and last but not least application of clone detection in other related fields. After each summary of a subarea, I am listing open research questions.

283 citations


Proceedings Article
01 Jan 2007
TL;DR: The mCRL2 language as discussed by the authors is a specification language for distributed systems that can be used to specify and analyse the behaviour of distributed systems, and it is the successor of the mcRL specification language.
Abstract: We introduce mCRL2, a specification language that can be used to specify and analyse the behaviour of distributed systems. This language is the successor of the mCRL specification language. The mCRL2 language extends a timed basic process algebra with the possibility to define and use abstract data types. The mCRL2 data language features predefined and higher-order data types. The process algebraic part of mCRL2 allows a faithful translation of coloured Petri nets and component based systems: we have introduced multiactions and we have separated communication and parallelism.

175 citations


Proceedings Article
01 Jan 2007
TL;DR: In this paper, the authors introduce the normative temporal logic acro{ntl, a generalisation of the well-known branching-time temporal logic ACro{ctl, in which path quantifiers are replaced by indexed deontic operators.
Abstract: We introduce emph{Normative Temporal Logic} (acro{ntl}), a logic for reasoning about normative systems. acro{ntl} is a generalisation of the well-known branching-time temporal logic acro{ctl}, in which the path quantifiers $Apath$ (``on all pathsldots'') and $Epath$ (``on some pathldots'') are replaced by the indexed deontic operators $O{ s}$ and $P{ s}$, where for example $O{ s}phi$ means ``$phi$ is obligatory in the context of normative system $ s$''. After defining the logic, we give a sound and complete axiomatisation, and discuss the logic's relationship to standard deontic logics. We present a symbolic representation language for models and normative systems, and identify four different model checking problems, corresponding to whether or not a model is represented symbolically or explicitly, and whether or not we are given an interpretation for the normative systems named in formulae to be checked. We show that the complexity of model checking varies from acro{p}-complete up to acro{exptime}-hard for these variations.

87 citations


Proceedings Article
01 Jan 2007
TL;DR: It is shown that with an appropriate modification, the Sinkhorn-Knopp algorithm is a natural candidate for computing the measure on enormous data sets.
Abstract: As long as a square nonnegative matrix $A$ contains sufficient nonzero elements, the Sinkhorn-Knopp algorithm can be used to balance the matrix, that is, to find a diagonal scaling of $A$ that is doubly stochastic. We relate balancing to problems in traffic flow and describe how balancing algorithms can be used to give a two sided measure of nodes in a graph. We show that with an appropriate modification, the Sinkhorn-Knopp algorithm is a natural candidate for computing the measure on enormous data sets.

79 citations


Proceedings Article
01 Jan 2007
TL;DR: In this paper, the authors discuss ten philosophical problems in deontic logic: how to formally represent norms, when a set of norms may be termed coherent, how to deal with normative conflicts, how contrary-to-duty obligations can be appropriately modeled, how dyadic operators may be redefined to relate to sets of norms instead of pref- erence relations between possible worlds, how various concepts of per-mission can be accommodated, how meaning postulates and counts-as conditionals can be taken into account, and how sets of norm may be revised and merged.
Abstract: The paper discusses ten philosophical problems in deontic logic: how to formally represent norms, when a set of norms may be termed 'coherent', how to deal with normative conflicts, how contrary- to-duty obligations can be appropriately modeled, how dyadic deontic operators may be redefined to relate to sets of norms instead of pref- erence relations between possible worlds, how various concepts of per- mission can be accommodated, how meaning postulates and counts-as conditionals can be taken into account, and how sets of norms may be revised and merged. The problems are discussed from the viewpoint of input/output logic as developed by van der Torre & Makinson. We ar- gue that norms, not ideality, should take the central position in deontic semantics, and that a semantics that represents norms, as input/output logic does, provides helpful tools for analyzing, clarifying and solving the problems of deontic logic.

70 citations


Proceedings Article
01 Jan 2007
TL;DR: The main scientific aim of the EMIL project is to construct a simulator for exploring and experimenting norm-innovation, and shall endeavour to point out that social dynamics in societies of intelligent agents is necessarily bi-directional, which adds complexity to the emergence processes.
Abstract: In this paper we will present the EMIL project, "EMergence In the Loop: Simulating the two-way dynamics of norm innovation", a three-year project funded by the European Commission (Sixth Framework Programme -Information Society and Technologies) in the framework of the initiative "Simulating Emergent Properties in Complex Systems". The EMIL project intends to contribute to the study of social complex systems by modelling norm innovation as a phenomenon implying interrelationships among multiple levels. It shall endeavour to point out that social dynamics in societies of intelligent agents is necessarily bi-directional, which adds complexity to the emergence processes. The micro-macro link will be modelled and observed in the emergence of properties at the macro-level and their immergence into the micro-level units. The main scientific aim of the EMIL project is to construct a simulator for exploring and experimenting norm-innovation.

65 citations


Proceedings Article
01 Jan 2007
TL;DR: No finite protocol can guarantee an envy-free division of a cake among three or more players, if each player is to receive a single connected piece as mentioned in this paper, even if unbounded.
Abstract: No finite protocol (even if unbounded) can guarantee an envy-free division of a cake among three or more players, if each player is to receive a single connected piece.

64 citations


Proceedings Article
01 Jan 2007
TL;DR: This paper provides a brief introduction to the issue of measuring similarity between malicious programs, and how evolution is known to occur in the area, and uses this review to try to draw lines that connect research in software engineering to problems in anti- malware research.
Abstract: In software engineering contexts software may be compared for similarity in order to detect duplicate code that indicates poor design, and to reconstruct evolution history. Malicious software, being nothing other than a particular type of software, can also be compared for simi- larity in order to detect commonalities and evolution history. This paper provides a brief introduction to the issue of measuring similarity between malicious programs, and how evolution is known to occur in the area. It then uses this review to try to draw lines that connect research in software engineering (e.g., on "clone detection" ) to problems in anti- malware research.

60 citations


Journal ArticleDOI
01 Sep 2007
TL;DR: It is shown that a Simple Stochastic Game (SSG) can be formulated as an LP-type problem, and the first strongly subexponential solution for SSGs is obtained (a stronglySubexponential algorithm has only been known for binary SSGs [L]).
Abstract: We show that a Simple Stochastic Game (SSG) can be formulated as an LP-type problem. Using this formulation, and the known algorithm of Sharir and Welzl [SW] for LP-type problems, we obtain the first strongly subexponential solution for SSGs (a strongly subexponential algorithm has only been known for binary SSGs [L]). Using known reductions between various games, we achieve the first strongly subexponential solutions for Discounted and Mean Payoff Games. We also give alternative simple proofs for the best known upper bounds for Parity Games and binary SSGs. To the best of our knowledge, the LP-type framework has been used so far only in order to yield linear or close to linear time algorithms for various problems in computational geometry and location theory. Our approach demonstrates the applicability of the LP-type framework in other fields, and for achieving subexponential algorithms.

56 citations


Proceedings Article
01 Jan 2007
TL;DR: In this paper, the authors introduce the research issues related to and definition of normative multi-agent systems, and propose a normative framework for multiagent systems based on the notion of normativity.
Abstract: This article introduces the research issues related to and definition of normative multiagent systems.

Proceedings Article
01 Jan 2007
TL;DR: In this article, a typology of social collectives, including collection of agents, knowledge community, intentional collective, and intentional normative collective, is proposed, based on the formal-ontological paradigm of Constructive Descriptions and Situations.
Abstract: Based on the formal-ontological paradigm of Constructive Descriptions and Situations, we propose a definition of social collectives that includes social agents, plans, norms, and the conceptual relations between them. We also propose a typology of social collectives, including collection of agents, knowledge community, intentional collective, and intentional normative collective. Our ontology, represented as a first-order theory, provides the expressivity to talk about the contexts (social, informational, circumstantial, and epistemic), in which collectives make and produce sense

Proceedings Article
01 Jan 2007
TL;DR: HDR is defined as the retrieval of relevant historic documents for a modern query to treat the historic and modern languages as different languages, and use cross-language information retrieval (CLIR) techniques to translate one language into the other.
Abstract: Our cultural heritage, as preserved in libraries, archives and museums, is made up of documents written many centuries ago. Large-scale digitization initiatives, like DigiCULT, make these documents available to non-expert users through digital libraries and vertical search engines. For a user, querying a historic document collection may be a disappointing experience. Natural languages evolve over time, changing in pronunciation and spelling, and new words are introduced continuously, while older words may disappear out of everyday use. For these reasons, queries involving modern words may not be very effective for retrieving documents that contain many historic terms. Although reading a 300-year-old document might not be problematic because the words are still recognizable, the changes in vocabulary and spelling can make it difficult to use a search engine to find relevant documents. To illustrate this, consider the following example from our collection of 17th century Dutch law texts. Looking for information on the tasks of a lawyer (modern Dutch: {it advocaat}) in these texts, the modern spelling will not lead you to documents containing the 17th century Dutch spelling variant {it advocaet}. Since spelling rules were not introduced until the 19th century, 17th century Dutch spelling is inconsistent. Being based mainly on pronunciation, words were often spelled in several different variants, which poses a problem for standard retrieval engines. We therefore define Historic Document Retrieval (HDR) as the retrieval of relevant historic documents for a modern query. Our approach to this problem is to treat the historic and modern languages as different languages, and use cross-language information retrieval (CLIR) techniques to translate one language into the other.

Proceedings Article
01 Jan 2007
TL;DR: This paper introduces a construct for the definition of norms in the design of artificial institutions, expressed in terms of roles and event times, which, when certain activating events take place, is transformed into commitments of the agents playing certain roles.
Abstract: In this paper we investigate two important and related aspects of the formalization of open interaction systems: how to specify norms, and how to enforce them by means of sanctions. The problem of specifying the sanctions associated with the violation of norms is crucial in an open system because, given that the compliance of autonomous agents to obligations and prohibitions cannot be taken for granted, norm enforcement is necessary to constrain the possible evolutions of the system, thus obtaining a degree of predictability that makes it rational for agents to interact with the system. In our model, norms are specified declaratively. When certain events take place, norms become active and generate pending commitments for the agents playing certain roles. Norms also specify the sanctions associated with their violation. In the paper, we analyze the concept of sanction in detail and propose a mechanism through which sanctions can be applied.

Proceedings Article
01 Jan 2007
TL;DR: In this article, the authors survey the role of digital historical documents and report on their own work in the area with a focus on special matching strategies that help to relate modern language keywords with old variants.
Abstract: With the new interest in historical documents insight grew that electronic access to these texts causes many specific problems. In the first part of the paper we survey the present role of digital historical documents. After collecting central facts and observations on historical language change we comment on the difficulties that result for retrieval and data mining on historical texts. In the second part of the paper we report on our own work in the area with a focus on special matching strategies that help to relate modern language keywords with old variants. The basis of our studies is a collection of documents from the Early New High German period. These texts come with a very rich spectrum on word variants and spelling variations.

Proceedings Article
01 Jan 2007
TL;DR: The construction employs ``combinatorial'' hashing as an underlying building block (like Universal Hashing for cryptographic message authentication by Wegman and Carter) and runs at rate ~1, thus improving on a similar rate~1/2 approach by Hirose (FSE 2006).
Abstract: This paper proposes a construction for collision resistant $2n$-bit hash functions, based on $n$-bit block ciphers with $2n$-bit keys. The construction is analysed in the ideal cipher model; for $n=128$ an adversary would need roughly $2^{122}$ units of time to find a collision. The construction employs ``combinatorial'' hashing as an underlying building block (like Universal Hashing for cryptographic message authentication by Wegman and Carter). The construction runs at rate~1, thus improving on a similar rate~1/2 approach by Hirose (FSE 2006).

Proceedings Article
01 Jan 2007
TL;DR: In this article, the authors propose a new mechanism that combines a restrictive license auction with royalty licensing, which is more profitable than standard license auctions, auctioning royalty contracts, fixed-fee licensing, pure royalty licensing and two-part tariffs.
Abstract: This paper revisits the licensing of a non--drastic process innovation by an outside innovator to a Cournot oligopoly. We propose a new mechanism that combines a restrictive license auction with royalty licensing. This mechanism is more profitable than standard license auctions, auctioning royalty contracts, fixed--fee licensing, pure royalty licensing, and two-part tariffs. The key features are that royalty contracts are auctioned and that losers of the auction are granted the option to sign a royalty contract. Remarkably, combining royalties for winners and losers makes the integer constraint concerning the number of licenses irrelevant.

Proceedings Article
01 Dec 2007
TL;DR: A normative organization system composed of a nor- mative organization modeling language MOISE Inst used to define the norma- tive organization of a MAS, accompanied with SYNAI, a normative organization implementation architecture which is itself regulated with an explicit normative organization specification.
Abstract: In the last years, social and organizational aspects of agency have be- come a major issue in multi-agent systems' research. Recent applications of MAS enforce the need of using these aspects in order to ensure some social order within these systems. Tools to control and regulate the overall functioning of the system are needed in order to enforce global laws on the autonomous agents operating in it. This paper presents a normative organization system composed of a nor- mative organization modeling language MOISE Inst used to define the norma- tive organization of a MAS, accompanied with SYNAI, a normative organization implementation architecture which is itself regulated with an explicit normative organization specification.

Proceedings Article
01 Jan 2007
TL;DR: In this article, the semantics of Parikh's axiom (P) for relevance-sensitive belief revision are studied in the form of constraints on system-of-spheres.
Abstract: Parikh's axiom (P) for relevance-sensitive belief revision is studied. Sound and complete semantics for axiom (P) is provided in the form constraints on system-of-spheres.

Proceedings Article
01 Jan 2007
TL;DR: The paper is intended to be a starting point for a more comprehensive analysis of the subject of similarity in programs, which is critical to understand if progress is to be made in fields such as clone detection.
Abstract: An overview of the concept of program similarity is pre- sented. It divides similarity into two types—syntactic and semantic— and provides a review of eight categories of methods that may be used to measure program similarity. A summary of some applications of these methods is included. The paper is intended to be a starting point for a more comprehensive analysis of the subject of similarity in programs, which is critical to understand if progress is to be made in fields such as clone detection.

Proceedings Article
01 Jan 2007
TL;DR: In this article, game-theoretic approaches to normative multi-agent systems are presented, and the authors explain the raison d'etre and basic ideas of their approach, sketching the central elements with pointers to other publications for detailed developments.
Abstract: We explain the raison d'etre and basic ideas of our game-theoretic approach to normative multiagent systems, sketching the central elements with pointers to other publications for detailed developments.

Proceedings Article
01 Jan 2007
TL;DR: In this paper, the authors present a formal normative framework for agent-based systems that facilitates their implementation, and consider the perspective of individual agents and what they might need to effectively reason about the society in which they participate.
Abstract: One of the key issues in the computational representation of open societies relates to the introduction of norms that help to cope with the heterogeneity, the autonomy and the diversity of interests among their members. Research regarding this issue presents two omissions. One is the lack of a canonical model of norms that facilitates their implementation, and that allows us to describe the processes of reasoning about norms. The other refers to considering, in the model of normative multi-agent systems, the perspective of individual agents and what they might need to effectively reason about the society in which they participate. Both are the concerns of this paper, and the main objective is to present a formal normative framework for agent-based systems that facilitates their implementation.

Proceedings Article
01 Jan 2007
TL;DR: A survey of the use of homotopy methods in game theory, which explains how the main ideas can be extended to compute equilibria in extensive form and dynamic games, and how homotopies can be used to compute all NashEquilibria.
Abstract: This paper presents a survey of the use of homotopy methods in game theory. Homotopies allow for a robust computation of game-theoretic equilibria and their refinements. Homotopies are also suitable to compute equilibria that are selected by various selection theories. We present the relevant techniques underlying homotopy algorithms. We give detailed expositions of the Lemke-Howson algorithm and the van den Elzen-Talman algorithm to compute Nash equilibria in 2-person games, and the Herings-van den Elzen, Herings-Peeters, and McKelvey-Palfrey algorithms to compute Nash equilibria in general $n$-person games. We explain how the main ideas can be extended to compute equilibria in extensive form and dynamic games, and how homotopies can be used to compute all Nash equilibria.

Proceedings Article
Martin Becker1
01 Jan 2007
TL;DR: This paper characterizes the Ambient Assisted Living domain, briefly presents relevant software architecture trends, esp. applicable styles and patterns, and discusses promising software technology already available to solve the problems.
Abstract: Driven by the ongoing demographical, structural, and social changes in all modern, industrialized countries, there is a huge interest in IT-based equipment and services these days that enable independent living of people with specific needs. Despite of promising concepts, approaches and technology, those systems are still rather a vision than reality. In order to pave the way towards a common understanding of the problem and overall software solution approaches, this paper (i) characterizes the Ambient Assisted Living domain, (ii) briefly presents relevant software architecture trends, esp. applicable styles and patterns and (iii) discusses promising software technology already available to solve the problems.

Proceedings Article
19 Sep 2007
TL;DR: In this article, the authors present an Service-Oriented Architecture (SOA)? based architecture framework, which is designed to be close to industry standards, especially to the Service Component Architecture (SCA).
Abstract: We present an Service-Oriented Architecture (SOA)? based architecture framework. The architecture framework is designed to be close to industry standards, especially to the Service Component Architecture (SCA). The framework is language independent and the building blocks of each system, activities and data, are first class citizens. We present a meta model of the architecture framework and discuss its concepts in detail. Through the framework, concepts of an SOA such as wiring, correlation and instantiation can be clarified.

Proceedings Article
01 Jan 2007
TL;DR: A small study performed at an international work shop to elicit judgments and discussions from world experts regarding what characteristics define a code clone as discussed by the authors showed that less than half of the clone candidates judged had 80% agreement amongst the judges.
Abstract: An objective definition of what a code clone is currently eludes the field. A small study was performed at an international work- shop to elicit judgments and discussions from world experts regarding what characteristics define a code clone. Less than half of the clone can- didates judged had 80% agreement amongst the judges. Judges appeared to dier primarily in their criteria for judgment rather than their in- terpretation of the clone candidates. In subsequent open discussion the judges provided several reasons for their judgments. The study casts ad- ditional doubt on the reliability of experimental results in the field when the full criterion for clone judgment is not spelled out.

Proceedings Article
01 Jan 2007
TL;DR: This work considers one-round games between a classical verifier and two provers who share entanglement and shows that when the constraints enforced by the verifier are ‘unique’ constraints, the value of the game can be well approximated by a semidefinite program.
Abstract: We consider one-round games between a classical verifier and two provers who share entanglement. We show that when the constraints enforced by the verifier are ‘unique’ constraints (i.e., permutations), the value of the game can be well approximated by a semidefinite program. Essentially the only algorithm known previously was for the special case of binary answers, as follows from the work of Tsirelson in 1980. Among other things, our result implies that the variant of the unique games conjecture where we allow the provers to share entanglement is false. Our proof is based on a novel ‘quantum rounding technique’, showing how to take a solution to an SDP and transform it to a strategy for entangled provers.

Proceedings Article
01 Jan 2007
TL;DR: In this article, the authors propose the use of global constraints to bias the designing of global patterns according to the expectation of the user, so that a global pattern can be oriented towards the search of exceptions or a clustering.
Abstract: It is well known that local patterns are at the core of a lot of knowledge which may be discovered from data. Nevertheless, use of local patterns is limited by their huge number and computational costs. Several approaches (e.g., condensed representations, pattern set discovery) aim at grouping or synthesizing local patterns to provide a global view of the data. A global pattern is a pattern which is a set or a synthesis of local patterns coming from the data. In this paper, we propose the idea of global constraints to write queries addressing global patterns. A key point is the ability to bias the designing of global patterns according to the expectation of the user. For instance, a global pattern can be oriented towards the search of exceptions or a clustering. It requires to write queries taking into account such biases. Open issues are to design a generic framework to express powerful global constraints and solvers to mine them. We think that global constraints are a promising way to discover relevant global patterns.

Proceedings Article
01 Jan 2007
TL;DR: Issues of relevance in designing high-level programming languages dedicated to reactivity on the Web are investigated and twelve theses on features desirable for a language of reactive rules tuned to programming Web and Semantic Web applications are presented.
Abstract: Reactivity, the ability to detect and react to events, is an essential functionality in many information systems. In particular, Web systems such as online marketplaces, adaptive (e.g., recommender) sys- tems, and Web services, react to events such as Web page updates or data posted to a server. This article investigates issues of relevance in designing high-level programming languages dedicated to reactivity on the Web. It presents twelve theses on features desirable for a language of reactive rules tuned to programming Web and Semantic Web applications.

Proceedings Article
01 Jan 2007
TL;DR: In this article, a mathematical analysis of PageRank derivatives is presented, and it is shown that for real-world graphs values of $alpha$ close to $1$ do not give a more meaningful ranking.
Abstract: PageRank is defined as the stationary state of a Markov chain. The chain is obtained by perturbing the transition matrix induced by a web graph with a damping factor $alpha$ that spreads uniformly part of the rank. The choice of $alpha$ is eminently empirical, and in most cases the original suggestion $alpha=0.85$ by Brin and Page is still used. In this paper, we give a mathematical analysis of PageRank when $alpha$ changes. In particular, we show that, contrarily to popular belief, for real-world graphs values of $alpha$ close to $1$ do not give a more meaningful ranking. Then, we give closed-form formulae for PageRank derivatives of any order, and by proving that the $k$-th iteration of the Power Method gives exactly the PageRank value obtained using a Maclaurin polynomial of degree $k$, we show how to obtain an approximation of the derivatives. Finally, we view PageRank as a linear operator acting on the preference vector and show a tight connection between iterated computation and derivation.