scispace - formally typeset
Search or ask a question

Showing papers in "Lecture Notes in Computer Science in 2007"


Journal Article
TL;DR: In this paper, the authors describe an ultra-lightweight block cipher, present, which is suitable for extremely constrained environments such as RFID tags and sensor networks, but it is not suitable for very large networks such as sensor networks.
Abstract: With the establishment of the AES the need for new block ciphers has been greatly diminished; for almost all block cipher applications the AES is an excellent and preferred choice. However, despite recent implementation advances, the AES is not suitable for extremely constrained environments such as RFID tags and sensor networks. In this paper we describe an ultra-lightweight block cipher, present . Both security and hardware efficiency have been equally important during the design of the cipher and at 1570 GE, the hardware requirements for present are competitive with today's leading compact stream ciphers.

1,750 citations


BookDOI
TL;DR: This work discusses an Efficient Protocol for Secure Two-Party Computation in the Presence of Malicious Adversaries, and a Fast and Key-Efficient Reduction of Chosen-Ciphertext to Known-Plaintext Security.
Abstract: Chosen-Prefix Collisions for MD5 and Colliding X.509 Certificates for Different Identities.- Non-trivial Black-Box Combiners for Collision-Resistant Hash-Functions Don't Exist.- The Collision Intractability of MDC-2 in the Ideal-Cipher Model.- An Efficient Protocol for Secure Two-Party Computation in the Presence of Malicious Adversaries.- Revisiting the Efficiency of Malicious Two-Party Computation.- Efficient Two-Party Secure Computation on Committed Inputs.- Universally Composable Multi-party Computation Using Tamper-Proof Hardware.- Generic and Practical Resettable Zero-Knowledge in the Bare Public-Key Model.- Instance-Dependent Verifiable Random Functions and Their Application to Simultaneous Resettability.- Conditional Computational Entropy, or Toward Separating Pseudoentropy from Compressibility.- Zero Knowledge and Soundness Are Symmetric.- Mesh Signatures.- The Power of Proofs-of-Possession: Securing Multiparty Signatures against Rogue-Key Attacks.- Batch Verification of Short Signatures.- Cryptanalysis of SFLASH with Slightly Modified Parameters.- Differential Cryptanalysis of the Stream Ciphers Py, Py6 and Pypy.- Secure Computation from Random Error Correcting Codes.- Round-Efficient Secure Computation in Point-to-Point Networks.- Atomic Secure Multi-party Multiplication with Low Communication.- Cryptanalysis of the Sidelnikov Cryptosystem.- Toward a Rigorous Variation of Coppersmith's Algorithm on Three Variables.- An L (1/3?+??) Algorithm for the Discrete Logarithm Problem for Low Degree Curves.- General Ad Hoc Encryption from Exponent Inversion IBE.- Non-interactive Proofs for Integer Multiplication.- Ate Pairing on Hyperelliptic Curves.- Ideal Multipartite Secret Sharing Schemes.- Non-wafer-Scale Sieving Hardware for the NFS: Another Attempt to Cope with 1024-Bit.- Divisible E-Cash Systems Can Be Truly Anonymous.- A Fast and Key-Efficient Reduction of Chosen-Ciphertext to Known-Plaintext Security.- Range Extension for Weak PRFs The Good, the Bad, and the Ugly.- Feistel Networks Made Public, and Applications.- Oblivious-Transfer Amplification.- Simulatable Adaptive Oblivious Transfer.

261 citations


BookDOI
TL;DR: The main difference from well known routing techniques is that optimal LSP need not to be necessarily the shortest path solution as it is in the case of typical on-line routing.
Abstract: In the context of dynamic bandwidth allocation the QoS path provisioning for coexisted and aggregated traffic could be very important element of resource management. As we know all traffic in DiffServ/MPLS network is distributed among LSPs (Label Switching Path). Packets are classified in FEC (Forwarding Equivalence Class) and can be routed in relation to CoS (Class of Service). In the process of resource management we are looking for optimal LSP, taking care of concurrent flows traversing the network simultaneously. For better load-balancing purposes and congestion avoidance the LSP creation can be done off-line, possible during negotiation process. The main difference from well known routing techniques is that optimal LSP need not to be necessarily the shortest path solution as it is in the case of typical on-line routing (e.g. with OSPF protocol).

224 citations


Journal Article
TL;DR: An open problem is presented: the definition of a new notion of security that covers attacks like the ones presented here, but not more.
Abstract: We present two block cipher distinguishers in a setting where the attacker knows the key. One is a distinguisher for AES reduced the seven rounds. The second is a distinguisher for a class of Feistel ciphers with seven rounds. This setting is quite different from traditional settings. We present an open problem: the definition of a new notion of security that covers attacks like the ones we present here, but not more.

199 citations


BookDOI
TL;DR: Second International Conference, RSEISP 2014, Held as Part of JRS 2014, Granada and Madrid, Spain, July 9-13, 2014.
Abstract: Second International Conference, RSEISP 2014, Held as Part of JRS 2014, Granada and Madrid, Spain, July 9-13, 2014. Proceedings

191 citations



Journal Article
TL;DR: This paper proposes two modifications to the usual way of applying F-Race that make it suitable for tuning tasks with a very large number of initial candidate parameter settings and allow a significant reduction of the number of function evaluations without any major loss in solution quality.
Abstract: Finding appropriate values for the parameters of an algorithm is a challenging, important, and time consuming task. While typically parameters are tuned by hand, recent studies have shown that automatic tuning procedures can effectively handle this task and often find better parameter settings. F-Race has been proposed specifically for this purpose and it has proven to be very effective in a number of cases. F-Race is a racing algorithm that starts by considering a number of candidate parameter settings and eliminates inferior ones as soon as enough statistical evidence arises against them. In this paper, we propose two modifications to the usual way of applying F-Race that on the one hand, make it suitable for tuning tasks with a very large number of initial candidate parameter settings and, on the other hand, allow a significant reduction of the number of function evaluations without any major loss in solution quality. We evaluate the proposed modifications on a number of stochastic local search algorithms and we show their effectiveness. © Springer-Verlag Berlin Heidelberg 200

183 citations


BookDOI
TL;DR: A new influence function is experiment in this work that uses the degree of fuzzy proximity of key terms in a document to compute the relevance of the document to the query in a Boolean information retrieval method.
Abstract: We experiment a new influence function in our information retrieval method that uses the degree of fuzzy proximity of key terms in a document to compute the relevance of the document to the query. The model is based on the idea that the closer the query terms in a document are to each other the more relevant the document. Our model handles Boolean queries but, contrary to the traditional extensions of the basic Boolean information retrieval model, does not use a proximity operator explicitly. A single parameter makes it possible to control the proximity degree required. To improve our system we use a stemming algorithm before indexing, we take a specific influence function and we merge fuzzy proximity result lists built with different width of influence function. We explain how we construct the queries and report the results of our experiments in the ad-hoc monolingual French task of the CLEF 2006 evaluation campaign.

181 citations


Journal Article
TL;DR: In this article, the connection between formal errors (such as deadlocks) and a set of metrics that capture various structural and behavioral aspects of a process model is discussed, and a comprehensive validation based on an extensive sample of EPC process models from practice is provided.
Abstract: Business process models play an important role for the management, design, and improvement of process organizations and process-aware information systems. Despite the extensive application of process modeling in practice, there are hardly empirical results available on quality aspects of process models. This paper aims to advance the understanding of this matter by analyzing the connection between formal errors (such as deadlocks) and a set of metrics that capture various structural and behavioral aspects of a process model. In particular, we discuss the theoretical connection between errors and metrics, and provide a comprehensive validation based on an extensive sample of EPC process models from practice. Furthermore, we investigate the capability of the metrics to predict errors in a second independent sample of models. The high explanatory power of the metrics has considerable consequences for the design of future modeling guidelines and modeling tools.

147 citations


Journal Article
TL;DR: This paper presents the realisation, using a Service Oriented Architecture, of an approach for dynamic, flexible and extensible exception handling in workflows, based not on proprietary frameworks, but on accepted ideas of how people actually work.
Abstract: This paper presents the realisation, using a Service Oriented Architecture, of an approach for dynamic, flexible and extensible exception handling in workflows, based not on proprietary frameworks, but on accepted ideas of how people actually work. The resultant service implements a detailed taxonomy of workflow exception patterns to provide an extensible repertoire of self-contained exception-handling processes called exlets, which may be applied at the task, case or specification levels. When an exception occurs at runtime, an exlet is dynamically selected from the repertoire depending on the context of the exception and of the particular work instance. Both expected and unexpected exceptions are catered for in real time, so that 'manual handling' is avoided.

124 citations


Journal Article
TL;DR: In this article, anisotropic structures of the facial images can be captured effectively by the proposed approach using elongated neighborhood distribution, which is called the elongated LBP (ELBP), and a new feature, called Average Maximum Distance Gradient Magnitude (AMDGM), is proposed.
Abstract: In this paper, we propose a new face recognition approach based on local binary patterns (LBP). The proposed approach has the following novel contributions. (i) As compared with the conventional LBP, anisotropic structures of the facial images can be captured effectively by the proposed approach using elongated neighborhood distribution, which is called the elongated LBP (ELBP). (ii) A new feature, called Average Maximum Distance Gradient Magnitude (AMDGM), is proposed. AMDGM embeds the gray level difference information between the reference pixel and neighboring pixels in each ELBP pattern. (iii) It is found that the ELBP and AMDGM features are well complement with each other. The proposed method is evaluated by performing facial expression recognition experiments on two databases: ORL and FERET. The proposed method is compared with two widely used face recognition approaches. Furthermore, to test the robustness of the proposed method under the condition that the resolution level of the input images is low, we also conduct additional face recognition experiments on the two databases by reducing the resolution of the input facial images. The experimental results show that the proposed method gives the highest recognition accuracy in both normal environment and low image resolution conditions.

BookDOI
TL;DR: In this article, Petri nets without tokens are used to generate Petri Net state spaces for Reconfigurable Component Systems (RCSs) and a Compositional Method for the Synthesis of Asynchronous Communication Mechanisms is presented.
Abstract: Invited Papers.- Petri Nets, Discrete Physics, and Distributed Quantum Computation.- Autonomous Distributed System and Its Realization by Multi Agent Nets.- Petri Nets Without Tokens.- Toward Specifications for Reconfigurable Component Systems.- Generating Petri Net State Spaces.- Full Papers.- Markov Decision Petri Net and Markov Decision Well-Formed Net Formalisms.- Comparison of the Expressiveness of Arc, Place and Transition Time Petri Nets.- Improving Static Variable Orders Via Invariants.- Independence of Net Transformations and Token Firing in Reconfigurable Place/Transition Systems.- From Many Places to Few: Automatic Abstraction Refinement for Petri Nets.- A Compositional Method for the Synthesis of Asynchronous Communication Mechanisms.- History-Dependent Petri Nets.- Complete Process Semantics for Inhibitor Nets.- Behaviour-Preserving Transition Insertions in Unfolding Prefixes.- Combining Decomposition and Unfolding for STG Synthesis.- Object Nets for Mobility.- Web Service Orchestration with Super-Dual Object Nets.- Synthesis of Elementary Net Systems with Context Arcs and Localities.- Nets with Tokens Which Carry Data.- Operating Guidelines for Finite-State Services.- Theory of Regions for the Synthesis of Inhibitor Nets from Scenarios.- Utilizing Fuzzy Petri Net for Choreography Based Semantic Web Services Discovery.- Formal Models for Multicast Traffic in Network on Chip Architectures with Compositional High-Level Petri Nets.- Name Creation vs. Replication in Petri Net Systems.- Modelling the Datagram Congestion Control Protocol's Connection Management and Synchronization Procedures.- The ComBack Method - Extending Hash Compaction with Backtracking.- Computing Minimal Elements of Upward-Closed Sets for Petri Nets.- Tool Papers.- ProM 4.0: Comprehensive Support for Real Process Analysis.- dmcG: A Distributed Symbolic Model Checker Based on GreatSPN.- Workcraft: A Static Data Flow Structure Editing, Visualisation and Analysis Tool.

Journal Article
TL;DR: A framework for identifying characteristics of Business Process Orientation is defined and a valid tool for measuring the degree of BusinessProcess Orientation (BPO) of an organisation is provided based on empirical research in 30 international organisations.
Abstract: Processes are the core of organisations. Business Process Management (BPM) argues organisations can gain competitive advantage by improving and innovating their processes through a holistic process-oriented view. An organisation can be more or less process-oriented depending on their experience in applying process thinking for better results. The aim of this paper is to define a framework for identifying characteristics of Business Process Orientation and to provide a valid tool for measuring the degree of Business Process Orientation (BPO) of an organisation based on empirical research in 30 international organisations. A holistic view on integrated process management and change is taken as a starting point.

BookDOI
TL;DR: The qualitative evaluation method, which included a series of interviews and focus groups held with older adults, and healthcare and design professionals, is discussed together with key findings.
Abstract: This paper details the evaluation of human modelling software, which provides visual access to dynamic biomechanical data on older adult mobility to a new audience of professionals and lay people without training in biomechanics. An overview of the process of creating the visualisation software is provided, including a discussion of the benefits over existing approaches. The qualitative evaluation method, which included a series of interviews and focus groups held with older adults, and healthcare and design professionals, is discussed together with key findings. The findings are illustrated with examples of new dialogues about specific mobility issues impacting on healthcare and design planning which were facilitated by the data visualisations.

BookDOI
TL;DR: In this paper, the authors considered a temporally homogeneous, one-dimensional diffusion process X(t) defined over I = (-b, a), with infinitesimal parameters depending on the sign of X(T).
Abstract: For a, b > 0, we consider a temporally homogeneous, one-dimensional diffusion process X(t) defined over I = (-b, a), with infinitesimal parameters depending on the sign of X(t). We suppose that, when X(t) reaches the position 0, it is reflected rightward to delta with probability p > 0 and leftward to -delta with probability 1 - p, where delta > 0. It is presented a method to find approximate formulae for the mean exit time from the interval (-b,a), and for the probability of exit through the right end a, generalizing the results of Lefebvre ([1]) holding, in the limit delta -> 0, for asymmetric Brownian motion with drift.

Journal Article
TL;DR: In this paper, the authors present a framework (SALT - Semantically Annotated LATEX) that allows the author to create metadata concurrently, i.e. while in the process of writing a document.
Abstract: Machine- understandable data constitutes the foundation for the Semantic Web. This paper presents a viable way for authoring and annotating Semantic Documents on the desktop. In our approach, the PDF file format is the container for document semantics, being able to store both the content and the related metadata in a single file. To achieve this, we provide a framework (SALT - Semantically Annotated LATEX), that extends the LATEX writing environment and supports the creation of metadata for scientific publications. SALT allows the author to create metadata concurrently, i.e. while in the process of writing a document. We discuss some of the requirements which have to be met when developing such a support for creating semantic documents. In addition, we describe a usage scenario to show the feasability and benefit of our approach.

Book ChapterDOI
TL;DR: The aspects of personalization, taken as a general technique for the customization of services to the user, which have been successfully employed in e-commerce Web sites are described.
Abstract: This chapter is about personalization and adaptation in electronic commerce (e-commerce) applications. In the first part, we briefly introduce the challenges posed by e-commerce and we discuss how personalization strategies can help companies to face such challenges. Then, we describe the aspects of personalization, taken as a general technique for the customization of services to the user, which have been successfully employed in e-commerce Web sites. To conclude, we present some emerging trends and and we discuss future perspectives.

Journal Article
TL;DR: In this paper, the authors presented an algorithm to extract the secret key from white-box DES implementations, which is a differential attack on obfuscated rounds, and works regardless of the shielding external encodings that are applied.
Abstract: At DRM 2002, Chow et al. [4] presented a method for implementing the DES block cipher such that it becomes hard to extract the embedded secret key in a white-box attack context. In such a context, an attacker has full access to the implementation and its execution environment. In order to provide an extra level of security, an implementation shielded with external encodings was introduced by Chow et al. and improved by Link and Neumann [10]. In this paper, we present an algorithm to extract the secret key from such white-box DES implementations. The cryptanalysis is a differential attack on obfuscated rounds, and works regardless of the shielding external encodings that are applied. The cryptanalysis has a average time complexity of 214 and a negligible space complexity.

Journal Article
TL;DR: The aims of the competition were first, to collect challenging benchmark problems, and second, to provide a platform to assess a broad variety of Answer Set Programming systems.
Abstract: This paper gives a summary of the First Answer Set Programming System Competition that was held in conjunction with the Ninth International Conference on Logic Programming and Nonmonotonic Reasoning. The aims of the competition were twofold: first, to collect challenging benchmark problems, and second, to provide a platform to assess a broad variety of Answer Set Programming systems. The competition was inspired by similar events in neighboring fields, where regular benchmarking has been a major factor behind improvements in the developed systems and their ability to address practical applications.

Journal Article
TL;DR: In this article, the authors survey different techniques for fast collision search in SHA-1 and similar hash functions and propose a simple but effective method to facilitate comparison, and give complexity estimates and performance measurements of this new and improved collision search method.
Abstract: The diversity of methods for fast collision search in SHA-1 and similar hash functions makes a comparison of them difficult. The literature is at times very vague on this issue, which makes comparison even harder. In situations where differences in estimates of attack complexity of a small factor might influence short-term recommendations of standardization bodies, uncertainties and ambiguities in the literature amounting to a similar order of magnitude are unhelpful. We survey different techniques and propose a simple but effective method to facilitate comparison. In a case study, we consider a newly developed attack on 70-step SHA-1, and give complexity estimates and performance measurements of this new and improved collision search method.

Book ChapterDOI
TL;DR: The starting point of the work is the event B development of the IEEE 1394 leader election protocol; from standard documents, it is derived temporal requirements to solve the contention problem and a method for introducing time constraints using a pattern is proposed.
Abstract: Distributed applications are based on algorithms which should be able to deal with time constraints. It is mandatory to express time constraints in (mathematical) models and the current work intends to integrate time constraints in the modelling process based on event B models and refinement. The starting point of our work is the event B development of the IEEE 1394 leader election protocol; from standard documents, we derive temporal requirements to solve the contention problem and we propose a method for introducing time constraints using a pattern. The pattern captures time constraints in a generic event B development and it is applied to the IEEE 1394 case study.


BookDOI
TL;DR: In this article, the authors apply harmony data smoothing learning on a weighted kernel density model to obtain a sparse density estimator and empirically compare this method with the least squares cross-validation (LSCV) method for the classical kernel density estimators.
Abstract: In this paper we apply harmony data smoothing learning on a weighted kernel density model to obtain a sparse density estimator. We empirically compare this method with the least squares cross-validation (LSCV) method for the classical kernel density estimator. The most remarkable result of our study is that the harmony data smoothing learning method outperforms LSCV method in most cases and the support vectors selected by harmony data smoothing learning method are located in the regions of local highest density of the sample.

Book ChapterDOI
TL;DR: This work introduces a novel stochastic local search algorithm for the vertex cover problem and evaluates its performance on the commonly used DIMACS benchmarks for the related clique problem, finding that its approach is competitive with the current best stochastically local search algorithms for finding cliques.
Abstract: We introduce a novel stochastic local search algorithm for the vertex cover problem. Compared to current exhaustive search techniques, our algorithm achieves excellent performance on a suite of problems drawn from the field of biology. We also evaluate our performance on the commonly used DIMACS benchmarks for the related clique problem, finding that our approach is competitive with the current best stochastic local search algorithm for finding cliques. On three very large problem instances, our algorithm establishes new records in solution quality.

Journal Article
TL;DR: In this article, the authors present the fastest known implementation of a modular multiplication for a 160-bit standard compliant elliptic curve (secp160r1) for 8-bit micro controller which are typically used in WSNs.
Abstract: In this article we present the fastest known implementation of a modular multiplication for a 160-bit standard compliant elliptic curve (secp160r1) for 8-bit micro controller which are typically used in WSNs. The major part (77%) of the processing time for an elliptic curve operation such as ECDSA or EC Diffie-Hellman is spent on modular multiplication. We present an optimized arithmetic algorithm which significantly speed up ECC schemes. The reduced processing time also yields a significantly lower energy consumption of ECC schemes. With our implementation results we can show that a 160-bit modular multiplication can be performed in 0.39 ms on an 8-bit AVR processor clocked at 7.37 MHz. This brings the vision of asymmetric cryptography in the field of WSNs with all its benefits for key-distribution and authentication a step closer to reality.

Journal Article
TL;DR: In this article, the authors proposed an improved distance bounding protocol for noisy channels that offers a substantial reduction (about 50%) in the number of communication rounds compared to the Hancke and Kuhn protocol.
Abstract: Location information can be used to enhance mutual entity authentication protocols in wireless ad-hoc networks. More specifically, distance bounding protocols have been introduced by Brands and Chaum at Eurocrypt'93 to preclude distance fraud and mafia fraud attacks, in which a local impersonator exploits a remote honest user. Hancke and Kuhn have proposed a solution to cope with noisy channels. This paper presents an improved distance bounding protocol for noisy channels that offers a substantial reduction (about 50%) in the number of communication rounds compared to the Hancke and Kuhn protocol. The main idea is to use binary codes to correct bit errors occurring during the fast bit exchanges. Our protocol is perfectly suitable to be employed in low-cost, noisy wireless environments.

Journal Article
TL;DR: This paper focuses on and discusses about the literature on formal, top-down and pre-existent organizations by stressing the different aspects that may be considered to program them and presents some challenges for future research considering particularly the openness feature of those agents' organizations.
Abstract: In the last years, social and organizational aspects of agency have become a major issue in multi-agent systems' research. Recent applications of MAS enforce the need of using such aspects in order to ensure some social order within these systems. However, there is still a lack of comprehensive views of the diverse concepts, models and approaches related to agents' organizations. Moreover, most designers have doubts about how to put these concepts in practice, i.e., how to program them. In this paper we focus on and discuss about the literature on formal, top-down and pre-existent organizations by stressing the different aspects that may be considered to program them. Finally, we present some challenges for future research considering particularly the openness feature of those agents' organizations.

BookDOI
TL;DR: In this article, a special issue on early aspect mapping is presented, focusing on the mapping of early aspect across the software lifecycle and a special section on Aspects and Software Evolution is discussed.
Abstract: Volume IV of Transactions on Aspect-Oriented Software Development continues the special issue on Early Aspects from volume III. The special issue was guest edited by Joao Araujo and Elisa Baniassad and handled by one of the co-editors-in-chief, Mehmet Aksit. The papers in volume III discussed topics pertaining to analysis, visualisation, conflict identification and composition of Early Aspects. The papers in this volume focus on mapping of Early Aspects across the software lifecycle. Complementing this focus on aspect mapping is a special section on Aspects and Software Evolution guest edited by Walter Cazzola, Shigeru Chiba and Gunter Saake—the co-editor-in-chief handling this issue was Awais Rashid.

Journal Article
TL;DR: In this paper, a method for creating scheduling heuristics for parallel proportional machine scheduling environment and arbitrary performance criteria is presented, where genetic programming is used to synthesize the priority function which, coupled with an appropriate meta-algorithm for a given environment, forms the priority scheduling heuristic.
Abstract: In this paper we present a method for creating scheduling heuristics for parallel proportional machine scheduling environment and arbitrary performance criteria. Genetic programming is used to synthesize the priority function which, coupled with an appropriate meta-algorithm for a given environment, forms the priority scheduling heuristic. We show that the procedures derived in this way can perform similarly or better than existing algorithms. Additionally, this approach may be particularly useful for those combinations of scheduling environment and criteria for which there are no adequate scheduling algorithms.

Journal Article
TL;DR: In this paper, it was shown that only a single iteration preserves Coll, Sec, aSec, eSec, or aPre, and only one iteration preserves Pre, Sec and eSec.
Abstract: Nearly all modern hash functions are constructed by iterating a compression function. At FSE’04, Rogaway and Shrimpton [RS04] formalized seven security notions for hash functions: collision resistance (Coll) and three variants of second-preimage resistance (Sec, aSec, eSec) and preimage resistance (Pre, aPre, ePre). The main contribution of this paper is in determining, by proof or counterexample, which of these seven notions is preserved by each of eleven existing iterations. Our study points out that none of them preserves more than three notions from [RS04]. In particular, only a single iteration preserves Pre, and none preserves Sec, aSec, or aPre. The latter two notions are particularly relevant for practice, because they do not rely on the problematic assumption that practical compression functions be chosen uniformly from a family. In view of this poor state of afiairs, even the mere existence of seven-property-preserving iterations seems uncertain. As a second contribution, we propose the new Random-Oracle XOR (ROX) iteration that is the flrst to provably preserve all seven notions, but that, quite controversially, uses a random oracle in the iteration. The compression function itself is not modeled as a random oracle though. Rather, ROX uses an auxiliary small-input random oracle (typically 170 bits) that is called only a logarithmic number of times.