scispace - formally typeset
Search or ask a question

Showing papers by "Wolfgang Emmerich published in 2003"


Journal ArticleDOI
TL;DR: CARISMA, a mobile computing middleware which exploits the principle of reflection to enhance the construction of adaptive and context-aware mobile applications, is described and a method by which policy conflicts can be handled is demonstrated.
Abstract: Mobile devices, such as mobile phones and personal digital assistants, have gained wide-spread popularity. These devices will increasingly be networked, thus enabling the construction of distributed applications that have to adapt to changes in context, such as variations in network bandwidth, battery power, connectivity, reachability of services and hosts, etc. In this paper, we describe CARISMA, a mobile computing middleware which exploits the principle of reflection to enhance the construction of adaptive and context-aware mobile applications. The middleware provides software engineers with primitives to describe how context changes should be handled using policies. These policies may conflict. We classify the different types of conflicts that may arise in mobile computing and argue that conflicts cannot be resolved statically at the time applications are designed, but, rather, need to be resolved at execution time. We demonstrate a method by which policy conflicts can be handled; this method uses a microeconomic approach that relies on a particular type of sealed-bid auction. We describe how this method is implemented in the CARISMA middleware architecture and sketch a distributed context-aware application for mobile devices to illustrate how the method works in practice. We show, by way of a systematic performance evaluation, that conflict resolution does not imply undue overheads, before comparing our research to related work and concluding the paper.

524 citations


Proceedings ArticleDOI
28 May 2003
TL;DR: This work investigates end-to-end quality of service (QoS) and highlights that QoS provision has multiple facets and requires complex agreements between network services, storage services and middleware services, and introduces SLAng, a language for defining Service Level Agreements (SLAs) that accommodates these needs.
Abstract: Application or web services are increasingly being used across organisational boundaries. Moreover, new services are being introduced at the network and storage level. Languages to specify interfaces for such services have been researched and transferred into industrial practice. We investigate end-to-end quality of service (QoS) and highlight that QoS provision has multiple facets and requires complex agreements between network services, storage services and middleware services. We introduce SLAng, a language for defining Service Level Agreements (SLAs) that accommodates these needs. We illustrate how SLAng is used to specify QoS in a case study that uses a web services specification to support the processing of images across multiple domains and we evaluate our language based on it.

293 citations


Proceedings ArticleDOI
03 May 2003
TL;DR: This paper presents a repair framework for inconsistent distributed documents, a new method for generating interactive repairs from full first order logic formulae that constrain these documents, and presents a full implementation of the components in this framework.
Abstract: Comprehensive consistency management requires a strong mechanism for repair once inconsistencies have been detected. In this paper we present a repair framework for inconsistent distributed documents. The core piece of the framework is a new method for generating interactive repairs from full first order logic formulae that constrain these documents. We present a full implementation of the components in our repair framework, as well as their application to the UML and related heterogeneous documents such as EJB deployment descriptors. We describe how our approach can be used as an infrastructure for building higher-level, domain specific frameworks and provide an overview of related work in the database and software development environment community.

206 citations


Journal ArticleDOI
TL;DR: Xlinkit as discussed by the authors is a lightweight framework for consistency checking that leverages standard Internet technologies for managing the consistency of heterogeneous, distributed software engineering documents, which is central to the development of large and complex systems.
Abstract: The problem of managing the consistency of heterogeneous, distributed software engineering documents is central to the development of large and complex systems. We show how this problem can be addressed using xlinkit, a lightweight framework for consistency checking that leverages standard Internet technologies. xlinkit provides flexibility, strong diagnostics, and support for distribution and document heterogeneity. We use xlinkit in a comprehensive case study that demonstrates how design, implementation and deployment information of an Enterprise JavaBeans system can be checked for consistency, and rechecked incrementally when changes are made.

164 citations


Proceedings Article
01 May 2003
TL;DR: This work investigates end-to-end quality of service (QoS) and highlights that QoS provision has multiple facets and requires complex agreements between network services, storage services and middleware services, and introduces SLAng, a language for defining Service Level Agreements (SLAs) that accommodates these needs.
Abstract: Application or web services are increasingly being used across organisational boundaries. Moreover; new services are being introduced at the network and storage level. Languages to specify interfaces for such services have been researched and transferred into industrial practice. We investigate end-to-end quality of service (QoS) and highlight that QoS provision has multiple facets and requires complex agreements between network services, storage services and middleware services. We introduce SLAng, a language for defining Service Level Agreements (SLAs) that accommodates these needs. We illustrate how SLAng is used to specify QoS in a case study that uses a web services specification to support the processing of images across multiple domains and we evaluate our language based on it.

51 citations


Proceedings ArticleDOI
14 Jul 2003
TL;DR: This work defines architectural stability and formulate the problem of evaluating software architectures for stability and evolution, drawing attention to the use of Architectures Description Languages (ADLs) for supporting the evaluation of software architectures in general and for architectural stability specifically.
Abstract: Summary form only given. We survey seminal work on software architecture evaluation methods. We then look at an emerging class of methods that explicates evaluating software architectures for stability and evolution. We define architectural stability and formulate the problem of evaluating software architectures for stability and evolution. We draw attention to the use of Architectures Description Languages (ADLs) for supporting the evaluation of software architectures in general and for architectural stability specifically.

42 citations


01 Jan 2003
TL;DR: A novel model is contributed to that exploits options theory to predict architectural stability and provides insights on the evolution of the software system based on valuing the extent an architecture can endure a set of likely evolutionary changes.
Abstract: Architectural stability refers to the extent an architecture is flexible to endure evolutionary changes in stakeholders\' requirements and the environment. We assume that the primary goal of software architecture is to guide the system\'s evolution. We contribute to a novel model that exploits options theory to predict architectural stability. The model is predictive: it provides \"insights\" on the evolution of the software system based on valuing the extent an architecture can endure a set of likely evolutionary changes. The model builds on Black and Scholes financial options theory (Noble Prize wining) to value such extent. We show how we have derived the model: the analogy and assumptions made to reach the model, its formulation, and possible interpretations. We refer to this model as ArchOptions.

39 citations


Journal ArticleDOI
TL;DR: The particular motivation for performance analysis in the domain of Enterprise Information Systems (EISs) is described and it is argued that the Model Driven Architecture (MDA) is a suitable framework for integrating formal analysis techniques with engineering methods appropriate to the domain.

29 citations


Book ChapterDOI
TL;DR: Sentin, a middleware system that allows the flexible use of logical mobility techniques by applications running on mobile hosts which are connected to very different networks is designed and implemented.
Abstract: An increasing number of applications is being written for mobile hosts, such as laptop computers, mobile phones, PDAs etc These applications are usually monolithic, featuring very limited interoperability and context-awareness and are usually difficult to deploy and update Application engineers have to deal with a very dynamic set of environments that these applications are in contact with and it is becoming increasingly difficult to design an application that will be able to cater to all the user’s needs in those environments This new setting forces a shift from design-time to run-time effort in developing software systems To solve these problems and to allow a new class of ubiquitous and adaptable applications to be built, we have designed and implemented satin, a middleware system that allows the flexible use of logical mobility techniques by applications running on mobile hosts which are connected to very different networks In this paper we describe our approach and show how satin can be used to deploy and update applications on mobile devices easily and efficiently

18 citations


Proceedings ArticleDOI
06 Oct 2003
TL;DR: This work presents an approach to managing formal models using model driven architecture technologies that deliver analysis techniques through integration with the design tools and repositories that practitioners use, and relies on standards to permit deployment in multiple tools.
Abstract: We present an approach to managing formal models using model driven architecture (MDA) technologies that deliver analysis techniques through integration with the design tools and repositories that practitioners use. Expert modeling knowledge is captured in domain-specific languages and meta-model constraints. These are represented using UML (Unified Modeling Language) and collocated with designs and analysis models, providing a flexible and visible approach to managing semantic associations. The approach relies on standards to permit deployment in multiple tools. We demonstrate our approach with an example in which queuing-network models are associated with UML design models to predict average case performance.

18 citations


Book ChapterDOI
22 Sep 2003
TL;DR: This work exploits the fact that modern object and component middleware offer only a small number of underlying synchronisation primitives and threading policies to reduce the state space that needs to be model checked by exploiting middleware characteristics.
Abstract: Distributed systems are increasingly built using distributed object or component middleware. The dynamic behaviour of those distributed systems is influenced by the particular combination of middleware synchronisation and threading primitives used for communication amongst distributed objects. A designer may accidentally choose combinations that cause a distributed application to enter undesirable states or violate liveness properties. We exploit the fact that modern object and component middleware offer only a small number of underlying synchronisation primitives and threading policies. For each of these we define a UML stereotype and a formal process algebra specification of the stereotype semantics. We devise a means to specify safety and liveness properties in UML and again map those to process algebra safety and liveness properties. We can thus apply model checking techniques to verify that a given design does indeed meet the desired properties. We propose how to reduce the state space that needs to be model checked by exploiting middleware characteristics. We finally show how model checking results can be related back to the input UML models. In this way we can hide the formalism and the model checking process entirely from UML designers, which we regard as critical for the industrial exploitation of this research.

Book ChapterDOI
15 Dec 2003
TL;DR: This paper proposes the notion of Electronic Service Management System (ESMS), a framework for modelling and implementing electronic services, substantiated by a workflow-oriented architecture defined in accordance with the OMG’s Model-driven Architecture (MDA) principles.
Abstract: Mainly on the wake of the Web Service initiative, electronic services are emerging as a reference model for business information technology systems. Individual applications retain core functions and technology base, but integration becomes crucial. A business service derives from the coordination of different business capabilities. The related electronic service derives from the integration of the different applications sustaining such capabilities. The effective realisation of an electronic service requires explicit modelling and active management of the relations between business capabilities and technical infrastructure. In this paper, we propose the notion of Electronic Service Management System (ESMS) as a framework for modelling and implementing electronic services. The notion of ESMS is substantiated by a workflow-oriented architecture, which we mainly derive from the experience of HP Service Composer and the DySCo (Dynamic Service Composer) research prototype. The architecture is defined in accordance with the OMG’s Model-driven Architecture (MDA) principles.

Proceedings Article
01 Oct 2003
TL;DR: This work presents an approach to managing formal models using Model Driven Architecture technologies that delivers analysis techniques through integration with the design tools and repositories that practitioners use and relies on standards to permit deployment in multiple tools.
Abstract: We present an approach to managing formal models using Model Driven Architecture (MDA) technologies that delivers analysis techniques through integration with the design tools and repositories that practitioners use. Expert modelling knowledge is captured in domain-specific languages and meta-model constraints. These are represented using UML and colocated with designs and analysis models, providing a flexible and visible approach to managing semantic associations. The approach relies on standards to permit deployment in multiple tools. We demonstrate our approach with an example in which queuing-network models are associated with UML design models to predict average case performance.

Journal Article
TL;DR: A taxonomy for the classes of consistency constraints that occur in this domain is proposed and how xlinkit, a generic technology for managing the consistency of distributed documents, can be used to specify consistency constraints and detect transaction inconsistencies is presented.
Abstract: Financial institutions are increasingly using XML as a defacto standard to represent and exchange information about their products and services. Their aim is to process transactions quickly, cost-effectively, and with minimal human intervention. Due to the nature of the financial industry, inconsistencies inevitably appear throughout the lifetime of a financial transaction and their resolution introduces cost and time overheads. We give an overview of requirements for inconsistency detection in our particular domain of interest: the over-the-counter (OTC) financial derivatives sector. We propose a taxonomy for the classes of consistency constraints that occur in this domain and present how xlinkit, a generic technology for managing the consistency of distributed documents, can be used to specify consistency constraints and detect transaction inconsistencies. We present the result of an evaluation where xlinkit has been used to specify the evaluation rules for version 1.0 of the Financial Products Markup Language (FpML). The results of that evaluation were so encouraging that they have led the FpML Steering Committee to consider xlinkit as the standard for specifying validation constraints throughout.

Journal ArticleDOI
TL;DR: This project intends to make experimental methodologies more accessible to researchers by using programmable networking techniques and by building a management system for a network testbed.
Abstract: The way in which research groups evaluate router software (QoS and routing components, for example) seems to be restricted to methodologies using mathematical modelling and simulation techniques. We believe that an experimental methodology is rarely used as the deployment of custom routing software to a testbed comprising multiple routers is a non-trivial task that is beyond the scope of most network research projects. This project intends to make experimental methodologies more accessible to researchers by using programmable networking techniques and by building a management system for a network testbed.

01 Jan 2003
TL;DR: In this article, the authors present a taxonomy for the classes of consistency constraints that occur in the over-the-counter (OTC) financial derivatives sector and present how xlinkit, a generic technology for managing the consistency of distributed documents, can be used to specify consistency constraints and detect transaction inconsistencies.
Abstract: Financial institutions are increasingly using XML as a de-facto standard to represent and exchange information about their products and services. Their aim is to process transactions quickly, cost-effectively, and with minimal human intervention. Due to the nature of the financial industry, inconsistencies inevitably appear throughout the lifetime of a financial transaction and their resolution introduces cost and time overheads. We give an overview of requirements for inconsistency detection in our particular domain of interest: the over-the-counter (OTC) financial derivatives sector. We propose a taxonomy for the classes of consistency constraints that occur in this domain and present how xlinkit, a generic technology for managing the consistency of distributed documents, can be used to specify consistency constraints and detect transaction inconsistencies. We present the result of an evaluation where xlinkit has been used to specify the evaluation rules for version 1.0 of the Financial Products Markup Language (FpML). The results of that evaluation were so encouraging that they have led the FpML Steering Committee to consider xlinkit as the standard for specifying validation constraints throughout.

01 Jan 2003
TL;DR: A case study of the experience re-engineering a scientific application using the Open Grid Services Architecture, a new specification for developing Grid applications using web service technologies such as WSDLand SOAP, and a computational workflow service that enables users to distribute and manage parts of the computational process across different clusters and administrativedomains.
Abstract: We present a case study of our experience re-engineeringa scientific application using the Open Grid Services Architecture(OGSA), a new specification for developing Gridapplications using web service technologies such as WSDLand SOAP. During the last decade, UCL?s Chemistry departmenthas developed a computational approach for predictingthe crystal structures of small molecules. However,each search involves running large iterations of computationallyexpensive calculations and currently takes a fewmonths to perform. Making use of early implementationsof the OGSA specification we have wrapped the Fortranbinaries into OGSI-compliant service interfaces to exposethe existing scientific application as a set of loosely coupledweb services. We show how the OGSA implementationfacilitates the distribution of such applications across alarge network, radically improving performance of the systemthrough parallel CPU capacity, coordinated resourcemanagement and automation of the computational process.We discuss the difficulties that we encountered turning Fortranexecutables into OGSA services and delivering a robust,scalable system. One unusual aspect of our approachis the way we transfer input and output data for the Fortrancodes. Instead of employing a file transfer service wetransform the XML encoded data in the SOAP message tonative file format, where possible using XSLT stylesheets.We also discuss a computational workflow service that enablesusers to distribute and manage parts of the computationalprocess across different clusters and administrativedomains. We examine how our experience re-engineeringthe polymorph prediction application led to this approachand to what extent our efforts have succeeded.


Book ChapterDOI
TL;DR: This paper proposes a definition for compatibility between versions of XML languages that takes this additional need for consistency into account and argues that the problem can become tractable using heuristic methods if the two languages are related in a version history.
Abstract: Individual organisations as well as industry consortia are currently defining application and domain-specific languages using the eXtended Markup Language (XML) standard of the World Wide Web Consortium (W3C). The paper shows that XML languages differ in significant aspects from generic software engineering artifacts and that they therefore require a specific approach to version and configuration management. When an XML language evolves, consistency between the language and its instance documents needs to be preserved in addition to the internal consistency of the language itself. We propose a definition for compatibility between versions of XML languages that takes this additional need into account. Compatibility between XML languages in general is undecidable. We argue that the problem can become tractable using heuristic methods if the two languages are related in a version history. We propose to evaluate the method by using different versions of the Financial products Markup Language (FpML), in the definition of which we participate.

Journal ArticleDOI
01 Jan 2003
TL;DR: This special section of the Automated Software Engineering Journal includes two examples of how XML and related technologies are influencing the construction of software engineering tools, environments, and software development processes.
Abstract: The increasing use of XML and related technologies in many branches of computer science is also affecting the way software engineering tools are built. XML is becoming a standard in many software engineering applications and its use in the area of tool integration, code re-engineering, application interaction, web and databases integration, is spreading. XML is an abstract and flexible meta language that can be used and adapted to express many kinds of semi-structured data. Furthermore, many different technologies and tools have been developed around XML in order to support parsing, verification, validation, merging, comparing, checking consistency, and linking. Even network communication has been affected with the development of standards, such as the Simple Object Access Protocol (SOAP). Software engineering seems to have been positively influenced by the existence of these notations and tools. This special section of the Automated Software Engineering Journal includes two examples of how XML and related technologies are influencing the construction of software engineering tools, environments, and software development processes. XML facilitates the structuring of information and the developed technologies simplify the access to this information considerably. Many software development tools are increasingly taking advantage of the potential of XML. Furthermore, XML leverages a long software engineering tradition of work on graph based and abstract syntax tree approaches. The exploitation of XML is affecting the development of distributed software architectures, web-related systems, mobile computing applications, and middleware that provides new degrees of flexibility in terms of integration, security, and interoperability. Structuring information in XML and the ability to exchange data in that format opens the door to the exploitation of new strategies and approaches. Research in this field is now seeing new developments in the areas of component integration, reflection, heterogeneity, and security, among others.



01 Jan 2003
TL;DR: This tutorial examines the behaviour of the crystal polymorph prediction system operating in a Grid environment and considers how effectively the implementation exploits available resources.
Abstract: Summary and Conclusion As a conclusion to this tutorial, we will share lessons learnt from our experience developing with OGSA and highlight the limitations as well as the benefits of deploying OGSA middleware. In particular, we will examine the behaviour of the crystal polymorph prediction system operating in a Grid environment and consider how effectively the implementation exploits available resources. We also discuss practical and political considerations that arise from “real world” Grid environments where technical arguments are often compromised by the needs and preferences of different users, organizations and domain administrators. Conduct of tutorial Delivery The tutorial will consist mostly of a talk support by a Powerpoint presentation. We also intend to illustrate some concepts with live software demonstrations: the first outlining the process of creating a simple OGSA service using the Globus Toolkit 3.0 (GT3) from simple interface description through to stub generation and wrapping an implementation using the delegation model; the second demonstrating the crystal polymorph application running in a simulated Grid environment. This will enable participants to experience Grid middleware from a developer’s perspective and lend some reality to the concepts and mechanisms the tutorial covers.