scispace - formally typeset
Search or ask a question

Showing papers on "Database-centric architecture published in 2007"


Journal ArticleDOI
TL;DR: The five industrial software architecture design methods are compared and it is found that the five approaches have a lot in common and match more or less the ''ideal'' pattern that can be used for further method comparisons.

292 citations


Journal ArticleDOI
TL;DR: A review of six cognitive architectures, namely Soar, ACT-R, ICARUS, BDI, the subsumption architecture and CLARION, pointing to promising directions towards generic and scalable architectures with close analogy to human brains.
Abstract: This article aims to present an account of the state of the art research in the field of integrated cognitive architectures by providing a review of six cognitive architectures, namely Soar, ACT-R, ICARUS, BDI, the subsumption architecture and CLARION. We conduct a detailed functional comparison by looking at a wide range of cognitive components, including perception, memory, goal representation, planning, problem solving, reasoning, learning, and relevance to neurobiology. In addition, we study the range of benchmarks and applications that these architectures have been applied to. Although no single cognitive architecture has provided a full solution with the level of human intelligence, important design principles have emerged, pointing to promising directions towards generic and scalable architectures with close analogy to human brains.

103 citations


Proceedings ArticleDOI
21 Mar 2007
TL;DR: A state of the art on software architecture reconstruction approaches is presented, showing a plethora of approaches and techniques supporting architecture reconstruction but often difficult to compare the approaches.
Abstract: To maintain and understand large applications, it is crucial to know their architecture. The first problem is that unlike classes and packages, architecture is not explicitly represented in the code. The second problem is that successful applications evolve over time, so their architecture inevitably drifts. Reconstructing the architecture and checking whether it is still valid is therefore an important aid. While there is a plethora of approaches and techniques supporting architecture reconstruction, there is no comprehensive state of the art and it is often difficult to compare the approaches. This article presents a state of the art on software architecture reconstruction approaches

83 citations


Proceedings ArticleDOI
01 May 2007
TL;DR: In this article, the authors proposed a privacy-enhanced data-centric sensor network (pDCS) which offers different levels of data privacy based on different cryptographic keys, and proposed several query optimization techniques based on Euclidean Steiner Tree and Keyed Bloom Filter.
Abstract: The demand for efficient data dissemination/access techniques to find the relevant data from within a sensor network has led to the development of data-centric sensor networks (DCS), where the sensor data as contrast to sensor nodes are named based on attributes such as event type or geographic location. However, saving data inside a network also creates security problems due to the lack of tamper-resistance of the sensor nodes and the unattended nature of the sensor network. For example, an attacker may simply locate and compromise the node storing the event of his interest. To address these security problems, we present pDCS, a privacy-enhanced DCS network which offers different levels of data privacy based on different cryptographic keys. In addition, we propose several query optimization techniques based on Euclidean Steiner Tree and Keyed Bloom Filter to minimize the query overhead while providing certain query privacy. Finally, detailed analysis and simulations show that the Keyed Bloom Filter scheme can significantly reduce the message overhead with the same level of query delay and maintain a very high level of query privacy.

75 citations


Journal ArticleDOI
TL;DR: It is argued that the proposed architecture permits the interconnection of almost any kind of government body and that it establishes a common ground upon which new standardization levels can be built.

75 citations


Book ChapterDOI
24 Sep 2007
TL;DR: A concern-driven measurement framework for assessing architecture modularity that encompasses a mechanism for documenting architectural concerns, and a suite of concern-oriented architecture metrics is proposed.
Abstract: Much of the complexity of software architecture design is derived from the inadequate modularization of key broadly-scoped concerns, such as exception handling, distribution, and persistence. However, conventional architecture metrics are not sensitive to the driving architectural concerns, thereby leading a number of false positives and false negatives in the design assessment process. Therefore, there is a need for assessment techniques that support a more effective identification of early design modularity anomalies relative to crosscutting concerns. In this context, this paper proposes a concern-driven measurement framework for assessing architecture modularity. It encompasses a mechanism for documenting architectural concerns, and a suite of concern-oriented architecture metrics. We evaluated the usefulness of the proposed framework while comparing the modularity of architecture design alternatives in three different case studies.

70 citations


Proceedings ArticleDOI
26 Mar 2007
TL;DR: It is demonstrated that successful optimization for C64-like large-scale multi-core architectures requires a careful analysis that can identify certain domain-specific features of a target application and match them well with some key multi- core architecture features.
Abstract: The rapid revolution in microprocessor chip architecture due to multicore technology is presenting unprecedented challenges to the application developers as well as system software designers: how to best exploit the parallelism potential due to such multi-core architectures? In this paper, we report an in-depth study on such challenges based on our experience of optimizing the fast Fourier transform (FFT) on the IBM Cyclops-64 chip architecture - a large-scale multi-core chip architecture consisting 160 thread units, associated memory banks and an interconnection network that connect them together in a shared memory organization. We demonstrate how multi-core architectures like the C64 could be used to achieve a high performance implementation of FFT both in 1D and 2D cases. We analyze the optimization challenges and opportunities including problem decomposition, load balancing, work distribution, and data-reuse, together with the exploiting of the C64 architecture features such as the multi-level of memory hierarchy and large register files. Furthermore, the experience learned during the hand-tuned optimization process have provided valuable guidance in our compiler optimization design and implementation. The main contributions of this paper include: 1) our study demonstrates that successful optimization for C64-like large-scale multi-core architectures requires a careful analysis that can identify certain domain-specific features of a target application (e.g. FFT) and match them well with some key multi-core architecture features; 2) our optimization, assisted with hand-tuned process, provided quantitative evidence on the importance of each optimization identified in 1); 3) automatic optimization by our compiler, the design and implementation of which is guided by the feedbacks from 1) and 2), shows excellent results that are often comparable to the results derived from our time-consuming hand-tuned code.

58 citations


01 May 2007
TL;DR: This paper draws from a variety of research results to illustrate how formal approaches to software architecture can lead to enhancements in software quality, including improved clarity of design, support for analysis, and assurance that implementations conform to their intended architecture.
Abstract: Over the past 15 years there has been increasing recognition that careful attention to the design of a system's software architecture is critical to satisfying its requirements for quality attributes such as performance, security, and dependability. As a consequence, during this period the field of software architecture has matured significantly. However, current practices of software architecture rely on relatively informal methods, limiting the potential for fully exploiting architectural designs to gain insight and improve the quality of the resulting system. In this paper we draw from a variety of research results to illustrate how formal approaches to software architecture can lead to enhancements in software quality, including improved clarity of design, support for analysis, and assurance that implementations conform to their intended architecture.

46 citations


Journal ArticleDOI
28 Feb 2007
TL;DR: An architectural framework along with a set of middleware elements, facilitating the integration of perceptual components, sensors, actuators, and context-modeling scripts, comprising sophisticated ubiquitous computing applications in smart spaces is presented.
Abstract: We present an architectural framework along with a set of middleware elements, facilitating the integration of perceptual components, sensors, actuators, and context-modeling scripts, comprising sophisticated ubiquitous computing applications in smart spaces. The architecture puts special emphasis on the integration of perceptual components contributed by a variety of technology providers, which has not been adequately addressed in legacy architectures. Moreover, the introduced architecture allows for intelligent discovery and management of resources. Along with the description of this breadboard architecture, we present its non-functional features and assess its performance. We also outline a rich set of practical prototype pervasive services that have been built, based on this architecture. These services emphasize on providing non-obtrusive human-centric assistance (e.g., memory aids, meeting recordings, pertinent information) in the scope of meetings, lectures and presentation, Experiences from building these services manifest the benefits of the introduced architecture.

45 citations


Journal ArticleDOI
TL;DR: The System Architecture and Model Extraction Technique (SAMEtech) described here overcomes a weakness of previous work with "angio traces" in two ways: it only requires standard trace formats and it uses a simpler algorithm which scales up linearly for very large traces.

44 citations


Proceedings ArticleDOI
05 Nov 2007
TL;DR: This paper presents a prototype that implements the derivation of the product-specific architecture as a model transformation described in the Atlas Transformation Language (ATL), and describes how this activity can be automated using a model-driven approach.
Abstract: Product Derivation is one of the central activities in Software Product Lines (SPL). One of the main challenges of the process of product derivation is dealing with complexity, which is caused by the large number of artifacts and dependencies between them. Another major challenge is maximizing development efficiency and reducing time-to-market, while at the same time producing high quality products. One approach to overcome these challenges is to automate the derivation process. To this end, this paper focuses on one particular activity of the derivation process; the derivation of the product-specific architecture and describes how this activity can be automated using a model-driven approach. The approach derives the product-specific architecture by selectively copying elements from the product-line architecture. The decision, which elements are included in the derived architecture, is based on a product-specific feature configuration. We present a prototype that implements the derivation as a model transformation described in the Atlas Transformation Language (ATL). We conclude with a short overview of related work and directions for future research

Journal ArticleDOI
TL;DR: The new method, labeled Holistic Product Line Architecture Assessment (HoPLAA), uses a holistic approach that focuses on risks and quality attribute tradeoffs - not only for the common product line architecture, but for the individual product architectures as well.
Abstract: The success of architecture-centric development of software product lines is critically dependent upon the availability of suitable architecture assessment methods. While a number of architecture assessment methods are available and some of them have been widely used in the process of evaluating single product architectures, none of them is equipped to deal with the main challenges of product line development. In this paper we present an adaptation of the Architecture Tradeoff Analysis Method (ATAM) for the task of assessing product line architectures. The new method, labeled Holistic Product Line Architecture Assessment (HoPLAA), uses a holistic approach that focuses on risks and quality attribute tradeoffs - not only for the common product line architecture, but for the individual product architectures as well. In addition, it prescribes a qualitative analytical treatment of variation points using scenarios. The use of the new method is illustrated through a case study.

Proceedings ArticleDOI
20 May 2007
TL;DR: An effort to identify the major concepts in software architecture that can go into meta knowledge, including generic architecture knowledge, through two different techniques.
Abstract: Knowledge management of any domain requires controlled vocabularies, taxonomies, thesauri, ontologies, concept maps and other such artifacts. This paper describes an effort to identify the major concepts in software architecture that can go into such meta knowledge. The concept terms are identified through two different techniques (1) manually, through back-of- the-book index of some of the major texts in Software Architecture (2) through a semi-automatic technique by parsing the Wikipedia pages. Only generic architecture knowledge is considered. Apart from identifying the important concepts of software architecture, we could also see gaps in the software architecture content in the Wikipedia.

Journal ArticleDOI
TL;DR: Six types of architecture are presented, beginning with the observation-driven Markov decision process as Type-1, and progressively becomes more complete toward the necessary functions of autonomous mental development.

13 Feb 2007
TL;DR: This paper proposes an adapted layered architecture for the languages of the Semantic Web based on fundamental aspects for layered architectures in information systems.
Abstract: Tim Berners-Lee proposed a layered architecture for the languages of the Semantic Web in 2000 and suggested an adapted architecture in 2003. We evaluated the architecture according to a list of layered architecture characteristics and in this paper, based on aforementioned evaluation, we propose an adapted layered architecture for the languages of the Semantic Web based on fundamental aspects for layered architectures in information systems.

Book ChapterDOI
24 Sep 2007
TL;DR: This work proposes to model architectures using the i* framework, a goal-oriented modelling language that allows to represent the functional and non-functional requirements of an architecture using actors and dependencies instead of components and connectors.
Abstract: There is a recognized gap between requirements and architectures. There is also evidence that architecture evaluation, when done at the early phases of the development lifecycle, is an effective way to ensure the quality attributes of the final system. As quality attributes may be satisfied at a different extent by different alternative architectural solutions, an exploration and evaluation of alternatives is often needed. In order to address this issue at the requirements level, we propose to model architectures using the i* framework, a goal-oriented modelling language that allows to represent the functional and non-functional requirements of an architecture using actors and dependencies instead of components and connectors. Once the architectures are modelled, we propose guidelines for the generation of alternative architectures based upon existing architectural patterns, and for the definition of structural metrics for the evaluation of the resulting alternative models. The applicability of the approach is shown with the Home Service Robot case study.

Journal ArticleDOI
Grady Booch1
TL;DR: A software development process that swirls around the growth of a software-intensive system's architecture has considerable material value and an architecture-first approach appears to be a reflection of sound development practices.
Abstract: Architecture is an artifact that's governed throughout the software life cycle - from conception through development to deployment and finally evolution, then to adaptation, assimilation, replacement, or abandonment. Similarly, the architect, either as an individual, a role, or a team, lovingly crafts, grows, and governs that architecture as it emerges from the thousands of individual design decisions of which it's composed. In this sense, an architecture-first approach appears to be a reflection of sound development practices. Now, strict agilists might counter that an architecture-first approach is undesirable because we should allow a system's architecture to emerge over time. More than just a reflection, however, a software development process that swirls around the growth of a software-intensive system's architecture has considerable material value.

Journal ArticleDOI
TL;DR: An approach for performance analysis of layered, service-oriented architecture models is proposed, which consists of two phases: a ‘‘top-down’’ propagation of workload parameters, and a “bottom-up” propagation of performance or cost measures.
Abstract: In this article we address the integration of functional models with non-functional models in the context of service-oriented enterprise architecture. Starting from the observation that current approaches to model-driven development have a strong focus on functionality, we argue the necessity of including non-functional aspects as early as possible in the service design process. We distinguish two modelling spaces, the design space and the analysis space, which can be integrated by means of model transformations. Quantitative results obtained in the analysis space, using special-purpose analysis techniques, can be related back to the design models by means of a reverse transformation. This provides a framework for incorporating non-functional analysis into methodological support for e-service development. While, for detailed design models, performance analysis is more or less covered by existing techniques, there is still a gap at the architectural overview level. Therefore, we propose an approach for performance analysis of layered, service-oriented architecture models, which consists of two phases: a ‘‘top-down’’ propagation of workload parameters, and a “bottom-up’’ propagation of performance or cost measures. By means of an example, we demonstrate the application of the approach and show that a seamless integration with detailed performance analysis methods (e.g., queueing analysis) can be achieved.

Journal ArticleDOI
Grady Booch1
TL;DR: The architecture of a software-intensive system is largely irrelevant to its end users, so testers can use a system's architecture to devise tests that are relevant to the particular texture of that implementation.
Abstract: The architecture of a software-intensive system is largely irrelevant to its end users. Far more important to these stakeholders is the system's behavior, exhibited by raw, naked, running code. Most interesting system tests should be based on the use cases that are identified incrementally over the system's life cycle, the same use cases that the system's architects used to guide their design decisions. Testers can conduct other system tests only after the system's architecture is crisp. Just as analysts use a system's architecture as scaffolding along which to climb and examine the details of every edge, so too can testers use a system's architecture to devise tests that are relevant to the particular texture of that implementation

Book ChapterDOI
24 Sep 2007
TL;DR: The ArchWare-ADL, a formal, well-founded architecture description language, based on the higher-order typed &pi-calculus, which consists of a set of layers to address the requirements of active architectures, is designed and constructed.
Abstract: The term co-evolution describes the symbiotic relationship between dynamically changing business environments and the software that supports them. Business changes create pressures on the software to evolve, and at the same time technology changes create pressures on the business to evolve. More generally, we are concerned with systems where it is neither economically nor technologically feasible to suspend the operation of the system while it is being evolved. Typically these are long-lived systems in which dynamic co-evolution, whereby a system evolves as part of its own execution in reaction to both predicted and emergent events, is the only feasible option for change. Examples of such systems include continuously running business process models, sensor nets, grid applications, self-adapting/tuning systems, routing systems, control systems, autonomic systems, and pervasive computing applications. Active architectures address both the structural and behavioural requirements of dynamic co-evolving software by modelling software architecture as part of the on-going computation, thereby allowing evolution during execution and formal checking that desired system properties are preserved through evolution. This invited paper presents results on active architectures from the Compliant System Architecture and ArchWare projects. We have designed and constructed the ArchWare-ADL, a formal, well-founded architecture description language, based on the higher-order typed &pi-calculus, which consists of a set of layers to address the requirements of active architectures. The ArchWare-ADL design principles, concepts and formal notations are presented together with its sophisticated reflective technologies for supporting active architectures and thereby dynamic co-evolution.

Proceedings ArticleDOI
05 Nov 2007
TL;DR: This paper explores the suitability of the i* goal-oriented approach for representing software architectures and checks its properties against the ones suitable for Architecture Description Languages and defines some criteria for solving the unfulfilled aspects in representing the architectures.
Abstract: In order to work at the software architecture level, specification languages and analysis techniques are needed There exist many proposals that serve that purpose, but few of them address architecture and requirements altogether, leaving a gap between both disciplines Goal-oriented approaches are suitable for bridging this gap because they allow representing architecture-related concepts (components, nodes, files, etc) and more abstract concepts (goals, non-functional requirements, etc) by using the same constructs In this paper we explore the suitability of the i* goal-oriented approach for representing software architectures For doing so, we check its properties against the ones suitable for Architecture Description Languages and we define some criteria for solving the unfulfilled aspects in representing the architectures This paper assumes basic notions on i*

Proceedings ArticleDOI
20 May 2007
TL;DR: This ongoing research describes an approach for product line synthesis architecture, where design decisions are introduced to promote its reuse, and explores extensibility ideas from software product lines to show how architectures can be extended on the basis of design decisions.
Abstract: Software architectures represent the design of a system for describing its main relevant parts. Recently, recording and documenting architectural design decisions has attracted the attention of the software architecture community. Design decisions are an important piece during the architecting process that must be explicitly documented, but there is little evidence of successful reuse of this architectural knowledge. This work focuses on the reuse of design decisions in order to customize architectures. Specifically, we explore extensibility ideas from software product lines to show how architectures can be extended on the basis of design decisions. The documentation of synthesis architectures has received so far little attention, and particularly its reuse. This ongoing research describes an approach for product line synthesis architecture, where design decisions are introduced to promote its reuse.

Journal ArticleDOI
Michael von der Beeck1
TL;DR: A tight integration of both architecture levels—on the conceptual and on the tool level—with related development phases such as requirements engineering, behaviour modeling, code generation as well as version and configuration management resulting in a seamless overall development process is presented.
Abstract: This paper presents a modeling approach for the development of software for electronic control units in the automotive domain. The approach supports the development of two related architecture models in the overall development process: the logical architecture provides a graphical, quite abstract representation of a typically large set of automotive functions. On this abstraction level no design decisions are taken. The technical architecture provides a software and a hardware representation in separated views: the software architecture describes the software realization of functions as software components, whereas the hardware architecture models hardware ntities, on which the software components are deployed. Logical as well as technical architectures only model structural information, but no behavioural information. A tight integration of both architecture levels—on the conceptual and on the tool level—with related development phases such as requirements engineering, behaviour modeling, code generation as well as version and configuration management resulting in a seamless overall development process is presented. This architecture modeling approach has been developed within a safety-relevant project at BMW Group. Positive as well as negative experiences with the application of this approach are described.

Proceedings ArticleDOI
11 Mar 2007
TL;DR: Novel compiler techniques are developed in order to generate high-quality code for the reduced-cost accelerators and prevent performance loss to the extent possible, and show that the increase in the total number of execution cycles in reduced-interconnect accelerators is less than 1% of the fully-connected accelerator.
Abstract: The demand for high performance has driven acyclic computation accelerators into extensive use in modern embedded and desktop architectures. Accelerators that are ideal from a software perspective, are difficult or impossible to integrate in many modern architectures, though, due to area and timing requirements. This reality is coupled with the observation that many application domains under-utilize accelerator hardware, because of the narrow data they operate on and the nature of their computation. In this work, we take advantage of these facts to design accelerators capable of executing in modern architectures by narrowing datapath width and reducing interconnect. Novel compiler techniques are developed in order to generate high-quality code for the reduced-cost accelerators and prevent performance loss to the extent possible. First, data width profiling is used to statistically determine how wide program data will be at run time. This information is used by the subgraph mapping algorithm to optimally select subgraphs for execution on targeted narrow accelerators. Overall, our data-centric compilation techniques achieve on average 6.5%, and up to 12%, speed up over previous subgraph mapping algorithms for 8-bit accelerators. We also show that, with appropriate compiler support, the increase in the total number of execution cycles in reduced-interconnect accelerators is less than 1% of the fully-connected accelerator

01 Jan 2007
TL;DR: A novel approach to cognitive architecture exploration in which multiple cognitive architectures are integrated in their entirety to increase significantly the application breadth and utility of cognitive architectures generally and favors a breadth-first rather than depth-first approach to Cognitive modeling.
Abstract: The paper describes a novel approach to cognitive architecture exploration in which multiple cognitive architectures are integrated in their entirety. The goal is to increase significantly the application breadth and utility of cognitive architectures generally. The resulting architecture favors a breadth-first rather than depth-first approach to cognitive modeling by focusing on matching the broad power of human cognition rather than any specific data set. It uses human cognition as a functional blueprint for meeting the requirements for general intelligence. For example, a chief design principle is inspired by the power of human perception and memory to reduce the effective complexity of problem solving. Such complexity reduction is reflected in an emphasis on integrating subsymbolic and statistical mechanisms with symbolic ones. The architecture realizes a “cognitive pyramid” in which the scale and complexity of a problem is successively reduced via three computational layers: Proto-cognition (information filtering and clustering), Micro-cognition (memory retrieval modulated by expertise) and Macro-cognition (knowledge-based reasoning). The consequence of this design is that knowledge-based reasoning is used primarily for non-routine, novel situations; more familiar situations are handled by experience-based memory retrieval. Filtering and clustering improve overall scalability by reducing the elements to be considered by higher levels. The paper describes the design of the architecture, two prototype explorations, and evaluation and limitations.

Journal ArticleDOI
TL;DR: A method for analyzing architecture potential on the basis of dependencies between quality attributes is presented and applied and an explicit representation and correlation of such dependencies provides decision support for architectural concerns.
Abstract: The share of software in embedded systems has been growing permanently in the recent years. Thus, software architecture as well as its evaluation have become important parts of the development of embedded systems to describe, assess, and assure sound architecture as basis for high quality systems. Furthermore, design space exploration can be based on architecture evaluation. To achieve an efficient exploration process, architectural decisions need to be taken into account as part of the architecture. In this paper, a method for analyzing architecture potential on the basis of dependencies between quality attributes is presented and applied. An explicit representation and correlation of such dependencies provides decision support for architectural concerns. Not only can suboptimal decisions be avoided but rather valuable options are highlighted. Besides the quality of an architecture, knowledge of how to achieve and even improve the quality can be analyzed. The latter is the concern of architecture potential analysis presented in this paper. Furthermore, architectural decisions can be documented and will be traceable and justifiable with respect to the development rationale. The ongoing development process can then be based on dependable and well documented architectural decisions. The predictability of change impacts is increased. Thus, time and costs can be saved by avoiding suboptimal changes.

Proceedings ArticleDOI
20 May 2007
TL;DR: This tutorial describes various concepts and approaches to manage the architecture knowledge from both management and technical perspectives, and discusses various approaches to characterize architecture knowledge based on the requirements of a particular domain.
Abstract: Capturing the technical knowledge, contextual information, and rationale surrounding the design decisions underpinning system architectures can greatly improve the software development process. If not managed, this critical knowledge is implicitly embedded in the architecture, becoming tacit knowledge which erodes as personnel on the project change. Moreover, the unavailability of architecture knowledge precludes organizations from growing their architectural capabilities. In this tutorial, we highlight the benefits and challenges in managing software architecture knowledge. We discuss various approaches to characterize architecture knowledge based on the requirements of a particular domain. We describe various concepts and approaches to manage the architecture knowledge from both management and technical perspectives. We also demonstrate the utility of captured knowledge to support software architecture activities with a case study covering the use of architecture knowledge management techniques and tools in an industrial project.

Proceedings ArticleDOI
20 May 2007
TL;DR: A software architectural style for large networks is proposed, based on a formal mathematical study of crystal growth, that will exhibit properties of discreetness, fault-tolerance, and scalability.
Abstract: Large networks, such as the Internet, pose an ideal medium for solving computationally intensive problems, such as NP-complete problems, yet no well-scaling architecture for Internet-sized systems exists. I propose a software architectural style for large networks, based on a formal mathematical study of crystal growth that will exhibit properties of (1) discreetness (nodes on the network cannot learn the algorithm or input of the computation), (2) fault-tolerance (malicious, faulty, and unstable nodes cannot break the computation), and (3) scalability (communication among the nodes does not increase with network or problem size). I plan to evaluate the style both theoretically and empirically for these three properties.

Proceedings ArticleDOI
26 Nov 2007
TL;DR: Diapason is presented, a formal framework that allows us to formally support SOA design, ckecking, execution and evolution, particularly when addressing service-based architecture maintenance and evolution.
Abstract: Web services are often employed to create wide distributed evolvable applications from existing components that constitute a service-based software system. Services-Oriented Architectures promote loose coupling, services distribution, dynamicity and agility and introduce new engineering issues. As services involved in a SOA are remote and autonomous services, the SOA designer does not control them and unpredictable behaviour can occur. Services orchestration is a key issue in order to fit expectations and reach objectives. Thus, Service-Oriented Architectures have to be designed, analized and deployed with rigor in order to be plainly useful and quality aware. Orchestration languages (BPEL4WS, BPML, etc.) fail in some points due to the lack of formalization and expressiveness, particularly when addressing service-based architecture maintenance and evolution. This paper presents Diapason, a formal framework that allows us to formally support SOA design, ckecking, execution and evolution.

Proceedings ArticleDOI
08 Oct 2007
TL;DR: This work considers the challenge of enhancing sensor networks for surveillance and global security with increased distributed data processing capabilities, including multi-sensor fusion, data aggregation or mining, and rule-based alert generation and proposes a novel architecture that will enable the creation of more resilient and complex monitoring applications.
Abstract: We consider the challenge of enhancing sensor networks for surveillance and global security with increased distributed data processing capabilities, including multi-sensor fusion, data aggregation or mining, and rule-based alert generation. We advocate a novel architecture that will enable the creation of more resilient and complex monitoring applications. We exemplify its benefits in a chemical accident scenario. The architecture introduces new processing nodes in the field and derives the requirements for the software they will run. We propose to consider the use of a service oriented architecture (SOA) to program and deploy the data processing applications. We analyze existing and on-going work within the Web Services community and conclude that it is possible to implement the architecture with an appropriate combination of COTS (commercial off-the-shelf software components). We conclude with our plans to move forward in this direction and validate the approach on a hardware and software testbed.