scispace - formally typeset
Open AccessJournal ArticleDOI

Quality of Service for Workflows and Web Service Processes

Reads0
Chats0
TLDR
In this article, the authors present a predictive QoS model that makes it possible to compute the quality of service (QoS) for workflows automatically based on atomic task QoS attributes.
About
This article is published in Journal of Web Semantics.The article was published on 2004-04-01 and is currently open access. It has received 807 citations till now. The article focuses on the topics: Mobile QoS & Workflow management system.

read more

Citations
More filters
Proceedings ArticleDOI

An approach for QoS-aware service composition based on genetic algorithms

TL;DR: Genetic Algorithms, while being slower than integer programming, represent a more scalable choice, and are more suitable to handle generic QoS attributes.
Journal ArticleDOI

A Taxonomy of Workflow Management Systems for Grid Computing

TL;DR: In this article, a taxonomy that characterizes and classifies various approaches for building and executing workflows on Grids has been proposed, highlighting the design and engineering similarities and differences of state-of-the-art in Grid workflow systems, and identifying the areas that need further research.
Proceedings ArticleDOI

Combining global optimization with local selection for efficient QoS-aware service composition

TL;DR: This paper proposes a solution that combines global optimization with local selection techniques to benefit from the advantages of both worlds and significantly outperforms existing solutions in terms of computation time while achieving close-to-optimal results.
Proceedings ArticleDOI

Constraint driven Web service composition in METEOR-S

TL;DR: This work presents a constraint driven Web service composition tool in METEOR-S, which allows the process designers to bind Web services to an abstract process, based on business and process constraints and generate an executable process.
Proceedings ArticleDOI

Selecting skyline services for QoS-based web service composition

TL;DR: This paper proposes an approach based on the notion of skyline to effectively and efficiently select services for composition, reducing the number of candidate services to be considered, and discusses how a provider can improve its service to become more competitive and increase its potential of being included in composite applications.
References
More filters
Journal ArticleDOI

Basic Local Alignment Search Tool

TL;DR: A new approach to rapid sequence comparison, basic local alignment search tool (BLAST), directly approximates alignments that optimize a measure of local similarity, the maximal segment pair (MSP) score.
Journal ArticleDOI

Improved tools for biological sequence comparison.

TL;DR: Three computer programs for comparisons of protein and DNA sequences can be used to search sequence data bases, evaluate similarity scores, and identify periodic structures based on local sequence similarity.
Proceedings Article

DAML-S: semantic markup for web services

TL;DR: The overall structure of the ontology, the service profile for advertising services, and the process model for the detailed description of the operation of services are described, which compare DAML-S with several industry efforts to define standards for characterizing services on the Web.
Book ChapterDOI

On Non-Functional Requirements in Software Engineering

TL;DR: This chapter reviews the state of the art on the treatment of non-functional requirements (hereafter, NFRs), while providing some prospects for future directions.
Book

Competing Against Time: How Time-Based Competition is Reshaping Global Markets

TL;DR: Time consumption, like cost, is quantifiable and therefore manageable as mentioned in this paper, and therefore it can be used as a strategic weapon to improve the competitiveness of a company by reducing if not eliminating delays and using their response advantages to attract profitable customers.
Related Papers (5)
Frequently Asked Questions (12)
Q1. What are the contributions mentioned in the paper "Quality of service for workflows and web service processes" ?

In this paper, the authors present a predictive QoS model that makes it possible to compute the quality of service for workflows automatically based on atomic task QoS attributes. The authors also present the implementation of their QoS model for the METEOR workflow system. The authors describe the components that have been changed or added, and discuss how they interact to enable the management of QoS. 

One of the advantages of using Web services is to enable easier and greater interoperability and integration among systems and applications. 

As workflow systems carry out more complex and mission-critical applications, QoS analysis serves to ensure that each application meets user requirements. 

One of the primary goals of using a database system loosely coupled with the workflow system is to enable different tools to be used to analyze QoS, such as project management and statistical tools. 

The second class, the distributional class, corresponds to the specification of a distribution function which statistically describes tasks behavior at runtime. 

Their approach is based on continuous-time Markov chains and Markov reward models to predict the performance, availability, and performability of a WfMS under a given load. 

The re-computation of QoS estimates for tasks and for transition probabilities is done based on runtime data generated from past workflow executions that have been stored in the database log. 

The values specified in the basic class are typically employed by mathematical methods in order to compute workflow QoS metrics, while the distributional class information is used by simulation systems to compute workflow QoS (Chandrasekaran, Silver et al. 

it is expected that as the workflow system executes more instances, the reliability of the DNA Sequencing workflow will decrease. 

A task t remains in the pre-init state as long as its task scheduler is waiting for another transition to be enabled in order to place the task into an initial state. 

When instances of a workflow w have already been executed, then the data used to re-compute the probabilities come from initial designer specifications for workflow w, from other executed instances of workflow w, and if available, from the instance of workflow w for which the authors wish to predict the QoS. 

Task cost (C) is the cost incurred when a task t is executed; it can be broken down into two major components: enactment cost and realization cost.