scispace - formally typeset
Search or ask a question
Topic

Workflow

About: Workflow is a research topic. Over the lifetime, 31996 publications have been published within this topic receiving 498339 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: The taxonomies of cloud workflow scheduling problem and techniques are proposed based on analytical review and identified the aspects and classifications unique to workflow scheduling in the cloud environment in three categories, namely, scheduling process, task and resource.

92 citations

Proceedings ArticleDOI
01 Nov 2006
TL;DR: This paper uses AI planning to automatically generate workflows that bring the data center from its current state to the desired state using object models, and discusses the optimizations to Partial Order Planning algorithms for the provisioning domain.
Abstract: Today's enterprise data centers support thousands of mission-critical business applications composed of multiple distributed heterogeneous components. Application components exhibit complex dependencies on the configuration of multiple data center network, middleware, and related application resources. Applications are also associated with extended life-cycles, migrating from development to testing, staging and production environments, with frequent roll-backs. Maintaining end-to-end data center operational integrity and quality requires careful planning of (1) application deployment design, (2) resource selection, (3) provisioning operation selection, parameterization and ordering, and (4) provisioning operation execution. Current data center management products are focused on workflow-based automation of the deployment processes. Workflows are of limited value because they hard-code many aspects of the process, and are thus sensitive to topology changes. An emerging and promising class of model-based tools is providing new methods for designing detailed deployment topologies based on a set of requirements and constraints. In this paper we describe an approach to bridging the gap between generated "desired state" models and the elemental procedural provisioning operations supported by data center resources. In our approach, we represent the current and desired state of the data center using object models. We use AI planning to automatically generate workflows that bring the data center from its current state to the desired state. We discuss our optimizations to Partial Order Planning algorithms for the provisioning domain. We validated our approach by developing and integrating a prototype with a state of the art provisioning product. We also present initial results of a performance study.

92 citations

Journal ArticleDOI
TL;DR: A novel directional and non-local-convergent particle swarm optimization (DNCPSO) that employs non-linear inertia weight with selection and mutation operations by directional search process, which can reduce the makespan and cost dramatically and obtain a compromising result.

92 citations

Journal ArticleDOI
TL;DR: This paper describes two e-science infrastructures: Science and Engineering Applications Grid (SEAGrid) and molecular modeling and parametrization (ParamChem), which share a similar three-tier computational infrastructure that consists of a front-end client, a middleware web services layer, and a remote HPC computational layer.

92 citations

Journal ArticleDOI
TL;DR: This paper examines the various factors that affect the completion time of a fine granularity astronomy workflow using Condor as the workflow execution engine, and shows that changing the system parameters that influence these factors and restructuring the workflow can drastically reduce the completionTime of this class of workflows.
Abstract: Large-scale applications can be expressed as a set of tasks with data dependencies between them, also known as application workflows. Due to the scale and data processing requirements of these applications, they require Grid computing and storage resources. So far, the focus has been on developing easy to use interfaces for composing these workflows and finding an optimal mapping of tasks in the workflow to the Grid resources in order to minimize the completion time of the application. After this mapping is done, a workflow execution engine is required to run the workflow over the mapped resources. In this paper, we show that the performance of the workflow execution engine in executing the workflow can also be a critical factor in determining the workflow completion time. Using Condor as the workflow execution engine, we examine the various factors that affect the completion time of a fine granularity astronomy workflow. We show that changing the system parameters that influence these factors and restructuring the workflow can drastically reduce the completion time of this class of workflows. We also examine the effect on the optimizations developed for the astronomy application on a coarser granularity biology application. We were able to reduce the completion time of the Montage and the Tomography application workflows by 90% and 50%, respectively.

92 citations


Network Information
Related Topics (5)
Software
130.5K papers, 2M citations
89% related
Information system
107.5K papers, 1.8M citations
84% related
The Internet
213.2K papers, 3.8M citations
82% related
Deep learning
79.8K papers, 2.1M citations
82% related
Cluster analysis
146.5K papers, 2.9M citations
81% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20241
20234,414
20229,010
20211,461
20201,579
20191,702