scispace - formally typeset
Search or ask a question

Showing papers on "Workflow published in 2004"


Journal ArticleDOI
TL;DR: In this article, an object-oriented image analysis software, eCognition, is proposed to integrate remote sensing imagery and GIS for mapping, environmental monitoring, disaster management and civil and military intelligence.
Abstract: Remote sensing from airborne and spaceborne platforms provides valuable data for mapping, environmental monitoring, disaster management and civil and military intelligence. However, to explore the full value of these data, the appropriate information has to be extracted and presented in standard format to import it into geo-information systems and thus allow efficient decision processes. The object-oriented approach can contribute to powerful automatic and semi-automatic analysis for most remote sensing applications. Synergetic use to pixel-based or statistical signal processing methods explores the rich information contents. Here, we explain principal strategies of object-oriented analysis, discuss how the combination with fuzzy methods allows implementing expert knowledge and describe a representative example for the proposed workflow from remote sensing imagery to GIS. The strategies are demonstrated using the first object-oriented image analysis software on the market, eCognition, which provides an appropriate link between remote sensing imagery and GIS.

2,539 citations


Journal ArticleDOI
TL;DR: A new algorithm is presented to extract a process model from a so-called "workflow log" containing information about the workflow process as it is actually being executed and represent it in terms of a Petri net.
Abstract: Contemporary workflow management systems are driven by explicit process models, i.e., a completely specified workflow design is required in order to enact a given workflow process. Creating a workflow design is a complicated time-consuming process and, typically, there are discrepancies between the actual workflow processes and the processes as perceived by the management. Therefore, we have developed techniques for discovering workflow models. The starting point for such techniques is a so-called "workflow log" containing information about the workflow process as it is actually being executed. We present a new algorithm to extract a process model from such a log and represent it in terms of a Petri net. However, we also demonstrate that it is not possible to discover arbitrary workflow processes. We explore a class of workflow processes that can be discovered. We show that the /spl alpha/-algorithm can successfully mine any workflow represented by a so-called SWF-net.

1,953 citations


Journal ArticleDOI
TL;DR: The Taverna project has developed a tool for the composition and enactment of bioinformatics workflows for the life sciences community that is written in a new language called Scufl, where by each step within a workflow represents one atomic task.
Abstract: Motivation:In silico experiments in bioinformatics involve the co-ordinated use of computational tools and information repositories. A growing number of these resources are being made available with programmatic access in the form of Web services. Bioinformatics scientists will need to orchestrate these Web services in workflows as part of their analyses. Results: The Taverna project has developed a tool for the composition and enactment of bioinformatics workflows for the life sciences community. The tool includes a workbench application which provides a graphical user interface for the composition of workflows. These workflows are written in a new language called the simple conceptual unified flow language (Scufl), where by each step within a workflow represents one atomic task. Two examples are used to illustrate the ease by which in silico experiments can be represented as Scufl workflows using the workbench application. Availability: The Taverna workflow system is available as open source and can be downloaded with example Scufl workflows from http://taverna.sourceforge.net

1,709 citations


Journal ArticleDOI
TL;DR: In this article, the authors present a predictive QoS model that makes it possible to compute the quality of service (QoS) for workflows automatically based on atomic task QoS attributes.

807 citations


Proceedings ArticleDOI
21 Jun 2004
TL;DR: The Kepler scientific workflow system provides domain scientists with an easy-to-use yet powerful system for capturing scientific workflows (SWFs), a formalization of the ad-hoc process that a scientist may go through to get from raw data to publishable results.
Abstract: Most scientists conduct analyses and run models in several different software and hardware environments, mentally coordinating the export and import of data from one environment to another. The Kepler scientific workflow system provides domain scientists with an easy-to-use yet powerful system for capturing scientific workflows (SWFs). SWFs are a formalization of the ad-hoc process that a scientist may go through to get from raw data to publishable results. Kepler attempts to streamline the workflow creation and execution process so that scientists can design, execute, monitor, re-run, and communicate analytical procedures repeatedly with minimal effort. Kepler is unique in that it seamlessly combines high-level workflow design with execution and runtime interaction, access to local and remote data, and local and remote service invocation. SWFs are superficially similar to business process workflows but have several challenges not present in the business workflow scenario. For example, they often operate on large, complex and heterogeneous data, can be computationally intensive and produce complex derived data products that may be archived for use in reparameterized runs or other workflows. Moreover, unlike business workflows, SWFs are often dataflow-oriented as witnessed by a number of recent academic systems (e.g., DiscoveryNet, Taverna and Triana) and commercial systems (Scitegic/Pipeline-Pilot, Inforsense). In a sense, SWFs are often closer to signal-processing and data streaming applications than they are to control-oriented business workflow applications.

746 citations


Book ChapterDOI
07 Nov 2004
TL;DR: The experience in applying KAoS services to ensure policy compliance for Semantic Web Services workflow composition and enactment is described and how this work has uncovered requirements for increasing the expressivity of policy beyond what can be done with description logic is described.
Abstract: In this paper we describe our experience in applying KAoS services to ensure policy compliance for Semantic Web Services workflow composition and enactment. We are developing these capabilities within the context of two applications: Coalition Search and Rescue (CoSAR-TS) and Semantic Firewall (SFW). We describe how this work has uncovered requirements for increasing the expressivity of policy beyond what can be done with description logic (e.g., role-value-maps), and how we are extending our representation and reasoning mechanisms in a carefully controlled manner to that end. Since KAoS employs OWL for policy representation, it fits naturally with the use of OWL-S workflow descriptions generated by the AIAI I-X planning system in the CoSAR-TS application. The advanced reasoning mechanisms of KAoS are based on the JTP inference engine and enable the analysis of classes and instances of processes from a policy perspective. As the result of analysis, KAoS concludes whether a particular workflow step is allowed by policy and whether the performance of this step would incur additional policy-generated obligations. Issues in the representation of processes within OWL-S are described. Besides what is done during workflow composition, aspects of policy compliance can be checked at runtime when a workflow is enacted. We illustrate these capabilities through two application examples. Finally, we outline plans for future work.

636 citations


Journal ArticleDOI
01 May 2004
TL;DR: The article presents a synopsis of the major HRI issues in reducing the number of humans it takes to control a robot, maintaining performance with geographically distributed teams with intermittent communications, and encouraging acceptance within the existing social structure.
Abstract: Rescue robotics has been suggested by a recent DARPA/NSF study as an application domain for the research in human-robot integration (HRI). This paper provides a short tutorial on how robots are currently used in urban search and rescue (USAR) and discusses the HRI issues encountered over the past eight years. A domain theory of the search activity is formulated. The domain theory consists of two parts: 1) a workflow model identifying the major tasks, actions, and roles in robot-assisted search (e.g., a workflow model) and 2) a general information flow model of how data from the robot is fused by various team members into information and knowledge. The information flow model also captures the types of situation awareness needed by each agent in the rescue robot system. The article presents a synopsis of the major HRI issues in reducing the number of humans it takes to control a robot, maintaining performance with geographically distributed teams with intermittent communications, and encouraging acceptance within the existing social structure.

593 citations


Book ChapterDOI
TL;DR: The Pegasus system that can map complex workflows onto the Grid and takes an abstract description of a workflow and finds the appropriate data and Grid resources to execute the workflow is described.
Abstract: In this paper we describe the Pegasus system that can map complex workflows onto the Grid. Pegasus takes an abstract description of a workflow and finds the appropriate data and Grid resources to execute the workflow. Pegasus is being released as part of the GriPhyN Virtual Data Toolkit and has been used in a variety of applications ranging from astronomy, biology, gravitational-wave science, and high-energy physics. A deferred planning mode of Pegasus is also introduced.

544 citations


Journal ArticleDOI
01 Jul 2004
TL;DR: This survey systematically classifies different approaches in the area of adaptive workflows and discusses their strengths and limitations along typical problems related to dynamic WF change.
Abstract: The capability to dynamically adapt in-progress workflows (WF) is an essential requirement for any workflow management system (WfMS). This fact has been recognized by the WF community for a long time and different approaches in the area of adaptive workflows have been developed so far. This survey systematically classifies these approaches and discusses their strengths and limitations along typical problems related to dynamic WF change. Along this classification we present important criteria for the correct adaptation of running workflows and analyze how actual approaches satisfy them. Furthermore, we provide a detailed comparison of these approaches and sketch important further issues related to dynamic change.

467 citations


Proceedings ArticleDOI
20 Sep 2004
TL;DR: A mechanism is introduced that determines theQoS of a Web service composition by aggregating the QoS dimensions of the individual services by building upon abstract composition patterns derived from Van der Aalst's et al. comprehensive collection of workflow patterns.
Abstract: Contributions in the field of Web services have identified that (a) finding matches between semantic descriptions of advertised and requested services and (b) nonfunctional characteristics - the quality of service (QoS) - are the most crucial criteria for composition of Web services. A mechanism is introduced that determines the QoS of a Web service composition by aggregating the QoS dimensions of the individual services. This allows to verify whether a set of services selected for composition satisfies the QoS requirements for the whole composition. The aggregation performed builds upon abstract composition patterns, which represent basic structural elements of a composition, like sequence, loop, or parallel execution. This work focusses on workflow management environments. We define composition patterns that are derived from Van der Aalst's et al. comprehensive collection of workflow patterns. The resulting aggregation schema supports the same structural elements as found in workflows. Furthermore, the aggregation of several QoS dimensions is discussed.

400 citations


Book ChapterDOI
01 Jan 2004
TL;DR: In this paper, the impact of e-business on supply chain integration can be described along the dimensions of information integration, synchronized planning, coordinated workflow, and new business models, and significant value can be created by e-Business enabled supply-chain integration.
Abstract: e-Business has emerged as a key enabler to drive supply chain integration. Businesses can use the Internet to gain global visibility across their extended network of trading partners and help them respond quickly to changing customer demand captured over the Internet. The impact of e-business on supply chain integration can be described along the dimensions of information integration, synchronized planning, coordinated workflow, and new business models. As a result, many of the core supply chain principles and concepts can now be put into practice much more effectively using e-business. Significant value can be created by e-business enabled supply chain integration.

Journal ArticleDOI
01 Nov 2004
TL;DR: AGENTWORK as discussed by the authors is a workflow management system supporting automated workflow adaptations in a comprehensive way, which uses temporal estimates to determine which remaining parts of running workflows are affected by an exception and is able to predictively perform suitable adaptations.
Abstract: Current workflow management systems still lack support for dynamic and automatic workflow adaptations. However, this functionality is a major requirement for next-generation workflow systems to provide sufficient flexibility to cope with unexpected failure events. We present the concepts and implementation of AGENTWORK, a workflow management system supporting automated workflow adaptations in a comprehensive way. A rule-based approach is followed to specify exceptions and necessary workflow adaptations. AGENTWORK uses temporal estimates to determine which remaining parts of running workflows are affected by an exception and is able to predictively perform suitable adaptations. This helps to ensure that necessary adaptations are performed in time with minimal user interaction which is especially valuable in complex applications such as for medical treatments.

Journal Article
TL;DR: Cooperative Information Systems (CoopIS) 2004 International Conference (International Conference on Cooperative Information Systems) PC Co-chairs' Message- Keynote- Business Process Optimization- Workflow/Process/Web Services, I- Discovering Workflow Transactional behavior from Event-based Log- A Flexible Mediation Process for Large Distributed Information Systems- Exception Handling Through a Workflow- WorkFlow/Process, Web Services, II- Flexible and Composite Schema Matching Algorithm- Analysis, Transformation, and Improvements of ebXML Choreographies based on Work
Abstract: Cooperative Information Systems (CoopIS) 2004 International Conference- CoopIS 2004 International Conference (International Conference on Cooperative Information Systems) PC Co-chairs' Message- Keynote- Business Process Optimization- Workflow/Process/Web Services, I- Discovering Workflow Transactional Behavior from Event-Based Log- A Flexible Mediation Process for Large Distributed Information Systems- Exception Handling Through a Workflow- Workflow/Process/Web Services, II- A Flexible and Composite Schema Matching Algorithm- Analysis, Transformation, and Improvements of ebXML Choreographies Based on Workflow Patterns- The Notion of Business Process Revisited- Workflow/Process/Web Services, III- Disjoint and Overlapping Process Changes: Challenges, Solutions, Applications- Untangling Unstructured Cyclic Flows - A Solution Based on Continuations- Making Workflow Models Sound Using Petri Net Controller Synthesis- Database Management/Transaction- Concurrent Undo Operations in Collaborative Environments Using Operational Transformation- Refresco: Improving Query Performance Through Freshness Control in a Database Cluster- Automated Supervision of Data Production - Managing the Creation of Statistical Reports on Periodic Data- Schema Integration/Agents- Deriving Sub-schema Similarities from Semantically Heterogeneous XML Sources- Supporting Similarity Operations Based on Approximate String Matching on the Web- Managing Semantic Compensation in a Multi-agent System- Modelling with Ubiquitous Agents a Web-Based Information System Accessed Through Mobile Devices- Events- A Meta-service for Event Notification- Classification and Analysis of Distributed Event Filtering Algorithms- P2P/Collaboration- A Collaborative Model for Agricultural Supply Chains- FairNet - How to Counter Free Riding in Peer-to-Peer Data Structures- Supporting Collaborative Layouting in Word Processing- A Reliable Content-Based Routing Protocol over Structured Peer-to-Peer Networks- Applications, I- Covering Your Back: Intelligent Virtual Agents in Humanitarian Missions Providing Mutual Support- Dynamic Modelling of Demand Driven Value Networks- An E-marketplace for Auctions and Negotiations in the Constructions Sector- Applications, II- Managing Changes to Engineering Products Through the Co-ordination of Human and Technical Activities- Towards Automatic Deployment in eHome Systems: Description Language and Tool Support- A Prototype of a Context-Based Architecture for Intelligent Home Environments- Trust/Security/Contracts- Trust-Aware Collaborative Filtering for Recommender Systems- Service Graphs for Building Trust- Detecting Violators of Multi-party Contracts- Potpourri- Leadership Maintenance in Group-Based Location Management Scheme- TLS: A Tree-Based DHT Lookup Service for Highly Dynamic Networks- Minimizing the Network Distance in Distributed Web Crawling- Ontologies, DataBases, and Applications of Semantics (ODBASE) 2004 International Conference- ODBASE 2004 International Conference (Ontologies, DataBases, and Applications of Semantics) PC Co-chairs' Message- Keynote- Helping People (and Machines) Understanding Each Other: The Role of Formal Ontology- Knowledge Extraction- Automatic Initiation of an Ontology- Knowledge Extraction from Classification Schemas- Semantic Web in Practice- Generation and Management of a Medical Ontology in a Semantic Web Retrieval System- Semantic Web Based Content Enrichment and Knowledge Reuse in E-science- The Role of Foundational Ontologies in Manufacturing Domain Applications- Intellectual Property Rights Management Using a Semantic Web Information System- Ontologies and IR- Intelligent Retrieval of Digital Resources by Exploiting Their Semantic Context- The Chrysostom Knowledge Base: An Ontology of Historical Interactions- Text Simplification for Information-Seeking Applications- Information Integration- Integration of Integrity Constraints in Federated Schemata Based on Tight Constraining- Modal Query Language for Databases with Partial Orders- Composing Mappings Between Schemas Using a Reference Ontology- Assisting Ontology Integration with Existing Thesauri

01 Jan 2004
TL;DR: A series of Workflow Resource Patterns are described that aim to capture the various ways in which resources are represented and utilized in workflows and are used as the basis for a detailed comparison of a number of commercially available workflow management systems and business process modelling languages.
Abstract: Workflow systems seek to provide an implementation vehicle for complex, recurring business processes. Notwithstanding this common objective, there are a variety of distinct features offered by commercial workflow management systems. These differences result in significant variations in the ability of distinct tools to represent and implement the plethora of requirements that may arise in contemporary business processes. Many of these requirements recur quite frequently during the requirements analysis activity for workflow systems and abstractions of these requirements serve as a useful means of identifying the key components of workflow languages. Previous work has identified a number of Workflow Control Patterns and Workflow Data Patterns, which characterize the range of control flow and data constructs that might be encountered when modelling and analysing workflows. In this paper, we describe a series of Workflow Resource Patterns that aim to capture the various ways in which resources are represented and utilized in workflows. By delineating these Patterns in a form that is independent of specific workflow technologies and modelling languages, we are able to provide a comprehensive treatment of the resource perspective and we subsequently use these Patterns as the basis for a detailed comparison of a number of commercially available workflow management systems and business process modelling languages.

Book ChapterDOI
17 Jun 2004
TL;DR: This paper introduces the approach, defines metrics, and presents a tool to mine social networks from event logs, combining concepts from workflow management and social network analysis.
Abstract: Increasingly information systems log historic information in a systematic way. Workflow management systems, but also ERP, CRM, SCM, and B2B systems often provide a so-called “event log”, i.e., a log recording the execution of activities. Unfortunately, the information in these event logs is rarely used to analyze the underlying processes. Process mining aims at improving this by providing techniques and tools for discovering process, control, data, organizational, and social structures from event logs. This paper focuses on the mining social networks. This is possible because event logs typically record information about the users executing the activities recorded in the log. To do this we combine concepts from workflow management and social network analysis. This paper introduces the approach, defines metrics, and presents a tool to mine social networks from event logs.

Patent
27 May 2004
TL;DR: In this paper, a set of policy rules are established in connection with these service level objectives, and an update of the configuation of the storage network, such as a provisioning of storage resources for the application, is performed according to a workflow that implements the policy rules, which allows the service-level objectives of the application to be automatically satisfied by the new provisioning.
Abstract: Policy based management of storage resources in a storage network. Service level objectives are associated with storage resource requestors such as applications. A set of policy rules is established in connection with these service level objectives. An update of the configuation of the storage network, such as a provisioning of storage resources for the application, is performed according to a workflow that implements the policy rules, which allows the service level objectives of the application to be automatically satisfied by the new provisioning. Metrics are used to ensure that service level objectives continue to be met.

Proceedings ArticleDOI
05 Jan 2004
TL;DR: This paper discusses the design principles, functionality, and application of the proposed GridAnt workflow manager, an extensible client-side workflow management system, called GridAnt, developed.
Abstract: Process management is an extremely important concept in both business and scientific communities. Several workflow management tools have been proposed in recent years offering advanced functionality in various domains. In the business world, workflow vendors offer commercial and customized solutions targeting specific users. In the scientific world, several open-source workflow management tools are freely available. However they are directed toward service aggregation rather than distributed process management. Little consideration is given to the needs of the client in terms of mapping the process flow of the client. In the grid community it is essential that the grid users have such a tool available enabling them to orchestrate complex work-flows on the fly without substantial help from the service providers. At the same time it is important that the grid user not be burdened with the intricacies of the workflow system. With the perspective of the grid user in mind, an extensible client-side workflow management system, called GridAnt, has been developed. This paper discusses the design principles, functionality, and application of the proposed GridAnt workflow manager.

Proceedings Article
01 Jan 2004
TL;DR: This paper identifies and justifies the importance of data modelling in overall workflows specification and verification, and illustrates and defines several potential data flow problems that, if not detected prior to workflow deployment may prevent the process from correct execution, execute process on inconsistent data or even lead to process suspension.
Abstract: A complete workflow specification requires careful integration of many different process characteristics. Decisions must be made as to the definitions of individual activities, their scope, the order of execution that maintains the overall business process logic, the rules governing the discipline of work list scheduling to performers, identification of time constraints and more. The goal of this paper is to address an important issue in workflows modelling and specification, which is data flow, its modelling, specification and validation. Researchers have neglected this dimension of process analysis for some time, mainly focussing on structural considerations with limited verification checks. In this paper, we identify and justify the importance of data modelling in overall workflows specification and verification. We illustrate and define several potential data flow problems that, if not detected prior to workflow deployment may prevent the process from correct execution, execute process on inconsistent data or even lead to process suspension. A discussion on essential requirements of the workflow data model in order to support data validation is also given.

Journal ArticleDOI
TL;DR: In this paper, the authors give an overview of the organizational aspects of workflow technology in the context of the workflow life cycle, provide a review of existing work, and develop guidelines for the design of a workflow-enabled organization, which can be used by both workflow vendors and users.
Abstract: Business processes automation requires the specification of process structures as well as the definition of resources involved in the execution of these processes. While the modeling of business processes and workflows is well researched, the link between the organizational elements and process activities is less well understood, and current developments in the web services choreography area completely neglect the organizational aspect of workflow applications. The purpose of this paper is to give an overview of the organizational aspects of workflow technology in the context of the workflow life cycle, to provide a review of existing work, and to develop guidelines for the design of a workflow-enabled organization, which can be used by both workflow vendors and users.

Posted Content
01 Jan 2004
TL;DR: Workflow Management as mentioned in this paper is an overview of workflow terminology and organization, as well as detailed coverage of workflow modeling with Petri nets, which facilitates communication between designers and users, and includes case studies, review exercises, and a glossary.
Abstract: This book offers a comprehensive introduction to workflow management, the management of business processes with information technology. By defining, analyzing, and redesigning an organization's resources and operations, workflow management systems ensure that the right information reaches the right person or computer application at the right time. The book provides a basic overview of workflow terminology and organization, as well as detailed coverage of workflow modeling with Petri nets. Because Petri nets make definitions easier to understand for nonexperts, they facilitate communication between designers and users. The book includes a chapter of case studies, review exercises, and a glossary. A special Web site developed by the authors, www.workflowcourse.com, features animation, interactive examples, lecture materials, exercises and solutions, relevant links, and other valuable resources for the classroom.

Proceedings ArticleDOI
06 Jun 2004
TL;DR: Triana as discussed by the authors is an extension to the Triana PSE to facilitate graphical Web service discovery, composition and invocation, which is a part of the GridLab and GridOneD projects and is used in the GEO 600 project.
Abstract: Service composition refers to the aggregation of services to build complex applications to achieve client requirements. It is an important challenge to make it possible for users to construct complex workflows transparently and thereby insulating them from the complexity of interacting with numerous heterogeneous services. We present an extension to the Triana PSE to facilitate graphical Web service discovery, composition and invocation. Our framework has several novel features which distinguish it from other work in this area. First, users can graphically create complex service compositions. Second, Triana allows the user to share the composite service as a BPELAWS graph or expose it as a service in a one-click manner. Third, Triana allows the user to easily carry out "what-if" analysis by altering existing workflows. Fourth, Triana allows the user to record provenance data for a workflow. Finally, our framework allows the user to execute the composed graph on a Grid or P2P network. Triana is a part of the GridLab and GridOneD projects and is used in the GEO 600 project.

Journal ArticleDOI
TL;DR: The paper presents a general and comprehensive correctness criterion for ensuring compliance of in-progress WF instances with a modified WF schema, and which rules and which information are needed at mininum for satisfying this criterion.
Abstract: Process-oriented support of collaborative work is an important challenge today. At first glance, Workflow Management Systems (WfMS) seem to be very suitable tools for realizing team-work processes. However, such processes have to be frequently adapted, e.g., due to process optimizations or when process goals change. Unfortunately, runtime adaptability still seems to be an unsolvable problem for almost all existing WfMS. Usually, process changes can be accomplished by modifying a corresponding (graphical) workflow (WF) schema. Especially for long-running processes, however, it is extremely important that such changes can be propagated to already running WF instances as well, but without causing inconsistencies and errors. The paper presents a general and comprehensive correctness criterion for ensuring compliance of in-progress WF instances with a modified WF schema. For different kinds of WF schema changes, it is precisely stated, which rules and which information are needed at mininum for satisfying this criterion.

Proceedings ArticleDOI
01 Oct 2004
TL;DR: A new code partitioning algorithm is given to partition a BPEL program represented as a program dependence graph, with the goal of minimizing communication costs and maximizing the throughput of multiple concurrent instances of the input program.
Abstract: Distributed enterprise applications today are increasingly being built from services available over the web. A unit of functionality in this framework is a web service, a software application that exposes a set of "typed'' connections that can be accessed over the web using standard protocols. These units can then be composed into a composite web service. BPEL (Business Process Execution Language) is a high-level distributed programming language for creating composite web services. Although a BPEL program invokes services distributed over several servers, the orchestration of these services is typically under centralized control. Because performance and throughput are major concerns in enterprise applications, it is important to remove the inefficiencies introduced by the centralized control. In a distributed, or decentralized orchestration, the BPEL program is partitioned into independent sub-programs that interact with each other without any centralized control. Decentralization can increase parallelism and reduce the amount of network traffic required for an application. This paper presents a technique to partition a composite web service written as a single BPEL program into an equivalent set of decentralized processes. It gives a new code partitioning algorithm to partition a BPEL program represented as a program dependence graph, with the goal of minimizing communication costs and maximizing the throughput of multiple concurrent instances of the input program. In contrast, much of the past work on dependence-based partitioning and scheduling seeks to minimize the completion time of a single instance of a program running in isolation. The paper also gives a cost model to estimate the throughput of a given code partition.

Book ChapterDOI
07 Jun 2004
TL;DR: This paper describes the implementation of a system supporting YAWL (Yet Another Workflow Language), and presents the architecture and functionality of the system and zoom into the control-flow, data, and operational perspectives.
Abstract: This paper describes the implementation of a system supporting YAWL (Yet Another Workflow Language). YAWL is based on a rigorous analysis of existing workflow management systems and related standards using a comprehensive set of workflow patterns. This analysis shows that contemporary workflow systems, relevant standards (e.g. XPDL, BPML, BPEL4WS), and theoretical models such as Petri nets have problems supporting essential patterns. This inspired the development of YAWL by taking Petri nets as a starting point and introducing mechanisms that provide direct support for the workflow patterns identified. As a proof of concept we have developed a workflow management system supporting YAWL. In this paper, we present the architecture and functionality of the system and zoom into the control-flow, data, and operational perspectives.

Journal ArticleDOI
TL;DR: This work describes a process to systematize and make explicit the translation of document-based knowledge into workflow-integrated clinical decision support systems via the Guideline Elements Model.

Journal ArticleDOI
01 Oct 2004
TL;DR: A Petri-Net-based state transition approach that binds states of private workflow tasks to their adjacent workflow view-task and develops a CWA for view-based cross-organisational workflow execution.
Abstract: Interconnecting business processes across systems and organisations is considered to provide significant benefits, such as greater process transparency, higher degrees of integration, facilitation of communication, and consequently higher throughput in a given time interval. However, to achieve these benefits requires tackling constraints. In the context of this paper these are privacy-requirements of the involved workflows and their mutual dependencies. Workflow views are a promising conceptional approach to address the issue of privacy; however this approach requires addressing the issue of interdependencies between workflow view and adjacent private workflow. In this paper we focus on three aspects concerning the support for execution of cross-organisational workflows that have been modelled with a workflow view approach: (i) communication between the entities of a view-based workflow model, (ii) their impact on an extended workflow engine, and (iii) the design of a cross-organisational workflow architecture (CWA). We consider communication aspects in terms of state dependencies and control flow dependencies. We propose to tightly couple private workflow and workflow view with state dependencies, whilst to loosely couple workflow views with control flow dependencies. We introduce a Petri-Net-based state transition approach that binds states of private workflow tasks to their adjacent workflow view-task. On the basis of these communication aspects we develop a CWA for view-based cross-organisational workflow execution. Its concepts are valid for mediated and unmediated interactions and express no choice of a particular technology. The concepts are demonstrated by a scenario, run by two extended workflow management systems.

Journal ArticleDOI
01 Jul 2004
TL;DR: This paper proposes a novel framework to support workflow modeling and design by adapting workflow cases from a repository of process models, based on a structured workflow lifecycle and leverages recent advances in model management and case-based reasoning techniques.
Abstract: In order to support efficient workflow design, recent commercial workflow systems are providing templates of common business processes These templates, called cases, can be modified individually or collectively into a new workflow to meet the business specification However, little research has been done on how to manage workflow models, including issues such as model storage, model retrieval, model reuse and assembly In this paper, we propose a novel framework to support workflow modeling and design by adapting workflow cases from a repository of process models Our approach to workflow model management is based on a structured workflow lifecycle and leverages recent advances in model management and case-based reasoning techniques Our contributions include a conceptual model of workflow cases, a similarity flooding algorithm for workflow case retrieval, and a domain-independent AI planning approach to workflow case composition We illustrate the workflow model management framework with a prototype system called Case-Oriented Design Assistant for Workflow Modeling (CODAW)

Journal ArticleDOI
TL;DR: A new approach to the automatic execution of business processes using event-condition-action (ECA) rules that can be automatically triggered by an active database is proposed.
Abstract: Changes in recent business environments have created the necessity for a more efficient and effective business process management. The workflow management system is software that assists in defining business processes as well as automatically controlling the execution of the processes. We propose a new approach to the automatic execution of business processes using event-condition-action (ECA) rules that can be automatically triggered by an active database. First of all, we propose the concept of blocks that can classify process flows into several patterns. A block is a minimal unit that can specify the behaviors represented in a process model. An algorithm is developed to detect blocks from a process definition network and transform it into a hierarchical tree model. The behaviors in each block type are modeled using ACTA formalism. This provides a theoretical basis from which ECA rules are identified. The proposed ECA rule-based approach shows that it is possible to execute the workflow using the active capability of database without users' intervention. The operation of the proposed methods is illustrated through an example process.

Journal ArticleDOI
Lin Liu1, Eric Yu1
01 Apr 2004
TL;DR: The combined use of a goal-oriented requirements language (GRL) and a scenario-oriented notation Use Case Maps (UCM) for representing design knowledge of information systems is proposed.
Abstract: In order to design a better information system, a designer would like to have notations to visualize how design experts' know-how can be applied according to one's specific social and technology situation. We propose the combined use of a goal-oriented requirements language (GRL) and a scenario-oriented notation Use Case Maps (UCM) for representing design knowledge of information systems. Goal-oriented modelling is used throughout the requirements and design process. In GRL, goals are used to depict business objectives and system requirements, both functional and non-functional. Tasks are used to represent different ways for achieving goals. Means-ends reasoning is used to explore alternative solutions and their operationalizations into implementable system constructs. Social context is modelled in terms of dependency relationships among agents and roles. Scenarios expressed in UCM are used to describe elaborated business processes or workflow. The complementary use of goal-oriented modelling with GRL and scenario modelling with UCM is illustrated with an example of designing a web-based training system.

Journal ArticleDOI
TL;DR: Pegasus, an AI planning system which is integrated into the grid environment that takes a user's highly specified desired results, generates valid workflows that take into account available resources, and submits the workflows for execution on the grid.
Abstract: A key challenge for grid computing is creating large-scale, end-to-end scientific applications that draw from pools of specialized scientific components to derive elaborate new results. We develop Pegasus, an AI planning system which is integrated into the grid environment that takes a user's highly specified desired results, generates valid workflows that take into account available resources, and submits the workflows for execution on the grid. We also begin to extend it as a more distributed and knowledge-rich architecture.