scispace - formally typeset
Search or ask a question

Showing papers on "Process modeling published in 2017"


Journal ArticleDOI
TL;DR: This survey draws up a systematic inventory of approaches to customizable process modeling and provides a comparative evaluation with the aim of identifying common and differentiating modeling features, providing criteria for selecting among multiple approaches, and identifying gaps in the state of the art.
Abstract: It is common for organizations to maintain multiple variants of a given business process, such as multiple sales processes for different products or multiple bookkeeping processes for different countries. Conventional business process modeling languages do not explicitly support the representation of such families of process variants. This gap triggered significant research efforts over the past decade, leading to an array of approaches to business process variability modeling. In general, each of these approaches extends a conventional process modeling language with constructs to capture customizable process models. A customizable process model represents a family of process variants in a way that a model of each variant can be derived by adding or deleting fragments according to customization options or according to a domain model. This survey draws up a systematic inventory of approaches to customizable process modeling and provides a comparative evaluation with the aim of identifying common and differentiating modeling features, providing criteria for selecting among multiple approaches, and identifying gaps in the state of the art. The survey puts into evidence an abundance of customizable process-modeling languages, which contrasts with a relative scarcity of available tool support and empirical comparative evaluations.

358 citations


Journal ArticleDOI
01 Aug 2017
TL;DR: This paper describes an application of deep learning with recurrent neural networks to the problem of predicting the next event in a business process, and shows results that surpass the state-of-the-art in prediction precision.
Abstract: Predicting business process behaviour is an important aspect of business process management. Motivated by research in natural language processing, this paper describes an application of deep learning with recurrent neural networks to the problem of predicting the next event in a business process. This is both a novel method in process prediction, which has largely relied on explicit process models, and also a novel application of deep learning methods. The approach is evaluated on two real datasets and our results surpass the state-of-the-art in prediction precision.

319 citations


Journal ArticleDOI
TL;DR: The proposed technique is evaluated in detail and it is shown that its application in conjunction with certain existing process discovery algorithms significantly improves the quality of the discovered process models and that it scales well to large datasets.
Abstract: In the era of “big data”, one of the key challenges is to analyze large amounts of data collected in meaningful and scalable ways. The field of process mining is concerned with the analysis of data that is of a particular nature, namely data that results from the execution of business processes. The analysis of such data can be negatively influenced by the presence of outliers, which reflect infrequent behavior or “noise”. In process discovery, where the objective is to automatically extract a process model from the data, this may result in rarely travelled pathways that clutter the process model. This paper presents an automated technique to the removal of infrequent behavior from event logs. The proposed technique is evaluated in detail and it is shown that its application in conjunction with certain existing process discovery algorithms significantly improves the quality of the discovered process models and that it scales well to large datasets.

139 citations


Journal ArticleDOI
TL;DR: In this article, the uniqueness of seminal parametric design concepts and their impact on models of parametric Design Thinking (PDT) are examined through review of key texts and theoretical concepts from early cognitive models up to current models.

136 citations


Journal ArticleDOI
TL;DR: The simulation results showed that the proposed model represented the timestamp data acquired by IoT and captured the entire production process, thus enabling the determination of real-time performance indicators.
Abstract: To cope with large fluctuations in the demand of a commodity, it is necessary for the manufacturing system to have rapid reactive ability. This requirement may be secured by performance measurement. Although manufacturing companies have used information systems to manage performance, there has been the difficulty of capturing real-time data to depict real situations. The recent development and application of the Internet of Things (IoT) has enabled the resolution of this problem. In demonstration of the functionality of IoT, we developed an IoT-based performance model consistent with the ISA-95 and ISO-22400 standards, which define manufacturing processes and performance indicator formulas. The development comprised three steps: (1) Selection of the Key Performance Indicators of the Overall Equipment Effectiveness (OEE), and the development of an IoT-based production performance model, (2) Implementation of the IoT-based architecture and performance measurement process using Business Process Modelling and...

133 citations


Posted Content
TL;DR: In this paper, a systematic review and comparative evaluation of automated process discovery methods, using an open-source benchmark and covering twelve publicly-available real-life event logs, twelve proprietary real life event logs and nine quality metrics, is presented.
Abstract: Process mining allows analysts to exploit logs of historical executions of business processes to extract insights regarding the actual performance of these processes. One of the most widely studied process mining operations is automated process discovery. An automated process discovery method takes as input an event log, and produces as output a business process model that captures the control-flow relations between tasks that are observed in or implied by the event log. Various automated process discovery methods have been proposed in the past two decades, striking different tradeoffs between scalability, accuracy and complexity of the resulting models. However, these methods have been evaluated in an ad-hoc manner, employing different datasets, experimental setups, evaluation measures and baselines, often leading to incomparable conclusions and sometimes unreproducible results due to the use of closed datasets. This article provides a systematic review and comparative evaluation of automated process discovery methods, using an open-source benchmark and covering twelve publicly-available real-life event logs, twelve proprietary real-life event logs, and nine quality metrics. The results highlight gaps and unexplored tradeoffs in the field, including the lack of scalability of some methods and a strong divergence in their performance with respect to the different quality metrics used.

127 citations


Journal ArticleDOI
TL;DR: In this paper, a review of the road to a successful quality by design and dynamic control implementation is discussed, with a major focus on the preconditions for the application of model predictive control for mammalian cell culture bioprocesses.
Abstract: The industrial production of complex biopharmaceuticals using recombinant mammalian cell lines is still mainly built on a quality by testing approach, which is represented by fixed process conditions and extensive testing of the end-product. In 2004 the FDA launched the process analytical technology initiative, aiming to guide the industry towards advanced process monitoring and better understanding of how critical process parameters affect the critical quality attributes. Implementation of process analytical technology into the bio-production process enables moving from the quality by testing to a more flexible quality by design approach. The application of advanced sensor systems in combination with mathematical modelling techniques offers enhanced process understanding, allows on-line prediction of critical quality attributes and subsequently real-time product quality control. In this review opportunities and unsolved issues on the road to a successful quality by design and dynamic control implementation are discussed. A major focus is directed on the preconditions for the application of model predictive control for mammalian cell culture bioprocesses. Design of experiments providing information about the process dynamics upon parameter change, dynamic process models, on-line process state predictions and powerful software environments seem to be a prerequisite for quality by control realization.

113 citations


Journal ArticleDOI
02 Jan 2017
TL;DR: The article provides information systems researchers with an overview of the empirical state of the art of process model comprehension and provides recommendations for new research questions to be addressed and methods to be used in future experiments.
Abstract: Visual process models are meant to facilitate comprehension of business processes. However, in practice, process models can be difficult to understand. The main goal of this article is to clarify the sources of cognitive effort in comprehending process models. The article undertakes a comprehensive descriptive review of empirical and theoretical work in order to categorize and summarize systematically existing findings on the factors that influence comprehension of visual process models. Methodologically, the article builds on a review of forty empirical studies that measure objective comprehension of process models, seven studies that measure subjective comprehension and user preferences, and thirty-two articles that discuss the factors that influence the comprehension of process models. The article provides information systems researchers with an overview of the empirical state of the art of process model comprehension and provides recommendations for new research questions to be addressed and methods to be used in future experiments.

111 citations


Journal ArticleDOI
TL;DR: In this article, the authors describe and justify robust control flow conversion algorithms, which provide the basis for more advanced BPMN-based discovery and conformance checking algorithms, such as Petri nets, causal nets and process trees.
Abstract: Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining techniques used to discover process models from event logs, find log and model deviations, and analyze performance characteristics of processes. The representational bias (a way to model processes) plays an important role in process mining. The BPMN 2.0 (Business Process Model and Notation) standard is widely used and allows to build conventional and understandable process models. In addition to the flat control flow perspective, subprocesses, data flows, resources can be integrated within one BPMN diagram. This makes BPMN very attractive for both process miners and business users, since the control flow perspective can be integrated with data and resource perspectives discovered from event logs. In this paper, we describe and justify robust control flow conversion algorithms, which provide the basis for more advanced BPMN-based discovery and conformance checking algorithms. Thus, on the basis of these conversion algorithms low-level models (such as Petri nets, causal nets and process trees) discovered from event logs using existing approaches can be represented in terms of BPMN. Moreover, we establish behavioral relations between Petri nets and BPMN models and use them to adopt existing conformance checking and performance analysis techniques in order to visualize conformance and performance information within a BPMN diagram. We believe that the results presented in this paper can be used for a wide variety of BPMN mining and conformance checking algorithms. We also provide metrics for the processes discovered before and after the conversion to BPMN structures. Cases for which conversion algorithms produce more compact or more complicated BPMN models in comparison with the initial models are identified.

84 citations


Journal ArticleDOI
TL;DR: Simulation results on the plant-wide Tennessee Eastman process show that the distributed Bayesian network approach can be feasible for modeling large-scale process and provides informative multi-level reference results for further diagnosis and isolation.

72 citations


Journal ArticleDOI
TL;DR: This work proposes an approach to automatically identify inconsistencies between a process model and a corresponding textual description and demonstrates the applicability of the approach on real-life data.

Journal ArticleDOI
TL;DR: A new sociotechnical process modeling method to describe and evaluate processes, using the SEIPS model as the conceptual framework, is proposed, which produces a process map and supplementary table, which identify work system barriers and facilitators.

Journal ArticleDOI
01 Aug 2017
TL;DR: In this article, the authors propose a framework for devising process querying methods, i.e., techniques for the (automated) management of repositories of designed and executed processes, as well as models that describe relationships between processes.
Abstract: The volume of process-related data is growing rapidly: more and more business operations are being supported and monitored by information systems. Industry 4.0 and the corresponding industrial Internet of Things are about to generate new waves of process-related data, next to the abundance of event data already present in enterprise systems. However, organizations often fail to convert such data into strategic and tactical intelligence. This is due to the lack of dedicated technologies that are tailored to effectively manage the information on processes encoded in process models and process execution records. Process-related information is a core organizational asset which requires dedicated analytics to unlock its full potential. This paper proposes a framework for devising process querying methods, i.e., techniques for the (automated) management of repositories of designed and executed processes, as well as models that describe relationships between processes. The framework is composed of generic components that can be configured to create a range of process querying methods. The motivation for the framework stems from use cases in the field of Business Process Management. The design of the framework is informed by and validated via a systematic literature review. The framework structures the state of the art and points to gaps in existing research. Process querying methods need to address these gaps to better support strategic decision-making and provide the next generation of Business Intelligence platforms.

Journal ArticleDOI
TL;DR: An effective modeling of the cataract intervention is possible using the combination of BPM and ACM, which gives the possibility to depict complex processes with complex decisions and allows a significant advantage for modeling perioperative processes.
Abstract: Medical processes can be modeled using different methods and notations Currently used modeling systems like Business Process Model and Notation (BPMN) are not capable of describing the highly flexible and variable medical processes in sufficient detail We combined two modeling systems, Business Process Management (BPM) and Adaptive Case Management (ACM), to be able to model non-deterministic medical processes We used the new Standards Case Management Model and Notation (CMMN) and Decision Management Notation (DMN) First, we explain how CMMN, DMN and BPMN could be used to model non-deterministic medical processes We applied this methodology to model 79 cataract operations provided by University Hospital Leipzig, Germany, and four cataract operations provided by University Eye Hospital Tuebingen, Germany Our model consists of 85 tasks and about 20 decisions in BPMN We were able to expand the system with more complex situations that might appear during an intervention An effective modeling of the cataract intervention is possible using the combination of BPM and ACM The combination gives the possibility to depict complex processes with complex decisions This combination allows a significant advantage for modeling perioperative processes

Journal ArticleDOI
TL;DR: This paper provides a novel event window-based encoding and generates a set of decision rules for the run-time prediction of process indicators according to event log properties that can be interpreted by users to extract further insight of the business processes while keeping a high level of accuracy.
Abstract: Predictive monitoring of business processes is a challenging topic of process mining which is concerned with the prediction of process indicators of running process instances. The main value of predictive monitoring is to provide information in order to take proactive and corrective actions to improve process performance and mitigate risks in real time. In this paper, we present an approach for predictive monitoring based on the use of evolutionary algorithms. Our method provides a novel event window-based encoding and generates a set of decision rules for the run-time prediction of process indicators according to event log properties. These rules can be interpreted by users to extract further insight of the business processes while keeping a high level of accuracy. Furthermore, a full software stack consisting of a tool to support the training phase and a framework that enables the integration of run-time predictions with business process management systems, has been developed. Obtained results show the validity of our proposal for two large real-life datasets: BPI Challenge 2013 and IT Department of Andalusian Health Service (SAS).

Journal ArticleDOI
TL;DR: This work investigates the insights and understanding which can be deduced from predictive process models for the product quality of a monoclonal antibody based on designed high‐throughput cell culture experiments performed at milliliter (ambr‐15®) scale.
Abstract: This work investigates the insights and understanding which can be deduced from predictive process models for the product quality of a monoclonal antibody based on designed high-throughput cell culture experiments performed at milliliter (ambr-15® ) scale The investigated process conditions include various media supplements as well as pH and temperature shifts applied during the process First, principal component analysis (PCA) is used to show the strong correlation characteristics among the product quality attributes including aggregates, fragments, charge variants, and glycans Then, partial least square regression (PLS1 and PLS2) is applied to predict the product quality variables based on process information (one by one or simultaneously) The comparison of those two modeling techniques shows that a single (PLS2) model is capable of revealing the interrelationship of the process characteristics to the large set product quality variables In order to show the dynamic evolution of the process predictability separate models are defined at different time points showing that several product quality attributes are mainly driven by the media composition and, hence, can be decently predicted from early on in the process, while others are strongly affected by process parameter changes during the process Finally, by coupling the PLS2 models with a genetic algorithm first the model performance can be further improved and, most importantly, the interpretation of the large-dimensioned process-product-interrelationship can be significantly simplified The generally applicable toolset presented in this case study provides a solid basis for decision making and process optimization throughout process development © 2017 American Institute of Chemical Engineers Biotechnol Prog, 33:1368-1380, 2017

Journal ArticleDOI
TL;DR: A system that utilizes process recommendation technology to help design new business processes from scratch in an efficient and accurate way and results show that the proposed approaches outperform them in terms of accuracy and efficiency.
Abstract: This paper presents a system that utilizes process recommendation technology to help design new business processes from scratch in an efficient and accurate way. The proposed system consists of two phases: 1) offline mining and 2) online recommendation. At the first phase, it mines relations among activity nodes from existing processes in repository, and then stores the extracted relations as patterns in a database. At the second phase, it compares the new process under construction with the premined patterns, and recommends proper activity nodes of the most matching patterns to help build a new process. Specifically, there are three different online recommendation strategies in this system. Experiments on both real and synthetic datasets are conducted to compare the proposed approaches with the other state-of-the-art ones, and the results show that the proposed approaches outperform them in terms of accuracy and efficiency.

Proceedings ArticleDOI
01 Jul 2017
TL;DR: A multi-stage deep learning approach is proposed which formulates the next business process event prediction problem as a classification problem and applies deep feedforward multilayer neural networks after extracting features with feature hashing and deep stacked autoencoders.
Abstract: The ability to proactively monitor business processes is one of the main differentiators for firms to remain competitive. Process execution logs generated by Process Aware Information Systems (PAIS) help to make various business process specific predictions. This enables a proactive situational awareness related to the execution of business processes. The goal of the approach proposed in the current paper is to predict the next business process event, considering the past activities in the running process instance, based on the execution log data from previously completed process instances. By predicting the business process events, companies can initiate timely interventions to address undesired deviations from the desired workflow. In our study, we propose a multi-stage deep learning approach which formulates the next business process event prediction problem as a classification problem and applies deep feedforward multilayer neural networks after extracting features with feature hashing and deep stacked autoencoders. The experiments conducted on a variety of business process log datasets reveal that the proposed multi-stage deep learning approach provides promising results. The results are compared against existing deep recurrent neural networks and other approaches as well.

Journal ArticleDOI
TL;DR: In this paper, a process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters.
Abstract: A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods with variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. For demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.

Proceedings ArticleDOI
01 Jun 2017
TL;DR: A semantic framework for developing IoT-aware business processes as follows is proposed, formalizing IoT resource description w.r.t Internet of Things Architecture (IoT-A) reference model in context of business processes, formalized IoT properties and allocation rules for optimal resource management and resolving resource conflicts based on strategies.
Abstract: The proliferation of connected devices, wherein Sensors, Actuators and Tags (such as Radio-Frequency Identification (RFID)) are able to seamlessly communicate to their environment and to each other for sharing information or to perform some actions has created the Internet of Things (IoT) ecosystem. These devices expose their functionality via standard services and application programming interfaces (APIs). They are considered to be one of the key technology enablers to foster the vision of a smart world, comprising of smart objects, smart supply chain management, smart manufacturing (Industry 4.0), smart buildings, to name a few. In fact, today these IoT devices continuously take part in various business processes that are being executing within the boundaries of the same enterprise or in different enterprises. Thus, there is an evident need to model these processes that are associated with IoT resources in a formal and unambiguous manner. However, in context of business processes, these is a lack of formalized and explicit description for IoT resources, thus hampering their efficient modeling and management. To bridge this gap, we propose a semantic framework for developing IoT-aware business processes as follows, (i) formalizing IoT resource description w.r.t Internet of Things Architecture (IoT-A) reference model in context of business processes, (ii) formalizing IoT properties and allocation rules for optimal resource management and (iii) resolving resource conflicts based on strategies. To illustrate the feasibility of our framework, we evaluated our semantic model for coverage of concepts in IoT-A reference model along with development of a proof of concept tool for integrating the IoT resources and our semantic model during the process modeling phase.

Journal ArticleDOI
TL;DR: It is proposed that model-based methods be applied throughout the lifecycle of a biopharmaceutical process, starting with the set-up of a process model, which is used for monitoring and control of process parameters, and ending with continuous and iterative process improvement via data mining techniques.
Abstract: Model-based methods are increasingly used in all areas of biopharmaceutical process technology. They can be applied in the field of experimental design, process characterization, process design, monitoring and control. Benefits of these methods are lower experimental effort, process transparency, clear rationality behind decisions and increased process robustness. The possibility of applying methods adopted from different scientific domains accelerates this trend further. In addition, model-based methods can help to implement regulatory requirements as suggested by recent Quality by Design and validation initiatives. The aim of this review is to give an overview of the state of the art of model-based methods, their applications, further challenges and possible solutions in the biopharmaceutical process life cycle. Today, despite these advantages, the potential of model-based methods is still not fully exhausted in bioprocess technology. This is due to a lack of (i) acceptance of the users, (ii) user-friendly tools provided by existing methods, (iii) implementation in existing process control systems and (iv) clear workflows to set up specific process models. We propose that model-based methods be applied throughout the lifecycle of a biopharmaceutical process, starting with the set-up of a process model, which is used for monitoring and control of process parameters, and ending with continuous and iterative process improvement via data mining techniques.


Journal ArticleDOI
TL;DR: An overview of the state-of-the-art regarding business process model similarity measures is provided and aims at analyzing which similarity measures exist, how they are characterized, and what kind of calculations are typically applied to determine similarity values.
Abstract: Business process models play an important role in today’s enterprises, hence, model repositories may contain hundreds of models. These models are, for example, reused during process modeling activities or utilized to check the conformance of processes with legal regulations. With respect to the amount of models, such applications benefit from or even require detailed insights into the correspondences between process models or between process models’ nodes. Therefore, various process similarity and matching measures have been proposed during the past few years. This article provides an overview of the state-of-the-art regarding business process model similarity measures and aims at analyzing which similarity measures exist, how they are characterized, and what kind of calculations are typically applied to determine similarity values. Finally, the analysis of 123 similarity measures results in the suggestions to conduct further comparative analyses of similarity measures, to investigate the integration of human input into similarity measurement, and to further analyze the requirements of similarity measurement usage scenarios as future research opportunities.

Journal ArticleDOI
01 Dec 2017
TL;DR: A business process assessment framework focused on the process redesign lifecycle phase and tightly coupled with process mining as an operational framework to calculate indicators to assess whether process redesign best practices have been applied and to what extent.
Abstract: The management of business processes in modern times is rapidly shifting towards being evidence-based. Business process evaluation indicators tend to focus on process performance only, neglecting the definition of indicators to evaluate other concerns of interest in different phases of the business process lifecycle. Moreover, they usually do not discuss specifically which data must be collected to calculate indicators and whether collecting these data is feasible or not. This paper proposes a business process assessment framework focused on the process redesign lifecycle phase and tightly coupled with process mining as an operational framework to calculate indicators. The framework includes process performance indicators and indicators to assess whether process redesign best practices have been applied and to what extent. Both sets of indicators can be calculated using standard process mining functionality. This, implicitly, also defines what data must be collected during process execution to enable their calculation. The framework is evaluated through case studies and a thorough comparison against other approaches in the literature. This paper presents a methodology to assess the effects accrued by BPR initiatives.The methodology provides evaluation measures to identify implementation and assess improvements.The methodology defines how evaluation measures are calculated using process mining.A couple of case studies demonstrates the applicability of the proposed methodology.

Book ChapterDOI
23 Oct 2017
TL;DR: In this article, a log is converted into a deterministic automaton in a lossless manner, and the input process model is converted to another minimal automaton, and a minimal error-correcting synchronized product of both automata is calculated using an A* heuristic.
Abstract: Given a process model representing the expected behavior of a business process and an event log recording its actual execution, the problem of business process conformance checking is that of detecting and describing the differences between the process model and the log. A desirable feature is to produce a minimal yet complete set of behavioral differences. Existing conformance checking techniques that achieve these properties do not scale up to real-life process models and logs. This paper presents an approach that addresses this shortcoming by exploiting automata-based techniques. A log is converted into a deterministic automaton in a lossless manner, the input process model is converted into another minimal automaton, and a minimal error-correcting synchronized product of both automata is calculated using an A* heuristic. The resulting automaton is used to extract alignments between traces of the model and traces of the log, or statements describing behavior observed in the log but not captured in the model. An evaluation on synthetic and real-life models and logs shows that the proposed approach outperforms a state-of-the-art method for complete conformance checking.

Journal ArticleDOI
TL;DR: By taking advantage of the structural and behavioral features of process models, an efficient approach which leverages effective heuristics and trace replaying to significantly reduce the overall search space for seeking the optimal alignment is presented.
Abstract: The aligning of event logs with process models is of great significance for process mining to enable conformance checking, process enhancement, performance analysis, and trace repairing. Since process models are increasingly complex and event logs may deviate from process models by exhibiting redundant, missing, and dislocated events, it is challenging to determine the optimal alignment for each event sequence in the log, as this problem is NP-hard. Existing approaches utilize the cost-based A* algorithm to address this problem. However, scalability is often not considered, which is especially important when dealing with industrial-sized problems. In this paper, by taking advantage of the structural and behavioral features of process models, we present an efficient approach which leverages effective heuristics and trace replaying to significantly reduce the overall search space for seeking the optimal alignment. We employ real-world business processes and their traces to evaluate the proposed approach. Experimental results demonstrate that our approach works well in most cases, and that it outperforms the state-of-the-art approach by up to 5 orders of magnitude in runtime efficiency.

Journal ArticleDOI
TL;DR: In this paper, a process reengineering ontology-based knowledge map methodology (PROM) is proposed to reduce the failure ratio, solve BPR problems, and overcome their difficulties.

Proceedings ArticleDOI
30 Mar 2017
TL;DR: A method to detect process drifts by performing statistical tests on graph metrics calculated from discovered process models using process models to answer the question which changes were made to the process.
Abstract: Work in organisations is often structured into business processes, implemented using process-aware information systems (PAISs). These systems aim to enforce employees to perform work in a certain way, executing tasks in a specified order. However, the execution strategy may change over time, leading to expected and unexpected changes in the overall process. Especially the unexpected changes may manifest without notice, which can have a big impact on the performance, costs, and compliance. Thus it is important to detect these hidden changes early in order to prevent monetary consequences. Traditional process mining techniques are unable to identify these execution changes because they usually generalise without considering time as an extra dimension, and assume stable processes. Most algorithms only produce a single process model, reflecting the behaviour of the complete analysis scope. Small changes cannot be identified as they only occur in a small part of the event log. This paper proposes a method to detect process drifts by performing statistical tests on graph metrics calculated from discovered process models. Using process models allows to additionally gather details about the structure of the drift to answer the question which changes were made to the process.

Book ChapterDOI
21 Jun 2017
TL;DR: The analysis of models of processes can be used to test if a given process complies with specifications, and for these capabilities, Process Mining is gaining importance and attention in healthcare.
Abstract: Process Mining is an emerging discipline investigating tasks related with the automated identification of process models, given real-world data (Process Discovery). The analysis of such models can provide useful insights to domain experts. In addition, models of processes can be used to test if a given process complies (Conformance Checking) with specifications. For these capabilities, Process Mining is gaining importance and attention in healthcare.

Journal Article
TL;DR: A method for checking the conformance between an event log capturing the actual execution of a business process, and a model capturing its expected or normative execution using a unified representation of process models and event logs based on a well-known model of concurrency.
Abstract: This article presents a method for checking the conformance between an event log capturing the actual execution of a business process, and a model capturing its expected or normative execution. Given a business process model and an event log, the method returns a set of statements in natural language describing the behavior allowed by the process model but not observed in the log and vice versa. The method relies on a unified representation of process models and event logs based on a well-known model of concurrency, namely event structures. Specifically, the problem of conformance checking is approached by folding the input event log into an event structure, unfolding the process model into another event structure, and comparing the two event structures via an error-correcting synchronized product. Each behavioral difference detected in the synchronized product is then verbalized as a natural language statement. An empirical evaluation shows that the proposed method scales up to real-life datasets while producing more concise and higher-level difference descriptions than state-of-the-art conformance checking methods.