scispace - formally typeset
Search or ask a question
Author

A Arya Adriansyah

Bio: A Arya Adriansyah is an academic researcher from Eindhoven University of Technology. The author has contributed to research in topics: Process mining & Process modeling. The author has an hindex of 14, co-authored 22 publications receiving 2767 citations.

Papers
More filters
Book ChapterDOI
Wil M. P. van der Aalst1, Wil M. P. van der Aalst2, A Arya Adriansyah2, Ana Karla Alves de Medeiros3, Franco Arcieri4, Thomas Baier5, Tobias Blickle6, Jagadeesh Chandra Bose2, Peter van den Brand, Ronald Brandtjen, Joos C. A. M. Buijs2, Andrea Burattin7, Josep Carmona8, Malu Castellanos9, Jan Claes10, Jonathan Cook11, Nicola Costantini, Francisco Curbera12, Ernesto Damiani13, Massimiliano de Leoni2, Pavlos Delias, Boudewijn F. van Dongen2, Marlon Dumas14, Schahram Dustdar15, Dirk Fahland2, Diogo R. Ferreira16, Walid Gaaloul17, Frank van Geffen18, Sukriti Goel19, CW Christian Günther, Antonella Guzzo20, Paul Harmon, Arthur H. M. ter Hofstede1, Arthur H. M. ter Hofstede2, John Hoogland, Jon Espen Ingvaldsen, Koki Kato21, Rudolf Kuhn, Akhil Kumar22, Marcello La Rosa1, Fabrizio Maria Maggi2, Donato Malerba23, RS Ronny Mans2, Alberto Manuel, Martin McCreesh, Paola Mello24, Jan Mendling25, Marco Montali26, Hamid Reza Motahari-Nezhad9, Michael zur Muehlen27, Jorge Munoz-Gama8, Luigi Pontieri28, Joel Ribeiro2, A Anne Rozinat, Hugo Seguel Pérez, Ricardo Seguel Pérez, Marcos Sepúlveda29, Jim Sinur, Pnina Soffer30, Minseok Song31, Alessandro Sperduti7, Giovanni Stilo4, Casper Stoel, Keith D. Swenson21, Maurizio Talamo4, Wei Tan12, Christopher Turner32, Jan Vanthienen33, George Varvaressos, Eric Verbeek2, Marc Verdonk34, Roberto Vigo, Jianmin Wang35, Barbara Weber36, Matthias Weidlich37, Ton Weijters2, Lijie Wen35, Michael Westergaard2, Moe Thandar Wynn1 
01 Jan 2012
TL;DR: This manifesto hopes to serve as a guide for software developers, scientists, consultants, business managers, and end-users to increase the maturity of process mining as a new tool to improve the design, control, and support of operational business processes.
Abstract: Process mining techniques are able to extract knowledge from event logs commonly available in today’s information systems. These techniques provide new means to discover, monitor, and improve processes in a variety of application domains. There are two main drivers for the growing interest in process mining. On the one hand, more and more events are being recorded, thus, providing detailed information about the history of processes. On the other hand, there is a need to improve and support business processes in competitive and rapidly changing environments. This manifesto is created by the IEEE Task Force on Process Mining and aims to promote the topic of process mining. Moreover, by defining a set of guiding principles and listing important challenges, this manifesto hopes to serve as a guide for software developers, scientists, consultants, business managers, and end-users. The goal is to increase the maturity of process mining as a new tool to improve the (re)design, control, and support of operational business processes.

1,135 citations

Journal ArticleDOI
TL;DR: In this paper, the authors focus on the importance of maintaining a proper alignment between event logs and process models and elaborate on the realization of such alignments and their application to conformance checking and performance analysis.
Abstract: Process mining techniques use event data to discover process models, to check the conformance of predefined process models, and to extend such models with information about bottlenecks, decisions, and resource usage. These techniques are driven by observed events rather than hand-made models. Event logs are used to learn and enrich process models. By replaying history using the model, it is possible to establish a precise relationship between events and model elements. This relationship can be used to check conformance and to analyze performance. For example, it is possible to diagnose deviations from the modeled behavior. The severity of each deviation can be quantified. Moreover, the relationship established during replay and the timestamps in the event log can be combined to show bottlenecks. These examples illustrate the importance of maintaining a proper alignment between event log and process model. Therefore, we elaborate on the realization of such alignments and their application to conformance checking and performance analysis. © 2012 Wiley Periodicals, Inc. © 2012 Wiley Periodicals, Inc.

632 citations

Proceedings ArticleDOI
29 Aug 2011
TL;DR: A robust replay analysis technique is presented that is able to measure the conformance of an event log for a given process model and quantifies conformance and provides intuitive diagnostics (skipped and inserted activities).
Abstract: The growing complexity of processes in many organizations stimulates the adoption of business process analysis techniques. Typically, such techniques are based on process models and assume that the operational processes in reality conform to these models. However, experience shows that reality often deviates from hand-made models. Therefore, the problem of checking to what extent the operational process conforms to the process model is important for process management, process improvement, and compliance. In this paper, we present a robust replay analysis technique that is able to measure the conformance of an event log for a given process model. The approach quantifies conformance and provides intuitive diagnostics (skipped and inserted activities). Our technique has been implemented in the ProM 6framework. Comparative evaluations show that the approach overcomes many of the limitations of existing conformance checking techniques.

376 citations

DOI
01 Jan 2014
TL;DR: In this article, a memory-efficient technique to compute alignments between event logs and process models has been developed, where low-level deviations, i.e., observed activities that are not allowed according to the model and the other way around, are explicitly identified.
Abstract: Aligning Observed and Modeled Behavior The availability of process models and event logs is rapidly increasing as more and more business processes are supported by IT. On the one hand, most organizations make substantial efforts to document their processes, while on the other hand, these processes leave footprints in their information systems. Although it is possible to extract event logs from today’s systems, the relation between event logs and process models is often identified using heuristics that may yield misleading insights. In this thesis, techniques to align event logs and process models are explored. Based on the obtained alignments, various analysis techniques are developed. The techniques are evaluated against both artificial and real-life process models and event logs. A memory-efficient technique to compute alignments between event logs and process models has been developed. Given an event log and a process model, low-level deviations, i.e., observed activities that are not allowed according to the model and the other way around, are explicitly identified. The technique can also be used to identify high-level deviations such as swapped and replaced activities. Our technique is applied to problems occurring in different domains. Unlike earlier approaches, alignment-based conformance checking techniques are shown to be robust against peculiarities of process models, such as duplicate and invisible tasks. Alignmentbased conformance metrics, such as fitness and precision, are shown to be more intuitive and can deal with multiple level of noise in event logs. Various visualizations of alignments provide powerful diagnostics to identify the context of frequently occurring deviations between process executions and prescribed process models. Applying data mining techniques to alignments yields root causes of deviations between the observed behavior in an event log and the modeled behavior in a process model. Alignments also improve the robustness of performance measurements based on event logs and process models, even if the logs are deviating from the models. From a computational point of view, computing alignments is extremely expensive. However, the obtained results indicate that alignments not only provide a theoretically solid basis for analysis based on both models and process executions, but are also able to handle problems of real-life complexity.

214 citations

Book ChapterDOI
03 Sep 2012
TL;DR: An approach to measure the precision of a process model with respect to an event log is proposed, which first align model and log thus making the approach more robust, even in case of deviations.
Abstract: Most organizations have process models describing how cases need to be handled. In fact, legislation and standardization (cf. the Sarbanes-Oxley Act, the Basel II Accord, and the ISO 9000 family of standards) are forcing organizations to document their processes. These processes are often not enforced by information systems. However, torrents of event data are recorded by today’s information systems. These recorded events reflect how processes are really executed. Often reality deviates from the modeled behavior. Therefore, measuring the extent process executions conform to a predefined process model is increasingly important. In this paper, we propose an approach to measure the precision of a process model with respect to an event log. Unlike earlier approaches, we first align model and log thus making our approach more robust, even in case of deviations. The approach has been implemented in the ProM 6 tool and evaluated using both artificial and real life cases.

144 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.
Abstract: Machine Learning is the study of methods for programming computers to learn. Computers are applied to a wide range of tasks, and for most of these it is relatively easy for programmers to design and implement the necessary software. However, there are many tasks for which this is difficult or impossible. These can be divided into four general categories. First, there are problems for which there exist no human experts. For example, in modern automated manufacturing facilities, there is a need to predict machine failures before they occur by analyzing sensor readings. Because the machines are new, there are no human experts who can be interviewed by a programmer to provide the knowledge necessary to build a computer system. A machine learning system can study recorded data and subsequent machine failures and learn prediction rules. Second, there are problems where human experts exist, but where they are unable to explain their expertise. This is the case in many perceptual tasks, such as speech recognition, hand-writing recognition, and natural language understanding. Virtually all humans exhibit expert-level abilities on these tasks, but none of them can describe the detailed steps that they follow as they perform them. Fortunately, humans can provide machines with examples of the inputs and correct outputs for these tasks, so machine learning algorithms can learn to map the inputs to the outputs. Third, there are problems where phenomena are changing rapidly. In finance, for example, people would like to predict the future behavior of the stock market, of consumer purchases, or of exchange rates. These behaviors change frequently, so that even if a programmer could construct a good predictive computer program, it would need to be rewritten frequently. A learning program can relieve the programmer of this burden by constantly modifying and tuning a set of learned prediction rules. Fourth, there are applications that need to be customized for each computer user separately. Consider, for example, a program to filter unwanted electronic mail messages. Different users will need different filters. It is unreasonable to expect each user to program his or her own rules, and it is infeasible to provide every user with a software engineer to keep the rules up-to-date. A machine learning system can learn which mail messages the user rejects and maintain the filtering rules automatically. Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis. Statistics focuses on understanding the phenomena that have generated the data, often with the goal of testing different hypotheses about those phenomena. Data mining seeks to find patterns in the data that are understandable by people. Psychological studies of human learning aspire to understand the mechanisms underlying the various learning behaviors exhibited by people (concept learning, skill acquisition, strategy change, etc.).

13,246 citations

01 Jan 2002

9,314 citations

Journal ArticleDOI
TL;DR: The practical relevance of BPM and rapid developments over the last decade justify a comprehensive survey and an overview of the state-of-the-art in BPM.
Abstract: Business Process Management (BPM) research resulted in a plethora of methods, techniques, and tools to support the design, enactment, management, and analysis of operational business processes. This survey aims to structure these results and provide an overview of the state-of-the-art in BPM. In BPM the concept of a process model is fundamental. Process models may be used to configure information systems, but may also be used to analyze, understand, and improve the processes they describe. Hence, the introduction of BPM technology has both managerial and technical ramifications and may enable significant productivity improvements, cost savings, and flow-time reductions. The practical relevance of BPM and rapid developments over the last decade justify a comprehensive survey.

739 citations

Journal ArticleDOI
TL;DR: In this paper, the authors focus on the importance of maintaining a proper alignment between event logs and process models and elaborate on the realization of such alignments and their application to conformance checking and performance analysis.
Abstract: Process mining techniques use event data to discover process models, to check the conformance of predefined process models, and to extend such models with information about bottlenecks, decisions, and resource usage. These techniques are driven by observed events rather than hand-made models. Event logs are used to learn and enrich process models. By replaying history using the model, it is possible to establish a precise relationship between events and model elements. This relationship can be used to check conformance and to analyze performance. For example, it is possible to diagnose deviations from the modeled behavior. The severity of each deviation can be quantified. Moreover, the relationship established during replay and the timestamps in the event log can be combined to show bottlenecks. These examples illustrate the importance of maintaining a proper alignment between event log and process model. Therefore, we elaborate on the realization of such alignments and their application to conformance checking and performance analysis. © 2012 Wiley Periodicals, Inc. © 2012 Wiley Periodicals, Inc.

632 citations

Proceedings ArticleDOI
11 Apr 2011
TL;DR: A new process representation language is presented in combination with an accompanying process mining algorithm that results in easy to understand process models even in the case of non-trivial constructs, low structured domains and the presence of noise.
Abstract: One of the aims of process mining is to retrieve a process model from a given event log. However, current techniques have problems when mining processes that contain nontrivial constructs, processes that are low structured and/or dealing with the presence of noise in the event logs. To overcome these problems, a new process representation language is presented in combination with an accompanying process mining algorithm. The most significant property of the new representation language is in the way the semantics of splits and joins are represented; by using so-called split/join frequency tables. This results in easy to understand process models even in the case of non-trivial constructs, low structured domains and the presence of noise. This paper explains the new process representation language and how the mining algorithm works. The algorithm is implemented as a plug-in in the ProM framework. An illustrative example with noise and a real life log of a complex and low structured process are used to explicate the presented approach.

474 citations