scispace - formally typeset
Search or ask a question

Showing papers on "Process modeling published in 2020"


Journal ArticleDOI
TL;DR: A comprehensive review of the design and modeling aspects of key aerospace composites manufacturing techniques using the X-ray computed tomography (XCT) procedure is presented in this paper, with a view to adopt several modeling approaches ranging from mesoscale to macroscale models for composites processing.

81 citations


Journal ArticleDOI
TL;DR: Challenges in modeling due to specific assumptions and limitations will be highlighted to provide a useful basis for researchers and end-users for further process modeling of biomass gasification in Aspen Plus.
Abstract: Aspen Plus has become one of the most common process simulation tools for both academia and industrial applications In the last decade, the number of the papers on Aspen Plus modeling of biomass gasification has significantly increased This review focuses on recent developments and studies on modeling biomass gasification in Aspen Plus including key aspects such as tar formation and model validation Accordingly, challenges in modeling due to specific assumptions and limitations will be highlighted to provide a useful basis for researchers and end-users for further process modeling of biomass gasification in Aspen Plus

59 citations


Journal ArticleDOI
TL;DR: The genetic algorithm (GA) was employed to reduce the input variables dimensionality, simplify the network structure and overcome the dynamic characteristic difficulties of process data in monitoring to improve the predictive accuracy and reliability for process monitoring.

54 citations


Journal ArticleDOI
TL;DR: The interactive serious gaming application was devised to encourage public engagement, facilitate communication and positive relationship between watershed communities, and make the decision process more attractive and transparent for the stakeholders.

44 citations


Journal ArticleDOI
TL;DR: A dynamic regularized latent variable regression (DrLVR) algorithm is proposed for dynamic data modeling and monitoring that aims to maximize the projection of quality variables on the dynamic latent spaces of the process variables.

42 citations


Journal ArticleDOI
TL;DR: A systemic literature review based on 24 papers rigorously selected from four popular search engines in 2018 is provided to assess the state of goal-oriented process mining, highlighting that the use of process mining in association with goals does not yet have a coherent line of research, whereas intention mining shows a meaningful trace of research.
Abstract: Process mining helps infer valuable insights about business processes using event logs, whereas goal modeling focuses on the representation and analysis of competing goals of stakeholders and systems. Although there are clear benefits in mining the goals of existing processes, goal-oriented approaches that consider logs during model construction are still rare. Process mining techniques, when generalizing large instance-level data into process models, can be considered as a data-driven complement to use case/scenario elicitation. Requirements engineers can exploit process mining techniques to find new system or process requirements in order to align current practices and desired ones. This paper provides a systemic literature review, based on 24 papers rigorously selected from four popular search engines in 2018, to assess the state of goal-oriented process mining. Through two research questions, the review highlights that the use of process mining in association with goals does not yet have a coherent line of research, whereas intention mining (where goal models are mined) shows a meaningful trace of research. Research about performance indicators measuring goals associated with process mining is also sparse. Although the number of publications in process mining and goal modeling is trending up, goal mining and goal-oriented process mining remain modest research areas. Yet, synergetic effects achievable by combining goals and process mining can potentially augment the precision, rationality and interpretability of mined models and eventually improve opportunities to satisfy system stakeholders.

40 citations


Journal ArticleDOI
TL;DR: It is demonstrated that if size and porosity of adsorbent pellets are optimized, efficiency and productivity of the process can be substantially improved and a universal dimensionless metric is proposed here for the first time.
Abstract: Performance-based screening of porous materials for CO2 capture and gas separation requires development of multiscale simulation workflows where physiochemical characteristics of adsorbents are obtained from molecular simulations, while separation performance of materials is evaluated at the process level by comparing overall energy efficiency and productivity in a particular process configuration. Practical implementation of these workflows requires: (a) accurate calculation of various material properties some of which are poorly estimated so far (e.g. specific heat capacity), (b) consistent treatment of the process variables that cannot be calculated from molecular simulations but are crucial for process modelling (e.g. pellet size and porosity), (c) improving computational efficiency of the workflows by reducing the search space in process optimization. In this study, we focus on four representative materials in the context of the vacuum swing adsorption process for carbon capture to probe these issues. We report on several observations with important implications for the theoretically achievable process efficiency, the computational efficiency of the multiscale workflows and on the consistency of materials rankings. We demonstrate that if size and porosity of adsorbent pellets are optimized, efficiency and productivity of the process can be substantially improved. We show the maximum performance of a material achievable in a particular process depends on a complex combination of both intrinsic material properties and process variables. This is evident from the ranking of the materials being different for a process with optimizable pellet size and porosity, compared to the reference case where these two properties are fixed. Analysis of the cycles on the Pareto fronts reveals common patterns for these variables for all the materials under consideration. We demonstrate that this observation reflects some optimum balance in the competition between diffusive processes into the pellet and convection flow processes across the bed. We attempt to capture this balance in a universal dimensionless metric which is explicitly proposed here for the first time. Application of such universal metrics could be very important in improving the efficiency of the optimization algorithms by narrowing down the multidimensional search space.

36 citations


Journal ArticleDOI
22 May 2020
TL;DR: A hybrid framework involving machine learning-assisted process modeling and optimization for controlling the melt pool geometry during the build process is developed and validated using experimental observations and demonstrates that a model-based optimization can be significantly accelerated using tools of machine learning in a data-driven setting.
Abstract: Metal additive manufacturing (AM) works on the principle of consolidating feedstock material in layers towards the fabrication of complex objects through localized melting and resolidification using high-power energy sources. Powder bed fusion and directed energy deposition are two widespread metal AM processes that are currently in use. During layer-by-layer fabrication, as the components continue to gain thermal energy, the melt pool geometry undergoes substantial changes if the process parameters are not appropriately adjusted on-the-fly. Although control of melt pool geometry via feedback or feedforward methods is a possibility, the time needed for changes in process parameters to translate into adjustments in melt pool geometry is of critical concern. A second option is to implement multi-physics simulation models that can provide estimates of temporal process parameter evolution. However, such models are computationally near intractable when they are coupled with an optimization framework for finding process parameters that can retain the desired melt pool geometry as a function of time. To address these challenges, a hybrid framework involving machine learning-assisted process modeling and optimization for controlling the melt pool geometry during the build process is developed and validated using experimental observations. A widely used 3D analytical model capable of predicting the thermal distribution in a moving melt pool is implemented and, thereafter, a nonparametric Bayesian, namely, Gaussian Process (GP), model is used for the prediction of time-dependent melt pool geometry (e.g., dimensions) at different values of the process parameters with excellent accuracy along with uncertainty quantification at the prediction points. Finally, a surrogate-assisted statistical learning and optimization architecture involving GP-based modeling and Bayesian Optimization (BO) is employed for predicting the optimal set of process parameters as the scan progresses to keep the melt pool dimensions at desired values. The results demonstrate that a model-based optimization can be significantly accelerated using tools of machine learning in a data-driven setting and reliable a priori estimates of process parameter evolution can be generated to obtain desired melt pool dimensions for the entire build process.

35 citations


Journal ArticleDOI
TL;DR: This work establishes the shape deviation generator (SDG) as a novel data analytical framework through a convolution formulation to model the 3-D shape formation in the AM process and establishes a new engineering-informed machine-learning framework to facilitate the learning of AM data to establish models for geometric shape accuracy prediction and control.
Abstract: The 3-D shape accuracy is a critical performance measure for products built via additive manufacturing (AM). With advances in computing and increased accessibility of AM product data, machine learning for AM (ML4AM) has become a viable strategy for enhancing 3-D printing performance. A proper description of the 3-D shape formation through the layer-by-layer fabrication process is critical to ML4AM, such as feature selection and AM process modeling. The physics-based modeling and simulation approaches present voxel-level description of an object formation from points to lines, lines to surfaces, and surfaces to 3-D shapes. However, this computationally intensive modeling framework does not provide a clear structure for machine learning of AM data. Significant progress has been made to model and predict the shape accuracy of planar objects under data analytical frameworks. In order to predict, learn, and compensate for 3-D shape deviations using shape measurement data, we propose a shape deviation generator (SDG) under a novel convolution formulation to facilitate the learning and prediction of 3-D printing accuracy. The shape deviation representation, individual layer input function, and layer-to-layer transfer function for the convolution modeling framework are proposed and derived. A deconvolution problem for identifying the transfer function is formulated to capture the interlayer interaction and error accumulation effects in the layer-by-layer fabrication processes. Physics-informed sequential model estimation is developed to fully establish the SDG models. The Gaussian process regression is adopted to capture spatial correlations. The printed 2-D and 3-D shapes via a stereolithography (SLA) process are used to demonstrate the proposed modeling framework and derive new process insights for AM processes. Note to Practitioners —With advances in computing and increased availability of product data, machine learning for additive manufacturing (ML4AM) has become a viable strategy for enhancing 3-D printing accuracy. This work establishes the shape deviation generator (SDG) as a novel data analytical framework through a convolution formulation to model the 3-D shape formation in the AM process. This new engineering-informed machine-learning framework will facilitate the learning of AM data to establish models for geometric shape accuracy prediction and control.

34 citations


Journal ArticleDOI
TL;DR: A review of the dynamic modeling approaches and the control strategies for the main process sections of biomass thermochemical conversion, either via gasification or through direct combustion, was undertaken.

33 citations


Proceedings ArticleDOI
TL;DR: In this paper, a set of conformance checking techniques described in 37 scholarly publications is presented, along with a concept matrix that highlights the dimensions where extant research concentrates and where blind spots exist.
Abstract: Conformance checking is a set of process mining functions that compare process instances with a given process model. It identifies deviations between the process instances' actual behaviour ("as-is") and its modelled behaviour ("to-be"). Especially in the context of analyzing compliance in organizations, it is currently gaining momentum -- e.g. for auditors. Researchers have proposed a variety of conformance checking techniques that are geared towards certain process model notations or specific applications such as process model evaluation. This article reviews a set of conformance checking techniques described in 37 scholarly publications. It classifies the techniques along the dimensions "modelling language", "algorithm type", "quality metric", and "perspective" using a concept matrix so that the techniques can be better accessed by practitioners and researchers. The matrix highlights the dimensions where extant research concentrates and where blind spots exist. For instance, process miners use declarative process modelling languages often, but applications in conformance checking are rare. Likewise, process mining can investigate process roles or process metrics such as duration, but conformance checking techniques narrow on analyzing control-flow. Future research may construct techniques that support these neglected approaches to conformance checking.

Proceedings ArticleDOI
19 Oct 2020
TL;DR: An updated process model for engineering software product lines, based on recent literature as well as experiences with industrial partners, is presented, which offers contemporary guidance for product-line engineers developing and evolving platforms, and inspires researchers to build novel methods and tools aligned with current practice.
Abstract: Process models for software product-line engineering focus on proactive adoption scenarios---that is, building product-line platforms from scratch. They comprise the two phases domain engineering (building a product-line platform) and application engineering (building individual variants), each of which defines various development activities. Established more than two decades ago, these process models are still the de-facto standard for steering the engineering of platforms and variants. However, observations from industrial and open-source practice indicate that the separation between domain and application engineering, with their respective activities, does not fully reflect reality. For instance, organizations rarely build platforms from scratch, but start with developing individual variants that are re-engineered into a platform when the need arises. Organizations also appear to evolve platforms by evolving individual variants, and they use contemporary development activities aligned with technical advances. Recognizing this discrepancy, we present an updated process model for engineering software product lines. We employ a method for constructing process theories, building on recent literature as well as our experiences with industrial partners to identify development activities and the orders in which these are performed. Based on these activities, we synthesize and discuss the new process model, called promote-pl. Also, we explain its relation to modern software-engineering practices, such as continuous integration, model-driven engineering, or simulation testing. We hope that our work offers contemporary guidance for product-line engineers developing and evolving platforms, and inspires researchers to build novel methods and tools aligned with current practice.

Journal ArticleDOI
TL;DR: This work proposes to use design of experiment studies in combination with hybrid modeling for process characterization in order to understand the impact of temporal deviations, production dynamics and provides a better understanding of the process variations that stem from the biological subsystem.
Abstract: Upstream bioprocess characterization and optimization are time and resource-intensive tasks. Regularly in the biopharmaceutical industry, statistical design of experiments (DoE) in combination with response surface models (RSMs) are used, neglecting the process trajectories and dynamics. Generating process understanding with time-resolved, dynamic process models allows to understand the impact of temporal deviations, production dynamics, and provides a better understanding of the process variations that stem from the biological subsystem. The authors propose to use DoE studies in combination with hybrid modeling for process characterization. This approach is showcased on Escherichia coli fed-batch cultivations at the 20L scale, evaluating the impact of three critical process parameters. The performance of a hybrid model is compared to a pure data-driven model and the widely adopted RSM of the process endpoints. Further, the performance of the time-resolved models to simultaneously predict biomass and titer is evaluated. The superior behavior of the hybrid model compared to the pure black-box approaches for process characterization is presented. The evaluation considers important criteria, such as the prediction accuracy of the biomass and titer endpoints as well as the time-resolved trajectories. This showcases the high potential of hybrid models for soft-sensing and model predictive control.

Journal ArticleDOI
TL;DR: Various nonlinear latent variable models based on autoencoder (AE) are developed in order to extract deeper nonlinear features from process data and provide a deep generative structure for nonlinear process monitoring and quality prediction.

Journal ArticleDOI
TL;DR: A hybrid predictive model based on transition-systems and statistical regression which is “product-oriented”, tailored to better predict online cycle-times on industrial environments is proposed, optimized by a linear programming model.

Journal ArticleDOI
TL;DR: This paper reviewed 40 proposed DBFI process models for RDBMS in the literature to offer up-to-date and comprehensive background knowledge on existingDBFI process model research, their associated challenges, issues for newcomers, and potential solutions for addressing such issues.
Abstract: Database Forensic Investigation (DBFI) involves the identification, collection, preservation, reconstruction, analysis, and reporting of database incidents. However, it is a heterogeneous, complex, and ambiguous field due to the variety and multidimensional nature of database systems. A small number of DBFI process models have been proposed to solve specific database scenarios using different investigation processes, concepts, activities, and tasks as surveyed in this paper. Specifically, we reviewed 40 proposed DBFI process models for RDBMS in the literature to offer up-to-date and comprehensive background knowledge on existing DBFI process model research, their associated challenges, issues for newcomers, and potential solutions for addressing such issues. This paper highlights three common limitations of the DBFI domain, which are: 1) redundant and irrelevant investigation processes; 2) redundant and irrelevant investigation concepts and terminologies; and 3) a lack of unified models to manage, share, and reuse DBFI knowledge. Also, this paper suggests three solutions for the discovered limitations, which are: 1) propose generic DBFI process/model for the DBFI field; 2) develop a semantic metamodeling language to structure, manage, organize, share, and reuse DBFI knowledge; and 3) develop a repository to store and retrieve DBFI field knowledge.

Journal ArticleDOI
TL;DR: This paper presents an integrated platform that brings interoperability to several simulation components that expands the process modeling tool Papyrus to allow it to communicate with external components through both distributed simulation and cosimulation standards.
Abstract: In order to control manufacturing systems, managers need risk and performance evaluation methods and simulation tools. However, these simulation techniques must evolve towards being multiperformance, multiactor, and multisimulation tools, and this requires interoperability between those distributed components. This paper presents an integrated platform that brings interoperability to several simulation components. This work expands the process modeling tool Papyrus to allow it to communicate with external components through both distributed simulation and cosimulation standards. The distributed modeling and simulation framework (DMSF) platform takes its environment into consideration in order to evaluate the sustainability of the system while integrating external heterogeneous components. For instance, a DMSF connection with external IoT devices has been implemented. Moreover, the orchestration of different smart manufacturing components and services is achieved through configurable business models. As a result, an automotive industry case study has successfully been tested to demonstrate the sustainability of smart supply chains and manufacturing factories, allowing better connectivity with their real environments.

Journal ArticleDOI
TL;DR: This work combined existing concepts and ontologies to define a “Hybrid Business Process Representation” (HBPR), and conducted a Systematic Literature Review to identify and investigate the characteristics of HBPRs combining imperative and declarative languages or artifacts.

Journal ArticleDOI
TL;DR: A novel robust Bayesian network is proposed for process modeling with low-quality data since unreliable data can cause model parameters to deviate from the real distributions and make network structures unable to characterize the true causalities.

Journal ArticleDOI
TL;DR: A model-based workflow is described to evaluate the bioprocess dynamics during process transfer and scale-up computationally and enables a knowledge-driven evaluation tool for biop rocess development.

Journal ArticleDOI
TL;DR: This work presents a novel process design optimization framework for additive manufacturing by integrating physics-informed computational simulation models with experimental observations to optimize the process parameters such as extrusion temperature, extrusion velocity, and layer thickness in the fused filament fabrication (FFF) AM process.
Abstract: This work presents a novel process design optimization framework for additive manufacturing (AM) by integrating physics-informed computational simulation models with experimental observations. The proposed framework is implemented to optimize the process parameters such as extrusion temperature, extrusion velocity, and layer thickness in the fused filament fabrication (FFF) AM process, in order to reduce the variability in the geometry of the manufactured part. A coupled thermo-mechanical model is first developed to simulate the FFF process. The temperature history obtained from the heat transfer analysis is then used as input for the mechanical deformation analysis to predict the dimensional inaccuracy of the additively manufactured part. The simulation model is then corrected based on experimental observations through Bayesian calibration of the model discrepancy to make it more accurately represent the actual manufacturing process. Based on the corrected prediction model, a robustness-based design optimization problem is formulated to optimize the process parameters, while accounting for multiple sources of uncertainty in the manufacturing process, process models, and measurements. Physical experiments are conducted to verify the effectiveness of the proposed optimization framework.

Journal ArticleDOI
TL;DR: This work introduces a novel method, called f-HMD, which aims at scalable hybrid model discovery in a cloud computing environment and returns hybrid process models to bridge the gap between formal and informal models.
Abstract: Process descriptions are used to create products and deliver services. To lead better processes and services, the first step is to learn a process model. Process discovery is such a technique which can automatically extract process models from event logs. Although various discovery techniques have been proposed, they focus on either constructing formal models which are very powerful but complex, or creating informal models which are intuitive but lack semantics. In this work, we introduce a novel method that returns hybrid process models to bridge this gap. Moreover, to cope with today's big event logs, we propose an efficient method, called f -HMD, aims at scalable hybrid model discovery in a cloud computing environment. We present the detailed implementation of our approach over the Spark framework, and our experimental results demonstrate that the proposed method is efficient and scalable.

Journal ArticleDOI
TL;DR: An approach and meta-model for developing company-specific ontologies, which can be used for modelling the portfolio of components, processes and equipment and their interrelations as a basis for co-evolution is proposed.
Abstract: Changeable manufacturing and platform-based co-development of products and manufacturing systems are becoming increasingly relevant for industrial manufacturing in order to respond to volatile mark...

Book ChapterDOI
TL;DR: In this chapter the requirements that process models must fulfil in order to be used for process optimization and finally in Digital Twins will be described and different types of models, including mechanistic as well as compartmentalized models, are outlined and their application in Digital twins and for process optimize is explained.
Abstract: A future-oriented approach is the application of Digital Twins for process development, optimization and finally during manufacturing. Digital Twins are detailed virtual representations of bioprocesses with predictive capabilities. In biotechnology, Digital Twins can be used to monitor processes and to provide data for process control and optimization. Central and crucial components of Digital Twins are mathematical process models, which are capable to describe and predict cultivations with high fidelity. Detailed mechanistic models in particular are suitable for both use in Digital Twins and for the development of process control strategies.In this chapter the requirements that process models must fulfil in order to be used for process optimization and finally in Digital Twins will be described. Different types of models, including mechanistic as well as compartmentalized models, are outlined and their application in Digital Twins and for process optimization is explained. Finally, a structured, compartmentalized process model, which was specifically designed for process optimization and has already been used in Digital Twins, is highlighted.

Journal ArticleDOI
TL;DR: The paper presents and compares two approaches to the problem of discovering constraints that involve two activities of the process such that each of these two activities is associated with a condition that must hold when the activity occurs.

Journal ArticleDOI
TL;DR: The simulation results of a nonlinear chemical process network example demonstrate the effective closed-loop control performance when the process is operated under the decentralized MPCs using the independently-trained recurrent neural network models, as well as the improved computational efficiency compared to theclosed-loop simulation of a centralized MPC system.
Abstract: This work focuses on the design of decentralized model predictive control (MPC) systems for nonlinear processes, where the nonlinear process is broken down into multiple, yet coupled subsystems and the dynamic behavior of each subsystem is described by a machine learning model. One decentralized MPC is designed and used to control each subsystem while accounting for the interactions between subsystems through feedback of the entire process state. The closed-loop stability of the overall nonlinear process network and the performance properties of the decentralized model predictive control system using machine-learning prediction models are analyzed. More specifically, multiple recurrent neural network models suited for each different subsystem need to be trained with a sufficiently small modeling error from their respective actual nonlinear process models to ensure closed-loop stability. These recurrent neural network models are subsequently used as the prediction model in decentralized Lyapunov-based MPCs to achieve efficient real-time computation time while ensuring closed-loop state boundedness and convergence to the origin. The simulation results of a nonlinear chemical process network example demonstrate the effective closed-loop control performance when the process is operated under the decentralized MPCs using the independently-trained recurrent neural network models, as well as the improved computational efficiency compared to the closed-loop simulation of a centralized MPC system.

Journal ArticleDOI
TL;DR: A new framework for understanding, modelling and software engineering in construction information activities that provides a theoretical and conceptual basis for designing, planning, creating, monitoring and evaluating construction-related online services that include a strong social component and use social media services is proposed.
Abstract: This paper introduces a new framework for understanding, modelling and software engineering in construction information activities. The current framework is based on understanding that products are...

Journal ArticleDOI
TL;DR: This paper proves that the information and the footprints preserved in the abstractions suffice to unambiguously rediscover the exact process model from a finite event log and outlines the implications on process mining techniques.
Abstract: Process mining aims at obtaining information about processes by analysing their past executions in event logs, event streams, or databases Discovering a process model from a finite amount of event data thereby has to correctly infer infinitely many unseen behaviours Thereby, many process discovery techniques leverage abstractions on the finite event data to infer and preserve behavioural information of the underlying process However, the fundamental information-preserving properties of these abstractions are not well understood yet In this paper, we study the information-preserving properties of the “directly follows” abstraction and its limitations We overcome these by proposing and studying two new abstractions which preserve even more information in the form of finite graphs We then show how and characterize when process behaviour can be unambiguously recovered through characteristic footprints in these abstractions Our characterization defines large classes of practically relevant processes covering various complex process patterns We prove that the information and the footprints preserved in the abstractions suffice to unambiguously rediscover the exact process model from a finite event log Furthermore, we show that all three abstractions are relevant in practice to infer process models from event logs and outline the implications on process mining techniques

Book ChapterDOI
08 Jun 2020
TL;DR: This paper formalizes the PME task into the multi-grained text classification problem, and proposes a hierarchical neural network to effectively model and extract multi-Grained information without manually-defined procedural features.
Abstract: Process model extraction (PME) is a recently emerged interdiscipline between natural language processing (NLP) and business process management (BPM), which aims to extract process models from textual descriptions. Previous process extractors heavily depend on manual features and ignore the potential relations between clues of different text granularities. In this paper, we formalize the PME task into the multi-grained text classification problem, and propose a hierarchical neural network to effectively model and extract multi-grained information without manually-defined procedural features. Under this structure, we accordingly propose the coarse-to-fine (grained) learning mechanism, training multi-grained tasks in coarse-to-fine grained order to share the high-level knowledge for the low-level tasks. To evaluate our approach, we construct two multi-grained datasets from two different domains and conduct extensive experiments from different dimensions. The experimental results demonstrate that our approach outperforms the state-of-the-art methods with statistical significance and further investigations demonstrate its effectiveness.

Journal ArticleDOI
01 Apr 2020
TL;DR: It is best to present the model in a ‘flattened’ form and in the ‘paper’ format in order to optimally understand a BPMN model, and the results show that the model reader’s business process modeling competency is an important factor of process model comprehension.
Abstract: Many factors influence the creation of business process models which are understandable for a target audience Understandability of process models becomes more critical when size and complexity of the models increase Using vertical modularization to decompose such models hierarchically into modules is considered to improve their understandability To investigate this assumption, two experiments were conducted The experiments involved 2 large-scale real-life business process models that were modeled using BPMN v20 (Business Process Model and Notation) in the form of collaboration diagrams Each process was modeled in 3 modularity forms: fully-flattened, flattened where activities are clustered using BPMN groups, and modularized using separately viewed BPMN sub-processes The objective was to investigate if and how different forms of modularity representation (used for vertical modularization) in BPMN collaboration diagrams influence the understandability of process models In addition to the forms of modularity representation, the presentation medium (paper vs computer) and model reader’s level of business process modeling competency were investigated as factors that potentially influence model comprehension 60 business practitioners from a large organization and 140 graduate students participated in our experiments The results indicate that, when these three modularity representations are considered, it is best to present the model in a ‘flattened’ form (with or without the use of groups) and in the ‘paper’ format in order to optimally understand a BPMN model The results also show that the model reader’s business process modeling competency is an important factor of process model comprehension