scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Mathematics in Industry in 2020"


Journal ArticleDOI
TL;DR: In this paper, an extended SEIR (susceptible-exposed-infectious-recovered) model and continuous-time optimal control theory are used to compute the optimal non-pharmaceutical intervention strategy for the case that a vaccine is never found and complete containment (eradication of the epidemic) is impossible.
Abstract: When effective medical treatment and vaccination are not available, non-pharmaceutical interventions such as social distancing, home quarantine and far-reaching shutdown of public life are the only available strategies to prevent the spread of epidemics. Based on an extended SEIR (susceptible-exposed-infectious-recovered) model and continuous-time optimal control theory, we compute the optimal non-pharmaceutical intervention strategy for the case that a vaccine is never found and complete containment (eradication of the epidemic) is impossible. In this case, the optimal control must meet competing requirements: First, the minimization of disease-related deaths, and, second, the establishment of a sufficient degree of natural immunity at the end of the measures, in order to exclude a second wave. Moreover, the socio-economic costs of the intervention shall be kept at a minimum. The numerically computed optimal control strategy is a single-intervention scenario that goes beyond heuristically motivated interventions and simple “flattening of the curve”. Careful analysis of the computed control strategy reveals, however, that the obtained solution is in fact a tightrope walk close to the stability boundary of the system, where socio-economic costs and the risk of a new outbreak must be constantly balanced against one another. The model system is calibrated to reproduce the initial exponential growth phase of the COVID-19 pandemic in Germany.

75 citations


Journal ArticleDOI
TL;DR: It is shown that improved case detection rate plays a decisive role to reduce the effective reproduction number, and there is still much room in terms of improving personal protection measures to compensate for the strict social distancing measures.
Abstract: Public health interventions have been implemented to mitigate the spread of coronavirus disease 2019 (COVID-19) in Ontario, Canada; however, the quantification of their effectiveness remains to be done and is important to determine if some of the social distancing measures can be relaxed without resulting in a second wave. We aim to equip local public health decision- and policy-makers with mathematical model-based quantification of implemented public health measures and estimation of the trend of COVID-19 in Ontario to inform future actions in terms of outbreak control and de-escalation of social distancing. Our estimates confirm that (1) social distancing measures have helped mitigate transmission by reducing daily infection contact rate, but the disease transmission probability per contact remains as high as 0.145 and case detection rate was so low that the effective reproduction number remained higher than the threshold for disease control until the closure of non-essential business in the Province; (2) improvement in case detection rate and closure of non-essential business had resulted in further reduction of the effective control number to under the threshold. We predict the number of confirmed cases according to different control efficacies including a combination of reducing further contact rates and transmission probability per contact. We show that improved case detection rate plays a decisive role to reduce the effective reproduction number, and there is still much room in terms of improving personal protection measures to compensate for the strict social distancing measures.

62 citations


Journal ArticleDOI
TL;DR: This model is consistent with the well known relevance of quarantine, shows the dramatic role of care houses and accounts for the increase in the death toll when spatial movements are not constrained.
Abstract: We present an epidemic model capable of describing key features of the Covid-19 pandemic. While capturing several qualitative properties of the virus spreading, it allows to compute the basic reproduction number, the number of deaths due to the virus and various other statistics. Numerical integrations are used to illustrate the adherence of the evolutions described by the model to specific well known real features of the present pandemic. In particular, this model is consistent with the well known relevance of quarantine, shows the dramatic role of care houses and accounts for the increase in the death toll when spatial movements are not constrained.

38 citations


Journal ArticleDOI
TL;DR: This paper firstly analyzes and identifies the equipment composition of mixed electrical equipment group by using the load decision tree algorithm, and the Particle Swarm Optimization (PSO) is used to solve the model for equipment state recognition.
Abstract: In order to realize the problems of non-intrusive load monitoring and decomposition (NILMD) from two aspects of load identification and load decomposition, based on the load characteristics of the database, this paper firstly analyzes and identifies the equipment composition of mixed electrical equipment group by using the load decision tree algorithm. Then, a 0–1 programming model for the equipment status identification is established, and the Particle Swarm Optimization (PSO) is used to solve the model for equipment state recognition, and the equipment operating state in the equipment group is identified. Finally, a simulation experiment is carried out for the partial data of Question A in the 6th “teddy cup” data mining challenge competition.

25 citations


Journal ArticleDOI
TL;DR: A new mathematical model for intraday electricity trading involving both renewable and conventional generation and allows to incorporate market data e.g. for half-spread and immediate price impact is presented.
Abstract: As an extension of (Progress in industrial mathematics at ECMI 2018, pp. 469–475, 2019), this paper is concerned with a new mathematical model for intraday electricity trading involving both renewable and conventional generation. The model allows to incorporate market data e.g. for half-spread and immediate price impact. The optimal trading and generation strategy of an agent is derived as the viscosity solution of a second-order Hamilton–Jacobi–Bellman (HJB) equation for which no closed-form solution can be given. We construct a numerical approximation allowing us to use continuous input data. Numerical results for a portfolio consisting of three conventional units and wind power are provided.

24 citations


Journal ArticleDOI
TL;DR: An extended SEIRD-model is presented to describe the disease dynamics in Germany and an additional parameter to capture the influence of unidentified cases is also included in the model.
Abstract: Since the end of 2019 an outbreak of a new strain of coronavirus, called SARS-CoV-2, is reported from China and later other parts of the world. Since January 21, World Health Organization (WHO) reports daily data on confirmed cases and deaths from both China and other countries (www.who.int/emergencies/diseases/novel-coronavirus-2019/situation-reports). The Johns Hopkins University (github.com/CSSEGISandData/COVID-19/blob/master/csse_COVID_19_data/csse_COVID_19_time_series/time_series_COVID19_confirmed_global.csv) collects those data from various sources worldwide on a daily basis. For Germany, the Robert-Koch-Institute (RKI) also issues daily reports on the current number of infections and infection related fatal cases (www.rki.de/DE/Content/InfAZ/N/Neuartiges_Coronavirus/Situationsberichte/Gesamt.html). However, due to delays in the data collection, the data from RKI always lags behind those reported by Johns Hopkins. In this work we present an extended SEIRD-model to describe the disease dynamics in Germany. The parameter values are identified by matching the model output to the officially reported cases. An additional parameter to capture the influence of unidentified cases is also included in the model.

21 citations


Journal ArticleDOI
TL;DR: A novel methodology is developed, integrating social contact patterns derived from empirical data with a disease transmission model that enables the usage of age-stratified incidence data to infer age-specific susceptibility, daily contact mixing patterns in workplace, household, school and community settings; and transmission acquired in these settings under different physical distancing measures.
Abstract: Social contact mixing plays a critical role in influencing the transmission routes of infectious diseases. Moreover, quantifying social contact mixing patterns and their variations in a rapidly evolving pandemic intervened by changing public health measures is key for retroactive evaluation and proactive assessment of the effectiveness of different age- and setting-specific interventions. Contact mixing patterns have been used to inform COVID-19 pandemic public health decision-making; but a rigorously justified methodology to identify setting-specific contact mixing patterns and their variations in a rapidly developing pandemic, which can be informed by readily available data, is in great demand and has not yet been established. Here we fill in this critical gap by developing and utilizing a novel methodology, integrating social contact patterns derived from empirical data with a disease transmission model, that enables the usage of age-stratified incidence data to infer age-specific susceptibility, daily contact mixing patterns in workplace, household, school and community settings; and transmission acquired in these settings under different physical distancing measures. We demonstrated the utility of this methodology by performing an analysis of the COVID-19 epidemic in Ontario, Canada. We quantified the age- and setting (household, workplace, community, and school)-specific mixing patterns and their evolution during the escalation of public health interventions in Ontario, Canada. We estimated a reduction in the average individual contact rate from 12.27 to 6.58 contacts per day, with an increase in household contacts, following the implementation of control measures. We also estimated increasing trends by age in both the susceptibility to infection by SARS-CoV-2 and the proportion of symptomatic individuals diagnosed. Inferring the age- and setting-specific social contact mixing and key age-stratified epidemiological parameters, in the presence of evolving control measures, is critical to inform decision- and policy-making for the current COVID-19 pandemic.

12 citations


Journal ArticleDOI
TL;DR: In this paper, the authors extend the model used for simulating LITT to account for vaporization using two different approaches and compare the results obtained with the measurements from the original study.
Abstract: Laser-induced thermotherapy (LITT) is a minimally invasive method causing tumor destruction due to heat ablation and coagulative effects. Computer simulations can play an important role to assist physicians with the planning and monitoring of the treatment. Our recent study with ex-vivo porcine livers has shown that the vaporization of the water in the tissue must be taken into account when modeling LITT. We extend the model used for simulating LITT to account for vaporization using two different approaches. Results obtained with these new models are then compared with the measurements from the original study.

11 citations


Journal ArticleDOI
TL;DR: In this article, a new percolation-porosity relation is derived and compared with the Kozeny-Carman relation, which states that flow through the pores is possible at a certain location as long as the porosity is larger than zero at this location in the aquifer.
Abstract: Water injection in the aquifer induces deformations in the soil. These mechanical deformations give rise to a change in porosity and permeability, which results in non-linearity of the mathematical problem. Assuming that the deformations are very small, the model provided by Biot’s theory of linear poroelasticity is used to determine the local displacement of the skeleton of a porous medium, as well as the fluid flow through the pores. In this continuum scale model, the Kozeny–Carman equation is commonly used to determine the permeability of the porous medium from the porosity. The Kozeny–Carman relation states that flow through the pores is possible at a certain location as long as the porosity is larger than zero at this location in the aquifer. However, from network models it is known that percolation thresholds exist, indicating that the permeability will be equal to zero if the porosity becomes smaller than these thresholds. In this paper, the relationship between permeability and porosity is investigated. A new permeability-porosity relation, based on the percolation theory, is derived and compared with the Kozeny–Carman relation. The strongest feature of the new approach is related to its capability to give a good description of the permeability in case of low porosities. However, with this network-inspired approach small values of the permeability are more likely to occur. Since we show that the solution of Biot’s model converges to the solution of a saddle point problem for small time steps and low permeability, we need stabilisation in the finite element approximation.

10 citations


Journal ArticleDOI
TL;DR: In this paper, the authors used PDE-constrained optimization to identify the blood perfusion rate from MR thermometry data obtained during the treatment with laser-induced thermotherapy (LITT).
Abstract: Using PDE-constrained optimization we introduce a parameter identification approach which can identify the blood perfusion rate from MR thermometry data obtained during the treatment with laser-induced thermotherapy (LITT) The blood perfusion rate, ie, the cooling effect induced by blood vessels, can be identified during the first stage of the treatment This information can then be used by a simulation to monitor and predict the ongoing treatment The approach is tested with synthetic measurements with and without artificial noise as input data

10 citations


Journal ArticleDOI
TL;DR: In this article, a quasi-stationary state model is used to understand the distribution of the heat in a steel plate and the changes in the solid phases of the steel and into liquid phase during the flame cutting process.
Abstract: The goal of this work is to describe in detail a quasi-stationary state model which can be used to deeply understand the distribution of the heat in a steel plate and the changes in the solid phases of the steel and into liquid phase during the flame cutting process We use a 3D-model similar to previous works from Thiebaud (J Mater Process Technol 214(2):304–310, 2014) and expand it to consider phases changes, in particular, austenite formation and melting of material Experimental data is used to validate the model and study its capabilities Parameters defining the shape of the volumetric heat source and the power density are calibrated to achieve good agreement with temperature measurements Similarities and differences with other models from literature are discussed

Journal ArticleDOI
TL;DR: In this paper, a space mapping-based approximation of the stochastic control problem by solutions of the deterministic one is proposed, in combination with the receding horizon control technique, which yields a reliable and fast numerical scheme.
Abstract: Control of stochastic interacting particle systems is a non-trivial task due to the high dimensionality of the problem and the lack of fast algorithms. Here, we propose a space mapping-based approximation of the stochastic control problem by solutions of the deterministic one. In combination with the receding horizon control technique this yields a reliable and fast numerical scheme for the closed loop control of stochastic interacting particle systems. As a numerical example we consider the herding of sheep with dogs. The numerical results underline the feasibility of our approach and further show stabilizing behaviour of the closed loop control.

Journal ArticleDOI
TL;DR: Performance measures motivated by risk measures from finance are introduced leading to a simulation based optimization framework for the production planning of a stochastic production network model, where the capacities of machines may change randomly.
Abstract: This paper is concerned with a simulation study for a stochastic production network model, where the capacities of machines may change randomly. We introduce performance measures motivated by risk measures from finance leading to a simulation based optimization framework for the production planning. The same measures are used to investigate the scenario when capacities are related to workers that are randomly not available. This corresponds to the study of a workforce planning problem in an uncertain environment.

Journal ArticleDOI
TL;DR: In this paper, a recent asymptotic model for solidification shrinkage-induced macrosegregation in the continuous casting of binary alloys is extended for the purposes of understanding the link between solute segregation and centreline shrinkage porosity, a defect that commonly occurs in the continuously casting of steel.
Abstract: A recent asymptotic model for solidification shrinkage-induced macrosegregation in the continuous casting of binary alloys is extended for the purposes of understanding the link between solute segregation and centreline shrinkage porosity, a defect that commonly occurs in the continuous casting of steel. In particular, the analysis elucidates the relationship between microsegregation, mushy-zone permeability, heat transfer and centreline pressure, yielding an inequality that constitutes a criterion for whether or not centreline porosity will form. The possibilities for developing this approach to take account of gas porosity and the implementation of mechanical soft reduction to reduce macrosegregation and shrinkage porosity are also discussed.

Journal ArticleDOI
TL;DR: A method to post-process raw thermograms based on the computation of topological derivatives which will produce much sharper images than the original thermograms, giving a good approximation of the shape, position and number of defects without the need of an iterative process.
Abstract: This paper deals with active time-harmonic infrared thermography applied to the detection of defects inside thin plates. We propose a method to post-process raw thermograms based on the computation of topological derivatives which will produce much sharper images (namely, where contrast is highly enhanced) than the original thermograms. The reconstruction algorithm does not need information about the number of defects, nor the size or position. A collection of numerical experiments illustrates that the algorithm is highly robust against measurement errors in the thermograms, giving a good approximation of the shape, position and number of defects without the need of an iterative process.

Journal ArticleDOI
TL;DR: In this paper, a simple compartmental model that includes key features of VZV such as latency and reactivation of the virus as zoster, and exogeneous boosting of immunity is proposed.
Abstract: Vaccines against varicella-zoster virus (VZV) are under introduction in Hungary into the routine vaccination schedule, hence it is important to understand the current transmission dynamics and to estimate the key parameters of the disease. Mathematical models can be greatly useful in advising public health policy decision making by comparing predictions for different scenarios. First we consider a simple compartmental model that includes key features of VZV such as latency and reactivation of the virus as zoster, and exogeneous boosting of immunity. After deriving the basic reproduction number $R_{0}$, the model is analysed mathematically and the threshold dynamics is proven: if $R_{0}\leq 1$ then the virus will be eradicated, while if $R_{0}>1$ then an endemic equilibrium exists and the virus uniformly persists in the population. Then we extend the model to include seasonality, and fit it to monthly incidence data from Hungary. It is shown that besides the seasonality, the disease dynamics has intrinsic multi-annual periodicity. We also investigate the sensitivity of the model outputs to the system parameters and the underreporting ratio, and provide estimates for $R_{0}$.

Journal ArticleDOI
TL;DR: This work developed a Python demonstrator for pricing total valuation adjustment (XVA) based on the stochastic grid bundling method (SGBM) and explores the potential of developing a simple yet highly efficient code with SGBM by incorporating CUDA Python into the program.
Abstract: In this work, we developed a Python demonstrator for pricing total valuation adjustment (XVA) based on the stochastic grid bundling method (SGBM). XVA is an advanced risk management concept which became relevant after the recent financial crisis. This work is a follow-up work on Chau and Oosterlee in (Int J Comput Math 96(11):2272–2301, 2019), in which we extended SGBM to numerically solving backward stochastic differential equations (BSDEs). The motivation for this work is basically two-fold. On the application side, by focusing on a particular financial application of BSDEs, we can show the potential of using SGBM on a real-world risk management problem. On the implementation side, we explore the potential of developing a simple yet highly efficient code with SGBM by incorporating CUDA Python into our program.

Journal ArticleDOI
TL;DR: In this article, a case study regarding inventory models for acquiring liquefied petroleum gas (LPG) cylinders is presented, where three new inventory models, which account for the return of LPG cylinders, are proposed in this work.
Abstract: This paper addresses a case study regarding inventory models for acquiring liquefied petroleum gas (LPG) cylinders. This is an industrial challenge that was proposed at an European Study Group with Industry, by a Portuguese energy company, for which the LPG cylinder is the main asset of its LPG business. Due to the importance of this asset, an acquisition plan must be defined in order to determine the amount of LPG cylinders to acquire, and when to acquire them, in order to optimize the investment. As cylinders are returned and refilled, the reverse logistic flows of these assets must be considered. As the classical inventory models are not suitable for this case study, three new inventory models, which account for the return of LPG cylinders, are proposed in this work. The first proposed model considers deterministic constant demand and continuous returns of LPG cylinders, with discrete replenishment from the supplier. The second model is similar, but for the case when the returned cylinders cover for the demand. A third model is also proposed considering that both the demand and the returns are stochastic in nature and the replenishment from the supplier is discrete. The three models address different scenarios that the company is either currently facing or is expecting to occur in the near future.

Journal ArticleDOI
TL;DR: An efficient and reliable method for stochastic yield estimation using Gaussian process regression, which gives not only an approximation of the function value, but also an error indicator that can be used to decide whether a sample point should be reevaluated or not.
Abstract: In this paper an efficient and reliable method for stochastic yield estimation is presented. Since one main challenge of uncertainty quantification is the computational feasibility, we propose a hybrid approach where most of the Monte Carlo sample points are evaluated with a surrogate model, and only a few sample points are reevaluated with the original high fidelity model. Gaussian process regression is a non-intrusive method which is used to build the surrogate model. Without many prerequisites, this gives us not only an approximation of the function value, but also an error indicator that we can use to decide whether a sample point should be reevaluated or not. For two benchmark problems, a dielectrical waveguide and a lowpass filter, the proposed methods outperform classic approaches.

Journal ArticleDOI
TL;DR: This work presents an integrated framework to find optimal process parameters for a laser-based material accumulation process (thermal upsetting) using a combination of meta-heuristic optimization models and finite element simulations, and introduces a new coupled numerical 3 d finite element method.
Abstract: Common goals of modern production processes are precision and efficiency. Typically, they are conflicting and cannot be optimized at the same time. Multi-objective optimization methods are able to compute a set of good parameters, from which a decision maker can make a choice for practical situations. For complex processes, the use of physical experiments and/or extensive process simulations can be too costly or even unfeasible, so the use of surrogate models based on few simulations is a good alternative. In this work, we present an integrated framework to find optimal process parameters for a laser-based material accumulation process (thermal upsetting) using a combination of meta-heuristic optimization models and finite element simulations. In order to effectively simulate the coupled system of heat equation with solid-liquid phase transitions and melt flow with capillary free surface in three space dimensions for a wide range of process parameters, we introduce a new coupled numerical 3d finite element method. We use a multi-objective optimization method based on surrogate models. Thus, with only few direct simulations necessary, we are able to select Pareto sets of process parameters which can be used to optimize three or six different performance measures.

Journal ArticleDOI
TL;DR: A mechanism explaining the approximately linear growth of Covid19 world total cases as well as the slow linear decrease of the daily new cases observed (in average) in USA and Italy is proposed.
Abstract: We propose a mechanism explaining the approximately linear growth of Covid19 world total cases as well as the slow linear decrease of the daily new cases (and daily deaths) observed (in average) in USA and Italy. In our explanation, we regard a given population (the whole world or a single nation) as composed by many sub-clusters which, after lockdown, evolve essentially independently. The interaction is modeled by the fact that the outbreak time of the epidemic in a sub-cluster is a random variable with probability density slowly varying in time. The explanation is independent of the law according to which the epidemic evolves in the single sub cluster.

Journal ArticleDOI
TL;DR: Multi-constraints and dual-targets aircraft route planning model can plan the flight path of the aircraft intuitively and timely, which confirmed the effectiveness of the model.
Abstract: As the core technology in the field of aircraft, the route planning has attracted much attention. However, due to the complexity of the structure and performance constraints of the aircraft, the route planning algorithm does not have well universality, so it cannot be used in a complex environment. In the paper, a multi-constraints and dual-targets aircraft route planning model was established for the real-time requirements of space flight, the dynamic changes of flight environment with time, the accuracy requirements of positioning errors in the safety area, and the minimum turning radius constraints. Based on the directed graph and dynamic programming ideas, the model simulation and model validation were carried out with the data of F question in the “16th Graduate Mathematical Modeling Contest”. The results showed that the optimal path length obtained in data set 1 was 124.61 km, the number of corrections was 11 times, the solution time was 2.3768 seconds, the optimal path length obtained in data set 2 was 110.00 km, and the number of corrections was 12 times. The solution time was 0.0168 seconds. Multi-constraints and dual-targets aircraft route planning model can plan the flight path of the aircraft intuitively and timely, which confirmed the effectiveness of the model.

Journal ArticleDOI
TL;DR: A new nonmonotone adaptive trust region line search method for solving unconstrained optimization problems, and presents a modified trust region ratio, which obtained more reasonable consistency between the accurate model and the approximate model.
Abstract: This paper proposes a new nonmonotone adaptive trust region line search method for solving unconstrained optimization problems, and presents a modified trust region ratio, which obtained more reasonable consistency between the accurate model and the approximate model. The approximation of Hessian matrix is updated by the modified BFGS formula. Trust region radius adopts a new adaptive strategy to overcome additional computational costs at each iteration. The global convergence and superlinear convergence of the method are preserved under suitable conditions. Finally, the numerical results show that the proposed method is very efficient.

Journal ArticleDOI
TL;DR: A policy-based Reinforcement Learning ansatz using neural networks is started, which is based on a recently discussed interpretation of neural networks, and the resulting infinite optimization problem is transformed into an optimization problem similar to the well-known optimal control problems.
Abstract: In this contribution, we start with a policy-based Reinforcement Learning ansatz using neural networks. The underlying Markov Decision Process consists of a transition probability representing the dynamical system and a policy realized by a neural network mapping the current state to parameters of a distribution. Therefrom, the next control can be sampled. In this setting, the neural network is replaced by an ODE, which is based on a recently discussed interpretation of neural networks. The resulting infinite optimization problem is transformed into an optimization problem similar to the well-known optimal control problems. Afterwards, the necessary optimality conditions are established and from this a new numerical algorithm is derived. The operating principle is shown with two examples. It is applied to a simple example, where a moving point is steered through an obstacle course to a desired end position in a 2D plane. The second example shows the applicability to more complex problems. There, the aim is to control the finger tip of a human arm model with five degrees of freedom and 29 Hill’s muscle models to a desired end position.

Journal ArticleDOI
TL;DR: Wieland et al. as discussed by the authors developed a one-two-dimensional fiber model and presented a problem-tailored numerical solution strategy to solve the problem of dry spinning of multiple fibers simultaneously.
Abstract: The dry spinning of fibers can be described by three-dimensional multi-phase flow models that contain key effects like solvent evaporation and fiber-air interaction. Since the direct numerical simulation of the three-dimensional models is in general not possible, dimensionally reduced models are deduced. We recently developed a one-two-dimensional fiber model and presented a problem-tailored numerical solution strategy in Wieland et al. (J Comput Phys 384:326–348, 2019). However, in view of industrial setups with multiple fibers spun simultaneously the numerical schemes must be accelerated to achieve feasible simulation times. The bottleneck builds the computation of the radial concentration and temperature profiles as well as their cross-sectionally averaged values. In this paper we address this issue and develop efficient numerical algorithms.

Journal ArticleDOI
TL;DR: The dynamic programming idea was applied to carry out the flight path planning for the flight space, and the path-correction number optimal double-objective model is comprehensively established.
Abstract: Aiming at the positioning error of the aircraft, the error accumulation to a certain extent may lead to the failure of the mission. A track correction method based on oriented graph search was proposed, and the dynamic programming idea was applied to solve the path optimization problem. Firstly, the correction points in the flight space were preprocessed, and the planning of the correction points in the flight space was transformed into a graph theory problem. It effectively solves the problem that traditional methods did not adapt well to dynamic changes in flight space. And the problem of too high complexity in the calculation space. Using the dynamic programming idea to carry out the flight path planning for the flight space, the path-correction number optimal double-objective model is comprehensively established. Finally, the oriented graph and dynamic programming ideas were used to solve the problem, and the visual analysis of the aircraft track was realized. The simulation results show that the established path-corrected optimal double-objective model can calculate an optimal track with relatively small correction times and short length.

Journal ArticleDOI
TL;DR: From a mathematical point of view, the key idea is to use the concept of stress majorization to minimize a stress function over the positions of the nodes in the graph.
Abstract: The visualization of conveyor systems in the sense of a connected graph is a challenging problem. Starting from communication data provided by the IT system, graph drawing techniques are applied to generate an appealing layout of the conveyor system. From a mathematical point of view, the key idea is to use the concept of stress majorization to minimize a stress function over the positions of the nodes in the graph. Different to the already existing literature, we have to take care of special features inspired by the real-world problems.