scispace - formally typeset
Search or ask a question

Showing papers on "Process modeling published in 1989"


Patent
28 Dec 1989
TL;DR: The Connected Development Process of Four Dimensional Cognitive Modeling (CDPM) system as discussed by the authors is a microprocessor manipulated program which extracts the data inherent in the cognitive process leading to the spoken or written word and converts that data into business models capable of defining the interrelationship and functions of a business.
Abstract: A microprocessor manipulated program which extracts the data inherent in the cognitive process leading to the spoken or written word and converts that data into business models capable of defining the interrelationship and functions of a business. The program models the business and the data thus generated is used to produce application software program code capable of controlling and/or performing all functions of the business. The system springs from The Connected Development Process of Four Dimensional Cognitive Modeling using the four basic linguistic entities of PROCESS and its attendant adjuncts of DATA, CONTROL and SUPPORT.

332 citations


Journal ArticleDOI
01 Oct 1989
TL;DR: In this article, Artificial Neural Networks (ANNs) have been used to model weld-bead geometry in terms of the equipment parameters selected to produce the weld, and the performance of neural networks for modeling is evaluated using actual welding data.
Abstract: Artificial neural networks have been studied to determine their applicability to modeling and control of physical processes. Some basic concepts relating to neural networks and how they can be used to model weld-bead geometry in terms of the equipment parameters selected to produce the weld are explained. Approaches to utilizing neural networks in process control are discussed. The need for modeling transient as well as static characteristics of physical systems for closed-loop control is pointed out, and an approach to achieving this is presented. The performance of neural networks for modeling is evaluated using actual welding data. It is concluded that the accuracy of neural network modeling is fully comparable with the accuracy achieved by more traditional modeling schemes. >

180 citations


Proceedings ArticleDOI
15 May 1989
TL;DR: The principles of entity process, models are outlined and ways in which they can help to address some of the problems with more conventional approaches to modeling software processes are suggested.
Abstract: A defined software process is needed to provide organizations with a consistent framework for performing their work and im- proving the way they do it. An overall framework for modeling simplifies the task of producing process models, permits them to be tailored to individual needs, and facilitates process evolution. This paper outlines the principles of entity process, models and suggests ways in which they can help to address some of the problems with more conventional approaches to modeling software processes.

159 citations


Proceedings ArticleDOI
15 May 1989
TL;DR: The generic model supports the representation of design arti- facts, steps and heuristics and is applied to model episodes from a pedagogical design process: the design of a lift system controller.
Abstract: A paradigm for representing process information is described. It consists of a simple generic model and a specialization mechanism for customizing the model for any design method. The generic model supports the representation of design arti- facts, steps and heuristics. A specialization of the generic model for JSD is illustrated in detail. The JSD model is then applied to model episodes from a pedagogical design process: the design of a lift system controller.

104 citations


Proceedings ArticleDOI
03 Jan 1989
TL;DR: A view of the primary objectives of software process modeling, which formed the basis of the approach used, is set forth, and the usefulness of the model is evaluated, and general lessons are drawn from the modeling effort.
Abstract: Experiences in applying a specific modeling approach and technology to a portion of a software support process used by the US Air Force are related. The modeling approach is discussed in the context of examples drawn from the model developed. A view of the primary objectives of software process modeling, which formed the basis of the approach used, is set forth. The usefulness of the model is evaluated, and general lessons are drawn from the modeling effort. >

80 citations


Book
01 Oct 1989
TL;DR: 'Computational Methods for Process Simulation' develops the methods needed for the simulation of real processes to be found in the process industries, and stresses the engineering fundamentals used in developing process models.
Abstract: From the Publisher: In the chemical industry large, realistic non-linear problems are routinely solved with the aid of computer simulation This has a number of benefits, including easy assessment of the economic desirability of a project, convenient investigation of the effects of changes to system variables, and finally the introduction of mathematical rigour into the design process and inherent assumptions that may not have been there before 'Computational Methods for Process Simulation' develops the methods needed for the simulation of real processes to be found in the process industries It also stresses the engineering fundamentals used in developing process models Steady state and dynamic systems are considered, for both spatially lumped and spatially distributed problems It develops analytical and numerical computational techniques for algebraic, ordinary and partial differential equations, and makes use of computer software routines that are widely available Dedicated software examples are available via the Internet Introduction Development of Mass, Energy and Momentum Balances Steady-State Lumped Systems Unsteady-State Lumped Systems Reaction-Kinetic Systems Vapour-Liquid Equilibrium Operations Microscopic Balances Solution of Split Boundary-Value Problems Solution of Partial Differential Equations Nomenclature Appendices Analytical Solutions to Ordinary Differential Equations IMSL Routines Index

60 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a review of the current state-of-the-art in the field of thermal plasmas processing with an emphasis on process control and optimization.
Abstract: Thermal plasma processing of materials is a rapidly growing area of research. The commercialization of these processes, however, has been limited by the lack of fundamental understanding of how the various processes work. Research has historically focused on developing models of fluid flow and heat transfer to particles injected into either DC arc or RF plasma jets. These models in the past have simplified boundary conditions to meet computational limitations. Recent advances in models have now been made, allowing evaluations of more of the plasma process variables. Supersonic flow modeling in a DC jet and modeling of the effects of particle loading (particulate feed rate) have been accomplished and are reviewed here. Materials processing using thermal plasmas has been separated into the categories of synthesis, melting, and deposition, and is discussed in view of the processing effects on the resultant material structures. Process modeling leading to process understanding is reviewed with an emphasis on process control and optimization. Commercialization of plasma processes requires controls and process transducers which result from experimentation and process models. Approaches to develop process controls from the current technical base are presented.

39 citations


DOI
01 Jan 1989
TL;DR: This work has defined a kernel, called MARVEL, for such an architecture and implemented several successive versions of the kernel and several small environments using the kernel.
Abstract: We have been working for several years on rule-based process modeling and the implementation of such models as part of the foundation for software development environments. We have defined a kernel, called MARVEL, for such an architecture and implemented several successive versions of the kernel and several small environments using the kernel. We have evaluated our results to date, and discovered several significant flaws and delineated several important open problems. Although the details are specific to rule-based process modeling, we believe that our insights will be valuable to other researchers and developers contemplating process modeling mechanisms.

33 citations


Proceedings ArticleDOI
Barry Boehm1, Frank C. Belz1
10 Oct 1989
TL;DR: A good many of the problems on software projects arise from mismatches between the process model used by the project and the project's real-world process drivers: budget, schedule, available commercial off-the-shelf (COTS) software, customer standards, user mission or technology uncertainties, etc.
Abstract: A good many of the problems on software projects arise from mismatches between the process model used by the project and the project's real-world process drivers: budget, schedule, available commercial off-the-shelf (COTS) software, customer standards, user mission or technology uncertainties, etc. The primary process modeling approach to date for avoiding these mismatches has been to try to develop the perfect process model: one which will work well for any combination of process drivers.

29 citations


Journal ArticleDOI
TL;DR: A computer program SUPREM 3.5, has been developed to simulate processes used to manufacture ion-implanted GaAs integrated circuits, and the present version of the simulator models each process individually, as well as integrating the various processes and models together.
Abstract: A computer program SUPREM 3.5, has been developed to simulate processes used to manufacture ion-implanted GaAs integrated circuits. The processes modelled in the present version of the simulator include ion implantation, diffusion, and activation. The simulator includes a routine to calculate the threshold voltage of a MESFET device based on the simulated processing results and on substrate properties. The models used for each dopant and process are based on physical mechanisms when possible, but empirical parameter fitting is sometimes used. Pearson IV distributions are used for the implantation modeling, concentration-independent diffusivities are used for n-type dopants, concentration-dependent diffusivities are used for p-type dopants, and concentration-dependent activation efficiencies are used for all dopants. The simulator models each process individually, as well as integrating the various processes and models together. >

28 citations


Journal ArticleDOI
TL;DR: In this article, a new approach to compensate for process/model mismatch errors, based upon the Generic Model Control (GMC) algorithm, is presented. But this approach is applicable to both linear and nonlinear model-based algorithms.
Abstract: Process model-based control algorithms that employ a process model directly in the controller, have been shown to produce good control performance and robust behaviour, despite process modelling errors. However, when the process/model mismatch is large, the closed-loop response, while still being better than responses obtained by conventional controllers, will be degraded. This paper presents a new approach to compensate for process/model mismatch errors, and is based upon the Generic Model Control (GMC) algorithm. This approach is applicable to both linear and nonlinear model-based algorithms. Simulation results are presented to illustrate the efficiency of the approach

Proceedings ArticleDOI
04 Dec 1989
TL;DR: A process model to portray the dynamics of Information Systems Development is presented and suggestions for using the model to identify the relevant scenario and thereby improve the management of the ISD process are suggested.
Abstract: A process model to portray the dynamics of Information Systems Development (ISD) is presented. The model, while primarily rooted in earlier process models, also incorporates two key contextual factors -the perceived threat to users and the relative power of the users and the systems group -from past factor studies. The model is then used to generate four scenarios across the ISD process. These are co-operative, user-dominated, MIS-dominate£4 and con/lict. The scenarios are illustrated with summaries from recent case studies in ISD. This indicates the effectiveness of the model. The paper concludes with suggestions for using the model to identify the relevant scenario and thereby improve the management of the ISD process.

Journal ArticleDOI
TL;DR: An efficient method is presented for the design of controllers for integrating and runaway processes based on model matching in the frequency domain to achieve low-order, easily implementable cascade controllers in a unity-output-feedback configuration.
Abstract: This paper presents an efficient method for the design of controllers for integrating and runaway processes. The method is based on model matching in the frequency domain. The presence of open-loop instability as well as pure time delay in the process models make the design task challenging for these classes of processes. The goal is to achieve low-order, easily implementable cascade controllers in a unity-output-feedback configuration. It is shown that the central problem is in the selection of appropriate reference models. Several key constraints are developed which relate a given process model to a class of reference models for achieving total stability. Typical design examples are presented to clearly illustrate the various mathematical techniques.

Proceedings ArticleDOI
10 Oct 1989
TL;DR: This paper’ is intended to convey the flavor of the approach to software process modeling through the presentation of an example model fragment to illustrate some of the key aspects of the modeling approach.
Abstract: This paper’ is intended to convey the flavor of our approach to software process modeling through the presentation of an example model fragment. Unfortunately, in the very limited space available for this paper, it is only possible to illustrate some of the key aspects of our modeling approach. The interested reader is referred to [4,6,7] for more comprehensive discussions and examples of our models.


01 Oct 1989
TL;DR: In this paper, a 2-stage D-Optimal design with 24 experiments was used to refine and calibrate physically-derived LPCVD models for the specific furnace, and statistical models were built for two equipment responses, namely the polysilicon deposition rate and the built-in wafer stress.
Abstract: The systematic methodology presented here aims at building and calibrating equipment-specific process models. These models describe the nominal equipment response and also the inherent process variations. D-Optimal experiment design, combined with statistical modeling, is employed for the characterization and representation of the equipment and its associated processes. The methodology has been successfully applied for modeling a furnace used for low pressure chemical vapor deposition (LPCVD) of undoped polysilicon. A 2-stage D-Optimal design with 24 experiments was used to refine and calibrate physically-derived LPCVD models for the specific furnace. Statistical models have been built for two equipment responses, namely the polysilicon deposition rate and the built-in wafer stress. These models agree well with the experiment data, and account correctly for the observed variations. 7 refs., 6 figs.

Journal ArticleDOI
TL;DR: In this paper, the concept of assembly-oriented design is explained in some detail, and various known methods of assemblyoriented design and their limits are discussed, as well as other possibilities of integration.

01 May 1989
TL;DR: A high level reuse framework is presented that characterizes the object of reuse, the process for adapting that object for itstarget application, and the reused object within its target application.
Abstract: Maintenance is viewed as a reuse process. In this context, a set of models that can be used to support the maintenance process is discussed. A high level reuse framework is presented that characterizes the object of reuse, the process for adapting that object for its target application, and the reused object within its target application. Based upon this framework, a qualitative comparison is offered of the three maintenance process models with regard to their strengths and weaknesses and the circumstances in which they are appropriate. To provide a more systematic, quantitative approach for evaluating the appropriateness of the particular maintenance model, a measurement scheme is provided, based upon the reuse framework, in the form of an organized set of questions that need to be answered. To support the reuse perspective, a set of reuse enablers are discussed.

Proceedings ArticleDOI
10 Oct 1989
TL;DR: In seeking to maximise opportunity for achieving common understanding and real progress on specific issues, the announcement for the 5th Process Workshop adopts a somewhat narrow view of the role of Process Models.
Abstract: In seeking to maximise opportunity for achieving common understanding and real progress on specific issues, the announcement for the 5th Process Workshop adopts a somewhat narrow view of the role of Process Models. By so focussing the discussion, the fact that desirable characteristics of models will vary with the purpose for which they are to be used, the benefit one hopes to derive from their development or use and the value attached to benefit to gained from such activity has had to be disregarded. Yet process models can serve many purposes as summarised, for example, in the introduction to the Proceedings of the 4th workshop [SPW88] and also on the attached table. Though the roles included in that table are neither newly identified nor controversial they are listed so that they may be kept in the back of our minds as the Workshop discussion progresses.The individual importance of these roles can clearly not be ordered or even quantified. Their relative significance will depend on the goals of the work during which they are developed or used. The primary motivation underlying my work with process models over the past years has been the search for a better understanding of the software development and evolution process. This has led to conclusions which are, perhaps, self evident to many Computer Scientists, [DEM79], [FET88], [VAR79, 89] (as extreme examples) and others [BON77]. They are, however, not widely understood by the general public and, more importantly, by those involved in the definition and acquisition of computer systems for specific applications. In view of the increasing dependence of mankind on computers and, hence, on software it appears important to bring these, explicitly, into the open and to also examine their implications in the narrower confines of the topics discussed in the process workshop series.In the first instance my studies concentrated on an examination of models of program evolution in recognition of the fact that understanding and controlling that phenomenon demanded understanding and control of the programming process [LEH69, 85a]. This investigation led directly to the SPE classification scheme in which E-type programs, in particular, are defined as programs that implement applications or solve problems from and in the real world. In developing this schema process models played a fundamental role [LEH80b]. These models had no direct or immediate relationship to development practice but, nevertheless, led to the insight that is reflected in the LST process model [LEH84] that later formed the conceptual underpinning of a working environment [LEH85b].The work led to the view that an E-Type program is a model of a model of a … model of an application in the real world [LEH80b]. This abstract total-process model was enriched by Turski's view [TUR81] which regarded successive model pairs as a theory and a model of that theory or, equally, as a specification and an implementation of that specification. At the First Process Workshop [SPW84] Turski's interpretation of the “three-legged” model was, in fact, to see both the real world application (concept) and the final implementation as models of a specification that forms the bridge between concept and realisation. Equally, the source and target representations at the core of each step of the LST process, for example, may be viewed as a theory and a model of that theory.From Turski's view it follows that each implementation is Godel incomplete. Its properties, including functional properties, cannot be completely known from within the system. By their involvement in the development process and through system usage humans become an integral part of the system. It is their insights, viewpoints, theories, algorithms, definitions, formalisations, reactions, interpretations and so on that drive and direct the abstraction, refinement and evolution processes. They determine the degree of satisfaction that the final solution system yields. Hence it may be observed that for any software system implementing a solution to a real world problem, modelling some aspect of the real world, there exists a degree of Godel-type uncertainty about its properties [LEH89]. The definition, specification and development process must seek to limit this uncertainty so that it is of no practical significance; reflecting only abstract representational incompleteness.This is not the only type of uncertainty related to program behaviour under execution relative to the operational environment. A primary property of E-type programs is the fact that installation and operation of a system change the application and its operational domain. Implicitly if not explicitly, the system includes a model of itself. Also the acts of conceiving, developing, installing, using and adapting a software system change understanding of the application and its needs, methods of solution, the relative value of specific features of system functionality, opportunities for enhancement and human perception of all these. This leads to declining satisfaction with the system unless the system is adapted to the new reality. Because of the many feedback paths the system displays Heisenberg-type uncertainty [LEH77]. The more precise the knowledge of the application, of system properties and of their respective domains the less satisfaction does the solution system deliver in terms of what are, at the time of usage, the system properties perceived to be desirable. Mismatch between system properties and human needs and desires cannot be removed, except momentarily. Development, adaptation and evolution processes and their management are key to minimisation of the consequences of this inherent uncertainty.There is also a third type of uncertainty. The domain of an E-type application is, in general, unbounded and, effectively, continuous. The solution system is finite and discrete. The process of deriving one from the other involves a variety of abstractions on the basis of assumptions, about the application, its domain, perceived needs and opportunities, human responses to real world and solution system events, computational algorithms, theories about all these and so on. Some assumptions will be explicitly stated, others will be implicit. All will be built into the final system. But the real world is essentially dynamic, forever changing. Even without feedback effects as discussed above, exogenous changes in the real world will change the facts on which the assumption set is based. However careful and complete the control on the validity of assumptions is at the time they are built into the system some, at least, will, at best, be less than fully valid when the program is executed, or better, when the results of execution are applied. That is when, to be fully satisfactory, a program needs to be correct. Initial correctness at the time of implementation is merely a means to an end. The assumption set must be maintained correct without significant delay by appropriate changes to program or documentation texts, an impossible task even if assumptions could be precisely pinpointed. Pragmatic uncertainty in system behaviour is inevitable.This analysis leads to the following Uncertainty Principle for Computer Application: The outcome of software system operation in the real world is inherently uncertain with the precise area of uncertainty also not knowable.This position paper is not the place to explore this assertion in detail. Some aspects have been discussed elsewhere [LEH77, 89]. More immediately the principle throws a new light on the expectations to be associated with Software Engineering, the system and software development process and the models constructed to represent that process. One may have many views of the process. The new one that emerges from the above discussion is of software engineering in general and the software development and evolution (maintenance) process in particular as the means to minimise uncertainty, the consequences of uncertainty and the maintenance of user satisfaction. Project models must reflect this responsibility

Proceedings ArticleDOI
R. W. Phillips1
03 Jan 1989
TL;DR: The SCA is thought to solve many of the process modeling problems associated with conventional languages and methods and integrates object-oriented programming, a common repository for the logical view of data, and logic programming.
Abstract: The key concepts and requirements for a process mechanism and the state change architecture (SCA) protocols are presented. The SCA is thought to solve many of the process modeling problems associated with conventional languages and methods. It integrates object-oriented programming, a common repository for the logical view of data, and logic programming. >

Proceedings ArticleDOI
10 Oct 1989
TL;DR: An overview of OPM is given, a process modeling environment through which process model templates are designed, instantiated, and tracked as they execute, and the way OPM supports model instantiation, resource management, and cooperative work among a team of developers is stressed.
Abstract: This document as an attempt to give an overview of OPM, a process modeling environment . By an environment we mean a user interface through which process model templates are designed, instantiated, and tracked as they execute. The two novel features of OPM that we want to stress here are: (i) the way OPM uses conventional CASE tool graphical notations for describing process templates, and (ii) the way OPM supports model instantiation, resource management, and cooperative work among a team of developers.We begin with some basic definitions. A process is a collection of activities that will be carried out over time. An activity is a basic unit of work and it minimally contains a name, and some attributes such as a duration or some resources. Often the activities are ordered in some way, but not necessarily. If an ordering is available, then we can talk about an activity's predecessors and successors. When a process is being carried out (instantiated), typically more than one activity is going on at the same time. The result of an activity may be a product or simply a confirmation that it has concluded. An activity can invoke other processes. In fact, there is no logical reason to distinguish between an activity and a process, so in the future we will use these terms interchangeably.There are several basic features of processes that we assume. In particular: there are several ways a process may be started, e.g. by human intervention, by the arrival of a message, or by the completion of some product;an action by a computer may end a process and start a new one;a process may require access to resources before it can begin;sometimes the time for a process can be reliably estimated and sometimes it cannot;processes must have the ability to look at (access) directories, files, test file attributes, express conditions, and invoke programs.Figure 1 contains a description of how debugging takes place within some hypothetical organization. A database of bug reports is collected and there is an available pool of programmers to work on them. Each instantiation of debugging process will assign a bug report to a programmer for fixing. When the programmer is done, he submits his fix to the QA (quality assurance) group who confirm that his fix is correct. If not, it is returned for further work. If so, then a report is written and submitted to the manager.From this example we can draw conclusions that extend our earlier notions about processes and their descriptions. The use of dataflow diagram notation is suitable for describing this process. Rectangles are used to represent resources and ovals are used to represent processes. Arrows indicate process flow and they may carry data. Other examples we have done indicate that a wide variety of processes can be adequately described by this natural notation. To refine further the debugging example, we assume that when the programmer submits the corrected code to the quality assurance group, a new process is started. That process is shown in Figure 2. From this elaboration we conclude that processes may have lower levels which contain process descriptions. Therefore we see that a process description should be thought of as a hierarchical object.Another observation from this example is that there may be several people all following this process description at the same time. Therefore we see the need to view the process description as a template and talk about its instantiations. When the template of Figure 1 is instantiated we interpret this to mean that a new bug has been assigned to a programmer for repair. Observe that when a single programmer is selected to fix a single bug, that instantiation of the debugging process may give rise to multiple instantiations of the working on bug subprocess. Therefore we see that when a process is executing, subprocess templates may be instantiated multiple times.

Proceedings ArticleDOI
10 Oct 1989
TL;DR: It is mandatory to experiment with executable process models in order to obtain user feedback, to identify requirements on architectural components to support such models, and to investigate the impact of automation in the process itself.
Abstract: The primary thesis of this position paper is that it is mandatory to experiment with executable process models in order to obtain user feedback, to identify requirements on architectural components to support such models, and to investigate the impact of automation in the process itself. This paper briefly describes three generations of investigations with respect to the formalizing, modeling, and encoding of software life-cycle processes. Providing for executable processes has been one of the most important goals of this work.

Patent
19 Jan 1989
TL;DR: In this paper, a data sampling part 2 samples N pieces of input/output data on a process 1 and these sampled data are supplied to a parameter deciding part 3 and the parameters of plural process models are obtained by a nonlinear optimizing method.
Abstract: PURPOSE:To realize the optimum control of a process by selecting a process model having the maximum post-probability among plural models and identifying the selected model as an optimum process model. CONSTITUTION:A data sampling part 2 samples N pieces of input/output data on a process 1. These sampled data are supplied to a parameter deciding part 3 and the parameters of plural process models are obtained by a nonlinear optimizing method. Then a post-probability arithmetic part 4 obtains a probability density function based on the dispersed and observed values obtained from the parameter estimating value decided by the part 3. Then said probability density function is operated by Bayes theorem for acquisition of the post- probability of the process model. Thus the optimum process control is attained by identifying a process model that has the maximum post-probability.

01 Jan 1989
TL;DR: A knowledge-based system architecture for malfunction diagnosis and sensor validation in chemical process plants is discussed and a novel sensor validation strategy is developed that streamlines the validation by leveraging functional and operational knowledge about the sensors.
Abstract: A knowledge-based system architecture for malfunction diagnosis and sensor validation in chemical process plants is discussed. The sensor validation is performed during diagnosis to identify sensor failures and determine alternate data for failed sensors such that correct symptomatic data are provided for the diagnosis of malfunctioning process equipment. The architecture consists of three distinct problem-solving modules and effectively integrates qualitative and quantitative knowledge for sensor validation and diagnosis. The diagnostic module involves a malfunction hierarchy which represents malfunction hypotheses in its nodes. A systematic functional decomposition strategy is developed to help identify process subsystems in levels of detail ranging from general functional systems to specific equipment components. It facilitates the construction of a malfunction hierarchy from the process flowsheet. Functional decomposition also identifies control system (Category I) sensors which need to be represented in the malfunction hierarchy so that their failures are explicitly considered in the diagnosis as the possible causes of undesirable operating conditions. For non-control system (Category II) sensors, their failures are considered to ensure that correct symptomatic information is used for diagnosis and process monitoring. For both sensor categories, four different modes of sensor failure are considered, which include mechanical failures as well as biases. Recognizing the characteristics of chemical plants and the validation problem, a novel sensor validation strategy is developed. The strategy streamlines the validation by leveraging functional and operational knowledge about the sensors. The knowledge for sensor validation, including the past operational sensor reliability ranking and the methods for determining alternate values for the sensors, is organized in the validation module. The knowledge representation is flexible so that any applicable methods for validation, ranging from qualitative estimates to quantitative process models, can be incorporated. The validation algorithm is integrated into malfunction diagnosis such that the diagnosis specifies which sensors need to be validated. Finally, numerical process data are stored in the data abstraction module, along with the various parameters for qualitative data abstraction and independent data checks. The computational architecture is demonstrated with a working prototype system developed for a dynamically simulated chemical process.


Journal ArticleDOI
Sulekh C. Jain1
01 Feb 1989-JOM
TL;DR: The need to control manufacturing processes is forcing manufacturing technologists to seek a scientific description of manufacturing processes and the response of raw materials, equipment and semi-finished products to these processes in order to simulate the entire manufacturing process, and thus obtain near-optimal solutions for producing the desired product qualities.
Abstract: The term "process" is broadly defined as the means by which a manufacturing organization converts raw materials into a finished product. This includes people and equipment operating together in a managed and planned system. In addition to the efficient use oflabor and production equipment, an ideal process turns out a product with a 100% yield and no waste of raw materials. Typical real processes produce products with variations in characteristics, suffer from equipment downtime, and waste raw materials. Real processes also are sensitive to human behavior, machine performance, and the characteristics of raw materials. Often, attempts to increase the throughput of a process lead to an increase in the sensitivity to such variances. The traditional approach to maintain product quality within acceptable limits has been increasingly intensive product inspection to remove unacceptable products from the production line. Of course, such inspection is costly and time-consuming. In today's environment, which places an emphasis on total quality control (TQC), the trend is to couple process design and control together. An ideal, properly controlled process will yield only good products. Complete process control obviates the need for product inspection (or at least limits it to first article verification or occasional spot checks). The need to control manufacturing processes is forcing manufacturing technologists to seek a scientific description of manufacturing processes and the response of raw materials, equipment and semi-finished products to these processes in order to simulate the entire manufacturing process, and thus obtain near-optimal solutions for producing the desired product qualities. This is process modeling. The objective is to achieve efficient manufacturing processes without the need for the costly trial-and-error search for viable manufacturing processes. Analyses of this type rely on the

Journal Article
TL;DR: In this article, the use of spreadsheets for chemical engineering calculations is discussed, and the advantages and disadvantages of using spreadsheets are addressed, including process optimization, reaction rate equations, and operating cost calculations.
Abstract: This article discusses the use of spreadsheets for chemical engineering calculations. The advantages and disadvantages of spreadsheets are addressed. The author focuses on the application of spreadsheets for flowsheets and illustrates several examples, including process optimization, reaction rate equations, and operating cost calculations.

Journal ArticleDOI
01 Dec 1989-Energy
TL;DR: In this article, a full-scale process model of an integrated steel mill is developed and tested, with particular focus on energy-problems analysis, and the steel-making process is represented by a set of linear relations.

Journal Article
TL;DR: The benefits using an object-oriented approach to modelling, which includes modularization, model encapsulation, hierarchical submodel decomposition, model parameterization and inheritance, are discussed.
Abstract: In this paper we discuss the benefits using an object-oriented approach to modelling. Models and modelling are gaining more and more interest in process industry due to the development of knowledge-based systems. The basic elements in object-oriented modelling methodology are modularization, model encapsulation, hierarchical submodel decomposition, model parameterization and inheritance. Models have an internal structure of model components, like terminals, parameters and behaviour descriptions. The model behavior can be described with equations or as a structure of submodels. Models and model components are represented as objects in single inheritance object class hierarchies. Chemical processes and control systems can be described with the same basic concepts. Tasks that are facilitated in object-oriented modelling are model reuse, model development, model refinement and model maintenance. They are all of major importance for multi purpose process modelling.

Proceedings ArticleDOI
10 Oct 1989
TL;DR: For a process model to be accepted by a variety of practitioners, it has to be flexible, in the sense that it can be easily and naturally adaptable to diverse software development scenes.
Abstract: For a process model to be accepted by a variety of practitioners, it has to be flexible, in the sense that it can be easily and naturally adaptable to diverse software development scenes. It may, although not must, also be of help if it is derived evolutionary, in the sense that the model accepts the basic tenets of prevalent software practices with only essential modifications. We have to avoid, in this respect, mistaking an idealised software process for an actual one. Actual development processes are not linear, hierarchical, nor organised in any other neat way, both microscopically and macroscopically. Development phase/stages used in common parlance do not depict actual development scenes; such scenes consist of lots of feedbacks, implicit assumptions of products not yet seen, related yet asynchronous activities, and dynamic revision/modifications. Most of all, maintenance phases are bogus phases. Software maintenance involves all the activities of the initial development process, writ small. Problems that are said to be particular to maintenance phases, moreover, do arise in initial phases also.