scispace - formally typeset
Search or ask a question

Showing papers on "Process modeling published in 1991"


Book
01 Feb 1991
TL;DR: In this article, the authors introduce Process Control and Automation, and present a survey of the main aspects of operational functions in process and production control, as well as an overview of the relationships between them.
Abstract: Part I. General Introduction. What is Process Control and Automation? Enterprise Functions and Organization. Operational Functions in Process and Production Control. Aspects of Operational Functions. Part II. Desired Operation: Momentary Optimization. Forms of Optimizing Process Operation. Optimizing the Operation of Continuous Process Units. Optimal Plant Operation. Desired Operation of a Production Site. Part III. Desired Operation: Dynamic Operation. Total Run Optimization. Optimal Intermode Operations. Optimum Operation Processes. Flexible Recipes and Recipe Improvement. Part IV. Process Control. Regulatory Control Principles. Regulatory Control Structures. Regulatory Control Issues. Quality Control and Statistical Process Control. Taste Control and Efficiency. Sequence and End Point Control. Part V. Process Supervision. Process State Assessment. Off-Normal Handling. Part VI. Internal Integration. Jobs and Work Organization. Work Places. Human/Automation System-Interaction. Process Models. Hardware and Software Infrastructure. Part VII. Automating. Process Automation Plan. Process Automation Projects. Cooperation in Automation Plans and Projects. Part VIII. External Integration. Logistic Control. Plant Design and Process Control. Interactions between Process Automation and Plant Maintenance. Process and Production Information Systems. Epilogue. Appendices. Author and Organization Index. Subject Index.

69 citations


Journal ArticleDOI
TL;DR: An adaptive learning architecture for modeling manufacturing processes involving several control variables is described and it is shown that, by employing the generalization ability of neural networks in the synthesis algorithm, new recipes can be produced for the LPCVD process.
Abstract: An adaptive learning architecture for modeling manufacturing processes involving several control variables is described. The use of this architecture to process modeling and recipe synthesis for deposition rate, stress, and film thickness in low-pressure chemical vapor deposition (LPCVD) of undoped polysilicon is discussed. In this architecture the model for a process is generated by combining the qualitative knowledge of human experts, captured in the form of influence diagrams, and the learning abilities of neural networks for extracting the quantitative knowledge that relates the parameters of a process. To evaluate the merits of this methodology, the accuracy of these new models is compared to that of more conventional models generated by the use of first principles and/or statistical regression analysis. The models generated by the integration of influence diagrams and neural networks are shown to have half the error or less, even though given only half as much information in creating the models. Furthermore, it is shown that, by employing the generalization ability of neural networks in the synthesis algorithm, new recipes can be produced for the process. Two such recipes are generated for the LPCVD process. One is a zero-stress polysilicon film recipe; the second is a uniform deposition rate recipe which is based on the use of a nonuniform temperature distribution during deposition. >

54 citations


Book ChapterDOI
01 May 1991
TL;DR: This paper proposes a way of integrating process and rule based approaches in information systems development with incremental specifications where details are successively added until they arrive at a specification from which executable code can be automatically generated.
Abstract: During the last years, the time aspect in information systems development has been addressed by several researchers [2], [8], [6] Organisations are dynamic by nature and thus, the importance of modelling time explicitly in systems engineering approaches is crucial This paper proposes a way of integrating process and rule based approaches in information systems development Both static and dynamic aspects including the temporal dimension can be described We envisage an approach with incremental specifications where details are successively added until we arrive at a specification from which executable code can be automatically generated The output from this process (ie, a set of rules) should be compatible with a rule manager which controls the execution of the system A prototype has been developed to demonstrate the feasibility of this approach and is briefly described

47 citations


Journal ArticleDOI
TL;DR: In this article, a model for wear in which wear is a continuous increasing stochastic process is set up and optimal control problems for these models are posed and explicitly solved in one case.
Abstract: Models for wear in which wear is a continuous increasing stochastic process are set up. Optimal control problems for these models are posed and explicitly solved in one case. >

45 citations


Journal ArticleDOI
C. Di Massimo1, Mark J. Willis1, Gary Montague1, M.T. Tham1, A.J. Morris1 
TL;DR: In this paper, the authors used a neural network-based process model for the estimation of a feed-batch penicillin fermentation and a continuous mycelial fermentation in a large scale industrial fermentation system.
Abstract: Artificial neural networks are made upon of highly interconnected layers of simple ‘neuron-like’ nodes. The neurons act as non-linear processing elements within the network. An attractive property of artificial neural networks is that given the appropriate network topology, they are capable of learning and characterising non-linear functional relationships. Furthermore, the structure of the resulting neural network based process model may be considered generic, in the sense that little prior process knowledge is required in its determination. The methodology therefore provides a cost efficient and reliable process modelling technique. One area where such a technique could be useful is biotechnological systems. Here, for example, the use of a process model within an estimation scheme has long been considered an effective means of overcoming inherent on-line measurement problems. However, the development of an accurate process model is extremely time consuming and often results in a model of limited applicability. Artificial neural networks could therefore prove to be a useful model building tool when striving to improve bioprocess operability. Two large scale industrial fermentation systems have been considered as test cases; a fed-batch penicillin fermentation and a continuous mycelial fermentation. Both systems serve to demonstrate the utility, flexibility and potential of the artificial neural network approach to process modelling.

43 citations


Journal ArticleDOI
TL;DR: The central part of this paper describes the nine-step Prism methodology for building and tailoring process models and gives several scenarios to support this description.
Abstract: The Prism model of engineering processes and an architecture which captures this model in its various components are described. The architecture has been designed to hold a product software process description the life-cycle of which is supported by an explicit representation of a higher-level (or meta) process description. The central part of this paper describes the nine-step Prism methodology for building and tailoring process models and gives several scenarios to support this description. In Prism, process models are built using a hybrid process modeling language that is based on a high-level Petri net formalism and rules. An important observation is that this environment should be seen as an infrastructure for carrying out the more difficult task of creating sound process models. >

40 citations


Proceedings ArticleDOI
16 Sep 1991
TL;DR: In this paper, a method of performing the cost analysis by taking into account the process yield at each step of the process sequence and how the yield at different steps impacts the overall cost of the module is presented.
Abstract: The design for manufacturing (DFM) concept has created a significant effort in the cost analysis of the overall product from the beginning of die fabrication to the fully assembled and tested module as a means to evaluate different options in packaging technologies, assembly and test processes, and equipment selection. A method of performing the cost analysis by taking into account the process yield at each step of the process sequence and how the yield at different steps impacts the overall cost of the module is presented. Most of the discussion pertains to the module assembly portion of the electronic assembly (assembling a finished package to a module), but the methodology could be applied to any process sequence. The methodology of modeling the process is outlined, and how to use the outputs from the process models to evaluate the different constituents of the module cost is addressed. >

27 citations


Proceedings ArticleDOI
21 Oct 1991
TL;DR: This approach is based on modeling the design methodologies and classifying their components and is demonstrated by using it to compare JSD and Booch’s Object 0 ciented Design (BOOD) and demonstrates that process modeling is valuable as a powerful tool in analysis of software development approaches.
Abstract: A number of software design methodologies have been developed and compared over the past 20 years. A good comparison would aid in codifying, enhancing and integrating these design methodologies. However, the existing comparisons are based largely upon the experiences of practitioners and the understandings of the authors. Consequently, these comparisons tend to be subjeckve and to be affected by application domains. 11 is the purpose of this paper to introduce a systematic approach to objectively compare design methodologies. Our approach is based on modeling the design methodologies and classifying their components. Modeling the design methodologies ent,ails decomposing the methodologies into romponents. The classification of the components illustrates which components address similar design issues and/or have similar structures. Similar components can be identified and may be further modeled to aid in understanding more precisely their similarities and differences. The models of the methodologies are also used as the bases for conjectures and conclusions about the differences between the methodologies. In this paper we demonstrate this approach by using it to compare JSD [Jacks831 and Booch’s Object 0 ciented Design (BOOD) [Booch86]. The results of this comparison also demonstrate that process modeling [OsterU, Kelln881 is valuable as a powerful tool in analysis of software development approaches.

25 citations



Proceedings ArticleDOI
Dewayne E. Perry1
15 Oct 1991
TL;DR: The general goals in Interact and Intermediate are to support goal-directed process modeling in such a way to maximize concurrency of activities and to minimize control of the human element in the process.
Abstract: and Modeling such a process requires design decisions about the granularity of the ,process activities, about the degree of prescription defined within those activities, and about the policies that govern the triggering and termination of the activities and that define the nature of interaction and communication among the programmers evolving the system. Whatever choices are made for these issues, the resulting process is of necessity one with concurrent, independent, and asynchronous activities - that is, there will be multiple and different activities acting on the multiple and different states of the product. The critical issue is that of coordinating and synchronizing these independent activities. The general goals in Interact and Intermediate are to support goal-directed process modeling in such a way to maximize concurrency of activities and to minimize control of the human element in the process. To do this, I have separated the model specification from the enaction - that is, I have separated the modeling from the support and and in doing so have separated the (mostly) static aspects of process modeling from the (mostly) dynamic aspects of model enactment. Interact provides facilities for defining objects, policies, and activities: object definitions are used to model both the product and the project; policies definitions are used to model various facts and relationships about both the product and the project as well as to define synchronization (or interaction) abstractions; activity definitions are used to model the process activities that transform the product and project from one state to another. Activities are defined in terms of the activating policies, the defined goals and resulting obligations. Where desired, the process designer may bind the user to a particular implementation of the activity by supplying some structure to what is normally considered a primitive entity. Object declarations include type definitions, type instances, and object delinitions. Types and type instances enable the process designer to define the appropriate abstractions that are necessary for the model and to define the values for those abstractions. Objects have types and may assume the values defined for those types, For example, the model of software artifact serves as the coordinating object for the various activities that transform the product from one state to another; the model of the project defines the objects by which non-product related communications take place. Of paramount importance, however, are the policy definitions. They define the relationships among objects in several ways. First, policies may be primitive. These serve as base abstraclions which are asserted as results of activities. Second, policies may encapsulate logical expressions that relate (in various ways) base

22 citations


Proceedings ArticleDOI
15 Oct 1991
TL;DR: The Process Virtual Machine is a process kernel designed to support multiple process formalisms by providing a internal formalism in which to define process meta-models.
Abstract: The Process Virtual Machine is a process kernel designed to support multiple process formalisms by providing a internal formalism in which to define process meta-models; a common process infrastructure to support the definition, instantiation, and enactment of process models expressed in those meta-models; generic support for distribution, persistence, and object management; and independence from the object manager being used to describe the product(s) produced. It is being discussed and designed within the Process Working Group of DARPA’s ProtoTech project.

Proceedings ArticleDOI
08 Jan 1991
TL;DR: The authors present a characterization of the different dimensions of a CSCW system, describe the proposed process programming language and evaluate the expressibility of the language by demonstrating, with examples, how it can be used to generate CSCw systems with different characteristics.
Abstract: An architecture is given for computer supported cooperative work (CSCW) systems that maintains and executes an explicit model of a project-specific process to provide customized support for enforcing the team coordination policies that are represented in the process model. One focus of this research has been the design of a process programming language for building executable process models. The language provides features for data, activity and user role modelling. A process program executed by a generic virtual machine is the basis of a customized CSCW system. The authors present a characterization of the different dimensions of a CSCW system, describe the proposed process programming language and evaluate the expressibility of the language by demonstrating, with examples, how it can be used to generate CSCW systems with different characteristics. >

Journal ArticleDOI
TL;DR: A component-connector-based description for a design object finds a friendly host in an object-oriented programming language because of its simplicity and provides a knowledge framework with function behaviour and structure.
Abstract: SUMMARY A component-connector-based description for a design object, because of its simplicity, finds a friendly host in an object-oriented programming language. The description not only matches a top-down design process but also provides a knowledge framework with function behaviour and structure. Thus a tiny but versatile assembly browser is initially implemented in Smalltalk/V through a multigraph data structure.


Journal ArticleDOI
TL;DR: The relationship between the ERM and DFD techniques is examined and the use of logical access mapping (LAM) as a method of synthesizing the product of these two data modelling activities is proposed.
Abstract: This paper begins by describing a generally accepted view of the contemporary information systems development process. This process assumes the direct derivation of a prototype from data models which have themselves been derived from a functionally decomposed representation of reality. The data models are constructed using entity-relationship modelling (ERM) and data flow diagram (DFD) techniques. The paper also considers the use of Petri net representations as structured process models of reality. These representations are a means of structuring the functional analysis of observed situations for which an information system design is required. The paper then suggests a refinement to the model just described. It examines the relationship between the ERM and DFD techniques and proposes the use of logical access mapping (LAM) as a method of synthesizing the product of these two data modelling activities. An example is used to illustrate the derivation of ERM and DFD from Petri nets. Also an example of the refined process model is given. ‘Weak links’ in this model are identified and further work towards establishing a method for formally proving data models is proposed.

Journal ArticleDOI
TL;DR: This work proposes an extension of a qualitative modeling system, known as functional modeling, which captures goaloriented activities explicitly, and shows how they may be used to support intelligent automation and fault management.

Proceedings ArticleDOI
11 Sep 1991
TL;DR: The software reusability-based development process and reusable objects modeling are discussed, and Reusability has been largely improved by a software specification adjustment method, called differential specification, and the direct transformation capability offered in the environment.
Abstract: The software reusability-based development process and reusable objects modeling are discussed. The supporting tools environment which allows both the developer and user to carry out requirement definitions, specification, and implementation in a reusable way, is described. Some quantitative evaluations are given about how productivity and quality have been improved by using this environment, based on the results of a number of case studies made at development sites. Reusability has been largely improved by a software specification adjustment method, called differential specification, and the direct transformation capability offered in the environment. >

Journal ArticleDOI
TL;DR: In this article, the authors present a number of numerical process models of the infra-red reflow soldering process, the major joining process for surface mount assemblies, and investigate the key process and design variables and their effect on joint quality in the final assembly.


Journal ArticleDOI
01 Jan 1991-JOM
TL;DR: In this article, an intelligent process control for the induction-coupled plasma deposition (ICPD) process is presented, which consists of integration of processing knowledge, process models, process sensors and control technology.
Abstract: Induction-coupled plasma deposition (ICPD) is currently a laboratory process relying on operator expertise rather than automated control. Intelligent process control for the ICPD process, which consists of integration of processing knowledge, process models, process sensors and control technology, offers the opportunity to provide closed-loop control as well as intelligent supervisory control to accelerate process development and bring the ICPD process to full-scale production. Intelligent process control for ICPD offers benefits such as higher material quality, control of the matrix micro-structure, cost reduction, higher process yield, and shorter development and production cycle times.

Proceedings ArticleDOI
15 Oct 1991
TL;DR: In the first session of the workshop, an experimental format using two principal techniques was adopted that attempted to structure the debate using Issues, Positions, and Arguments using IBIS techniques.
Abstract: In the first session of the workshop we adopted an experimental format using two principal techniques. The first was that we attempted to structure the debate using Issues, Positions, and Arguments. This approach was inspired by the IBIS work done at MCC by Conklin and Potts on design rationale capture (see Potts [POTT89]). The second was an attempt to record the discussion on a board visible to all participants. Our use of the IBIS techniques was based on Potts’ definitions. An issue poses a question about some focus of concern. Our issues were of the form “What are the strengths and weaknesses of X’? A position is a candidate response to an issue. Examples include “X is good for Y”, or “X has property Y which is undesirable”. An argument may support or object to a position, Arguments supply rationale or compare two positions. Four classes of formalism were presented:

Book ChapterDOI
01 Jan 1991
TL;DR: A new approach to the modelling of discrete dynamic systems, which has three outstanding features, is presented, which draws on the language-philosophical research of communication and supports real conceptual modelling of processes.
Abstract: A new approach to the modelling of discrete dynamic systems, which has three outstanding features is presented The first is that it draws on the language-philosophical research of communication This leads to the incorporation of the pragmatic function of messages in systems analysis, in addition to the semantic contents, which is already common practice The second feature is that it supports real conceptual modelling of processes, by which we mean modelling at a significantly higher level of abstraction than the current highest one, which is the logical level The third feature concerns the formal definition of the core metamodel and the applied modelling techniques The metamodel used is an automaton, which can be viewed as a generalization as well as an extension of the finite state machine The focus in this paper is on the dynamic perspective and on the structure perspective The modelling capabilities of the approach are demonstrated by using an example from the area of business applications

Proceedings ArticleDOI
08 Jan 1991
TL;DR: Discusses an active, or symbiotic, decision support system developed for the National Aeronautics and Space Administration called the Generic Operations Simulation Technique (GOST), which combines parametric modeling techniques (regression) with discrete-event simulation to cost and plan future space programs.
Abstract: Discusses an active, or symbiotic, decision support system developed for the National Aeronautics and Space Administration called the Generic Operations Simulation Technique (GOST). GOST combines parametric modeling techniques (regression) with discrete-event simulation to cost and plan future space programs. GOST differs from previous process modeling approaches by providing an 'artificially intelligent modeling expert' and an 'artificially intelligent domain expert' for assisting the user in developing and analyzing process models. An object oriented knowledge representation system provides the foundation for the GOST environment. A taxonomy of Computer Directed Process Managers, implemented within the representation system, control the appropriate processes responsible for delivering active modeling support. >

Patent
17 May 1991
TL;DR: In this paper, a projection model is used to realize the unitary management of information on building production and the grade increase and integration of the building production, maintaining the independence of a product model regarding buildings, and flexibly cope with alterations of transaction contents and organization in production transactions, alterations of the system, etc.
Abstract: PURPOSE:To realize the unitary management of information on building production and the grade increase and integration of the building production, to maintain the independence of a product model regarding buildings, and to flexibly cope with alterations of transaction contents and organization in production transactions, alterations of the system, etc. CONSTITUTION:This system is equipped with the projection model 1 which is constituted by putting a product model wherein a product is defined and a process model wherein transactions regarding the product are defined together and describing the insides of the process models as an object of hierarchic structure in view along the transactions, interfaces 6 and 7 between the project model 1, and other systems 2 and 3 and a database 4, and a user interface 5. Then a method for information reference which takes necessary information out of the data of the product model together with individual information required to carry on the respective transactions in the process model, application software relating to the actual transactions, etc., are embedded as values of slots.

Journal ArticleDOI
TL;DR: A new approach to the design and implementation of an intelligent process monitoring and diagnosis system shell called IPCS (Intelligent Process Control System) is described, which has undergone field tests and was in experimental use from May 1989 to July 1990.

Journal Article
Rossi Ja1
TL;DR: The use of computers to model dynamic systems, the theory that supports simulation modeling as a research methodology, and the issues of model validation are described.
Abstract: In the nursing literature, the term computer simulation refers exclusively to an educational tool that requires the user to respond to simulated events and to engage in the decision-making process. However, computer simulations are frequently constructed to model dynamic systems (systems that change with time) and to randomly simulate real life events in an effort to study complex problems. In this context, computer simulation is a research methodology and is used in a wide variety of disciplines including the behavioral sciences and business management. Nurses will inevitably encounter research studies where computer simulation has been employed. It is also very likely that this methodology will be used as a nursing research tool in the near future. This article introduces and describes the use of computers to model dynamic systems, the theory that supports simulation modeling as a research methodology, and the issues of model validation. Included are examples of four different research problems from diverse fields of study where computer simulation has been applied.

Proceedings ArticleDOI
01 Aug 1991
TL;DR: EXCON is a software package developed to control and define the modeling of experiments used to integrate various individual modeling systems, in a user- definable manner, to simulate the semiconductor manufacturing process.
Abstract: EXCON is a software package developed to control and define the modeling of experiments. In this case, EXCON is used to integrate various individual modeling systems, in a user- definable manner, to simulate the semiconductor manufacturing process. These numerous systems model specific aspects of the integrated circuit fabrication process. Each can be a large complex software program requiring many system resources to reliably emulate the physical processes, in many cases at the atomic level, in an analytical manner. There are many different program data formats and user interfaces within the modeling systems used. EXCON addresses the automatic insertion of configuration and process data, the conversion of data formats between modeling systems, and the sequence of model execution. EXCON also has a mechanism to re-run the sequence of models with variations in one or several configuration or data parameters, thereby creating an environment to do controlled experiments. EXCON assists in the visualization of the data in the experimental data sequence. EXCON allows for coarse-grained parallelism by connecting processes with an interprocess and inter-machine communication mechanism, thereby allowing for concurrent execution of processes on multiple machines. System performance enhancement is done with a incremental directed graph analysis technique. EXCON will, when appropriate, transparently convert file formats between modeling systems.

Journal Article
TL;DR: In this paper, the design of heat-treating processes can be improved and made more efficient through numerical simulation using numerical simulation techniques, which can be used to improve the efficiency of heat treating processes.
Abstract: Design of heat-treating processes can be improved and made more efficient through numerical simulation

Book ChapterDOI
01 Jan 1991
TL;DR: The objective of IMPPACT is to develop and demonstrate a new generation of integrated modelling systems for product design and process planning including machine control data generation.
Abstract: The objective of IMPPACT is to develop and demonstrate a new generation of integrated modelling systems for product design and process planning including machine control data generation. The main goals are to improve software vendors and users’ efficiency by making software more easy to integrate and by enhancing the functionality of existing CIM-modules. Limitations of integration will be overcome by a new conceptual approach of features used in all stages of the manufacturing process.

Journal ArticleDOI
TL;DR: This paper describes an active, or symbiotic, decision support system developed for the National Aeronautics and Space Administration called the Generic Operations Simulation Technique (GOST).
Abstract: :This paper describes an active, or symbiotic, decision support system developed for the National Aeronautics and Space Administration called the Generic Operations Simulation Technique (GOST). GOST combines parametric modeling techniques (regression) with discrete-event simulation to cost and plan future space programs. GOST differs from previous process modeling approaches by providing an “artificially intelligent modeling expert” and an “artificially intelligent domain expert” for assisting the user in developing and analyzing process models. An object oriented knowledge representation system provides the foundation for the GOST environment. A taxonomy of Computer Directed Process Managers, implemented within the representation system, controls the appropriate processes responsible for delivering active modeling support.