scispace - formally typeset
Search or ask a question

Showing papers on "System integration published in 1987"


Journal ArticleDOI
TL;DR: In this paper, the authors define integration as "joining together to make one" and show that integration actually means reintegration, because the process of designing a manufacturing system starts with a set of products and an entire factory which is already integrated.

36 citations


Book
01 Jan 1987
TL;DR: In this paper, the authors present a Morphological View Process Synthesis: Some Simple and Practical Developments Computer-Aided Design (CAD) Advance in Process Flowsheeting Systems Integration of Process Design and Process Control Safety in Process and Plant Design Developing Design Bases for Synthetic Fuel Plants Process and plant Design for Food and Biochemical Production Chemical Engineering in Electronic Materials Processing: Semiconductor Reaction and Reactor Engineering.
Abstract: Catalyst Design: Selected Topics and Examples Process Heat Exchanger Design: Qualitative Factors in Selection and Application Energy Efficient Separation Process Design Process Integration Process Synthesis: A Morphological View Process Synthesis: Some Simple and Practical Developments Computer-Aided Design (CAD) Advance in Process Flowsheeting Systems Integration of Process Design and Process Control Safety in Process and Plant Design Developing Design Bases for Synthetic Fuel Plants Process and Plant Design for Food and Biochemical Production Chemical Engineering in Electronic Materials Processing: Semiconductor Reaction and Reactor Engineering.

26 citations



Book ChapterDOI
01 Jan 1987
TL;DR: In this article, the design and its sensitivity data from the subsystems and disciplines are generated concurrently and then made available to the system designer enabling him to modify the system design so as to improve its performance.
Abstract: Interactions among engineering disciplines and subsystems in engineering system design are surveyed and specific instances of such interactions are described. Examination of the interactions that a traditional design process in which the numerical values of major design variables are decided consecutively is likely to lead to a suboptimal design. Supporting numerical examples are a glider and a space antenna. Under an alternative approach introduced, the design and its sensitivity data from the subsystems and disciplines are generated concurrently and then made available to the system designer enabling him to modify the system design so as to improve its performance. Examples of a framework structure and an airliner wing illustrate that approach.

15 citations


Journal ArticleDOI
TL;DR: In this article, the authors considered two integration issues through different implementation phases: viability demonstration of the DSM system, and user participation in the formulation of the system, which was deemed necessary since acceptance of a poorly understood system is difficult and rejection of a system which does not fulfill the user needs is almost guaranteed.
Abstract: The integration of a Demand Side Management (DSM) system into a utility begins as soon as the first DSM pilot program is conceived and continues through implementation until the complete system is installed. Florida Power & Light Company for its DSM implementation has considered two integration issues through the different implementation phases: viability demonstration of the DSM system, and user participation in the formulation of the system. DSM should address at any time the viability concerns of customer acceptance and customer demand reduction levels. These two concerns on viability must be examined throughout the life of the project. The second integration issue, user participation, was deemed necessary since acceptance of a poorly understood system is difficult and rejection of a system which does not fulfill the user needs is almost guaranteed. Proper consideration of the viability and user participation issues will greatly enhance DSM system integration into the utility.

14 citations


Proceedings ArticleDOI
01 Jan 1987
TL;DR: A software tool which automatically generates FORTRAN routines for tabular data lookups, the language used to develop a simulation model, and the structures for passing information into a simulation are discussed.
Abstract: A framework to build simulation models for aircraft dynamic systems integration is described. The objective of the framework is increased simulation model fidelity and reduced time required to develop and modify these models. The equations of motion for an elastic aircraft and their impact on the framework are discussed in broad terms. A software tool which automatically generates FORTRAN routines for tabular data lookups, the language used to develop a simulation model, and the structures for passing information into a simulation are discussed. A simulation variable nomenclature is presented. The framework has been applied to build an open-loop F/A-18 simulation model. This example model is used to illustrate model reduction issues. Current deficiencies in the framework are identified as areas for future research.

11 citations


Proceedings ArticleDOI
24 Sep 1987
TL;DR: This paper addresses methods for high and low level multi-sensor integration based on maintaining consistent labelings of features detected in different sensor domains and implementation in a concurrent computing environment.
Abstract: This paper addresses methods for high and low level multi-sensor integration based on maintaining consistent labelings of features detected in different sensor domains. Implementation in a concurrent computing environment is discussed. Keywords: Multi-Sensor Integration, Sensor Fusion, Consistent Labeling, Markov Random Field, Concurrent Computing, Hypercube, Simulated Annealing.

11 citations



Proceedings ArticleDOI
01 Mar 1987
TL;DR: This paper takes advantage of WLISP, an object-oriented, knowledge-based user interface construction kit that contains a large number of tools and intelligent support systems and shows benefits and problems of separating the interface from the application system.
Abstract: A contemporary software engineering problem is to adapt an existing piece of software to a new interface technology.We describe the process of integrating a software engineering tool with a window-based direct manipulation interface. Four stages of this process are described - from a simple integration to a fully integrated system in which it becomes hard to separate properties of the tool from those of the interface.We take advantage of WLISP, an object-oriented, knowledge-based user interface construction kit that contains a large number of tools and intelligent support systems. The object-oriented architecture of WLISP is well suited for making use of existing interface components and tailoring them to the specific needs of the application. This is illustrated by describing the implementation of two of the four integration stages in some detail. Thereby we show benefits and problems of separating the interface from the application system.

9 citations


Proceedings ArticleDOI
01 Jan 1987
TL;DR: In this article, the augmentation of terrestrial utility algorithmic decision aids for power dispatching for space station use, using expert systems to direct power demand analyses and the integration of results into operational decisions.
Abstract: Space Station electrical power management must be accomplished autonomously in order to decrease both airborne and ground support costs. Attention is presently given to the augmentation of terrestrial utility algorithmic decision aids for power dispatching for space station use, using expert systems to direct power demand analyses and the integration of results into operational decisions. Functions to be thus managed encompass power scheduling, energy allocation, failure cause diagnoses, goal proposal and plan preparation, consequence evaluation, and execution plan selection. The operating states of the system are normal, preventive, emergency, and restorative.

8 citations


Journal ArticleDOI
TL;DR: The architecture, the design concept, and the network organization for NTT's broadband switching system field trials are described and a study of the broadband switching System through a close look at the results obtained from field trials is presented.
Abstract: The architecture, the design concept, and the network organization for NTT's broadband switching system field trials are described. The features of this system are: a variety of connection-type services such as reservation-based and asymmetric connections, the installation of small-size remote concentrating switch, and a system integration using various subscriber transmission media such as optical fiber, satellite, and radio. This paper also presents the system architecture along with a study of the broadband switching system through a close look at the results obtained from field trials.

01 Oct 1987
TL;DR: A knowledge-based system for performing real-time monitoring and analysis of telemetry data from the NASA Hubble Space Telescope is described, which is being used to monitor testcases produced by the Bass Telemetry System in the Hardware/Software Integration Facility at Lockheed Missile and Space Co. in Sunnyvale, California.
Abstract: This paper descibes a knowledge-based system for performing real-time monitoring and analysis of telemetry data from the NASA Hubble Space Telescope (HST). In order to handle asynchronous inputs and perform in real time the system consists of three or more separate processes, which run concurrently and communicate via a message passing scheme. The data management process gathers, compresses, and scales the incoming telemetry data befoe sending it to the other tasks. The inferencing process uses the incoming data to perform a real-time analysis of the state and health of the Space Telescope. The I/O process receives telemetry monitors from the data management process, updates its graphical displays in real time, and acts as the interface to the console operator. The three processes may run on the same or different computers. This system is currently under development and is being used to monitor testcases produced by the Bass Telemetry System in the Hardware/Software Integration Facility at Lockheed Missile and Space Co. in Sunnyvale, California.

Journal ArticleDOI
01 Mar 1987
TL;DR: The simulation facilities in the NASA Lyndon B. Johnson Space Center (JSC) Engineering Directorate used in support of Space Shuttle development and operations have been upgraded throughout the 16-year program and reflect near current technology in both computer systems and simulation software design as discussed by the authors.
Abstract: Real-time, man-in-the-loop simulation has served an important function in the NASA manned-space-flight program by providing the means to evaluate systems design and integrated systems performance on the ground in a simulated flight environment. The Space Shuttle Program has relied heavily on simulation throughout all phases of development and into Space Shuttle operations. The simulation facilities in the NASA Lyndon B. Johnson Space Center (JSC) Engineering Directorate used in support of Space Shuttle development and operations have been upgraded throughout the 16-year program and reflect near current technology in both computer systems and simulation software design. Operations of the JSC engineering simulation facility, the Systems Engineering Simulator, were expanded in 1985 to support Orbiter-related Space Station design activities such as Orbiter docking scenarios and the use of the Orbiter remote manipulator system in Space Station berthing scenarios. Development of a new Space Station simulation designed to provide long-term support to the Space Station Program is well under way. A description of the two Engineering Directorate simulation facilities, the Systems Engineering Simulator and the Shuttle Avionics Integration Laboratory, is presented. The function of each in support of the Space Shuttle Program is discussed, with emphasis on functions applicable to Space Station. The function of the Systems Engineering Simulator in Space Station development is described. Finally, a comprehensive and detailed description of the new Space Station simulation under development on the System Engineering Simulator is presented.

Proceedings ArticleDOI
01 Dec 1987
TL;DR: This paper describes an evolution approach which seeks to avoid pitfalls of the phased refinement approach, which can lead to problems in error handling, project management, and errors from modifications.
Abstract: Traditionally there have been two alternative strategies for software development: phased refinement or evolutionary enhancement. In phased refinement, all system functionality is specified in the first step of development, and subsequent implementation phases add proscribed design details. This is the standard for formalized methodologies, such as the waterfall model underlying Dod standards. The evolution model, conversely, assumes that system functionality cannot be specified correctly initially, and it provides for incremental build-up of functionality.The phased refinement approach is criticized for its high cost of maintenance, for poor motivation of system developers doing abstract tasks in early phases of development, and for complication of system integration. The evolutionary approach is criticized for producing poorly structured software, which can lead to problems in error handling, project management, and errors from modifications. This paper describes an evolution approach which seeks to avoid there pitfalls.


Proceedings ArticleDOI
01 Jun 1987
TL;DR: This model is based on the International Standards Organization (ISO) layered model for Open Systems Interconnection (OSI) and the requirements used to develop the model are presented, and the various elements of the model described.
Abstract: This paper presents a model for integrated communications within the Space Station Information System (SSIS). The SSIS is generally defined as the integrated set of space and ground information systems and networks which will provide required data services to the Space Station flight crew, ground operations personnel, and customer communities. This model is based on the International Standards Organization (ISO) layered model for Open Systems Interconnection (OSI). The requirements used to develop the model are presented, and the various elements of the model described.

Journal Article
TL;DR: Making the transition from a manual to a truly integrated system is a gradual process, but will realize long-term benefits such as increased system efficiency, improved patient care, and better strategic business support.
Abstract: Healthcare organizations' changing priorities have created needs for new information systems, as well as a demand for better integration of existing systems. Ten years ago, data processing concentrated on general and patient accounting, payroll, and accounts payable. Today, automated systems have to be able to gather data from several different systems to help in the added areas of patient care, cost management, and marketing. Making the transition from a manual to a truly integrated system is a gradual process, but will realize long-term benefits such as increased system efficiency, improved patient care, and better strategic business support. Before taking this step, healthcare executives need to know the typical problems that may occur with integration and what factors to consider before developing an integrated system.



01 Jul 1987
TL;DR: Graphics simulation activities at the Mission Planning and Analysis Division (MPAD) of NASA's Johnson Space Center are focusing on the evaluation of a wide variety of graphical analysis within the context of present and future space operations.
Abstract: Creation of an interactive display environment can expose issues in system design and operation not apparent from nongraphics development approaches. Large amounts of information can be presented in a short period of time. Processes can be simulated and observed before committing resources. In addition, changes in the economics of computing have enabled broader graphics usage beyond traditional engineering and design into integrated telerobotics and Artificial Intelligence (AI) applications. The highly integrated nature of space operations often tend to rely upon visually intensive man-machine communication to ensure success. Graphics simulation activities at the Mission Planning and Analysis Division (MPAD) of NASA's Johnson Space Center are focusing on the evaluation of a wide variety of graphical analysis within the context of present and future space operations. Several telerobotics and AI applications studies utilizing graphical simulation are described. The presentation includes portions of videotape illustrating technology developments involving: (1) coordinated manned maneuvering unit and remote manipulator system operations, (2) a helmet mounted display system, and (3) an automated rendezous application utilizing expert system and voice input/output technology.

Journal ArticleDOI
TL;DR: The Pacific Gas and Electric Company (PGandE) as discussed by the authors developed a demonstration real-time pricing program which has long-term promise for large industrial customers, focusing on five key operational considerations -Hardware development, costing and rate design, marketing, customer operations and billing, and analysis.
Abstract: Pacific Gas and Electric Company (PGandE) has developed a demonstration real-time pricing program which has long-term promise for large industrial customers. The demonstration is designed to focus on five key operational considerations - Hardware Development, Costing and Rate Design, Marketing, Customer Operations and Billing, and Analysis. A particularly important aspect of PGandE's success with real-time pricing has been the use of small working groups, drawn from functional departments, which focus on each systems integration issue. PGandE's experience thusfar clearly demonstrates the importance of systems integration to successfully implementing new rate schedules such as real-time pricing.



Patent
02 Apr 1987
TL;DR: In this article, a file integration processing part generates a category number-file correspondence table by making old files and old programs correspond to categories in an influence point table (or selection influence table) on the basis of an old file information table/old program information table.
Abstract: PURPOSE:To facilitate system integration processing and to improve reliability by generating relative items of the converting operation of a file and a program which is carried out for each local system for each film when plural local systems which have the same program and file format are integrated into one system CONSTITUTION:A file integration processing part 140 generates a category number-file correspondence table 143 by making old files and old programs correspond to categories in an influence point table 141 (or selection influence table 142) on the basis of an old file information table/old program information table 21 which indicates characteristics of files and programs to be integrated into the system and the integration conditions 22 of individual users Consequently, even if there are many object files and programs present when the systems are integrated, they whose conversion patterns are of the same kind are handled as one group on the basis of the categories of the influence point table


Proceedings ArticleDOI
24 Apr 1987
TL;DR: In this article, the authors review the interrelationships between some critical issues in lens design and the integration of electro-optical systems and highlight others that may not be as well known.
Abstract: The purpose of this paper is to review the inter-relationships between some critical issues in lens design and the integration of electro-optical systems. It is meant to be of use to the systems engineer responsible for integrating the various disciplines (e.g. optical, thermal, etc.) that make up a full system. It is assumed that the reader has some optical knowledge but is not expert in lens design. The intent is to describe the function of the optical designer and the contribution of optical design programs, to refresh the reader's memory of some important optical issues and relationships, and highlight others that may not be as well known. Representative references are included which the reader can consult to get more detailed information.

Proceedings Article
23 Aug 1987
TL;DR: This work has implemented a generic integration tool that has been demonstrated to significantly shorten the design cycle, and is studying its application to life-cycle engineering.
Abstract: Life-cycle engineering is the integration of the design process, which addresses primarily the system's performance, with analysis of the design's other attributes, including reliability, maintainability, life-cycle cost, and manufacturability. The advent of symbolic approaches to design integration reveals the requirement and opportunity of using higher-level analysis to select the optimum design. This opportunity is present only when human interaction is reduced sufficiently to permit several complete design and analysis cycles to take place. We have implemented a generic integration tool that has been demonstrated to significantly shorten the design cycle, and are studying its application to life-cycle engineering. This logic-based tool regards the requested attributes as a set of uninstantiated variables and invokes external computational programs, recursively if necessary, to achieve a proof consisting of the computed variable bindings.


Journal ArticleDOI
Robert M. Mattison1
01 Sep 1987
TL;DR: The author proposes that this problem definition process can be greatly facilitated by a better understanding of the business's organizational environment and its relationship to the computer environment.
Abstract: Many computer systems today have evolved into hybrids of assorted subsystems. This assortment can include systems of disparate technologies(batch, data base ...), hardware configurations and of different chronological ages.As managers try to 'forge' these hybrids into homogeneous, integrated systems, they are discovering that integration means more then simply the physical compatability of its component parts.Standardized protocols, gateway machines, and translation strategies can help, but after all of these 'physical' incompatabilities are resolved, a much larger problem, the problem of informational and organizational integration becomes apparent.When the practioner of large BUSINESS systems design attempts to apply readily available theory to these issues, he finds that his challenge is not only to 'solve' data management problems, but to 'uncover' what the real problems are when applied against the backdrop of business organizations and their often contradictory informational needs.The author proposes that this problem definition process can be greatly facilitated by a better understanding of the business's organizational environment and its relationship to the computer environment.By examining the organization itself as an information gathering/distributing system, the individuals' needs for information are described in terms of information requirements. Computer system organization, data storage and access methods are then evaluated in terms of their ability to meet those informational needs efficiently.Finally, an approach to large system integration based upon these informational requirements is proposed.

Journal ArticleDOI
Mike Sage1
TL;DR: The specific tools required to integrate ada with embedded microprocessor hardware configurations are outlined, the trade-offs between standard board designs and customized architectures are analyzed, and how an ada development workstation, Multibox, can support hardware/software integration is described.