scispace - formally typeset
Search or ask a question

Showing papers on "Application software published in 1979"


Journal ArticleDOI
TL;DR: A linear programming method for security dispatch and emergency control calculations on large power systems is presented, which works directly with the normal power-system variables and limits, and incorporates the usual sparse matrix techniques.
Abstract: A linear programming (LP) method for security dispatch and emergency control calculations on large power systems is presented. The method is reliable, fast, flexible, easy to program, and requires little computer storage. It works directly with the normal power-system variables and limits, and incorporates the usual sparse matrix techniques. An important feature of the method is that it handles multi-segment generator cost curves neatly and efficiently.

396 citations


Journal ArticleDOI
01 Feb 1979

82 citations


Proceedings ArticleDOI
06 Nov 1979
TL;DR: In this paper, measures for estimating the stability of a program and the modules of which the program is composed are presented, and an algorithm for computing these stability measures is given.
Abstract: Software maintenance has been the dominant factor contributing to the high cost of software. In this paper, the software maintenance process and the important software quality attributes that affect the maintenance effort are discussed. Among these quality attributes, the stability of a program, which indicates the resistance to the potential ripple effect that the program would have when it is modified, is an important one. Measures for estimating the stability of a program and the modules of which the program is composed are presented, and an algorithm for computing these stability measures is given. Application of these measures during the maintenance phase is discussed along with an example. Further research efforts involving validation of the stability measures, application of these measures during the design phase, and restructuring based on these measures are also discussed.

75 citations


Proceedings ArticleDOI
25 Jun 1979
TL;DR: Although design languages have been in existence since the early 1960's, only in the last few years has there been a concerted effort to bring them into the design process as a useful tool.
Abstract: Although design languages have been in existence since the early 1960's, only in the last few years has there been a concerted effort to bring them into the design process as a useful tool.The major applications of design languages are:1. Description of the behavior and/or structure of a system as a means for accurately communicating design details between designers and users.2. As the input to a system level simulator.3. As the input to a automatic hardware compiler.4. As the input to a formal verification system.Several attempts have been made to integrate hardware and software design. Examples are the LOGOS system [Ro76] developed at Case Western Reserve University and the SARA system developed at the University of California at Los Angeles [Es77].

54 citations


Proceedings ArticleDOI
01 Jan 1979
TL;DR: Some novel algorithms for pattern recognition research and a framework for efficient development, maintenance, and sharing of Interactive software amongst several users and diverse application areas are described.
Abstract: This paper, in two parts, describes some novel algorithms for pattern recognition research and a framework for efficient development, maintenance, and sharing of Interactive software amongst several users and diverse application areas. This modular interactive software system (MISS) forms the basis of a general purpose image analysis and pattern recognition research system (IPS) implemented in the Macdonald Stewart Biomedical Image Processing Laboratory at McGill University. The first part of the paper discusses the algorithms and some preliminary results. Two algorithms are singled out. The first is an interactive approach to nouparametric feature selection via two-dimensional mapping of the multidimension al minimal spanning tree of the features in pattern space. Some preliminary results of the performance of the algorithm, in the automatic mode, applied to feature selection for cervical cell classification, are presented. The second algorithm is an exact procedure for condensing the training data, in the nearest neighbor decision rule, which yields a minimal set of points that implements precisely the original nearest neighbor decision boundary. The second part of the paper describes the MISS and IPS software systems. The MISS software implementation framework insures software colbpati bility and sharing among many individuals and diverse applications, provides safeguard against software loss, and supports an extendable high level interactive language with on-line document ation. MISS language support includes a BASE LAN GUAGE interpreter (implementing a variant of FORTRAN) plus an EXTENDED LANGUAGE interpreter that facilitates addition of new groups of language statements. Each group of statements is associated with a particular function, application area, or programmer. A-11 IPS software has been implemented within the MISS framework. The present IPS implementation includes over 300 EXTENDED LANGUAGE statements in twenty groups facilitating such functions as: image acquisition and display, simulation of a hardware image processor, data management, image manipulation and filtering, graphics, image segmentation and feature extraction, feature selection, classification, and classification per formance measurement. The overall design philosophy of the MISS and IPS software systems and the ease with which new software can be added and documented are described.

25 citations


Journal ArticleDOI
TL;DR: This research applies to: a) the setting of threshold values of complexity in software production in order to avoid undue difficulty with program debugging; b) the use of complexity as an index for allocating resources during the test phase of software development; c) theUse of complexity for developing test strategies and the selection of test data.
Abstract: Several research studies have shown a strong relationship between program complexity, as measured by the structural properties of a program, and its error properties, as measured by number and types of errors and error detection and correction times. This research applies to: a) the setting of threshold values of complexity in software production in order to avoid undue difficulty with program debugging; b) the use of complexity as an index for allocating resources during the test phase of software development; c) the use of complexity for developing test strategies and the selection of test data. Application #c uses the directed graph representation of a program and its complexity measures to decompose the program into its basic constructs. The identification of the constructs serves to identify a) the components of the program which must be tested, and b) the selection of test data which are needed to exercise these components. Directed-graph properties which apply to program development and testing are defined; examples of the application of graph properties for program development and testing are given; the results of program complexity and error measurements are presented; and a procedure for complexity measurement and its use in programming and testing is summarized.

21 citations


Journal ArticleDOI
W.E. Fischer1
TL;DR: PHIDAS is an application-independent database management system used together with a device-independent software package for graphic I/O within the integrated CAD/CAM system PHILIKON, which covers the detailing of sheet metal parts, of tool design and NC programming.
Abstract: PHIDAS is an application-independent database management system used together with a device-independent software package for graphic I/O within the integrated CAD/CAM system PHILIKON, which covers the detailing of sheet metal parts, of tool design and NC programming. All modules are linked together via the database which acts as the centre of a growing system. The architecture of PHIDAS is in accordance with the ANSI-3 schema-concept having a conceptual schema and an external subschema based on the network model of CODASYL, and an internal schema for the physical tuning which is particularly suited for engineering databases. The system is explained as it is used in PHILIKON.

21 citations


Proceedings ArticleDOI
15 May 1979
TL;DR: The particular array processor which was studied performs network solutions several times faster than general purpose large mainframe digital computers and is available at a fraction of the cost.
Abstract: The application of an array processor to solution of power system network equations is discussed and results from its performance evaluation are presented. The particular array processor which was studied performs network solutions several times faster than general purpose large mainframe digital computers and is available at a fraction of the cost. It has an immediate application for decreasing the computer costs and improving the responsiveness of programs which are used in power system planning and operations.

20 citations


Journal ArticleDOI
TL;DR: This tutorial analyzes developments in computer network architectures from a top-down design viewpoint—starting with user interface requirements, then developing a structure to realize that interface.
Abstract: This tutorial analyzes developments in computer network architectures from a top-down design viewpoint—starting with user interface requirements, then developing a structure to realize that interface.

19 citations


Journal ArticleDOI
Joel1
TL;DR: The architecture of circuit switching systems predate computers, and computer engineers may learn something from a look at their architecture.
Abstract: Circuit switching systems predate computers. Partly for that reason, computer engineers may learn something from a look at their architecture.

16 citations


Journal ArticleDOI
TL;DR: The articles in this issue of Computer analyze programming methodologies and tools developed to reduce the cost of producing software from a human factors viewpoint, using the tools of human factors engineering to determine if the use of a particular aid increases programming performance or the quality of the resulting software.
Abstract: Software engineering is an effort to reduce the cost of producing software by raising programming productivity and lowering maintenance effort and to increase the benefits of software by extending application areas and improving service. Human factors considerations can have a major impact on the software development process and the quality of the software produced. High-level languages have increased programmer productivity by removing from the programmer the burden of remembering what values are in what register and placing it on a software system which is much more suited to managing detail. Other programming methodologies and tools have been developed to reduce the cost of producing software: design methodologies, structured programming, chief programmer teams, disciplined coding conventions, etc. The efficacy of these software aids has been demonstrated mostly by case studies. The articles in this issue of Computer analyze some of these methodologies and tools from a human factors viewpoint, using the tools of human factors engineering to determine if the use of a particular aid increases programming performance or the quality of the resulting software.

Journal ArticleDOI
TL;DR: Some computer architectures are inherently more efficient than others, and three parameters help system specifiers and designers pick the best–independent of hardware.
Abstract: Some computer architectures are inherently more efficient than others, and three parameters help system specifiers and designers pick the best–independent of hardware The winners are likely to become military standards


Proceedings ArticleDOI
15 May 1979
TL;DR: An evaluation of an array processor, the AP- 120B manufactured by Floating Point Systems, Inc., for solving transient stability and similar problems is given in this paper. But the evaluation is limited to a single host computer.
Abstract: An evaluation is given of an array processor, the AP- 120B manufactured by Floating Point Systems, Inc., for solving transient stability and similar problems. The paper describes the salient characteristics of the array processor, explains the evaluation procedure, and discusses results. With the particular host computer used, a 20 to 1 speedup in stability simulation can be obtained.

Journal ArticleDOI
F. Matos1
01 Sep 1979
TL;DR: In this age of modern era, the use of internet must be maximized.
Abstract: In this age of modern era, the use of internet must be maximized. Yeah, internet will help us very much not only for important thing but also for daily activities. Many people now, from any level can use internet. The sources of internet connection can also be enjoyed in many places. As one of the benefits is to get the on-line future developments in telecommunications book, as the world window, as many people suggest.

Journal ArticleDOI
TL;DR: This paper discusses and compares several synthesis methods for electricity distribution networks, assessing the validity of the feasibility constraints used in the various methods and special reference is made to heuristics based on group-transfer concepts.
Abstract: This paper discusses and compares several synthesis methods for electricity distribution networks, assessing the validity of the feasibility constraints used in the various methods. Established methods, together with some original ideas, are explained; special reference is made to heuristics based on group-transfer concepts. The algorithms have been applied to data extracted from case studies to simulate a green-field synthesis of network configuration, thus results are comparable with existing network configurations. The results were produced using an interactive computing system.

Journal ArticleDOI
TL;DR: A control and data acquisition system has been prepared for the PIGMI program's prototype accelerator that features a central minicomputer connected to a distributed array of microprocessor based local control points via fast parallel links.
Abstract: A control and data acquisition system has been prepared for the PIGMI program's prototype accelerator. The system configuration features a central minicomputer connected to a distributed array of microprocessor based local control points via fast parallel links. A powerful microprocessor based operator's console provides dynamic control of all accelerator parameters while displaying in real time data taken at the local control points. In addition to centralized control, all basic control functions can be executed locally if necessary. While a special operating system was written for the micros, the central computer system primarily runs under its own commercially supplied real-time multitasking system. All applications programs are written in FORTRAN. Details of the control philosophy, the hardware and software are discussed.

01 Sep 1979
TL;DR: This report is a survey of nonprocedural communication between users and application software in interactive data-processing systems and describes the main features of interactive systems and classification of the potential users of application software.
Abstract: This report is a survey of nonprocedural communication between users and application software in interactive data-processing systems. It includes a description of the main features of interactive systems, a classification of the potential users of application software, and a definition of the nonprocedural interface. Nonprocedural languages are classified into a number of broad groups and illustrated with examples. Finally, future trends in user-computer interfaces and possible developments in manager-oriented languages are discussed.

Journal ArticleDOI
TL;DR: Three different structures of microcomputer systems consisting of two parallel diversified hardware units are described, with failure detection capability the goal for the described system architecture.

Journal ArticleDOI
TL;DR: A general description of a recently installed general-purpose laboratory computer system currently in use at the University of Pennsylvania tandem Van de Graff laboratory is given, which employs a dual processor concept.
Abstract: A general description of a recently installed general-purpose laboratory computer system currently in use at the University of Pennsylvania tandem Van de Graff laboratory is given. The system employs a dual processor concept: A PDP11/55 serves as the core of a multi-user system, while real-time data acquisition is handled by a microprogrammed branch driver (MBD). Applications to the acquisition and analysis of data from nuclear physics experiments are discussed.

Journal ArticleDOI
Bork1
TL;DR: The examples and methods described here have all been used in large introductory courses and are designed to be integrated with other educational materials.
Abstract: For successful classroom use, simulations must be integrated with other educational materials. The examples and methods described here have all been used in large introductory courses.

Journal ArticleDOI
TL;DR: Intelligent microprocessor based interfaces are used to relieve the main control computers of time consuming tasks requiring real-time response, and applications include RF phase-shifters and attenuators, and motor-driven variacs.
Abstract: Intelligent microprocessor based interfaces are used to relieve the main control computers of time consuming tasks requiring real-time response. The basic system consists of a Zilog Z80 CPU, CESR control system interface, read-only and read-write memory, and input/output address decode circuitry. One application employs inexpensive potentiometers to provide a cost effective operator-machine interface with excellent response. It uses a 64 channel analog-to-digital converter to scan 30 two-gang potentiometers, calculating the change in position of each knob 60 times a second. A second application uses multiprogramming techniques to achieve separate position setting of ten motorcontrolled devices with adaptive feedback. The controller can accept high-level commands to let the microprocessor guide the device to its destination, or low-level commands to let the main computer retain complete control of the device. Applications include RF phase-shifters and attenuators, and motor-driven variacs.

Journal ArticleDOI
13 Aug 1979
TL;DR: Though the code-production system developed was developed to support a particular benchmarking approach, it should also be useful in other modeling situations and might be of interest in any field where readability, reliability, ease of maintenance, and economy of programming effort are considered important.
Abstract: The author has recently developed a new methodology of benchmarking, which is being applied to a procurement in which (a) a single integrated interactive application is to span a distributed configuration of computing hardware, (b) the configuration is unknown when the benchmark is being developed, and (c) the application software will be written after the benchmark has been run. The buyer prepares a simulation model of the intended application in the form of programs that will run on the hardware being benchmarked. Each competing vendor is expected to tune the performance of this model to the hardware configuration that he has proposed, so he will require several versions of the model. This presents the buyer with a formidable software-production problem, which is further complicated by a requirement for extreme flexibility and reliability. The paper addresses the software-production problem and describes its solution. The solution was to develop an automated code-production system based on two principal design features. First, the model and its translator are both written in the same language; secondly, the common language is selected on the basis of readability and extensibility. The paper examines why this approach to the code-production problem was successful. Though the code-production system was developed to support a particular benchmarking approach, it should also be useful in other modeling situations. Indeed it might be of interest in any field where readability, reliability, ease of maintenance, and economy of programming effort are considered important.

Journal ArticleDOI
Victor J. Maggioli1
TL;DR: The capability of the special-purpose computer called the programmable controller (PC) has been increased to the point where it has to be considered a viable alternate when selecting a controller to perform operational logic functions in chemical control applications.
Abstract: The capability of the special-purpose computer called the programmable controller (PC) has been increased to the point where it has to be considered a viable alternate when selecting a controller to perform operational logic functions in chemical control applications. Guidelines used in the application of PC's in process control are described. The advent of the PC in chemical process applications provides new challenges for the manufacturer, application engineer, and user. What position in the control family hierarchy (e. g., relay sequencers, logic, microcomputers, direct digital control, mini-computers, etc.) the PC presently holds is defined.

Journal ArticleDOI
30 May 1979
TL;DR: Several techniques for implementing generalized software packages in APL are considered, including data-driven code generation, and the use of software templates, anAPL macrolanguage, and an APL macroprocessor.
Abstract: Several techniques for implementing generalized software packages in APL are considered. One technique in particular, data-driven code generation, is expanded upon. When this technique is employed, application software is written in template form. These templates are then compiled into APL objects when an individual application is configured. Some areas which are investigated are the use of software templates, an APL macrolanguage, and an APL macroprocessor. Specific examples relevant to data base management systems are presented.

Journal ArticleDOI
TL;DR: Basic Experimental Switching System (BESS) is designed as a functionally distributed control switching system accommodating intelligent terminals and has the following characteristics.
Abstract: Microprocessor application to a switching system, especially to telephone terminals, is expected to change switching system architecture. From this viewpoint, Basic Experimental Switching System (BESS) is designed as a functionally distributed control switching system accommodating intelligent terminals. This paper describes BESS hardware and software structures. BESS has the following characteristics. 1) Microprocessor controlled terminals can store and forward dialed digits, and can exchange data between them. Almost all services can be realized merely by a program stored in each terminal itself. 2) Functionally distributed controller of only three untailored microprocessors can treat 100 basic calls. 3) Program flexibility is especially realized by distinct exchange program separation into algorithm, data management and network control parts.

Proceedings ArticleDOI
06 Nov 1979
TL;DR: The influence of estimation errors for system parameters on the resulting system fault performance is examined and results are applied to the problem of error mode testing-finding the underlying error structure of the system.
Abstract: Inference techniques are applied to computing systems to improve the allocation of resources for fault tolerant performance. Using a general model for such systems, the influence of estimation errors for system parameters on the resulting system fault performance is examined. These results are then applied to the problem of error mode testing-finding the underlying error structure of the system. Simulation is used to illustrate the properties discussed.

Journal ArticleDOI
TL;DR: The basic principles of program generators with respect to its usage for process control systems are surveyed and it is of interest to enable the end-user to “describe” as well as to maintain his application himself.

Proceedings ArticleDOI
06 Nov 1979
TL;DR: This work explores three methods of combining the systems: adding file access to an existing data base, adding data base access to a existing file, and completely integrating file management and data base management into a single data management system.
Abstract: Existing computer systems frequently provide two I/O systems: a (language) file system and a data base management system. Generally, 'little interaction between the systems is available to applications, in the sense that data accessed through one cannot be accessed through the other. We explore three methods of combining the systems: adding file access to an existing data base, adding data base access to an existing file, and completely integrating file management and data base management into a single data management system. Motivations for all three methods are supplied. Implications for existing or newly designed software components are explored.

Proceedings ArticleDOI
06 Nov 1979
TL;DR: The design philosophy of two software systems, the Modular Interactive Software System MISS and IPS, are discussed and some of the requirements these software system are designed to meet are examined.
Abstract: In this part of the paper, the design philosophy of two software systems is discussed. The first, the Modular Interactive Software System MISS (1) is a general purpose interactive software system implementation framework within which purpose specific interactive software systems can be implemented. In addition to providing a framework within which to implement interactive software systems, the MISS system includes a software kernel which supports a high level interactive language plus several utility programs which facilitate the addition and documentation of new software modules implementing new statements in the interactive language. The second software system described here is a general purpose image analysis and pattern recognition research system IPS [1,2]. The IPS system is written within the MISS framework and consists of close to 300 extensions to the base level interactive language supported by the MISS system. Prior to a more detailed description of MISS and IPS, let us examine some of the requirements these software system are designed to meet.