scispace - formally typeset
Search or ask a question

Showing papers on "Application software published in 1983"


Journal ArticleDOI
TL;DR: Large, high-speed OCR systems designed to displace several hundred data entry specialists are gradually giving way to devices that can be cost-justified even in organizations employing only a few data entry personnel.
Abstract: eyes are everywhere, scrutinizing documents, blood slides, even pizza crusts. With this potpourri of applications, scanners may become as commonplace as today's photographic camera. Optical scanners, or scanning digitizers, are used to convert a picture into an array of numbers representing the positional distribution of optical density within the picture. The relation between the computer-readable output of the optical scanner and the original image is critical to automatic pattern recognition and digital image processing. Optical scanners can be thought of as eyes for computers. They are used in optical character recognition; computer-aided design and drafting; biomedical applications ; geographic data processing, including remote sensing ; facsimile communications; printing and publishing; and experimental physical science. (See center insert on applications.) Commercially, the most widespread application of optical scanners is optical character recognition. Virtually all earlier products were based on mechanical scanners, but the current trend is solid-state transducers. At the same time, the implementation of recognition algorithms is shifting from special-purpose, hard-wired logic to microprocessor-based software. Consequently, large, high-speed OCR systems designed to displace several hundred data entry specialists are gradually giving way to devices that can be cost-justified even in organizations employing only a few data entry personnel. OCR input devices may eventually be attached to even the smallest computer systems, complementing the standard keyboard for alphanumeric entry. To recognize a single character, we need only examine a 30 x 30 picture element array; the exact spatial relation between successive characters is unimportant. Optical scanners designed for character recognition are, therefore , generally not suitable for image input. Image scanners are, however, finding their way into OCR systems designed for less restrictive applications, such as the processing of documents containing gray-scale illustrations, line drawings, and a mix of type fonts. I Engineering applications require the digitization of line drawings with predominantly straight-line segments, such as detail, assembly, structural, and architectural drawings; logic, circuit, and wiring diagrams; printed circuit , wafer, and chip layouts; and utility maps (in-plant and crosscountry cabling, piping and pipelines, conduits , ducts, transmission lines). Ink on mylar has sufficient contrast to allow use of a scanner with only binary (black and white) amplitude output, but variations in re-flectance make pencil drawings difficult to scan. Many engineering applications require a scanner capable of handling a wide range of drawing sizes and a mix of line and alphabetic information. A typical floor plan for a telephone company central …

28 citations


Journal ArticleDOI
TL;DR: Does conventional human factors wisdom meet the needs of a state-of-the-art computer graphics display system?
Abstract: Does conventional human factors wisdom meet the needs of a state-of-the-art computer graphics display system? This case study suggests that human factors lags behind computer technology.

18 citations


Proceedings ArticleDOI
Y. Wu1
14 Apr 1983
TL;DR: This is the first operational signal processing software development system of its kind which enables the system designer to program in signal processing graph notations and will lead to the interface to VLSI cell libraries for direct mask generation.
Abstract: A Signal processing problem with moderate complexity must be segmented into "smaller" algorithms, intimately linked, to be executed by processing resources. Problems arising from such partitioning and interconnection of these constituent parts are not well understood. It is further complicated by the parallel execution of the elementary operations (primitives) which require the formation of queues and synchronization of the queue with respect to each operation. A software development system has been completed to program and to design the signal processor systems for underwater acoustics applications. This is the first operational signal processing software development system of its kind which enables the system designer to program in signal processing graph notations. The implied hierarchical architecture will lead to the interface to VLSI cell libraries for direct mask generation. A complete computer aided design approach to closely-coupled signal processing systems is described.

17 citations


Journal ArticleDOI
TL;DR: The use of computers in a fermentation pilot plant is described from a practical point of view and the main schedule task is described in detail to demonstrate the software structure.
Abstract: The use of computers in a fermentation pilot plant is described from a practical point of view. The aim is not the application of a computer to a single special process but a general application of computers to prepare and present data for the following analysis of fermentation processes. The hardware is normally bought from computer manufacturers, but some additional installations are useful. Application software has to be developed by the biotechnologist, therefore the software structure is the most important part of the computer application. Data storage is divided into three parts: short-time memory, long-time storage, and a medium memory mainly to be used in process analyses and process control. Four types of programs are used: main schedule tasks, low-priority on-line tasks, sense-switch routine, and different off-line programs. A table of all programs is presented, the main schedule task is described in detail to demonstrate the software structure.

13 citations


Journal ArticleDOI
TL;DR: The first generation of powerful personal computers for scientific applications is now on the market, and these computers have several advantages over conventional computers: very fast graphics, uniform response time, and a visually-oriente user environment that increases user productivity dramatically.
Abstract: Another revolution is occurring in computer hardware that will affect computer users in the power industry. The first generation of powerful personal1 computers for scientific applications is now on the market. These computers have several advantages over conventional computers: very fast graphics, uniform response time, and a visually-oriente user environment that increases user productivity dramatically. These advantages are the result of having a very powerful processor devoted to a single user's needs.

11 citations


Proceedings ArticleDOI
F. W. Day1
27 Jun 1983
TL;DR: This methodology can effectively assist personnel during the analysis, engineering, design, implementation and management phases of the development of large and complex Computer Aided Design Systems.
Abstract: This paper describes a system methodology, Computer Aided Software Engineering (CASE) as applied to a Bell Laboratories Computer Aided Design System (BELLCAD). This methodology can effectively assist personnel during the analysis, engineering, design, implementation and management phases of the development of large and complex Computer Aided Design Systems.

11 citations


Journal ArticleDOI
TL;DR: The key goal in this area is to make software reusable, while long-term solutions focus on very high level languages (VHLLs), and knowledge-based systems.
Abstract: The key goal in this area is to make software reusable. Short-term tasks involve building reusable software, while long-term solutions focus on very high level languages (VHLLs), and knowledge-based systems.

11 citations


23 Aug 1983
TL;DR: These report contains the abstract interface specifications for all the facilities provided to users by this module, and serves as development and maintenance documentation for the SCR software design, and is also intended as a model for other people interested in applying the abstract interfaces approach on other software projects.
Abstract: : This report describes the programmer interface to a set of avionics-oriented abstract data types implemented in software. The Application Data Types module is part of NRL's Software Cost Reduction (SCR) project, to demonstrate the feasibility of applying advanced software engineering techniques to complex real-time systems to simplify maintenance. The Application Data Types module allows operations on data independent of the representation. In the case of numeric abstract types, which represent physical quantities such as speed or distance, arithmetic operations may be performed independent of the units of physical measure. This allows the rest of the application software to remain unchanged even when representation decisions change about these data. These report contains the abstract interface specifications for all the facilities provided to users by this module. Itr serves as development and maintenance documentation for the SCR software design, and it is also intended as a model for other people interested in applying the abstract interface approach on other software projects.

8 citations


Journal ArticleDOI
TL;DR: The computer methods that are common to both areas of power system analysis and electronic circuit computer aided design are described and the differences in application are highlighted.
Abstract: The computational algorithms utilized in power system analysis have more than just a minor overlap with those used in electronic circuit computer aided design. This paper describes the computer methods that are common to both areas and highlights the differences in application through brief examples. Recognizing this commonality has stimulated the exchange of useful techniques in both areas and has the potential of fostering new approaches to electric network analysis through the interchange of ideas.

8 citations


Journal ArticleDOI
F. Mintzer1, K. Davies, A. Peled, F. Ris
TL;DR: The real-time signal processor (RSP) has a programmable signal processing architecture that was created to provide a quick and cost-effective way to implement a broad range of signal processing applications.
Abstract: The real-time signal processor (RSP) has a programmable signal processing architecture that was created to provide a quick and cost-effective way to implement a broad range of signal processing applications. Other objectives were that the RSP be easy to program, suitable for LSI implementation, and conveniently connectable into distributed systems. It was also intended that the RSP would be able to capitalize on the reduced computational complexity (RCC) algorithms in order to achieve increased performance. In this paper, the RSP architecture is described. The ways in which the objectives have influenced the architecture are discussed. The software support, designed to simplify the task of software generation for the RSP, is also described. Finally, the implementation of the RSP is presented.

8 citations


Journal ArticleDOI
TL;DR: The role of microcomputers in the analytical laboratory is growing rapidly as applications software becomes more generally available and the implications of computer access to networked data bases are discussed.

Journal ArticleDOI
TL;DR: Most potentiometers (-20) will disappear, so that control functions will be betaken overby digital information stored in an electronic memory, allowing electronic adjustments on the factory floor by computers, eliminating mechanical calibrations now done manually.
Abstract: Also, most potentiometers (-20) will disappear, so that controlfunctionswill betaken overbydigital information stored in an electronic memory. This will permit electronic adjustments on the factory floor by computers, eliminating mechanical calibrations now done manually. Once sold, such sets will continuously adjust themselves to compensate for aging components such as picture tube and deflection components, providing much better and long term stability and greater reliability.

Journal ArticleDOI
TL;DR: Working models interfaced to microcomputers provide an inexpensive way of simulating real-world situations and can be used to design and test smart grids.
Abstract: Working models interfaced to microcomputers provide an inexpensive way of simulating real-world situations.

Journal ArticleDOI
TL;DR: The Data System portion of the LAMPF Control System - which handles more than 12,000 widely different devices - allows greater uniformity, flexibility, maintainability, and hardware independence than before.
Abstract: As part of the upgrade of the Los Alamos Meson Physics Facility (LAMPF) control computer, the software providing the interface between application programs and the accelerator hardware was redesigned. This new Data System design - which handles more than 12,000 widely different devices - allows greater uniformity, flexibility, maintainability, and hardware independence than before. The Data System portion of the LAMPF Control System is described in this paper.

Proceedings Article
01 Jan 1983
TL;DR: A Multibus-based parallel processor simulation system intended to serve as a vehicle for gaining hands-on experience, testing system and application software, and evaluating parallel processor performance during development of a larger system based on the horizontal/vertical-bus interprocessor communication mechanism is described.
Abstract: A Multibus-based parallel processor simulation system is described. The system is intended to serve as a vehicle for gaining hands-on experience, testing system and application software, and evaluating parallel processor performance during development of a larger system based on the horizontal/vertical-bus interprocessor communication mechanism. The prototype system consists of up to seven Intel iSBC 86/12A single-board computers which serve as processing elements, a multiple transmission controller (MTC) designed to support system operation, and an Intel Model 225 Microcomputer Development System which serves as the user interface and input/output processor. All components are interconnected by a Multibus/IEEE 796 bus. An important characteristic of the system is that it provides a mechanism for a processing element to broadcast data to other selected processing elements. This parallel transfer capability is provided through the design of the MTC and a minor modification to the iSBC 86/12A board. The operation of the MTC, the basic hardware-level operation of the system, and pertinent details about the iSBC 86/12A and the Multibus are described.

Journal ArticleDOI
01 Jul 1983

Proceedings ArticleDOI
22 Jun 1983
TL;DR: EASY5 as mentioned in this paper is a modular modeling system for simulation and analysis of fossil and nuclear power plants, which allows the analyst to prepare a single model of the system to be analyzed, and allows the linearized analysis methods of EASY5 to be applied to verify the model itself before design work begins.
Abstract: EASY5 is the result of a development effort of more than 10 years duration. Analysts at Being perceived the need for a single software tool to support both nonlinear simulation and linearized analysis of dynamic systems. Prior to the development of EASY5 it was necessary to build and maintain two models of a dynamic system: a linearized model used for controls analysis and design and a nonlinear model used to verify the linearized design. Use of EASY5 allows the analyst to prepare a single model of the system to be analyzed. Not only does this reduce the time required to build and maintain models but it allows the linearized analysis methods of EASY5 to be applied to verification of the model itself before design work begins. One of the major applications of EASY5 has been in support of EPRI's SysteMMS, a modular modeling system for simulation and analysis of fossil and nuclear power plants. Under the sponsorship of EPRI, Boeing Computer Services (BCS) developed a library of models which facilitates the use of EASY5 as part of SysteMMS. This paper presents an overview of the capabilities of EASY5, a brief description of the EASY5/SysteMMS library, and an example of application of EASY5 to designing a controller for a deaerator.

Journal ArticleDOI
TL;DR: This paper will provide a functional description of the overall energy control system configuration and the software provided and major emphasis will be placed on the advanced applications which are being provided as a part of the system.
Abstract: TransAlta Utilities, the largest investor owned electric utility in Canada, is implementing an advanced energy control system to be located in Calgary, Alberta. This energy control system (ECS) consists of four large-scale processors configured as a main applications computer (MAC) and a security applications computer (SAC) with the remaining two processors in a dual redundant mode with the MAC and SAC. The system includes MMI equipment and advanced SCADA interfacing hardware which supports the sequence of events (SOE) function. The energy control system is supplied by the Leeds and Northrup Company and will replace three existing systems which were designed and constructed by TransAlta Utilities. The new system will centralize the electric system monitoring, control, and security assessment functions in the Calgary facility and allow the expansion capability needed in the rapidly growing province. This paper will provide a functional description of the overall energy control system configuration and the software provided. Major emphasis will be placed on the advanced applications which are being provided as a part of the system. These advanced applications have many unique ``state of the art'' features which give the power system operator vastly improved capabilities in controlling and analyzing the system.

Journal ArticleDOI
TL;DR: In this article, the authors report on quantitative performance enhancements obtained by practical application of vertical migration techniques in a hierarchically layered software/firmware/hardware system Burroughs B 1726.

Proceedings ArticleDOI
27 Jun 1983
TL;DR: Computer Aided Programming is to software manufacturing what CAD/CAM is to conventional manufacturing and CAP provides an intelligent software development environment in which the computer handles the bulk of the work required to build and to maintain code.
Abstract: A software manufacturing paradigm is described which deals effectively with many issues long associated with software development and maintenance. In essence, Computer Aided Programming is to software manufacturing what CAD/CAM is to conventional manufacturing. A reusable component approach to automated software assembly, CAP provides an intelligent software development environment in which the computer handles the bulk of the work required to build and to maintain code. CAP builds a program from specific goal-oriented instructions supplied by the user. Custom code is automatically spliced with reusable CAP computer code to create customized application programs.

Journal ArticleDOI
TL;DR: This tutorial, directed toward nonspecialists, surveys the entire range of software applications for communications satellites and touches on topics addressed in greater depth by companion articles in this issue.
Abstract: Software plays an important role in the development and operation of communications satellite systems. Software applications include analysis and design, manufacturing and testing of system components, and controlling and monitoring of system operation. This tutorial, directed toward nonspecialists, surveys the entire range of software applications for communications satellites and touches on topics addressed in greater depth by companion articles in this issue.

Proceedings ArticleDOI
01 Jan 1983
TL;DR: This paper outlines the basic components of distributed systems, such a processors, operating systems, languages, interprocessor communications and the design decisions which effect bus loading, reliability, design flexibility, concurrent task scheduling and ease of software development.
Abstract: Microprocessor technology is appearing more often in all forms of offshore instrumentation and systems. Typically, this instrumentation must operate in real time and meet strict throughput and environmental specifications. As real time, offshore, processing requirements place heavier demands on single CPU processing centers the benefits of distributing the workload amongst several CPUs becomes increasingly attractive. An incompletely developed distributed architecture, however, may not provide the expected level of performance desired. A clear understanding of design tradeoffs and implementation considerations is required to ensure a successful implementation. This paper outlines the basic components of distributed systems, such a processors, operating systems, languages, interprocessor communications and the design decisions which effect bus loading, reliability, design flexibility, concurrent task scheduling and ease of software development. Two specific application examples, both using a distributed computer system, are presented, one in the navigation field and the other an untethered, unmanned vehicle control system.

Journal ArticleDOI
TL;DR: Comparisons of code written for standard serial machines to the same code modified for MIDAS will be examined and performance results discussed.
Abstract: All programs currently running on serial computers require some degree of modification when moved to parallel processors. This is true whether the architectural parallelism is manifested at the instruction level, such as in array processors or the CRAY, or achieved via multiple processors, as is the case in the MIDAS system. In either case the degree to which the program exploits the architecture can significantly affect the processing speed. Some guidelines for application programming for the MIDAS system are discussed. Important programming considerations include the separation of serial and parallel elements of a program (such as program initialization), data input mechanisms (including hardware preprocessing), and output mechanisms. Comparisons of code written for standard serial machines to the same code modified for MIDAS will be examined and performance results discussed.

Journal ArticleDOI
TL;DR: An account is given of the four main stages in a development and implementation programme: defining the requirements of the application; matching these to the vision system; developing the application software; integrating and installing the system.

Journal ArticleDOI
TL;DR: A number of applications and specially designed computerized instruments for neurophysiology, in which the programming language Modula was used, are described and the strengths and limitations of Modula are evaluated.
Abstract: The history of the development of digital computer systems for use in medical care and biological laboratory experiments is reviewed, with special emphasis on programming languages. The relevance to this application of techniques first used in the design of operating systems for simultaneous multiple use of large computer systems and in performing concurrent real-time tasks is demonstrated. A number of applications and specially designed computerized instruments for neurophysiology, in which the programming language Modula was used, are described. The strengths and limitations of Modula are evaluated. The essential parallelism of laboratory and clinical monitoring tasks would seem to promote the use of the emerging technology of multitasking and multiprocessor languages and systems.

Journal Article
TL;DR: The traumatic experience of selecting and installing an in-house computer system can be eased by this guide through the forest of alternatives and sales claims confronting the prospective buyer.
Abstract: The traumatic experience of selecting and installing an in-house computer system can be eased by this guide through the forest of alternatives and sales claims confronting the prospective buyer. The authors have summarized problems common in the selection/installation process, and perhaps more important, they have detailed specific questions which should be asked in the decision-making process. The various selection criteria discussed include application software suitability, operating system software, equipment and its maintenance, ease of conversion, vendor experience and responsiveness, and cost considerations. By carefully investigating these factors as they relate to the many computer systems available, the buyer can reach a logical decision and anticipate a successful installation.

Journal ArticleDOI
01 Aug 1983

Proceedings ArticleDOI
12 Dec 1983
TL;DR: A description of an integrated System Design/Static and Dynamic Modeling (SD/SDM) tool set is presented and a data base organization is used for the representation of the architecture, load, application software and the operating system of a distributed system.
Abstract: The general problem of the representation and performance-oriented modeling of the behavior of distributed computer systems is discussed. A description of an integrated System Design/Static and Dynamic Modeling (SD/SDM) tool set is presented. In SD/SDM, a data base organization is used for the representation of the architecture, load, application software and the operating system of a distributed system. Modeling is performed statically or dynamically through a generic simulation model that does not require recompilation. Applications of SD/SDM in actual projects are discussed briefly.

Book ChapterDOI
01 Jan 1983
TL;DR: In this paper, a practical architecture based on local-area network technology which supports this incorporation of redundancy is proposed, and a set of classification criteria for interprocess communication primitives is given and the applicability of various proposals which have appeared in the literature is analyzed.
Abstract: In order to realize the potential for reliability and maintainability offered by distributed computer control systems in real-time process control applications, it is essential to have flexible methods for incorporating redundancy. A practical architecture based on local-area network technology which supports this incorporation of redundancy is proposed. A set of classification criteria for interprocess communication primitives is given and the applicability of the various proposals which have appeared in the literature is analysed. An approach based on the validity time of messages is outlined. This concept permits a unified treatment of message transfer protocols and transparent redundancy in the application software.

Proceedings ArticleDOI
26 Oct 1983
TL;DR: For functions involving typical hospital users such as physicians and nurses, the use of the virtual circuit function for direct user terminal connectivity is limited to situations involving only the most sophisticated users, such as programmers and analysts.
Abstract: The virtual circuit service is a common feature of local area communications networks (LACNs). This feature provides the user of a network with the functionality of direct connectivity to any host computer on the network. At the University of California, San Francisco Hospital a microcomputer based LACN using fiberoptic communications media has been in use for over two years. We have limited our use of the virtual circuit function for direct user terminal connectivity to situations involving only the most sophisticated users, such as programmers and analysts. For functions involving typical hospital users such as physicians and nurses, we chose a more "user friendly" approach by interposing host application software between the user and other host computers on the network.