scispace - formally typeset
Search or ask a question

Showing papers on "Software published in 1977"


Book
01 Jan 1977
TL;DR: The eighth edition of this text features more coverage of programmable logic devices (PLDs), and Topics that apply to digital signal processing (DSP), a very rapidly advancing technology in electronics, have been expanded and improved.
Abstract: From the Publisher: NEW! The eighth edition features more coverage of programmable logic devices (PLDs). This technology is rapidly replacing the use of conventional small- and medium-scale ICs in modern digital systems. Interspersed throughout the text where appropriate, this PLD coverage offers students an alternative means of implementing digital logic circuits, from the simplest gates to complex systems. NEW! Each text is packaged with two free CD-ROMs. The first CD-ROM contains the entire library of Texas Instruments Logic Data Sheets, including all TTL series, CMOS, and bus interface parts. The second CD-ROM contains: Circuits from the text rendered in both Electronics Workbench™ and CircuitMaker ® software programs. Students with access to Electronics Workbench software can open and work interactively with the Electronics Workbench circuit files to increase their understanding of concepts and to prepare for laboratory activities. Free CircuitMaker Student Version software is included on the CD-ROM, enabling students to access the CircuitMaker files. A limited-compile demonstration version of the PAL Expert CUPL language compiler from Logical Devices, Inc. UPDATED! Topics that apply to digital signal processing (DSP), a very rapidly advancing technology in electronics, have been expanded and improved. UPDATED! Digital logic technology coverage and terms often encountered in personal computer literature have been updated and improved. UPDATED! Students have free access to the text's Companion Website at ...

144 citations


Journal ArticleDOI
TL;DR: The Requirements Statement Language (RSL) as discussed by the authors is a flow-oriented language for the expression of software requirements, and the Requirements Engineering and Validation System (REVS) is a software package which includes a translator for RSL, a data base for maintaining the description of system requirements and a collection of tools to analyze the information in the data base.
Abstract: The development of system requirements has been recognized as one of the major problems in the process of developing data processing system software. We have developed a computer-aided system for maintaining and analyzing such requirements. This system includes the Requirements Statement Language (RSL), a flow-oriented language for the expression of software requirements, and the Requirements Engineering and Validation System (REVS), a software package which includes a translator for RSL, a data base for maintaining the description of system requirements, and a collection of tools to analyze the information in the data base. The system emphasizes a balance between the use of the creativity of human thought processes and the rigor and thoroughness of computer analysis. To maintain this balance, two key design principles–extensibility and disciplined thinking–were foliowed throughout the system. Both the language and the software are easily user-extended, but adequate locks are placed on extensions, and limitations are imposed on use, so that discipline is augmented rather than decreased.

132 citations


Journal ArticleDOI
TL;DR: The Software Development System (SDS) is a methodology addressing the problems involved in the development of software for ballistic missile defense systems, large real-time, automated systems with a requirement for high reliability.
Abstract: This paper contains a discussion of the Software Development System (SDS), a methodology addressing the problems involved in the development of software for ballistic missile defense systems. These are large real-time, automated systems with a requirement for high reliability. The SDS is a broad approach attacking problems arising in requirements generation, software design, coding, and testing. The approach is highly requirements oriented and has resulted in the formulation of structuring concepts, a requirements statement language, process design language, and support software to be used throughout the development cycle. This methodology represents a significant advance in software technology for the development of software for a class of systems such as BMD. The support software has been implemented and is undergoing evaluation.

106 citations


Journal ArticleDOI
TL;DR: This paper describes four major aspects of software management: development statistics, development process, development objectives, and software maintenance.
Abstract: This paper describes four major aspects of software management: development statistics, development process, development objectives, and software maintenance. The control of both large and smal software projects is included in the analysis.

87 citations


Journal ArticleDOI
20 Jul 1977
TL;DR: An interactive computer graphics method has been developed for the rapid generation of arbitrary shaped three-dimensional surfaces that is a synthesis of spline theory and algorithms, an interactive means for man-machine communication, and software for static or dynamic graphics display.
Abstract: An interactive computer graphics method has been developed for the rapid generation of arbitrary shaped three-dimensional surfaces. The method is a synthesis of spline theory and algorithms, an interactive means for man-machine communication, and software for static or dynamic graphics display. The basic technique employed is a modified lofting method in which sectional curves are represented by uniform B-splines and the surface is interpolated between sections by Cardinal splines. Among the features of this method are algorithm, which enable interactive modification of the B-spline representation of the sectional curves. At all stages of the process, the spatial information is graphically displayed to the user. Complex surfaces can be created by the combination of a number of shapes that have been separately generated and automatically joined. The system has been successfully interfaced to a variety of analytical routines for structural, medical and graphical applications.

84 citations


Journal ArticleDOI
TL;DR: The relational operations provided by the RAP hardware are described, and a representative approach to providing the same relational operations with conventional software and hardware is devised.
Abstract: An associative processor called RAP has been designed to provide hardware support for the use and manipulation of databases. RAP is particularly suited for supporting relational databases. In this paper, the relational operations provided by the RAP hardware are described, and a representative approach to providing the same relational operations with conventional software and hardware is devised. Analytic models are constructed for RAP and the conventional system. The execution times of several of the operations are shown to be vastly improved with RAP for large relations.

82 citations


Book ChapterDOI
01 Jan 1977
TL;DR: A flexible and economic multi-grid data structure is presented, together with the software developed to support grid manipulations in general, and MLAT programming in particular.
Abstract: A survey is given of solution techniques which simultaneously use a sequence of increasingly finer discretizations (finite-difference or finite-element equations) for the same continuous problem. The solution process acts, on one hand, as a fast solver, and provides, on the other hand, for very efficient, nearly optimal discretizations. A flexible and economic multi-grid data structure is presented, together with the software developed to support grid manipulations in general, and MLAT programming in particular.

75 citations


Journal ArticleDOI
TL;DR: An experimental project was undertaken to modify an existing ground PN modem and a software implementation of the digital tracking algorithms was selected where a HP-2100A minicomputer controls carrier frequency and PN code phase via digital phase shifters.
Abstract: To optimize the threshold of a pseudonoise (PN) spread spectrum modem for use over an aircraft/satellite communications link at SHF, the effects of Doppler must be taken into account. Reconstitution of carrier phase by a Costas loop to coherently demodulate the PSK data and also the delay-lock error voltage has typically been the practice in PN modems intended for ground applications. To accommodate the platform dynamics, the Costas loop must have a relatively wide bandwidth, and this implies a significant threshold degradation. An alternate implementation employs a noncoherent carrier tracking loop which maintains frequency lock rather than phase lock. Now, the delay-lock error voltage is noncoherently demodulated. For the airborne application, analysis and simulations show this implementation will extend the receiver's tracking threshold significantly (up to 6 dB) for the worst case dynamics profile. An experimental project was undertaken to modify an existing ground PN modem (AN/USC-28, ADM version) for flight test. A software implementation of the digital tracking algorithms was selected where a HP-2100A minicomputer controls carrier frequency and PN code phase via digital phase shifters. The Costas demodulator for extracting PSK data resides entirely in software, and is completely segregated from PN tracking. In laboratory testing of the receiver with simulated dynamics and in actual flight tests, the demonstrated performance was found to approach closely the goals established by the analyses and simulations.

74 citations


Journal ArticleDOI
Tom Love1
TL;DR: The major findings were that programs with simplified control flow were easier for the computer science students to understand as measured by their ability to reconstruct the programs.
Abstract: A within-subjects experimental design was used to test the effect of two variables on program understanding. The independent variables were complexity of control flow and paragraphing of the source code. Understanding was measured by having the subjects memorize the code for a fixed time and reconstruct the code verbatim. Also some subjects were asked to describe the function of the program after completing their reconstruction. The two groups of subjects for the experiment were students from an introductory programming class and from a graduate class in programming languages.The major findings were that paragraphing of the source had no effect for either group of subjects but that programs with simplified control flow were easier for the computer science students to understand as measured by their ability to reconstruct the programs. The dependent variable, rated accuracy of their description of the programs functions, did not differ as a function of either independent variable.The paper is concluded with a description of the utility of this experimental approach relative to improving the reliability of software and a discussion of the importance of these findings.

52 citations


ReportDOI
03 Jun 1977
TL;DR: A procedure for designing computer systems that are developed specifically to be a component of a more complex system to reduce maintenance costs by means of a software organization that insulates most of the programs from changes in the interface.
Abstract: : This report describes a procedure for designing computer systems that are developed specifically to be a component of a more complex system. Two significant characteristics of such design problems are the following: the computer system interface is determined by factors outside the control of the computer system designer, and the specifications of that interface are likely to change throughout the life cycle of the system. The purpose of the procedure described in this report is to reduce maintenance costs by means of a software organization that insulates most of the programs from changes in the interface. The procedure is based on the systematic compilation of an assumption list. The assumption list describes those aspects of the interface that future users and other knowledgeable persons consider essential and therefore stable. Other aspects of the interface are ignored. An abstract interface is designed on the basis of this assumption list. A specification of the abstract interface is used to procure the major components of the system. This report explains the principles behind the procedure and illustrates its use. The success of the procedure is primarily limited by the ability of designers and future users to compile an accurate list of assumptions. A side benefit of the procedure is simpler, better structured software. Successful application of the procedure should result in both increased reliability and reduced lift-cycle costs.

52 citations


Journal ArticleDOI
TL;DR: In this article, the present and future uses and capabilities of computers and examines computer hardware and software, and presents a survey of the current and future use and capabilities, including hardware, software and services.
Abstract: Explores the present and future uses and capabilities of computers and examines computer hardware and software.

Proceedings ArticleDOI
01 Jan 1977
TL;DR: The system, called the Associative File Processor (AFP), utilizes a conventional minicomputer for control, off-the-shelf high density disks for storage, a special purpose parallel search module as a text term detector, and query and retrieval software.
Abstract: This paper describes an approach to solving a major problem in the information processing sciences— that of searching very large (5-50 billion characters) data bases of unstructured free-text for random queries within a reasonable time and at an affordable price.The need by information specialists and knowledge workers for large, fast low-cost text and document retrieval systems is growing rapidly. Conventional approaches to the problem have usually depended upon expensive, general purpose computers, upon special pre-preprocessing of the textual data (e.g. file inverting, indexing, abstracting, etc.), and upon elaborate, costly software. The resulting retrieval systems often cost hundreds of dollars per query and the full scanning of an uninverted, unstructured billion byte textual data base could take hours of computer services. However, in spite of these restrictions, such full text search systems have proved useful and even indispensible for many applications.Computer technology of the late 1960's and the 1970's, in both hardware and software (e.g., minicomputers, low-cost, high density disk storage, “chip” electronics, natural language query systems, etc.), have made i t practical to build special purpose, low-cost text retrieval systems. Such a system has been built, tested, and is now in a production stage. The system called the Associative File Processor (AFP), utilizes a conventional minicomputer (DEC's PDP-11/45) for control, off-the-shelf high density disks for storage, a special purpose parallel search module as a text term detector, and query and retrieval software. The AFP is currently being field tested at two sites. Full text, parallel searches on un-preprocessed textual data bases are being performed at the effective matching rates of 4 billion bytes per second (8K byte key memory times 500 Kbyte/second data stream). Estimated costs are 10 to 25 cents per query for a one billion byte data base. The costs per query and the time for searching increase in a linear fashion as data base increases. A basic architecture for the AFP is described and an implemented version is discussed. A more powerful term detector module is also under development. This system is designed around a finite state automaton algorithm.

Journal ArticleDOI
TL;DR: The NASCAP (NASA Charging Analyzer Program) as mentioned in this paper was developed by Systems, Science and Software under contract to NASA-LeRC to simulate the charging of a complex spacecraft in geosynchronous orbit.
Abstract: A computer code, NASCAP (NASA Charging Analyzer Program), has been developed by Systems, Science and Software under contract to NASA-LeRC to simulate the charging of a complex spacecraft in geosynchronous orbit. The capabilities of the NASCAP code include a fully three-dimensional solution of Poisson's equation about an object having considerable geometrical and material complexity, particle tracking, shadowing in sunlight, calculation of secondary emission, backscatter and photoemission, and graphical output. A model calculation shows how the NASCAP code may be used to improve our understanding of the spacecraft-plasma interaction.

Patent
28 Jul 1977
TL;DR: In this paper, a channel-to-channel (C2C) adapter is proposed for interconnecting two or more digital computers or digital data processors. But it does not have a view of these conventions, and the assignment of device addresses for processor use and the direction of data transfer are agreed to among the software systems executing on the interconnected processors.
Abstract: A high performance channel-to-channel adapter for interconnecting two or more digital computers or digital data processors. Multiple input/output device addresses are recognized by the channel-to-channel adapter. The channel-to-channel adapter makes the proper processor-to-processor connection by matching device addresses. In particular, it interconnects for data transfer purposes the two processors for which the same device address has been received. The assignment of device addresses for processor use and the direction of data transfer are by conventions agreed to among the software systems executing on the interconnected processors. The channel-to-channel adapter does not have a view of these conventions. In the more general case, two device addresses are assigned by software convention to each processor-to-processor link, one address being used to transfer data in one direction and the other address being used to transfer data in the opposite direction.

01 Jan 1977
TL;DR: The pitfalls encountered when solving GP problems and some proposed remedies are discussed in detail and a numerical comparison of some of the more promising recently developed computer codes for geometric programming on a specially chosen set of GP test problems is given.
Abstract: This paper attempts to consolidate over 15 years of attempts at designing algorithms for geometric programming (GP) and its extensions. The pitfalls encounteres when solving GP's and some proposed remedies are discussed in detail. A comprehensive summary of published software for the solution of GP problems is included. Also included, is a numerical comparison of some of the more promising recently developed computer codes for geometric programming, on a specially chosen set of GP test problems. The relative performance of these codes is measured in terms of their robustness as well as speed of computation. The performance of some general nonlinear programming (NLP) codes on the same set of test problems is also given and compared with the results for the GP codes. The paper concludes with some suggestions for future research.

Book
01 Jan 1977
TL;DR: This edition correlates closely with popular chip trainers and includes added coverage of the Intel 8088 16-bit microprocessors and includes a student version of the TASM cross-assembler software program.
Abstract: Striking an ideally balanced approach, this text introduces students to microprocessor fundamentals by using a pedagogical SAP (Simple-As-Possible) model computer. The text then relates these fundamentals to three real-world examples: Intel's 8085, Motorola's 6800, and the 6502 chip used by Apple Computers. Instructors can focus on just one of these popular microprocessors, or include the features of others. This edition correlates closely with popular chip trainers and includes added coverage of the Intel 8088 16-bit microprocessors. It also includes a student version of the TASM cross-assembler software program. Experiments for Digital Computer Electronics are prepared expressly for this Third Edition, containing hardware and software experiments that allow students to expand upon the topics covered in the text through hands-on exercises. An Instructor's Guide containing answers to chapter questions and experiment results is also offered.

Journal ArticleDOI
TL;DR: It is shown that a functional high-level language signal processing program can easily be modified so as to produce a similar program which, when executed, automatically generates another program containing precomputed algorithm sequencing and data access information.
Abstract: Optimal use of high-speed programmable digital signal processors generally demands familiarity with machine architectural features and hence, production of programs whose structure reflects and exploits those features. In contrast, it is apparent that little effort has been made to develop programming techniques which fully realize the signal processing computational capability of standard minicomputers. In this paper, it is shown that a functional high-level language signal processing program can easily be modified so as to produce a similar program which, when executed, automatically generates another program containing precomputed algorithm sequencing and data access information. The generated program will then utilize central processor arithmetic and logical capability only for data-dependent computation. In this way, instructions normally associated with computation for program sequencing/control or data access are eliminated, and all benefits of increased algorithm complexity for reduction of data-dependent arithmetic computation are in fact realized as decreased program execution time. Examples are given of Fortran programs which generate Fortran FFT subroutines and, for completeness, assembly language realizations of the Pfeifer/Blankinship autocorrelation algorithm. Results demonstrate that, using this technique, standard minicomputers may execute digital signal processing algorithms faster than peripheral processors which normally require standard minicomputers as host processors.

Proceedings ArticleDOI
26 Sep 1977
TL;DR: In this article, the effect of unwanted stray radiation in optical systems is discussed and the basic power transfer equation is written as the product of three factors: the surface BRDF, the projected solid angle, and the power on the source objects.
Abstract: Qualitative and quantitative analyses of the effect of unwanted stray radiation in optical systems are discussed. The basic power transfer equation is written as the product of three factors: the surface BRDF, the projected solid angle, and the power on the source objects. The minimization of any one of these factors along the most significant paths of scatter will reduce unwanted radiation reaching the image. BRDF measurements and their use in selecting suitable coatings are emphasized, along with the effect of altering the baffle configurations, edge scatter, and tolerancing requirements. Optical design aspects, such as stops, field stops, Lyot stops, and obscurations, are discussed in suppressing stray radiation for various systems. The performance and testing of scaled models is related to the expected system performance. The major software tools, GUERAP and APART stray radiation analysis programs, are discussed. The approach presented is to develop qualitative concepts that can be easily grasped by the optical and mechanical designers. The approach should help stimulate stray radiation rejection ideas that can be incorporated during initial design evaluations.

Journal ArticleDOI
TL;DR: This paper is a state-of-the-art review of software for combined continuous/discrete systems simulation with the following 18 languages or packages discussed.
Abstract: This paper is a state-of-the-art review of software for combined continuous/discrete systems simulation. The following 18 languages or packages are discussed:

Proceedings Article
06 Oct 1977
TL;DR: Key advances in information technology, both hardware and software, are described and special attention is devoted to three approaches to the development of more specialized information processing architectures: firmware enhancement, "intelligent" controllers, and minicomputer "back-end" processors.
Abstract: Demands for more effective information management, coupled with advances in computer hardware and software technology, have resulted in the emergence of the information utility concept, whereby database computers specialized for information storage and processing can serve as information nodes. Such database computers can provide high-performance and high-reliability information management services to both conventional and "personal" computers. In this paper key advances in information technology, both hardware and software, are described. Special attention is devoted to three approaches to the development of more specialized information processing architectures: (1) firmware enhancement, (2) "intelligent" controllers, and (3) minicomputer "back-end" processors. These approaches are preliminary to the development of truly specialized high-performance, high-reliability database machine architectures. The DBC and INFOPLEX functional modular database machine architectures are presented as examples.

ReportDOI
01 Dec 1977
TL;DR: MINOS is a Fortran system for solving large-scale linearly constrained optimization problems and the System Manual gives an overview of the system, the programming conventions used, data structures, tolerances, and error conditions.
Abstract: : MINOS is a Fortran system for solving large-scale linearly constrained optimization problems. The System Manual gives an overview of the system, the programming conventions used, data structures, tolerances, and error conditions. details are given of a practical implementation for maintaining a sparse LU factorization. The reduced-gradient approach for handling a nonlinear objective function has been described elsewhere; further implementation details are included here. The System Manual should facilitate interfacing of MINOS with other optimization software.

Journal ArticleDOI
TL;DR: The subprogram module is an enhancement to CSSL-based simulation software and relies on its host software for the specification and solution of the dynamic equations of the system under study.
Abstract: This paper describes a subprogram module which sim plifies the implementation of parameter optimization studies in continuous dynamic systems. The module is an enhancement to CSSL-based simulation software and relies on its host software for the specification and solution of the dynamic equations of the system under study. The module includes several different optimization algorithms -a feature which can be of great value in difficult problems. It has output documentation routines and is organized so as to be transparent to the user.

Proceedings ArticleDOI
A. Douaud1, P. Eyzat1
01 Feb 1977
TL;DR: A versatile data acquisition system for engine pressure-time history that utilizes a 12-bit sample-and-hold A/D converter in conjunction with a mini-computer and software implemented is described.
Abstract: A versatile data acquisition system for engine pressure-time history is described The on-line system utilizes a 12-bit sample-and-hold A/D converter in conjunction with a mini-computer The computer controls this acquisition process and performs data processing to generate the desired results The time base for the system is generated by a shaft mounted disk and photoelectric sensors On-line selection based on predefined criteria of particular interest is software implemented Statistical data are available such as the standard deviation and the histogram of maximum pressure Teletype print, XY plot and punched paper tape are standard outputs Among the wide variety of potential applications of this system, some actual examples in such areas as cyclic variation, knock, friction losses and heat transfer are given

ReportDOI
01 May 1977
TL;DR: Results are presented which give the prediction equations obtained and a discussion of the predictability of errors and error rate in each sample, and recommendations for further research are also provided.
Abstract: : This report presents and discusses the results obtained for statistical predictions of programming errors using multiple linear regression analysis Programming errors were predicted from linear combinations of program characteristics and programmer variables Each of the program characteristics variables were considered to be measures of the program's complexity and structure Two distinct data samples comprising 783 programs with approximately 297,000 source instructions written for command and control software applications were analyzed Background data on both samples is provided which includes discussions related to each sample's software development environment, testing conditions, predictor variables, definition of programming errors, and general data characteristics Results are presented which give the prediction equations obtained and a discussion of the predictability of errors and error rate in each sample Conclusion of the study and recommendations for further research are also provided (Author)

Proceedings ArticleDOI
01 Dec 1977
TL;DR: In this article, the transmission zero problem is investigated through application of QZ-type algorithms for the nonsymmetric generalized eigenvalue problem, which use unitary similarities to efficiently reduce the problem to one where the zeros may be determined in a useful, accurate, and dependable manner.
Abstract: The computation of various "system zeros" is investigated through application of QZ-type algorithms for the nonsymmetric generalized eigenvalue problem. Such algorithms use unitary similarities to efficiently reduce the problem to one where the zeros may be determined in a useful, accurate, and dependable manner. Recent reliable and sophisticated analysis and software (specifically, EISPACK) developed by numerical linear algebra specialists is used. EISPACK, moreover, is widely available and can be applied directly to the transmission zero problem. Examples and timing estimates are given and the associated generalized eigenvecter problem is noted with its application to the computation of supremal (A,B)-invariant and controllability subspaces.

Book ChapterDOI
01 Jan 1977
TL;DR: The methodology of machining tests for data collection, the software performing spectral analysis and the results of the first series of tests are described and discussed and further development in this research field are proposed.
Abstract: The paper reports about an experimental research aiming to investigate the mutual relationships between tool wear and power spectrum of the flexural vibrations of the tool, during cutting, in turning operations. The methodology of machining tests for data collection, the software performing spectral analysis and the results of the first series of tests are described and discussed. Further development in this research field are then proposed.

Journal ArticleDOI
TL;DR: This approach, called "top-down modular design," attempts to minimize the numerous iterations of the development cycle in the general process of constructing simulation programs.
Abstract: One of the more critical problems in computing science today is the rapidly increasing cost of developing and maintaining software for new automated data systems. New software development is generally a standardized process whereby software evolves from an idea to a useful system operating on a computer. The traditional model for a software development project includes feasibility study, requirements analysis, system design, program design, coding, testing, documentation, and implementation. Program design, coding, and testing are relatively well defined activities, but they are rarely straightforward. Involving many iterations among the phases and the activities within the phases, these iterations are a result of the knowledge gained as the system is being generated. We will describe here a different approach to the software development process. This approach, called "top-down modular design," attempts to minimize the numerous iterations of the development cycle. The basic philosophy, similar to that of structured programming, has already been applied to a variety of applications but has not yet been utilized in the general process of constructing simulation programs.

Journal ArticleDOI
TL;DR: As organizations become more dependent on computers, they become more sensitive to computer system failures and increasing emphasis is being placed by users and vendors on the reliability of the total system, and particularly the system software.
Abstract: As organizations become more dependent on computers, they become more sensitive to computer system failures. The importance of computer reliability in realtime control systems (e.g., communications systems8,15and traffic control systems) has been recognized for some time. Many computer users are now becoming aware that they accomplish more on systems which seldom crash because of malfunctions than on systems which run very rapidly (and correctly) between frequent crashes. Consequently, increasing emphasis is being placed by users and vendors on the reliability of the total system, and particularly the system software. A notable example of this is the emphasis placed by IBM on reliability concerns in the design and implementation of OS/VS2 Release 2.35,36

01 Jan 1977
TL;DR: The purpose of this article is to examine the research developments in software for numerical computation, and to attempt to separate software research from numerical computation research, which is not easy as the two are intimately intertwined.
Abstract: The purpose of this article is to examine the research developments in software for numerical computation. Research and development of numerical methods is not intended to be discussed for two reasons. First, a reasonable survey of the research in numerical methods would require a book. The COSERS report [Rice et al, 1977] on Numerical Computation does such a survey in about 100 printed pages and even so the discussion of many important fields (never mind topics) is limited to a few paragraphs. Second, the present book is focused on software and thus it is natural to attempt to separate software research from numerical computation research. This, of course, is not easy as the two are intimately intertwined. We want to define numerical computation rather precisely so as to distinguish it from business data processing, symbolic processing (such as compilers) and general utilities (such as file manipulation systems or job schedulers). We have the following definition: Numerical computation involves real numbers with procedures at a mathematical level of trigonometry, college algebra, linear algebra or higher. Some people use a somewhat narrower definition which restricts the term to computation in the physical sciences and a few people even think of numerical computation as research and development computation (as opposed to production) in science. There are two principal sources of the problems in numerical computation: Mathematical models of the physical world and the optimization of models of the organizational world. The scope and range of the sources and the associated software is illustrated by the following list: 1. Simulation of the effects of multiple explosions. The software is a very complex program of perhaps 20,000 Fortran statements. It is specially tailored to this problem and may have taken several years to implement. The program requires all the memory and many hours of time on the largest and fastest computers. 2. Optimization of feed mixtures for a chicken farmer. This is standard software of modest length (5OO-2OO0 statements) even with an interface for a naive user. It might take substantial time to execute on a small computer. 3. Analysis of the structural vibration of a vehicle. The software is similar to that of example 1. One might also use NASTRAN (see II.G.3) with only a few months for an implementation. More computer time and memory would be used by this approach. 4. Simple linear regression on demographic data (e.g. age or income). This is …

Journal ArticleDOI
TL;DR: The SAO laser satellite-ranging systems, located in Brazil, Peru, Australia, and Arizona, have been in operation for more than five years and have provided ranging data at accuracy levels of a meter or better as discussed by the authors.
Abstract: The four SAO laser satellite-ranging systems, located in Brazil, Peru, Australia, and Arizona, have been in operation for more than five years and have provided ranging data at accuracy levels of a meter or better. The paper examines system hardware (laser transmitter, the electronics, mount, photoreceiver, minicomputer, and station timing) and software (prediction program, calibration programs, and data handling and quick-look programs) and also considers calibration, station operation, and system performance.