scispace - formally typeset
Search or ask a question

Showing papers on "Software development published in 1976"


Journal ArticleDOI
TL;DR: In this article, the authors compare sequential development with stepwise refinement and specification of information hiding modules, and demonstrate that the two methods are based on the same concepts but bring complementary advantages.
Abstract: Program families are defined (analogously to hardware families) as sets of programs whose common properties are so extensive that it is advantageous to study the common properties of the programs before analyzing individual members. The assumption that, if one is to develop a set of similar programs over a period of time, one should consider the set as a whole while developing the first three approaches to the development, is discussed. A conventional approach called "sequential development" is compared to "stepwise refinement" and "specification of information hiding modules." A more detailed comparison of the two methods is then made. By means of several examples it is demonstrated that the two methods are based on the same concepts but bring complementary advantages.

851 citations


Proceedings ArticleDOI
13 Oct 1976
TL;DR: The study reported in this paper provides for the first time a clear, well-defined framework for assessing the often slippery issues associated with software quality, via the consistent and mutually supportive sets of definitions, distinctions, guidelines, and experiences cited.
Abstract: The study reported in this paper establishes a conceptual framework and some key initial results in the analysis of the characteristics of software quality. Its main results and conclusions are:• Explicit attention to characteristics of software quality can lead to significant savings in software life-cycle costs.• The current software state-of-the-art imposes specific limitations on our ability to automatically and quantitatively evaluate the quality of software.• A definitive hierarchy of well-defined, well-differentiated characteristics of software quality is developed. Its higher-level structure reflects the actual uses to which software quality evaluation would be put; its lower-level characteristics are closely correlated with actual software metric evaluations which can be performed.• A large number of software quality-evaluation metrics have been defined, classified, and evaluated with respect to their potential benefits, quantifiability, and ease of automation.•Particular software life-cycle activities have been identified which have significant leverage on software quality.Most importantly, we believe that the study reported in this paper provides for the first time a clear, well-defined framework for assessing the often slippery issues associated with software quality, via the consistent and mutually supportive sets of definitions, distinctions, guidelines, and experiences cited. This framework is certainly not complete, but it has been brought to a point sufficient to serve as a viable basis for future refinements and extensions.

739 citations


Book
22 Sep 1976
TL;DR: Software development, software testing, structured programming, composite design, language design, proofs of program correctness, and mathematical reliability models are covered, in an informal style for anyone whose work is affected by the unreliability of software.
Abstract: Deals constructively with recognized software problems. Focuses on the unreliability of computer programs and offers state-of-the-art solutions. Covers--software development, software testing, structured programming, composite design, language design, proofs of program correctness, and mathematical reliability models. Written in an informal style for anyone whose work is affected by the unreliability of software. Examples illustrate key ideas, over 180 references.

245 citations


Journal ArticleDOI
TL;DR: This survey focuses on two system structuring concepts that support security: small protection domains and extended-type objects and explains one approach toward implementing these concepts thoroughly and efficiently.
Abstract: Security has become an important and challenging goal in the design of computer systems. This survey focuses on two system structuring concepts that support security: small protection domains and extended-type objects. These two concepts are especially promising because they also support reliable software by encouraging and enforcing highly modular software structures--in both systems software and in applications programs. Small protection domains allow each subunit or module of a program to be executed in a restricted environment that can prevent unanticipated or undesirable actions by that module. Extended-type objects provide a vehicle for data abstraction by allowing objects of new types to be manipulated in terms of operations that are natural for these objects. This provides a way to extend system protection features so that protection can be enforced in terms of applicationsoriented operations on objects. This survey also explains one approach toward implementing these concepts thoroughly and efficiently--an approach based on the concept of capabilities incorporated into the addressing structure of the computer. Capability-based addressing is seen as a practical way to support future requirements for security and reliable software without sacrificing requirements for performance, flexibility, and sharing.

155 citations


Proceedings Article
01 Jan 1976
TL;DR: A computer-aided system for maintaining and analyzing system requirements, which includes the Requirements Statement Language (RSL), a flow-oriented language for the expression of software requirements, and the Requirements Engineering and Validation System (REVS), a software package which includes a translator for RSL.

144 citations


Journal ArticleDOI
TL;DR: The formal methodology of Higher Order Software (HOS), specifically aimed toward large-scale multiprogrammed/multiprocessor systems, is dedicated to systems reliability.
Abstract: The key to software reliability is to design, develop, and manage software with a formalized methodology which can be used by computer scientists and applications engineers to describe and communicate interfaces between systems. These interfaces include: software to software; software to other systems; software to management; as well as discipline to discipline within the complete software development process. The formal methodology of Higher Order Software (HOS), specifically aimed toward large-scale multiprogrammed/multiprocessor systems, is dedicated to systems reliability. With six axioms as the basis, a given system and all of its interfaces is defined as if it were one complete and consistent computable system. Some of the derived theorems provide for: reconfiguration of real-time multiprogrammed processes, communication between functions, and prevention of data and timing conflicts.

108 citations


Proceedings ArticleDOI
13 Oct 1976
TL;DR: The Software Development System (SDS) as mentioned in this paper is a methodology addressing the problems involved in the development of software for ballistic missile defense systems (BMDs), which is a broad approach attacking problems arising in requirements generation, software design, coding, and testing.
Abstract: This paper presents a discussion of the Software Development System (SDS), a methodology addressing the problems involved in the development of software for Ballistic Missile Defense systems. These are large, real-time, automated systems with a requirement for high reliability. The SDS is a broad approach attacking problems arising in requirements generation, software design, coding, and testing. The approach is highly requirements oriented and has resulted in the formulation of structuring concepts, a requirements statement language, process design language, and support software to be used throughout the development cycle. This methodology represents a significant advance in software technology for the development of software for a class of systems such as BMD. The support software has been implemented and is undergoing evaluation.

107 citations


Proceedings ArticleDOI
07 Jun 1976
TL;DR: An analysis of the software life-cycle is performed to determine where in the cycle the application of quality assurance techniques would be most beneficial, and a variety of approaches in increasing software quality are reviewed.
Abstract: This paper presents an examination into the economics of software quality assurance. An analysis of the software life-cycle is performed to determine where in the cycle the application of quality assurance techniques would be most beneficial. The number and types of errors occurring at various phases of the software life-cycle are estimated. A variety of approaches in increasing software quality (including Structured Programming, Top Down Design, Programmer Management Techniques and Automated Tools) are reviewed and their potential impact on quality and costs are examined.

75 citations


Book
01 Jan 1976
TL;DR: Software Tools in Pascal is written to teach how to write good Pascal programs that make good tools, and all programmers, professional and student, will find the book invaluable as a source of proven, useful programs for reading and study.
Abstract: From the Publisher: With the same style and clarity that characterized their highly acclaimed The Elements of Programming Style and Software Tools, the authors have written Software Tools in Pascal to teach how to write good Pascal programs that make good tools The programs contained in the book are not artificial, but are actual tools that have proved valuable in the production of other programs Structured programming and top-down design are emphasized and applied to every program, as are principles of sound design, testing, efficiency, and portability All of the programs are complete and have been tested directly from the text The programs are available in machine-readable form from Addison-Wesley Software Tools in Pascal is ideal for use in a software engineering course, for a second course in programming, or as a supplement in any programming course All programmers, professional and student, will find the book invaluable as a source of proven, useful programs for reading and study Numerous exercises are provided to test comprehension and to extend the concepts presented in the book 0201103427B04062001

68 citations


Proceedings ArticleDOI
13 Oct 1976
TL;DR: In this paper, the authors present a computer-surveyed, intuition-controlled programming support system (CIPS) at the Technical University of Munich Informatics Institute (TUMI).
Abstract: Programming is studied as an evolutionary process (starting with the problem description and ending with some computer program), which is done in a sequence of transformation steps. Nature of these steps at several levels is illustrated by typical examples. The aim is to arrive at a computer-surveyed, intuition-controlled programming support system (CIPS). Some aspects of this project, which is under way at the Technical University of Munich Informatics Institute, are discussed.

64 citations



Proceedings ArticleDOI
13 Oct 1976
TL;DR: This paper argues for five essential elements of any software engineering curriculum: computer science, management science, communication skills, problem solving, and design methodology.
Abstract: Software engineering involves the application of principles of computer science, management science, and other fields to the design and construction of software systems. Education in software engineering is fundamentally different from education in computer science, management science, or other constituent fields, even though it shares a large common area of concern. As we move toward the development of coordinated software engineering curricula, it is mandatory that we identify principles, not just random collections of techniques, on which to build them. Our research, teaching, and practical experience leads us to argue for five essential elements of any software engineering curriculum: computer science, management science, communication skills, problem solving, and design methodology. This paper will discuss these areas, illustrate their current application in courses, and indicate their implications for curriculum development.

Proceedings ArticleDOI
13 Oct 1976
TL;DR: In this paper, the authors provide quantitative evidence, from widely different environments, of the existence and nature of the evolutionary process of large scale, widely used programs such as operating systems, and some implications for software engineering and for project planners and managers are noted.
Abstract: Large scale, widely used programs such as operating systems are never complete. They undergo a continuing evolutionary cycle of maintenance, augmentation and restructuring to keep pace with evolving usage and implementation technologies. The paper provides quantitative evidence, from widely different environments, of the existence and nature of this evolutionary process. Interpretations and possible significance of some of the observed phenomena are discussed. Some implications for software engineering and for project planners and managers are noted.

Book ChapterDOI
01 Jan 1976
TL;DR: This paper concentrates on the third step from the viewpoint of a numerical analyst working on software for elementary and special functions.
Abstract: There are three distinct steps in the development of a numerical computer program: the development of theoretical methods to perform the desired computation, the development of practical computational algorithms utilizing one or more theoretical methods, and the implementation of these practical algorithms in documented computer software. This paper concentrates on the third step from the viewpoint of a numerical analyst working on software for elementary and special functions.

Proceedings ArticleDOI
13 Oct 1976
TL;DR: The Automated Testing and Load Analysis System (ATLAS) formalizes a concept of model-referenced testing for large software systems and has been successfully employed in testing over 40,000 instructions of Bell Laboratories large No. 4 ESS software package.
Abstract: The Automated Testing and Load Analysis System (ATLAS) formalizes a concept of model-referenced testing for large software systems. A directed graph model of the software under test, describing the sequential stimulus-response behavior of the software system, forms the basis of the approach. The objective of ATLAS is to certify the software under test against the model. This objective is met by components of ATLAS that automatically identify, generate, apply, and verify the set of tests required to establish that the software has correctly realized the model. The system has been successfully employed in testing over 40,000 instructions of Bell Laboratories large No. 4 ESS software package. Usage data and experience from this application and a critique of the approach are given.

Proceedings ArticleDOI
13 Oct 1976
TL;DR: A flexible framework, using a System Monitor, to design error-resistant software is presented, followed by a discussion of the strategies to handle errors in the module, program, and system levels.
Abstract: This paper presents a flexible framework, using a System Monitor, to design error-resistant software. The System Monitor contains the code and data for error detection, error containment and recovery at the module level, program level, and system level. It contains five types of components: the Internal Process Supervisor, the External Process Supervisor, the Interaction Supervisor, the System Monitor Kernel, and the Maintenance Program. The functions of each component is discussed, followed by a discussion of the strategies to handle errors in the module, program, and system levels.

Journal ArticleDOI
TL;DR: This paper examines the structure both of subroutine libraries for use with some base language and of complete programming languages, and outlines the advantages and disadvantages of each, along with facilities that should be present in any software package.
Abstract: This paper describes some software packages and programming systems for computer graphics applications, in the process considering software features for both passive and interactive graphics. It examines the structure both of subroutine libraries for use with some base language and of complete programming languages, and outlines the advantages and disadvantages of each, along with facilities that should be present in any software package.

Proceedings ArticleDOI
07 Jun 1976
TL;DR: An approach to the solution to the problem of program execution time increased due to incorporation of validation and recovery procedures is introduced, and a model system architecture tailored for efficient execution of failure-tolerant parallel programs is described.
Abstract: The state-of-art in software validation as well as the continuing growth of the size and complexity of software subsystems, makes extra costs paid for software error tolerance more than justified. A program in which software redundancy is incorporated i.e. a program in which procedures for run-time validation and recovery are explicitly specified, is generally called a failure-tolerant program. One problem in failure-tolerant programming, which could be particularly serious in real-time computing environments, is the program execution time increased due to incorporation of validation and recovery procedures. This paper introduces an approach to the solution, called the failure-tolerant parallel programming. The essence of this approach is to maximally overlap main-stream computation with redundant computation oriented for validation and recovery. Subsequently, a model system architecture tailored for efficient execution of failure-tolerant parallel programs is described. It is of highly general and modular nature and contains a novel memory subsystem named the duplex memory. Directions of further researches on program structuring and expansion of the model architecture are also indicated.

Proceedings ArticleDOI
13 Oct 1976
TL;DR: The importance of verifying systems specifications before commencing any software design is described and a technique for accomplishing this objective is delineated.
Abstract: Specifications provide the fundamental link to make the transition between the concept and definition phases of the system development cycle. Straightforward, unambiguous specifications are required to ensure successful results and at the same time minimize cost overruns during the development cycle. Many of the problems currently being addressed by software engineers have their origins in the frequently inconsistent and incomplete nature of system specifications.The U.S. Army Ballistic Missile Defense Advanced Technology Center (BMDATC) is currently studying several advanced software development technologies. BMDATC's efforts are directed toward identifying and resolving the fundamental problems that plague the software community: excessive costs, unrealistic or inappropriate schedules, and inadequate performance. A primary category of the BMDATC program is Data Processing System Engineering Research. This research employs an advanced engineering approach to the generation, verification, and unambiguous communication of a complete and consistent set of system requirements. The key elements of this technology are: (1) a mathematically rigorous decomposition technology that effectively translates system requirements into a traceable graphic representation; (2) a usable System Specification Language (SSL) that supports simulation and specification generation; (3) a set of software tools that aid in the development, verification, and configuration control of the decomposed requirements; and (4) a management approach that supports the designed-in quality of the developed specification.Definitive specifications are of primordial importance to the development process in that they are both the springboard for the design process and the yardstick of the test procedures. This paper describes the importance of verifying systems specifications before commencing any software design and delineates a technique for accomplishing this objective.

Proceedings ArticleDOI
01 Feb 1976
TL;DR: A course has been developed that provides a rich experience in software design, including the critical aspects of group work and programming and documentation style.
Abstract: Although the importance of providing realistic educational experiences involving the design of software systems has been recognized in many undergraduate curricula, it is difficult to consistently do so. With the constraints of an existing curricula and a small amount of class time a course has been developed that provides a rich experience in software design, including the critical aspects of group work and programming and documentation style. The course, its goals, and main features are described and analyzed. Experience with the course is reported and the problem of evaluating such a course is discussed.

Journal ArticleDOI
TL;DR: In computer engineering, measurement needs can be divided into three categories: measurement needs, quality, and quantity.
Abstract: Measurement is a fundamental technique of any discipline or science. In computer engineering, measurement needs can be divided into three categories:

Journal ArticleDOI
B. Kline1, M. Maerz, P. Rosenfeld
01 Jun 1976
TL;DR: In-circuit emulation, a major breakthrough in microcomputer development systems, has provided the ability to integrate hardware and software development during all phases of the development cycle, so the software designer can work with the prototype hardware as it is being designed by the hardware engineer.
Abstract: For years, software and hardware development for micro-computer-based products was accomplished by two segregated development efforts. This approach resulted in wasted effort and delays due to inconsistencies between hardware specifications and software implementation at the prototype level. In-circuit emulation, a major breakthrough in microcomputer development systems, has provided the ability to integrate hardware and software development during all phases of the development cycle. The software designer can now work with the prototype hardware as it is being designed by the hardware engineer. In addition, the hardware designer is now able to construct his hardware while working with the actual design software, facilitating debug as hardware development progresses. For the first time, powerful microcomputer development system debug aids can be applied in the user environment.

Journal ArticleDOI
01 Jun 1976
TL;DR: The various kinds of software support tools that are commercially available for programming microprocessor are described and an itemized fist of assembler features which support good program design is presented.
Abstract: The software tools used to develop a microcomputer-based product can have a substantial effect on the development costs, development time cycle, and reliability, of that product. The power and the appropriateness of the commercially available software tools for programming microprocessors vary significantly. Hence, in addition to evaluating the appropriateness of a microprocessor for a particular product, the systems designer must also evaluate the software tools available for that microprocessor. This paper describes the various kinds of software support tools that are commercially available for programming microprocessor. The functions and features of the tool categories of editors, assemblers and compilers, loaders, simulators, and debuggers are discussed. Tradeoffs between cross-computer and resident tools are presented. Finally, an itemized fist of assembler features which support good program design is presented and six commercially available microprocessor assemblers (Fairchild's F8 assembler, Intel's 8080 assembler, Motorola's 6800 assembler, National's IMP-16 assembler, Rockwell's PPS-8 assembler, and Signetics' 2650 assembler) are compared on the extent to which they provide these features.

Proceedings ArticleDOI
13 Oct 1976
TL;DR: The SAFEGUARD System represents the development of one of the largest, most complex software systems ever undertaken, including real time applications, support, and hardware installation and maintenance.
Abstract: The SAFEGUARD System represents the development of one of the largest, most complex software systems ever undertaken. Various types of software were developed, including real time applications, support, and hardware installation and maintenance. Two million instructions were developed at a cost of approximately five thousand staff years of effort. The objective of this paper is to document the staff resources utilized in this development. The actual development rates for the different types of software and the various factors affecting those rates are analyzed. Software productivity is shown to be a function of the type of software - logical, algorithmic, man machine, etc. Emphasis is placed upon total project productivity. The allocation of staff resources for the systems engineering, design, code and unit test, and integration activities, which had an approximate percentage distribution of 20, 20, 17, 43, is analyzed and characterized.

Proceedings ArticleDOI
13 Oct 1976
TL;DR: The software design process is discussed from an engineering point of view in terms of evolving a system architecture independently of implementation considerations and functional completeness, quality, machine independence, and performance completeness of a design are used as criteria for engineering design decisions.
Abstract: The software design process is discussed from an engineering point of view. Initially, a distinction is made between software design and program design. Software design is then described in terms of evolving a system architecture independently of implementation considerations. Computation structures are introduced as a means of modeling the dynamic behavior of a software architecture. Functional completeness, quality, machine independence, and performance completeness of a design are then used as criteria for engineering design decisions. Finally, the basic elements of a system to support the software design process are described.

Proceedings ArticleDOI
13 Oct 1976
TL;DR: This paper summarizes current research at RI aimed at developing secure operating systems and verifying certain critical properties of these systems, and shows that proofs of design properties can be relatively straightforward when the design is specified in suitable formal specification language.
Abstract: This paper summarizes current research at RI aimed at developing secure operating systems and verifying certain critical properties of these systems. It is seen that proofs of design properties can be relatively straightforward when the design is specified in suitable formal specification language. These proofs demonstrate the correspondence between the desired properties and a specification of the system design. Various on-line tools aid considerably in this process. In addition, correctness proofs for implementations of such systems are now feasible, because of both various theoretical advances and the use of supporting tools.

Journal ArticleDOI
TL;DR: These standards constrain code to be developed in a “structured” form for both data and control structures and have proved extremely valuable in practice and reduced the cost and time to produce and maintain large software systems that have been deployed in live multiple customer environments.
Abstract: A sample set of Cobol programming standards is offered. These standards constrain code to be developed in a “structured” form for both data and control structures. They do not require syntax beyond the existing Cobol language and in fact utilize a typical limited subset of the 1974 ANS Cobol standard. These standards have proved extremely valuable in practice and have reduced the cost and time to produce and maintain large software systems that have been deployed in live multiple customer environments.

Journal ArticleDOI
TL;DR: This paper discusses the requirements of programmers working in varying environments in relation to software engineering, structured programming, and program verification.
Abstract: This paper discusses the requirements of programmers working in varying environments in relation to software engineering, structured programming, and program verification.

Proceedings ArticleDOI
20 Oct 1976
TL;DR: A rationale for the use of computer assistance in seminars, curriculum development and other educational activity is given and a checklist intended for the potential organizer of a conference is concluded.
Abstract: A series of studies at the University of Michigan has explored information systems as a basis for learning environments (Zinn, 1974). Currently staff at the Center for Research on Learning and Teaching (CRLT) is looking at educational uses of computer-based conferencing, for example, computer-based seminars and computer-assisted curriculum development. Departments trying other uses, for example, computer-based committee work and computer-aided proposal preparation, find the basic software developed at CRLT to be applicable. However, some of the procedures described here are modified for the various purposes.A paper now in preparation by the same authors reports the history of computer-based conferencing at the University of Michigan. In brief, the activity did not become practical until the spring of 1975 when the CONFER I program (in Fortran) was completed by Robert Parnes. The conferencing software did not receive much use outside the circle of people interested in the software development until CONFER II became operational in the fall. A year of experience provides clear indication of the scope of conferencing applications and the resources needed for effective use.The first section of this note gives a rationale for the use of computer assistance in seminars, curriculum development and other educational activity. It concludes with a checklist intended for the potential organizer of a conference. The second section provides data on phase one of a study of conferencing applied in seminars and individual study. The third section discusses costs, time commitments and benefits. A fourth section describes implications of computer-based educational communications for the University of Michigan.

Proceedings ArticleDOI
13 Oct 1976
TL;DR: The SEF is easily transferable and can be used with vaxious hardware/operating system configurations where it will provide a host-independent software development system and in such a role will provide to the software developer standard facilities across a variety of host systems.
Abstract: The Software Engineering Facility (SEF) is a system for software engineering which is specifically designed to support the development of well-engineered software. However, it is not an operating system. Unlike operating systems such as OS/370, EXEC 8, and others, the SEF is not meant to support the execution of applications programs, just as the ordinary operating systems are not intended to specifically support the development of well-engineered applications programs. The SEF, in fact, will run under and use the facilities of such operating systems. It, then, is easily transferable and can be used with various hardware/operating system configurations where it will provide a host-independent software development system. In such a role it will provide to the software developer standard facilities across a variety of host systems.The SEF provides the central support for an integrated collection of subsystems, and the subsystems provide appropriate facilities for all phases of the software development process, from requirements definition through maintenance and enhancement. Each such subsystem contributes in a cohesive way towards the cost-effective development of well-engineered software.The design of the SEF was determined only after a careful analysis of the software development process and a thorough study of recent efforts in the field of software engineering.