scispace - formally typeset
Search or ask a question

Showing papers in "IEEE Computer in 1976"


Journal ArticleDOI
TL;DR: Three types of simplified models for the system workload are presented and the probabilistic models can be validated with respect to the real workload and are easy to use in a performance evaluation study.
Abstract: A major problem in the evaluation of the performance of a multiprogrammed computer system is the development of an accurate description of its normal workload. This paper formulates the workload characterization problem for a computing environment and presents three types of simplified models for the system workload. The probabilistic models of the workload presented here can be validated with respect to the real workload and are easy to use in a performance evaluation study. The results of a study of the workloads on the Univac 1108 computer at the Computer Science Center of the University of Maryland are also presented.

80 citations


Journal ArticleDOI
J. Rodriguez-Rosell1
TL;DR: With very few exceptions the reference strings that have been measured characterize virtual memory utilization, reflecting the fact that the motivating force behind these research activities is the desire to understand the behavior of virtual memory paging systems.
Abstract: During the past several years a considerable amount of effort has gone into the measurement, analysis, and modeling of program behavior Most of the work either assumes the existence of a reference string, which is then used in various ways, or attempts to produce a model for the process by which such reference strings are generated With very few exceptions the reference strings that have been measured characterize virtual memory utilization, reflecting the fact that the motivating force behind these research activities is the desire to understand the behavior of virtual memory paging systems

60 citations



Journal ArticleDOI
TL;DR: Program behavior studies may be useful in designing new programs and new virtual memory systems that are capable of levels of performance higher than those currently achievable.
Abstract: The practical objective of program behavior studies is to enhance program and system performance. On the one hand, the knowledge resulting from these studies may be useful in designing new programs and new virtual memory systems that are capable of levels of performance higher than those currently achievable. On the other hand, such knowledge may often be employed to increase the performance of existing programs and systems.

54 citations


Journal ArticleDOI
L. Levine1, W. Meyers
TL;DR: The memory system organization is compatible with a wide variety of low-cost fault detection and correction techniques that go a long way toward compensating for otherwise error-prone systems as discussed by the authors.
Abstract: Although continuing cost and performance improvements of the new bipolar and MOS RAM devices are providing strong incentives for their greatly expanded use in mainframe memory and other storage applications, these components have not yet reached the degree of reliability required for large memory systems Fortunately, however, memory system organization is compatible with a wide variety of low-cost fault detection and correction techniques6,10,11 that go a long way toward compensating for otherwise error-prone systems

45 citations


Journal ArticleDOI
TL;DR: To analyze or simulate an operating/computer system, one must construct a model of the programs executing within the system, and many recent analyses, particularly queueing models, have used mathematically convenient program models, which may cause the analysis to be unrepresentative of the real world.
Abstract: To analyze or simulate an operating/computer system, one must construct a model of the programs executing within the system. Many recent analyses, particularly queueing models, have used mathematically convenient program models. For example, in one popular model the times between page faults are assumed to be exponentially distributed and independent. However, such a program model is inaccurate, which may cause the analysis to be unrepresentative of the real world. Simulation models of systems, on the other hand, have relied largely on traces of actual programs. Such traces are undoubtedly more accurate than simple mathematical models, but they have several drawbacks. They are expensive to generate, they may not be truly representative of typical programs, and they may contain more detail than is necessary for accurate system modeling. Moreover, it is difficult to extrapolate their behavior to other similar programs.

40 citations


Journal ArticleDOI
TL;DR: This paper is concerned with paging systems, that is, systems for which the blocks of contiguous locations are of equal size and the occurrence of a reference to a page that is currently not in main memory is called a page fault.
Abstract: Virtual memory is one of the major concepts that has evolved in computer architecture over the last decade. It has had a great impact on the design of new computer systems since it was first introduced by the designers of the Atlas computer in 1962. A virtual memory is usually divided into blocks of contiguous locations to allow an efficient mapping of the logical addresses into the physical address space. In this paper, we are concerned with paging systems, that is, systems for which the blocks of contiguous locations are of equal size. The memory system consists of two levels: main memory and auxiliary memory. The occurrence of a reference to a page that is currently not in main memory is called a page fault. A page fault results in the interruption of the program and the transfer of the referenced page from auxiliary to main memory.

36 citations


Journal ArticleDOI
TL;DR: This bibliography attempts to compile all articles, books, conference papers, and technical reports about computer graphics and man-machine interaction that have been published in English from 1970 to 1975.
Abstract: Computer graphics, interactive techniques, and image processing are among the developments in the constantly evolving computer science field that impact the potential user ever more rapidly. This bibliography attempts to compile all articles, books, conference papers, and technical reports about computer graphics and man-machine interaction that have been published in English from 1970 to 1975. Because the literature pertaining to computer graphics and man-machine interaction is immense, this bibliography will no doubt be incomplete. Suggestions and contributions for future supplements to the bibliography should be sent to the compiler.

35 citations


Journal ArticleDOI
C. Adams1
TL;DR: If there was a line of continuity throughout the various sessions of the Computer Elements Technical Committee meeting last June, it was that the authors are entering a period of technology consolidation.
Abstract: If there was a line of continuity throughout the various sessions of the Computer Elements Technical Committee meeting last June in Vail, Colorado, it was that we are entering a period of technology consolidation. Advances foreseen in the near future are predicted to be extensions of areas of application utilizing existing technologies. The stated need for higher speed or more powerful technologies was only obvious at this meeting by its absence.

34 citations


Journal ArticleDOI
TL;DR: A number of different new technologies and devices have been developed that close the "access gap" between the two dissimilar technologies mentioned above.
Abstract: Until recently, electronically addressable devices such as ferrite core, plated wire, semiconductor memories, and electromechanically addressable devices such as magnetic tapes, disks, and drums were the few technologies from which a computer system designer could build a memory system. A number of different new technologies and devices have been developed that close the "access gap"14between the two dissimilar technologies mentioned above. These include charge-coupled devices (CCD's),2bubble memories,4electron beam addressed memories (EBAM),17and domain tip propagation (DOT).16Other technologies like CMOS1and integrated injection logic (I2L),9compete directly with the existing technologies. Table 1 (see p. 46) shows the possibility of a six-level hierarchy and some cost and performance projections for these technologies.

27 citations


Journal ArticleDOI
G.M. White1
TL;DR: The nature of some of these advances and the state of the art of automatic speech recognition are explained and an introduction to the state-of-the-art is provided.
Abstract: Research toward mechanical recognition of speech is laying the foundation for significant advances in pattern recognition and artificial intelligence This paper explains the nature of some of these advances and provides an introduction to the state of the art of automatic speech recognition

Journal ArticleDOI
TL;DR: The 1970's have witnessed two dramatic innovations in the application of computers to image processing in medicine: computerized tomography (CT), a system where data gathered from an x-ray scanner are fed into a computer to produce cross-sectional images of the human body, and white blood cell differentiation (WBCD).
Abstract: The 1970's have witnessed two dramatic innovations in the application of computers to image processing in medicine: computerized tomography (CT), a system where data gathered from an x-ray scanner are fed into a computer to produce cross-sectional images of the human body, and white blood cell differentiation (WBCD), a system that uses a television camera and a computer to replace the human eye and brain in the sophisticated task of visually observing and classifying human white blood cells through the microscope.

Journal ArticleDOI
TL;DR: This paper surveys and summarizes the major contributions to the theory and practice of testable logic design, and discusses the design of easily testable combinational, sequential, and iterative networks.
Abstract: This paper surveys and summarizes the major contributions to the theory and practice of testable logic design. The first part, dealing with the theoretical procedures, discusses the design of easily testable combinational, sequential, and iterative networks, illustrating major techniques with common running examples. The second part comments on the more practical aspects such as board layout, test point siting, and other facilities for easing the problems associated with testing.

Journal ArticleDOI
TL;DR: One of the primary functions of an operating system is to distribute the resources under its control among the users of the system in such a way as to achieve installation standards of performance (including service).
Abstract: One of the primary functions of an operating system is to distribute the resources under its control among the users of the system in such a way as to achieve installation standards of performance (including service).

Journal ArticleDOI
TL;DR: The development of techniques for the computer analysis of pictures and scenes began over 20 years ago and some of the major applications areas are automation (robot vision), cytology, radiology, high-energy physics, remote sensing, and document processing.
Abstract: The development of techniques for the computer analysis of pictures and scenes began over 20 years ago. Most of the work in this field has been application-oriented; some of the major applications areas are automation (robot vision), cytology, radiology, high-energy physics, remote sensing, and document processing (character recognition). However, many of the techniques and algorithms, even if developed for a particular application, are also applicable in other areas.

Journal ArticleDOI
TL;DR: Two basic types of satellites are described and techniques for best exploiting their capabilities are outlined.
Abstract: Satellite graphics terminals, which include a micro or minicomputer to perform interactive processing, present a number of advantages–not the least of which are accessibility and responsiveness. This paper describes two basic types of satellites and outlines techniques for best exploiting their capabilities.

Journal ArticleDOI
TL;DR: This chapter discusses pattern recognition in the context of machine learning, which has been a fascinating field of research and development in its own right for many years and is evidenced by the growing list of activities relating to pattern recognition work.
Abstract: Pattern recognition is not a new subject. Its principles and methodologies have influenced the course of technological development for many years in almost every knowledge-based field. To some fields, pattern recognition is a major tool for problem- solving, capable of producing dramatic results. To others, it has been accompanied by failure and disappointment. Despite the mixed results, however, pattern recognition has continued to be a fascinating field of research and development in its own right. This is evidenced by the growing list of activities, academic and industrial alike, relating to pattern recognition work.

Journal ArticleDOI
TL;DR: This paper examines the structure both of subroutine libraries for use with some base language and of complete programming languages, and outlines the advantages and disadvantages of each, along with facilities that should be present in any software package.
Abstract: This paper describes some software packages and programming systems for computer graphics applications, in the process considering software features for both passive and interactive graphics. It examines the structure both of subroutine libraries for use with some base language and of complete programming languages, and outlines the advantages and disadvantages of each, along with facilities that should be present in any software package.

Journal ArticleDOI
TL;DR: This papers will primarily concentrate on the memory applications of CCD's, finding wide-ranging applications in three major areas: imaging, memory, and analog signal processing.
Abstract: Since their invention in 1970 by Boyle and Smith,1,2charge-coupled devices (CCD's) have evolved rapidly and are finding wide-ranging applications in three major areas: imaging, memory, and analog signal processing. In this papers, we will primarily concentrate on the memory applications of CCD's.

Journal ArticleDOI
TL;DR: The user of a modern electronic hand calculator needs no knowledge of the works inside the box, and a modern high-level language system should present to its users an equally consistent environment, completely defined in terms of the syntax and semantics of the source language.
Abstract: No application programmer writes machine-language programs–i.e., strings of ones and zeroes. That primitive pursuit has long been reserved for those few who create the very first modules of a software system for new hardware. Instead, programmers make use of a wide spectrum of symbolic programming languages, ranging from assembly code to high-level languages such as Fortran, Cobol, and the Algol family. Every programming language has semantics which define some abstract machine. For the assembly-language programmer this machine bears a great resemblance to the actual hardware on which the program will be interpreted, but even here the programmer will frequently use system-defined subroutines or macros which represent extensions of the base hardware facilities. The high-level language programmer's abstract machine reflects the control mechanisms and data structures characteristic of the language. The Fortran programmer, for example, can think in terms of multidimensional array structures, DO loops, subprogram facilities, and so on. In principle he need never be concerned with the manner in which his abstract Fortran machine is to be realized by a particular hardware and software system. The user of a modern electronic hand calculator needs no knowledge of the works inside the box, and a modern high-level language system should present to its users an equally consistent environment, completely defined in terms of the syntax and semantics of the source language.

Journal ArticleDOI
A.R. Ward1
TL;DR: The bibliography below attempts to list most of the books, journal articles, conference papers, and technical reports in English and foreign languages gathered since the publication of the compiler's bibliography on the same topic in Computer, July 1974.
Abstract: Worldwide interest in LSI microprocessors and microcomputers has sparked a continuing growth of publications on all aspects of the subject. The bibliography below attempts to list most of the books, journal articles, conference papers, and technical reports in English and foreign languages, gathered since the publication of the compiler's bibliography on the same topic in Computer, July 1974. References from that bibliography are not repeated here. The cutoff date for inclusion in the list was October 15, 1975. Citations are grouped by year of publication, then listed alphabetically by author. Generally, one- or two-page "product features" are omitted, as are patents and specific device or system manuals.

Journal ArticleDOI
TL;DR: By increasing chip complexity above a few hundred gates, generally referred to as large-scale integration, the capability of economically producing new devices and systems for new applications is being realized.
Abstract: In its constant endeavor to increase the number of functions per semiconductor device, the semiconductor industry has focused continual effort on developing new semiconductor processes and photolithographic technologies and on compressing smaller geometries onto increasingly larger silicon chip areas. The result is that the number of functions (components per chip) is approximately doubling every year while the cost per function is decreasing. A byproduct of these improvements is increased system reliability. By increasing chip complexity above a few hundred gates, generally referred to as large-scale integration, the capability of economically producing new devices and systems for new applications is being realized.

Journal ArticleDOI
R.N. Noyce1
TL;DR: When the IEEE Computer Society was founded twentyfive years ago, the transistor was a laboratory curiosity, and operating computers were assembled from relays or vacuum tubes, today, a single integrated circuit far surpasses the capability of those early computers, and further progress seems inevitable.
Abstract: When the IEEE Computer Society was founded twentyfive years ago, the transistor was a laboratory curiosity, and operating computers were assembled from relays or vacuum tubes. Today, a single integrated circuit far surpasses the capability of those early computers, and further progress seems inevitable. The development of semiconductor devices has depended upon a synergism with computers. This is particularly true for integrated circuits, whose development was motivated by the computer applications. With each advance in components, the computers resulting from their use reached a wider market, motivating further advances in the semiconductor technology.

Journal ArticleDOI
J.W. Atwood1
TL;DR: As general-purpose computing systems become larger and more complex, it has become uneconomical to restrict their use to a single user or program, which has introduced the necessity of finding ways to safely share them.
Abstract: As general-purpose computing systems become larger and more complex, it has become uneconomical to restrict their use to a single user or program. This has introduced the necessity of finding ways to safely share them.

Journal ArticleDOI
TL;DR: In computer engineering, measurement needs can be divided into three categories: measurement needs, quality, and quantity.
Abstract: Measurement is a fundamental technique of any discipline or science. In computer engineering, measurement needs can be divided into three categories:

Journal ArticleDOI
TL;DR: Matching the instructions and their representations to the distributions of usage can save 75% of the space taken by contemporary machine representations, and eliminate all forms of overflow from machine-language.
Abstract: Matching the instructions and their representations to the distributions of usage can save 75% of the space taken by contemporary machine representations. The gain in space may be accompanied by a reduction in execution time due to more efficient use of data paths. Variable-length codes can also eliminate all forms of overflow from machine-language, and greatly reduce the probability of overflow in data.

Journal ArticleDOI
TL;DR: A cross section of computer professionals, including most of the chairmen of the Computer Society's technical committees and several other society members, was interviewed with the objective of summarizing their assessment of the major developments in computer technology.
Abstract: Genesis of survey. During the past six months, a cross section of computer professionals, including most of the chairmen of the Computer Society's technical committees and several other society members, was interviewed with the objective of summarizing their assessment of the major developments in computer technology. (The names of those consulted are listed on p. 75.) Attention was concentrated on the principal recent developments in the computer field and on the major problems that stand in the way of future progress.

Journal ArticleDOI
E.D. Carlson1
TL;DR: Comparison of existing graphics terminal alternatives in terms of the requirements indicates that 18 of these 30 requirements are not being adequately met today.
Abstract: If new applications of computer graphics are to be developed and used, the graphics terminal must meet the requirements of users and applications. Studies conducted at the IBM Research Laboratory in San Jose have resulted in the identification of 30 specific requirements for the display screen, the associated hardware for input, output, processing and storage, the interaction rates, the terminal packaging, and the support of the applications programmer. Comparison of existing graphics terminal alternatives in terms of the requirements indicates that 18 of these 30 requirements are not being adequately met today.

Journal ArticleDOI
TL;DR: With the availability of microprocessors, it is now possible to resolve this delemma: a reasonable amount of intelligence can be added to the terminal at a very reasonable cost.
Abstract: Graphic terminals allow on-line interaction between the user and his program, thereby enabling him to alter the displayed picutre according to his requirements. However, a major problem with the use of a graphic terminal is the cost associated with it. There is no doubt that low-cost graphic terminals are available, but their utility is very limited since they have no processing capability (or intelligence) and must depend on a host processor attention, while at the same time putting additional strain, on the resources of the host machine. This naturally degrades the response time, depending on the processor load at that time, On the other hand, graphic terminals which have local processing capability are very expensive. With the availability of microprocessors, it is now possible to resolve this delemma: a reasonable amount of intelligence can be added to the terminal at a very reasonable cost. Such a terminal has been developed at the University of Ottawa.

Journal ArticleDOI
TL;DR: "Virtual memory" is a computing term which has come into increasing use in recent years, but its use often causes controversy and misunderstanding, for it is used to mean different things by different people.
Abstract: "Virtual memory" is a computing term which has come into increasing use in recent years. Unfortunately, like other new expressions, its use often causes controversy and misunderstanding, for it is used to mean different things by different people. Not long ago when one major computer vendor announced the introduction of the new technique of 'virtual storage,' other manufacturers complained that they had been doing the same thing for years under a different name (see Figure 1).