scispace - formally typeset
Search or ask a question

Showing papers in "IEEE Computer in 1973"


Journal ArticleDOI
TL;DR: A cache-based computer system employs a fast, small memory interposed between the usual processor and main memory that provides a smaller ratio of memory access times, and holds the processor idle while blocks of data are being transferred from main memory to cache rather than switching to another task.
Abstract: A cache-based computer system employs a fast, small memory -the " cache" - interposed between the usual processor and main memory. At any given time the cache contains as much as possible the instructions and data the processor needs; as new information is needed it is brought from main memory to cache, displacing old information. The processor tends to operate with a memory of cache speed but with main memory cost-per-bit. This configuration has analogies with other systems employing memory hierarchies, such as "paging" or "virtual memory" systems. In contrast with these latter, a cache is managed by hardware algorithms, deals with smaller blocks of data (32 bytes, for example, rather than 4096), provides a smaller ratio of memory access times (5:1 rather than 1000: 1), and, because of the last factor, holds the processor idle while blocks of data are being transferred from main memory to cache rather than switching to another task. These are important differences, and may suffice to make the cache-based system cost effective in many situations where paging is not.

85 citations


Journal ArticleDOI
TL;DR: An RCA Labs team project begun in 1968 is described, with the general goal to predict the performance of new system architectures being considered within RCA for future computers, where the cache-system or slave memory idea was the principal subject.
Abstract: We describe in this paper an RCA Labs team project begun in 1968, with the general goal to predict the performance of new system architectures being considered within RCA for future computers. The cache-system or slave memory idea was the principal subject. However, many other problems have been studied, and the data base that was collected has been found useful in analysis as well as prediction of performance, and in obtaining fundamental data for modelling purposes.

24 citations


Journal ArticleDOI
TL;DR: The results of integrating the computer into the day-to-day operation of the Cardiac Surgical Intensive Care Unit of the University of Alabama Hospital are described.
Abstract: The application of technology and system engineering techniques to the delivery of health services has made possible the successful implementation of a computer-based system in the clinical care of patients during the crucial early hours following heart surgery. Our hospital was faced with continually increasing numbers of candidates for cardiac operations without proportionate growth in the availability of skilled, highly motivated registered nurses qualified to care for these critically ill patients. Cardiac surgeons responsible for the patient's welfare hypothesized that despite greater patient loads the quality of care could be maintained, economically, with the existing nursing staff, if automation could be successfully employed to aid the physicians and the nurses with their care-related duties. In this paper we describe the results of integrating the computer into the day-to-day operation of the Cardiac Surgical Intensive Care Unit of the University of Alabama Hospital. Since July of 1967 this automated care system has performed record keeping duties and been used clinically in the observation and treatment of 2750 patients (2–28–73) following operation for correction of various acquired and congenital heart defects.

21 citations


Journal ArticleDOI
TL;DR: In the late 1960's, not many people had direct experience in designing or working with supercomputers, but at least there was little confusion over what the term “supercomputer” meant.
Abstract: In the late 1960's, not many people had direct experience in designing or working with supercomputers, but at least there was little confusion over what the term “supercomputer” meant. In those days the main criterion for qualifying as a supercomputer was the number of instructions handled per unit time. Today, we know a good deal more about this category of machine, but it's getting harder and harder to define the term — mainly because of its growth in number and variety. The proliferation of processors has created a need for an orderly, workable scheme of classifications.

20 citations


Journal ArticleDOI
TL;DR: A workshop on privacy and protection in operating systems in Princeton, New Jersey, from June 12-14, 1972 was sponsored by the IEEE Committee on Operating Systems.
Abstract: The IEEE Committee on Operating Systems sponsored a workshop on privacy and protection in operating systems in Princeton, New Jersey, from June 12-14, 1972. Thirty-two people interested in operating system protection met at the Nassau Inn to discuss various problems and their possible solutions. The workshop was organized by Dr. R. Stockton Gaines of the Institute for Defense Analysis, Princeton. He and Professor Peter Denning, Princeton University, acted as session chairmen.

20 citations


Journal ArticleDOI
TL;DR: A workshop on the architecture and application of digital modules was held on June 7–8, 1973, at Carnegie-Mellon University, and participants were invited from computer manufacturers, semiconductor manufacturers, and universities.
Abstract: : The report summarizes the discussion of a workshop on the Architecture and Application of Digital Modules that was held on June 7-8, 1973 at Carnegie-Mellon University. The purpose of the workshop was to identify the major influences that continuing advancements in semiconductor technology will have on the next generation of digital systems. The workshop, and this report, can be approximately partitioned into three main topics: discussion of current register-transfer level module sets and what can be learned from their development and use; the state of semiconductor technology and its current trends; and finally, discussion of current efforts to define or build computer structures that may become prototypes of the next generation of digital systems. (Author)

20 citations


Journal ArticleDOI
TL;DR: The importance of computer networking as a powerful national force is now being recognized both in the public and private sectors and there is extensive continuing research and development underway that promises greater efficiencies and capabilities than those realized on a wide scale today.
Abstract: The importance of computer networking as a powerful national force is now being recognized both in the public and private sectors1. The necessary computer and communications technologies are rapidly evolving and, while a large number of networks are fully operational, there is extensive continuing research and development underway that promises greater efficiencies and capabilities than those realized on a wide scale today.

13 citations


Journal ArticleDOI
TL;DR: In this article, the applicability of these tools in certain measurement problems is somewhat limited, since on one hand the system was not specifically designed to be measurable, and on the other hand the tool was designed for a class of systems, configurations, or versions, and its measurement power with respect to a particular system may have been sacrificed to its generality.
Abstract: Computer designers and users have felt the need for monitoring the behavior of their systems since the earliest days of the computer era. However, only very few and recent systems have been adequately instrumented by their designers during the design stage (see for example reference 1). This situation has led to the development of hardware and software measurement tools having the common property of requiring relatively minor or no modifications of the system to be monitored. The applicability of these tools in certain measurement problems is somewhat limited, since on one hand the system was not specifically designed to be measurable, and on the other hand the tool was designed to be applicable to a class of systems, configurations, or versions, and its measurement power with respect to a particular system may have been sacrificed to its generality.

12 citations


Journal ArticleDOI
TL;DR: The following article is a condensation of a COSINE task force report and may have introduced emphases and perspectives that are not held by the authors of the report.
Abstract: The following article is a condensation of a COSINE task force report. The editing process required to condense the original report for these pages necessarily involves arbitrary judgements, and may have introduced emphases and perspectives that are not held by the authors of the report. All such differences are unintentional. Interested readers should obtain copies of the full report, which are available without charge from: Commission on Education, National Academy of Engineering, 2101 Constitution Avenue, N. W. Washington, D. C. 20418

10 citations


Journal ArticleDOI
TL;DR: When operational networks are being designed, full consideration must be given to the legal, economic, social, and management factors as well as those that are purely technical.
Abstract: Computer networks and their communications support no longer present major technical problems. Today, many organizations are planning major networks in a very matter-of-fact way; often, however, giving little or no consideration to non-technical issues during the design phase. When operational networks are being designed, full consideration must be given to the legal, economic, social, and management factors as well as those that are purely technical.

10 citations


Journal ArticleDOI
TL;DR: The minicomputer's compact size and low cost have permitted the development of dedicated systems to meet specialized data processing requirements ranging from communications control, instrument data acquisition, and small business accounting to hotel reservations management and operating racetrack tote boards.
Abstract: Since its commercial introduction in 1965, the minicomputer has revolutionized data processing. Its compact size and low cost have permitted the development of dedicated systems to meet specialized data processing requirements ranging from communications control, instrument data acquisition, and small business accounting to hotel reservations management and operating racetrack tote boards.

Journal ArticleDOI
TL;DR: Since the development of the transistor 25 years ago, the digital computer has changed greatly in size, speed, capability and cost, and clinical medicine has also progressed.
Abstract: Since the development of the transistor 25 years ago, the digital computer has changed greatly in size, speed, capability and cost. During the same period of time, clinical medicine has also progressed. Cardiac surgery has become commonplace, preventive medicine and mass health screening are gaining in popularity, and diagnostic tests such as the electrocardiogram and the chemical analysis of blood and urine are being carried out in ever increasing numbers. These are just a few of the many advances in medical science which have greatly increased the quantity of information that must be collected, sorted, analyzed, indexed, stored and otherwise manipulated.

Journal ArticleDOI
TL;DR: Technical feasibility of computer-communication networks has already been demonstrated for general purpose computers, and evidence is currently being presented to show that economic viability is also feasible.
Abstract: Technical feasibility of computer-communication networks has already been demonstrated [3, 5, 8, 14] for general purpose computers, and evidence is currently being presented to show that economic viability is also feasible. [8, 14]

Journal ArticleDOI
TL;DR: An evaluation of the growth of computer engineering in electrical engineering departments during the last seven years and of the impact of COSINE Committee activities was accomplished through a questionnaire survey of electrical engineering department chairmen.
Abstract: The COSINE Committee of the Commission on Education of the National Academy of Engineering was organized in September, 1965 to help electrical engineering departments develop educational programs in computer engineering and to redesign other courses to use digital computers. The COSINE Committee disbanded in late 1972. A final activity of the COSINE Committee was an evaluation of the growth of computer engineering in electrical engineering departments during the last seven years and of the impact of COSINE Committee activities. The evaluation was accomplished through a questionnaire survey of electrical engineering department chairmen. This paper reports the results of that survey ∗ .

Journal ArticleDOI
TL;DR: The ISP (for Instruction Set Processor) notation was developed for a text to precisely describe the programming level of a computer in terms of its Memory, Instruction Format, Data Types, Data Operations, Interpreting a Specific Instruction Set.
Abstract: The ISP (for Instruction Set Processor) notation was developed for a text [Bell& Newell, 1971] to precisely describe the programming level of a computer in terms of its Memory, Instruction Format, Data Types, Data Operations, Interpreting a Specific Instruction Set.

Journal ArticleDOI
TL;DR: Data communications,1 as the industry uses the term, involves the equipment or services used in the flow of information to or from a computer centre, where a computer must have a part in processing the information transmitted.
Abstract: DATA COMMUNICATIONS,1 AS OUR INDUSTRY USES THE TERM, INVOLVES THE EQUIPMENT OR SERVICES USED IN THE FLOW OF INFORMATION TO OR FROM A COMPUTER CENTER, WHERE A COMPUTER MUST HAVE A PART IN PROCESSING THE INFORMATION TRANSMITTED. IT IS NOT SIMPLY THE TRANSMISSION OF DATA FROM ONE SOURCE TO ANOTHER.

Journal ArticleDOI
TL;DR: Multiphasic health testing programs using computers are becoming generally accepted in the United States as a result of the increasing demand by the public for low-cost periodic health examinations and the availability of automated multiphasichealth testing procedures and computerized data processing.
Abstract: Multiphasic health testing programs using computers are becoming generally accepted in the United States. They have developed as a result of the increasing demand by the public for low-cost periodic health examinations and the availability of automated multiphasic health testing procedures and computerized data processing.

Journal ArticleDOI
TL;DR: The transferral process of research to patient care is the most difficult and time consuming procedure known to physicians, so do the authors face a repetition of similar delays for electrocardiographic automation?
Abstract: The transferral process of research to patient care is the most difficult and time consuming procedure known to physicians. The electrocardiogram was discovered in 1887, put into experimental Use in Europe in the early 1900's and in the United States 10 or so years later. But, it was not until the '20s and '30s that the electrocardiogram came into wide general use. Do we face a repetition of similar delays for electrocardiographic automation?

Journal ArticleDOI
TL;DR: The telecommunications technological environment is characterized by rapidly changing cost-benefit relationships among system components, due primarily to the cost decline of digital logic.
Abstract: State-of-the-art design of communication networks is characterized by the ubiquitous appearance of computer technology. All significant message-switching systems designed in the last ten years have used stored program processor controlled switching. Sophisticated terminals are also using small computers, and are otherwise dominated by the use of digital electronics. The telecommunications technological environment is characterized by rapidly changing cost-benefit relationships among system components, due primarily to the cost decline of digital logic.

Journal ArticleDOI
TL;DR: “The law is very much concerned with continuity and predictability; consequently it inevitably seeks to draw analogies between a new innovation and existing doctrine.”
Abstract: “The law is very much concerned with continuity and predictability; consequently it inevitably seeks to draw analogies between a new innovation and existing doctrine.”

Journal ArticleDOI
Albert S. Hoagland1
TL;DR: In this article, the authors defined the product category defined as mass storage are disk, tape, and card-like recording structures, and the perspective that at a growing number of installations all data will be available on-line under systems control.
Abstract: Mass storage as a functional need in computer systems is continually increasing in importance with the growing trend to interactive terminal-oriented systems, serving to store a systems data base and resident programming systems. The associated capacity, plus the ever expanding magnitude of such information, far exceeds the range where "electronic" memory is economically competitive. Included in the product category defined as mass storage are disk, tape, and card-like recording structures. We see in the future the perspective that at a growing number of installations all data will be available on-line under systems control.

Journal ArticleDOI
TL;DR: I consider being asked to present the Keynote Address at COMPCON 73 as a much to be, coveted and signal honor.
Abstract: I consider being asked to present the Keynote Address at COMPCON 73 as a much to be, coveted and signal honor. For this, I want to thank Sid Fernbach and the COMPCON 73 Committee.

Journal ArticleDOI
TL;DR: The objective of this paper is to present some of the techniques employed in order to enhance the effectiveness of data transfer rates and availability in main memory as needed by the processor(s).
Abstract: Two of the major factors affecting the efficiency of a computing system are the Data Transfer Rate between processor(s) and main memory and the Availability of Data in main memory as needed by the processor(s). The objective of this paper is to present some of the techniques employed in order to enhance the effectiveness of these factors. Both factors are dependent upon the cycle time of the main memory and the memory management system. Furthermore, in the case of multiprocessing environments, one other architectural feature of the main memory arrangement that becomes equally important to data transfer rates and availability is the concept of Modular Memory Organization where the memory is organized into a number of independent memory modules that can be accessed independently by individual processors.

Journal ArticleDOI
TL;DR: The search for methods for common-sense reasoning about actions and their effects, when implemented as computer programs, are called automatic problem solvers, and are the subject of this papers.
Abstract: A major theme in artificial intelligence research has been the quest for methods for common-sense reasoning about actions and their effects. Such methods, when implemented as computer programs, are called automatic problem solvers, and are the subject of this papers.

Journal ArticleDOI
TL;DR: This paper defines some of Jewel's needs for data collection and data communications and some of its approaches to satisfying these needs and describes the company to the extent necessary to make their applications understandable.
Abstract: In this paper we attempt to define some of Jewel's needs for data collection and data communications and some of our approaches to satisfying these needs. In order to make this discussion meaningful, we first describe the company to the extent necessary to make our applications understandable. Then we discuss our data processing applications to indicate how we have solved some difficult problems.

Journal ArticleDOI
TL;DR: It seems that the second generation systems will be characterized by an attempt to incorporate the developments of large scale integrated circuits and the “computer-on-a-chip” into modular systems.
Abstract: This is an important time in the history of modular computer systems. The first generation has been developed and its products are in daily use; now the developers of modular computer systems are taking the first steps toward the second generation. The first generation systems started with the fixed-plus-variable computer system proposed by Estrin in 19601, progressed through the developments of macromodules and Register Transfer Modules, and have reached a point of proliferation with the development of systems at MIT, the University of Delaware, and the University of Washington. (A brief description of these systems, with references, is contained in the paper by Fuller and Siewiorek.) It seems that the second generation systems will be characterized by an attempt to incorporate the developments of large scale integrated circuits and the “computer-on-a-chip” into modular systems.

Journal ArticleDOI
TL;DR: The operating system may be the most complex program ever written for the machine, as well as the one most frequently run, yet, with a few notable exceptions, computer hardware is still not being designed with sufficient regard for the special requirements of supervisory software.
Abstract: Historically, the interaction between operating systems and computer architecture has largely been very one-directional. Only recently have engineers begun to more fully appreciate that traditional hardware cost-performance tradeoffs are not entirely appropriate for stored-program machines. The operating system may be the most complex program ever written for the machine, as well as the one most frequently run. Yet, with a few notable exceptions, computer hardware is still not being designed with sufficient regard for the special requirements of supervisory software. Systems programmers are usually faced with creating operating systems which will survive and flourish in architectural environments ranging from passive to inhospitable. As with Samuel Johnson's upright dog, the surprise is not that it's done well, but that it's done at all.

Journal ArticleDOI
TL;DR: “Mr. Computer” has joined the staff of the New York State Education Department's Division for Handicapped Children.
Abstract: “Mr Computer” has joined the staff of the New York State Education Department's Division for Handicapped Children

Journal ArticleDOI
TL;DR: In almost every area of computing, it is easy to imagine ways in which language understanding could make computers more accessible, not only for those who use them now, but for a much wider range of people without special training.
Abstract: One of the most active areas of Artificial Intelligence research today is the understanding of natural language. From the earliest days of computing, people have been intrigued by the idea of communicating easily with computers. In the past few years, new techniques and larger systems have combined to open new possibilities for expressing ideas to computers in natural language. Question answering systems are approaching a level useful in real applications areas. Advanced computing applications like automatic programming will base their interaction with the user on natural language capabilities. In almost every area of computing, it is easy to imagine ways in which language understanding could make computers more accessible, not only for those who use them now, but for a much wider range of people without special training.

Journal ArticleDOI
TL;DR: Unlike the other modules described in this issue, Register Transfer Modules (RTM's) have been produced and marketed commercially (by Digital Equipment Corporation) and have an interest all their own.
Abstract: Unlike the other modules described in this issue, Register Transfer Modules (RTM's)∗∗ have been produced and marketed commercially (by Digital Equipment Corporation). This gives RTM's an interest all their own; it is one thing to conceive of a set of register transfer level digital modules, but quite another to market them.