scispace - formally typeset
Search or ask a question

Showing papers on "Data management published in 1976"


Journal ArticleDOI
TL;DR: An analytic model, based upon knowledge of data item lengths, transportation costs, and retrieval patterns, is developed to assist an analyst with this assignment problem.
Abstract: It is possible to significantly reduce the average cost of information retrieval from a large shared database by partitioning data items stored within each record into a primary and a secondary record segment. An analytic model, based upon knowledge of data item lengths, transportation costs, and retrieval patterns, is developed to assist an analyst with this assignment problem. The model is generally applicable to environments in which a database resides in secondary storage, and is useful for both uniprogramming and multiprogramming systems. A computationally tractable record design algorithm has been implemented as a Fortran program and applied to numerous problems. Realistic examples are presented which demonstrate a potential for reducing total system cost by more than 65 percent.

189 citations


Journal ArticleDOI
TL;DR: A behavioral study utilizing an attitudinal model explored the relationships between management information system users' perceptions of their computer system, perceived variables exogenous to the system, and the impact of external factors on these perceptions.
Abstract: A behavioral study utilizing an attitudinal model explored the relationships between management information system users' perceptions of their computer system, perceived variables exogenous to the ...

181 citations


01 Jun 1976
TL;DR: The basis in the Bell-La Padula model and the extensions required are narratively described and general notions of protection and the relational approach to data management are considered.
Abstract: : A mathematical model of a data management system embodying a military security policy is presented. Its basis in the Bell-La Padula model and the extensions required are narratively described. General notions of protection and the relational approach to data management are considered. A full, formal description of the model is included. (Author)

27 citations



01 Sep 1976
TL;DR: This thesis develops a methodology for monitoring the developing pattern of access to a data base and for choosing near-optimal physical data base organizations based on the evidence mode of use and considers the problem of adaptively selecting the set of secondary indices to be maintained in an integrated relational data base.
Abstract: The development of large integrated data bases that support a variety of applications in an enterprise promises to be one of the most important data processing activities of the next decade. The effective utilization o such data bases depends on the ability of data base management systems to cope with the evolution of data base applications. In this thesis, we attempt to develop a methodology for monitoring the developing pattern of access to a data base and for choosing near-optimal physical data base organizations based on the evidence mode of use. More specifically, we consider the problem of adaptively selecting the set of secondary indices to be maintained in an integrated relational data base. Stress is placed on the acquisition of an accurate usage model and on the precise estimation of data base characteristics, through the use of access monitoring and the application of forecasting and smoothing techniques. The cost model used to evaluate proposed index sets is realistic and flexible enough to incorporate the overhead costs of index maintenance, creation, and storage. A heuristic algorithm is developed for the selection of a near-optimal index set without an exhaustive enumeration of all possibilities.

15 citations



Proceedings ArticleDOI
20 Oct 1976
TL;DR: The results of this study provide considerable insight into the problem of conversion to a data base management system, and suggest guidelines for the evaluation of any proposed data base conversions.
Abstract: This paper presents a methodology for, and an evaluation of the feasibility of converting a typical data processing system to a data base management system. This methodology is applied to a particular system. The data base management system under evaluation uses a back-end mini-computer to perform the data management functions. The evaluation is made in terms of changes in system resources, program requirements, and human factors. The results of this study provide considerable insight into the problem of conversion to a data base management system, and suggest guidelines for the evaluation of any proposed data base conversions.

12 citations


Book
01 Jan 1976

12 citations


Proceedings ArticleDOI
20 Oct 1976
TL;DR: The construction of a performance model or prototype of a proposed data base management system (DBMS) application is described, showing the use of the methodology to “tune” important DBMS, Operating System and application program parameters.
Abstract: This paper describes the construction of a performance model or prototype of a proposed data base management system (DBMS) application by the application of certain techniques from the performance measurement and data base management disciplines. Performance measurement is conducted by using a “drive” workload that initiates the “real” workload with reasonable fidelity. The “drive” workload is constructed in terms of data base variable primitives, which are “inverted” to performance variable primitives in order to construct a resource-based drive workload. The application of this drive workload to a simulation model is illustrated, showing the use of the methodology to “tune” important DBMS, Operating System and application program parameters.

11 citations


Proceedings ArticleDOI
13 Oct 1976
TL;DR: The software engineering techniques that were used in the management and control of the programming team and efforts are discussed and a single experience with HSDMS is related to its possible impact on software engineering of database management systems in general and to data secure systems in particular.
Abstract: HSDMS (Highly Secure Data Management System) is a secure, on-line and multi-user experimental database management system developed on the Digital Equipment Corporation's PDP-10 computer system. It is a vehicle for testing new facilities and applications of data management and access control. Furthermore the development of HSDMS itself has been aimed at the outset as an exercise in software engineering management and control.In the first part of the paper, the software engineering techniques that were used in the management and control of the programming team and efforts are discussed. The discussion centers on the application of some of the known concepts such as the chief programmer team, structured programming and composite design to programming of the database management system.In the second part of the paper, system goals and capabilities are presented. The goals for HSDMS were to achieve data independence, efficient storage and access, effective user interface, and secure access control. The ability of HSDMS to meet the goals result from the utilization and integration of a number of design concepts and implementation approaches. For each system goal the concepts and approaches that have provided HSDMS with the capabilities to achieve the goal are shown.In the final part of the paper, we attempt to relate this single experience with HSDMS to its possible impact on software engineering of database management systems in general and to data secure systems in particular.

10 citations


Proceedings Article
08 Sep 1976
TL;DR: A deductive processor design is presented that incorporates new techniques for selecting, from large collections of mostly irrelevant general assertions and specific facts, the small number needed for deriving an answer to a particular query.
Abstract: This paper examines some of the problems and issues involved in designing a practical deductive inference processor to augment a data management system, as well as some of the benefits that can be expected from such an augmentation. A deductive processor design is presented that incorporates new techniques for selecting, from large collections of mostly irrelevant general assertions and specific facts, the small number needed for deriving an answer to a particular query.

Book
01 Jun 1976
TL;DR: The following book can be a great choice when you need this kind of sources, understanding and applying database technology.
Abstract: Many people are trying to be smarter every day. How's about you? There are many ways to evoke this case you can find knowledge and lesson everywhere you want. However, it will involve you to get what call as the preferred thing. When you need this kind of sources, the following book can be a great choice. database management systems understanding and applying database technology is the PDF of the book.

Journal ArticleDOI
01 Oct 1976-Infor
TL;DR: The Educational Data Base System (EDBS) as discussed by the authors is a data base management system that is implemented in APL and that uses APL as a host language, it is an educational tool design.
Abstract: This paper describes the Educational Data Base System (EDBS). EDBS is a data base management system that is implemented in APL and that uses APL as a host language. It is an educational tool design...

28 May 1976
TL;DR: Knowledge (as opposed to data or information) is proposed as a basic resource of an enterprise and a logical system design for possible implementation is suggested along with a discussion of essential managerial steps which should be taken to achieve corporate knowledge management.
Abstract: : Knowledge (as opposed to data or information) is proposed as a basic resource of an enterprise. The differences and interdependencies among data, information, and knowledge are discussed along with some ideas on what constitutes effective knowledge management for an enterprise. Three views of data management (Structural, Functional, and Physical) are put forward as necessary to the proper understanding of the data management activity of an enterprise. The Structural View is used to outline potential problems in knowledge management and to list several areas where further research and development are needed. Finally, a logical system design for possible implementation is suggested along with a discussion of essential managerial steps which should be taken to achieve corporate knowledge management. (Author)

Journal ArticleDOI
TL;DR: A security system for the GPLAN planning system that handles both data dependent and data independent security as well as “threat monitoring” is proposed.

Journal ArticleDOI
TL;DR: The needs for permanently changing the logical and physical structure of a medical datebase during the development of a health information system have initiated the project of implementing a DATA MANAGER.

Journal ArticleDOI
17 Jan 1976
TL;DR: A computer architecture is proposed which uses a hardware implemented descriptor system to provide facilities for explicit data typing, memory relocation, and access protection at the data element level, and offers significant speed improvement for languages using complex data types.
Abstract: A computer architecture is proposed which uses a hardware implemented descriptor system to provide facilities for explicit data typing, memory relocation, and access protection at the data element level. The problem of having to fetch descriptors for every operand access is overcome by storing descriptors in small fast memories of various types. The resulting machine runs simple languages (such as FORTRAN) as fast as conventional architectures, and offers significant speed improvement for languages using complex data types (such as data management systems). The cost of the descriptor storage hardware is shown to be modest, so this architecture would be suitable for machines as small as a large minicomputer.


Journal ArticleDOI
01 Apr 1976
TL;DR: The techniques employed in the MERLIN database 10 store and retrieve bibliographic data are described, briefly, with particular attention paid to the necessity of accurately maintaining the intellectually established relationships between bibliography elements.
Abstract: This document describes, briefly, the techniques employed in the MERLIN database 10 store and retrieve bibliographic data. An explanation of the choice of value related storage is given, with an indication of the merits of this approach both for data management (cataloguing) and retrieval (searching). Particular attention is paid to the necessity of accurately maintaining the intellectually established relationships between bibliographic data elements, and economic use of storage in a publicly shared data base.

Book ChapterDOI
01 Jan 1976
TL;DR: The goal is to put into the hands of engineers and scientists who use computer systems some computer science technology which is not yet widely available in contemporary software systems and/or programming language systems.
Abstract: This article deals primarily with the data structures of scientific computing and their representation and management. The coverage is at a fundamental although not elementary level. Emphasis is placed on applicable concepts and techniques. The goal is to put into the hands of engineers and scientists who use computer systems some computer science technology which is not yet widely available in contemporary software systems and/or programming language systems. The topics covered include fundamentals of abstract data structure, data representation and storage mapping functions for data structures, program control of executable storage, and an illustration of a data management system. Procedures for imbedding desired or needed data structuring and data management technology into the programming systems generally used for scientific computing are suggested.

Journal ArticleDOI
TL;DR: A discussion is presented describing what extensions were required to make the GPLAN system more adaptable to the planning process, including extending the network query language so that it would be “relatively complete”.
Abstract: This paper looks at the application of the Generalized Planning System (GPLAN) to a large scale water pollution problem. A discussion of the GPLAN system is presented along with a discussion of its application to the Federal Water Pollution Control Act Amendments of 1972. A discussion is then presented describing what extensions were required to make the GPLAN system more adaptable to the planning process. This included extending the network query language so that it would be “relatively complete”.


Book
01 Jan 1976
TL;DR: When you read more every page of this management standards for developing information systems, what you will obtain is something great.
Abstract: Read more and get great! That's what the book enPDFd management standards for developing information systems will give for every reader to read this book. This is an on-line book provided in this website. Even this book becomes a choice of someone to read, many in the world also loves it so much. As what we talk, when you read more every page of this management standards for developing information systems, what you will obtain is something great.

Book ChapterDOI
01 Jan 1976
TL;DR: Some important CAD capabilities required to support aerospace design in the areas of interactive support, information management, and computer hardware advances are discussed.
Abstract: A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

01 May 1976
TL;DR: The design of a program for detailed simulation modelling of generalized data management systems is described, meant to facilitate experimentation with execution-time tactics such as core management, disk scheduling, prefetching, and others.
Abstract: : The design of a program for detailed simulation modelling of generalized data management systems is described. The input to the program is a collection of process descriptions ('jobs') as sequences of block-oriented operations in a virtual data machine. The program accomplishes the binding onto a real hardware configuration, simulates the operation of a real multiprogrammed computer, and collects performance data. The design is meant to facilitate experimentation with execution-time tactics such as core management, disk scheduling, prefetching, and others. The framework accommodates a variety of computer architectures.


01 Mar 1976
TL;DR: The first experiments for a computer conference management information system at the National Aeronautics and Space Administration were reported in this paper, where six cost components were identified: (1) terminal equipment, (2) communication with a network port, (3) network connection, (4) computer utilization, (5) data storage and (6) administrative overhead.
Abstract: Results are reported of the first experiments for a computer conference management information system at the National Aeronautics and Space Administration. Between August 1975 and March 1976, two NASA projects with geographically separated participants (NASA scientists) used the PLANET computer conferencing system for portions of their work. The first project was a technology assessment of future transportation systems. The second project involved experiments with the Communication Technology Satellite. As part of this project, pre- and postlaunch operations were discussed in a computer conference. These conferences also provided the context for an analysis of the cost of computer conferencing. In particular, six cost components were identified: (1) terminal equipment, (2) communication with a network port, (3) network connection, (4) computer utilization, (5) data storage and (6) administrative overhead.


31 Dec 1976
TL;DR: An assessment of the needs of a group of potential users of satellite remotely sensed data (state, regional, and local agencies) involved in natural resources management in five states, and alternative data management systems to satisfy these needs are outlined in this article.
Abstract: An assessment was made of the needs of a group of potential users of satellite remotely sensed data (state, regional, and local agencies) involved in natural resources management in five states, and alternative data management systems to satisfy these needs are outlined. Tasks described include: (1) a comprehensive data needs analysis of state and local users; (2) the design of remote sensing-derivable information products that serve priority state and local data needs; (3) a cost and performance analysis of alternative processing centers for producing these products; (4) an assessment of the impacts of policy, regulation and government structure on implementing large-scale use of remote sensing technology in this community of users; and (5) the elaboration of alternative institutional arrangements for operational Earth Observation Data Management Systems (EODMS). It is concluded that an operational EODMS will be of most use to state, regional, and local agencies if it provides a full range of information services -- from raw data acquisition to interpretation and dissemination of final information products.