scispace - formally typeset
Search or ask a question

Showing papers on "Data management published in 1975"


01 Nov 1975
TL;DR: The DMS achieves its security by mapping its data base into the security structure provided by the operating system, with the result that the DMS need contain no security enforcement code.
Abstract: : This report describes the design of a Secure Data Management System (DMS) that is to operate on a Secure MULTICS Operating System Kernel. The DMS achieves its security by mapping its data base into the security structure provided by the operating system, with the result that the DMS need contain no security enforcement code. The logical view chosen for the DMS is the relational view of data. (Author)

58 citations



Proceedings ArticleDOI
22 Sep 1975
TL;DR: This research is devoted to the development of a methodology to automate and optimize the design of DBTG schema structures, using analytic modelling and optimization techniques.
Abstract: The production of an appropriate CODASYL Data Base Task Group (DBTG) Data Description Language (DDL) schema for a given data management application is a significant design problem. This research is devoted to the development of a methodology to automate and optimize the design of DBTG schema structures, using analytic modelling and optimization techniques.Given an implementation independent description of the data management requirements, it is possible to produce a schema configuration which is optimized with respect to logical record access rate, subject to storage and feasibility constraints, within a selected class of schemas. The storage/access rate trade off is expressable as an integer program, which can be mapped into a network traversal problem with a known dynamic programming solution.

38 citations


Proceedings Article
03 Sep 1975
TL;DR: TORUS is a natural language understanding system which serves as a front end to a data base management system in order to facilitate communication with a casual user and uses a semantic network for understanding each input statement and for deciding what information to output in response.
Abstract: This paper describes TORUS, a natural language understanding system which serves as a front end to a data base management system in order to facilitate communication with a casual user. The system uses a semantic network for "understanding" each input statement and for deciding what information to output in response. The semantic network stores general knowledge about the problem domain, in this case "student files" and the educational process at the University of Toronto, along with specific information obtained during the dialogue with the user. A number of associated functions make it possible to integrate the meaning of an input statement to the semantic network, and to select a portion of the semantic network which stores information that must be output. A first version of TORUS has been implemented and is currently being tested.

34 citations


Proceedings ArticleDOI
14 May 1975
TL;DR: An ongoing project which is addressing the problem of converting and transferring data bases among disparate data management systems by restricting the basic conversion types to field-to-field mappings and semantic analysis determines which combinations of conversion types are permissible.
Abstract: In this paper we describe an ongoing project which is addressing the problem of converting and transferring data bases among disparate data management systems (DMSs). The difficulties in converting a data base from one DMS to another stem from the fact that data base structures are system and application dependent. As a result, data base structures embed constraints of three types: (1) logical-level constraints, such as hierarchies, networks, size and type of fields; (2) storage-level constraints, such as inversion capabilities, access paths, and indexing organization; and (3) physical-level constraints, such as physical devices and block/record structures.The approach taken by this project is based on the concept that the data conversion process can depend basically on conversion at the logical level only. Conversion at this level can be achieved by using existing query and generate capabilities of DMSs to move data from their physical representation to the logical level and vice versa. Detailed descriptions of the system components and the languages supporting them are given.An important aspect of the work is in the area of semantics of logical data conversion. We choose to restrict the basic conversion types to field-to-field mappings. Then, semantic analysis determines which combinations of conversion types are permissible. This approach allows for conversion tools that are powerful yet simple to specify.Finally, some observations on the implications of data conversion needs for the design of data management systems are suggested.

32 citations


Proceedings ArticleDOI
19 May 1975
TL;DR: The Datacomputer is a large-scale data management and storage utility for use by a network of computers designed to provide facilities for data sharing among dissimilar machines, rapid access to large on-line files, storage economy through shared use of a trillion-bit store, and improved access control.
Abstract: The Datacomputer is a large-scale data management and storage utility for use by a network of computers. The system is designed to provide facilities for data sharing among dissimilar machines, rapid access to large on-line files, storage economy through shared use of a trillion-bit store, and improved access control.

29 citations


Journal ArticleDOI
Suresh K. Jain1
TL;DR: In this article, the authors describe the design, working, performance, costs and benefits of a computerized scheduling and management information system that has been developed for a large, generalized m/n machine shop which had previously been a bottleneck facility in the Cast Roll Manufacturing complex of Bethlehem Steel Corporation.
Abstract: This paper describes the design, working, performance, costs and benefits of a computerized scheduling and management information system that has been developed for a large, generalized m/n machine shop which had previously been a bottleneck facility in the Cast Roll Manufacturing complex of Bethlehem Steel Corporation. A flexible discrete-event simulation model, at the heart of the computer system, generates two types of schedules: Planning Schedules for making long- and medium-term planning and operating decisions, and Production Schedules for sequencing approximately 1,000 rolls on 40 machines on a day-to-day basis. A number of supporting programs perform data management, file management, and report-generation functions. Data-collection functions are deliberately performed manually to avoid the high costs associated with automatic data-collection equipment normally used in large computer systems. The computer programs are run in a time-sharing environment. A cathode ray tube and a medium-speed printer ...

21 citations


Proceedings ArticleDOI
22 Sep 1975
TL;DR: A data structuring scheme for authorization purposes is presented, that provides a powerful and flexible way of defining access control rules and allows convenient evaluation of access.
Abstract: A data structuring scheme for authorization purposes is presented, that:i) provides a powerful and flexible way of defining access control rules;ii) allows convenient evaluation of access.Algorithms for definition and evaluation of access are given, and possible applications are discussed.

21 citations


Proceedings ArticleDOI
14 May 1975
TL;DR: In this article, the current status of the Study Group on Data Base Management Systems in the United States is discussed, and a set of requirements for effective data base management systems are presented.
Abstract: This paper is a report on the current (1975 May) status of the Study Group on Data Base Management Systems in the United States. While the official purpose of this Study Group is an investigation of standardization potential in the area of data base management systems, an important by-product of the work of the Group has been the development of a set of requirements for effective data base management systems. As no existing or proposed implementation of a data base management system satsified these requirements, nor comprehends more than a fraction of the concepts involved, it is appropriate to explicate these ideas in the present forum.

20 citations


Journal ArticleDOI
TL;DR: A conceptual model of the decision-making process in fisheries management is presented in conjunction with applications of computer and systems analysis to this process.
Abstract: A conceptual model of the decision-making process in fisheries management is presented in conjunction with applications of computer and systems analysis to this process. One of the most difficult problems to solve is selecting objectives to be used for management evaluation. An objective function based on some or all of the components of yield, species, size desirability, and environmental quality is needed. Systems analysis and computer technology in data processing and simulation may be used in many situations to evaluate decision alternatives as an aid in developing management strategies.

17 citations


01 Sep 1975
TL;DR: This paper presents a meta-modelling system that automates the very labor-intensive and therefore time-heavy and therefore expensive and expensive process of manually cataloging and calculating the speed limit values for individual vehicles.
Abstract: For more information about this work, see the Center for Transportation Research Library catalog: http://library.ctr.utexas.edu/dbtw-wpd/query/id/32100

Proceedings ArticleDOI
14 May 1975
TL;DR: The importance of DDLs in computer-aided design (CAD) is described and users of CAD systems are compared with users of business data processing systems, and are shown to have radically different skills, view data in different ways, and perform different operations upon data.
Abstract: Data Description Languages (DDLs) usually are discussed in terms of business data processing applications. This paper describes the importance of DDLs in computer-aided design (CAD). Users of CAD systems are compared with users of business data processing systems, and are shown to have radically different skills, view data in different ways, and perform different operations upon data. Users of CAD systems are concerned not so much with frequent update or casual interrogation as with powerful and flexible representation of interconnections and mathematical constraints among components. The implications of CAD requirements for the relational and network models are discussed.


01 Oct 1975
TL;DR: The CODASYL Data Base Task Group (DBTG) approach to data management is described and then evaluated in terms of CASD, including a capability for describing new special-purpose sets in a sub-schema.
Abstract: : Computer-aided ship design (CASD) presents unusual requirements for data management: multiple views and working files must be described and processed. The CODASYL Data Base Task Group (DBTG) approach to data management is described and then evaluated in terms of CASD. Enhancements are recommended, including a capability for describing new special-purpose sets in a sub-schema, a capability for specifying a subset of a hierarchical structure as part of a user's view in a sub-schema, and a capability for defining procedures in the sub-schema to allow the user to conveniently follow a linear path through a complex structures. (Author)


Journal ArticleDOI
TL;DR: In this article, the authors delineate three fundamental classes of activity and associated personnel: (1) management science and scientists, (2) management technology and technologists, (3) management practice and managers.
Abstract: The transfer of management science into management practice is examined. Starting with the TIMS definition of management science, we delineate three fundamental classes of activity and associated personnel: (1) management science and scientists, (2) management technology and technologists, (3) management practice and managers. The inter-group communication flows which are necessary for the transfer of management science into management practice are then developed. The examination of management science utilization problems leads us to the hypothesis that management scientists have only commented upon, rather than studied, the process of management science application. Here a structure for such a study is developed.

01 Feb 1975
TL;DR: The goal of this thesis is reduction of these data management tasks by developing and applying a practical theory of data structure.
Abstract: : Data management programmers are finding their jobs are getting tougher because of the gradual replacement of sequential data bases by network data bases. In addition, there is a new job called 'Data Administrator' for handling the data structure problems associated with network data bases. The goal of this thesis is reduction of these data management tasks by developing and applying a practical theory of data structure. To insure the practical flavor of this research, the Data Base Task Group (DBIG) report has been selected as the specification of the data management system in which the applications function.


Journal ArticleDOI
TL;DR: The concept of computer maps as a data management and resource optimization technique in pest management is presented and a specific example is the use of the widely available computer mapping program SYMAP in the pest management program for the cereal leaf beetle, Oulema melanopus.
Abstract: The concept of computer maps as a data management and resource optimization technique in pest management is presented. A specific example is the use of the widely available computer mapping program SYMAP in the pest management program for the cereal leaf beetle, Oulema melanopus (L.).

Journal ArticleDOI
TL;DR: One approach to engineering data base management has been presented herein and it would clearly be that the integration of independently developed modules is possible and apparently extremely effective as has been demonstrated in the development of the FINITE system.
Abstract: One approach to engineering data base management has been presented herein. The problem is a complex one and has no apparent correct or incorrect solution. Various techniques have been tried by other investigators. Each generation brings new ideas and hopefully better approaches to engineering data management. FILES is a relatively new system that incorporates ideas from predecessor systems and includes features that were not present in the earlier systems. Among these are flexibility, a separate data definition language, a distinct data management compiler, data management execution system, and machine independence. Its use through POLO grammars is extremely simple. Grammar references to information which resides in data bases are very similar to FORTRAN data references. If one had to point to the one major advantage of using a system such as FILES, it would clearly be that the integration of independently developed modules is possible and apparently extremely effective as has been demonstrated in the development of the FINITE system.


Proceedings ArticleDOI
22 Sep 1975
TL;DR: Two experiments were performed by the Sperry Corporate Research Center to obtain estimates of the percentage of CP time which could be offloaded from the host to a dedicated data management computer, or DDM.
Abstract: There is growing concern among users of data base management systems regarding the cost and performance of such systems. As is typical of highly complex, generalized software, data management systems can be expensive to run, particularly in terms of processing time. A possible solution to this problem is to offload the data management functions into an inexpensive, dedicated minicomputer, called a dedicated data management computer, or DDM. This "backend" approach to the problem has been investigated by Bell Telephone Laboratories [1], using a large DBTG-based system. Their results indicated that such an approach is feasible, though no performance figures have been published. In an effort to obtain more detailed data on the performance implications of a DDM system, two experiments were performed by the Sperry Corporate Research Center. The first experiment was designed to obtain estimates of the percentage of CP time which could be offloaded from the host to a DDM. The second experiment, a prototype implementation of a DDM configuration using a standard minicomputer, was designed to demonstrate the credibility of such a system.

Proceedings ArticleDOI
Eric D. Carlson1
22 Sep 1975
TL;DR: A method, called data extraction, for interfacing large data bases with interactive problem solving systems, which provides interactive data description and presentation functions and shows the basic components in the interface.
Abstract: Interactive problem solving involves user-machine dialogues to solve unstructured problems, such as selecting marketing strategies and planning mass transit routes. A characteristic of most interactive problem solving systems is that they operate on data bases that are special purpose subsets derived from a large data base. The problem solving system must include code for interfacing with this data base, or the data must be converted to the formats required by the problem solving system. Effective interactive problem solving requires a data management system which provides flexibility in accessing a large data base and fast performance for the problem solving system. This paper describes a method, called data extraction, for interfacing large data bases with interactive problem solving systems. Figure 1 shows the basic components in the interface. A large data base management system is used to maintain and protect a set of data files. Data extraction is used to aggregate and subset from this set to provide information for problem solving. In addition to an I/O interface, data extraction provides interactive data description and presentation functions. For more details on the functional requirements of the data extraction components see [2].

01 Jan 1975
TL;DR: This dissertation presents a systematic methodology for making configuration decisions for large data bases, illustrated by its applications to the configuration of a large data base: the 1970 Census of Population and Housing.
Abstract: : This dissertation presents a systematic methodology for making configuration decisions for large data bases. For each phase of the methodology, informal and operational decision aids are provided. The primary design tool described is an interactive Data Base Configuration Model (DBCM). This model was developed to aid the data base designer in evaluating and comparing the cost and performance of alternative configurations. The methodology is illustrated by its applications to the configuration of a large data base: the 1970 Census of Population and Housing.

Proceedings ArticleDOI
22 Sep 1975
TL;DR: The implementation of a methodology for data base conversion that operates on the logical structure of data bases, using the existing query and generate capabilities of systems to provide a simple, intuitive language for a user who wishes to convert an existing data base.
Abstract: In a previous paper [1], we described a methodology for data base conversion that operates on the logical structure of data bases, using the existing query and generate capabilities of systems (typically data management systems). In this paper, we describe the implementation of such a converter in terms of its components and the internal data structures used. In [1] we discussed other approaches which were suggested in the literature and pointed out the advantages we see in our approach. The major advantage is complete independence from the internal organizational features of data bases, such as access paths and inversion and indexing mechanisms. As a result, we can provide a simple, intuitive language for a user who wishes to convert an existing data base.

Journal ArticleDOI
TL;DR: This paper describes the maintenance and enhancement of the Computer Sciences Teleprocessing System (CSTS) and its component processors and the design and documentation of test programs and the organization of test sets.
Abstract: This paper describes the maintenance and enhancement of the Computer Sciences Teleprocessing System (CSTS) and its component processors. CSTS is the system offered by the INFONET Division of Computer Sciences Corporation to provide nationwide conversational and batch teleprocessing service. In the first section, the organization of the project personnel and major activities of project departments are described. The second section describes the process by which functional enhancements and error corrections are implemented. The final section describes testing techniques and procedures used during implementation.The project staff is organized into implementation departments and service departments. Implementation departments develop new features of the system, corrections to errors, and improvements in system operation, especially in reduction of software overhead. Implementation departments are organized by system software functions such as operating system (device control, task management, and file management), language processors and data management systems, communications software, and applications. The service departments are system integration, performance analysis, product management and system test. The functions performed by these departments are described. Also described are the system evolution conferences and the periodic review of enhancements by a Change Advisory Board.During the development of a new version of the system, effective tracking of the status of changed modules is essential. The data base and process used to track new features, error corrections and changed component modules are described. The flow of implemented changes through system integration and system test is delineated, as is the development and verification of change documentation for users, administrators and operations personnel.The final section of the paper describes the design and documentation of test programs and the organization of test sets. Effective test operations are achieved by using self-checking tests as well as automated test operation and verification. Tests for compliance with functional specifications and for conformity to internal design are described.

Journal ArticleDOI
TL;DR: In this article, the authors outline a computerized management information system which could be made available to every manager of a modern dairy operation, which can provide the agricultural manager with accurate, timely information in a form that is most useful for today's management decisions.


01 Apr 1975
TL;DR: The state-of-the-art is described in 16 areas related to data management and resource sharing on computer networks.
Abstract: : The state-of-the-art is described in 16 areas related to data management and resource sharing on computer networks. The data management topics covered include data organization (structures and access techniques), optimization (hashing, clustering, partitioning, and compression), data languages (for structure definition and query), and file allocation on a network. The topics covered relating to the network and systems environment include communications, resource allocation, measurement and evaluation, network front ends, and security. The topics relating to network applications discuss general user support, and network management.

01 Dec 1975
TL;DR: The intent of this study has been to determine the requirements of a back-end processor including both hardware and software and to analyze a variety of commercially available mini-computers to see how they meet these requirements.
Abstract: : This report will present the results of a study to determine the most promising mini-computer processor using current hardware technology for implementing the Integrated Database Management System (IDMS) and ultimately leading to a commercially viable back-end database management system The intent of this study has been to determine the requirements of a back-end processor including both hardware and software and to analyze a variety of commercially available mini-computers to see how they meet these requirements Certain data collected from hardware manufacturers concerning both their hardware and their software is documented in this study together with a comparison of their products It must be noted that this study is the result of a mutual interest in back-end database management systems by Cullinane Corporation and various government agencies and reflects a desire on the part of Cullinane Corporation and others to investigate the technical and economic feasibility of back-end database management systems