scispace - formally typeset
Search or ask a question

Showing papers on "Data management published in 1980"


Journal ArticleDOI
TL;DR: This paper presents an overview of the SDD-1 design and its solutions to the above problems.
Abstract: The declining cost of computer hardware and the increasing data processing needs of geographically dispersed organizations have led to substantial interest in distributed data management. SDD-1 is a distributed database management system currently being developed by Computer Corporation of America. Users interact with SDD-1 precisely as if it were a nondistributed database system because SDD-1 handles all issues arising from the distribution of data. These issues include distributed concurrency control, distributed query processing, resiliency to component failure, and distributed directory management. This paper presents an overview of the SDD-1 design and its solutions to the above problems.This paper is the first of a series of companion papers on SDD-1 (Bernstein and Shipman [2], Bernstein et al. [4], and Hammer and Shipman [14]).

253 citations


Journal ArticleDOI
TL;DR: An overview of the SDMS concept is presented and its implementation in a prototype system for retrieving information from both a symbolic database management system and from an optical videodisk is described.
Abstract: Spatial data management is a technique for organizing and retrieving information by positioning it in a graphical data space (GDS). This graphical data space is viewed through a color raster-scan display which enables users to traverse the GDS surface or zoom into the image to obtain greater detail. In contrast to conventional database management systems, in which users access data by asking questions in a formal query language, a spatial data management system (SDMS) presents the information graphically in a form that seems to encourage browsing and to require less prior knowledge of the contents and organization of the database.This paper presents an overview of the SDMS concept and describes its implementation in a prototype system for retrieving information from both a symbolic database management system and an optical videodisk.

223 citations


01 Oct 1980
TL;DR: A broad-brush description of the basic goals and philosophy of a research program at SRI International aimed at developing the technology needed to support systems that can be tutored in English about new subject areas, and that can thereafter aid the user in filing and retrieving information and in conveniently applying to the new subject area other computer software.
Abstract: : This report presents a broad-brush description of the basic goals and philosophy of a research program at SRI International (SRI) aimed at developing the technology needed to support systems that can be tutored in English about new subject areas, and that can thereafter aid the user in filing and retrieving information and in conveniently applying to the new subject area other computer software, such as data-base management systems (DBMS), planners, schedulers, report generators, and simulators. These systems, which the authors call Knowledge Learning and Using Systems (KLAUS), are intended to act as brokers between the user's needs, as expressed in the user's terms, and the resources available in a rich computational environment. In a nutshell, the core concept of a KLAUS is that of an interactive system preprogrammed with essential skills for readily learning the concepts and vocabulary of new subject domains, and with expertise for applying acquired knowledge in problem-solving situations. A KLAUS is tutored about new domains in English (perhaps also using tables, menus, and domain-specific formalisms). While being taught, a KLAUS does not play a passive role, but actively looks for gaps and inconsistencies in its knowledge, asking its tutor pointed clarification questions. In this manner, the KLAUS aids the user in formalizing, organizing, and clarifying his or her ideas. After tutoring, a KLAUS can aid its tutor and other users in performing tasks that require combining knowledge of the new domain with knowledge of how to use sophisticated computer systems.

154 citations


01 Sep 1980
TL;DR: This user's guide is intended for the person who knows how to log in to the host operating system, as well as how to enter and edit a line of text, and how to use D-LADDER on a demonstration basis.
Abstract: : D-LADDER (DIAMOND-based Language Access to Distributed Data with Error Recovery) is a computer system designed to provide answers to questions posed at the terminal in a subset of natural language regarding a distributed data base of naval command and control information. The system accepts natural-language questions about the data. For each question D-LADDER plans a sequence of appropriate queries to the data base management system, determines on which machines the queries are to be processed, establishes links to those machines over the ARPANET, monitors the processing of the queries and recovers from certain errors in execution, and prepares a relevant answer to the original question. This user's guide is intended for the person who knows how to log in to the host operating system, as well as how to enter and edit a line of text. It does not explain how D-LADDER works, but rather how to use it on a demonstration basis.

63 citations


Proceedings ArticleDOI
14 May 1980
TL;DR: This paper defines an expert and indicates how it would be added to one existing data base system and suggests two appropriate mechanisms, namely hypothetical data bases (HDB's) and experts.
Abstract: This paper is concerned with adding knowledge to a data base management system and suggests two appropriate mechanisms, namely hypothetical data bases (HDB's) and experts. Herein we indicate the need for HDB's and define the extensions that are needed to a data base system to support HDB's.In addition, we suggest that the notion of "experts" is an appropriate way to add semantic knowledge to a data base system. Unlike most other proposals which extend an underlying data model to capture more meaning, our proposal does not require extensions to the schema. Moreover, the DBMS does not even have to know how an expert functions. In this paper we define an expert and indicate how it would be added to one existing data base system.

55 citations


Proceedings ArticleDOI
19 May 1980
TL;DR: It is contention that the designers of a database machine must choose a focus for their machine which strongly influences other design decisions, and the reasons for this choice are presented.
Abstract: The Intelligent Database Machine (IDM), manufactured by Britton-Lee Inc., is a back-end processor and storage system that contains a complete data management system. It includes specialized hardware to perform data management functions. It is our contention that the designers of a database machine must choose a focus for their machine which strongly influences other design decisions. The IDM is low cost, high performance machine designed to support "midrange" users. This paper presents the reasons for this choice, and the resultant design issues.

44 citations


Journal ArticleDOI
TL;DR: Experiences in the U.S. National Uranium Resource Evaluation Program and Canadian Uranium Reconnaissance Program related to the use of geochemical exploration methods are drawn upon to review concepts and standards for data management and analysis.

29 citations


Journal ArticleDOI
TL;DR: General operational requirements of data management systems are presented, and the notion of activity levels introduced, and a brief review of the evolution ofData management techniques in the context of computerized structural analysis is reviewed.

24 citations


Journal ArticleDOI
01 Sep 1980
TL;DR: Some characteristics of the data management function for a DSS are developed, based on data management functions in some existing management tools as well as general developments within database theory.
Abstract: In describing a DSS architecture it is common to include a database as a component in the system. However, the DSS literature pays little attention to the data management function. Emphasis is placed on model aspects and user-system interface. In this paper we shall develop some characteristics of the data management function for a DSS. They are based on data management functions in some existing management tools as well as general developments within database theory. Finally, a particular feature of data management, data extraction, is treated. Data extraction enables loading of data into a DSS from data files external to the DSS.

20 citations


Book
01 Jul 1980

19 citations


Book ChapterDOI
01 Jan 1980
TL;DR: The management software of EIDES, a set of executive routines that free each primitive image processing routine designed for incore use from data management work, assures transportability between systems.
Abstract: The image database EIDES was developed at the Electrotechnical Laboratory. It contains a considerable number of standard images for experimental studies on pattern recognition and image processing. The file structure of EIDES is based on the ‘Standard Format for Digital Images in Japan.’ This paper describes the management software of EIDES. Many additional subroutines exist for format conversion or access to image data in user's image processing programs. Especially, a set of executive routines can be used for manipulating large images stored in EIDES. These routines free each primitive image processing routine designed for incore use from data management work. This assures transportability between systems.



Journal ArticleDOI
TL;DR: Data management and communication have improved, thus, allowing nurses more time for direct patient care, and teaching of the residents and nurses has been facilitated and minimizes disparities from their diverse experience.
Abstract: To solve the problem of data management, a digital computer was introduced in this ICU in 1977. Data are manually entered at the bedside alpha-numeric keyboards; two beds are directly connected to the computer. The system was especially designed to work in the 11-bed ICU; its functions are: (1) admission, discharge, and transfer data of patients; (2) management of doctors' and nurses' notes in a free text form; (3) management of the problem-oriented record; (4) management of physical and bio chemical variables, medical disorders, and fluid balance; and (5) diagnostic and therapeutic decision-making. Since 1977, the authors have computerized over 2600 patients and now conclude: (1) data management and communication have improved, thus, allowing nurses more time for direct patient care; (2) teaching of the residents and nurses has been facilitated and minimizes disparities from their diverse experience; (3) it has contributed to the development of protocols for many of the procedures; and (4) it has led to a more systematic approach to patient care. The assistance of a professional computer programmer and continuous maintenance of the software are essential.

Proceedings ArticleDOI
14 May 1980
TL;DR: In this article, the design and access path data models are presented to form an integrated framework for logical and physical database design in a heterogeneous database environment, where a physical design is specified in terms of general properties of access paths, independent of implementation details.
Abstract: Design and Access Path Data Models are presented to form an integrated framework for logical and physical database design in a heterogeneous database environment. This paper focuses on the physical design process. First, a physical design is specified in terms of general properties of access paths, independent of implementation details. Then, a design is realized by mapping the specification into the storage structures of a particular database system. Algorithms for assigning the properties to logical access paths and for realizing a CODASYL 78 DBTG schema are given.

Journal ArticleDOI
TL;DR: An overview of the objectives, capabilities, status, and availability of the Consistent System is given.
Abstract: The Consistent System (CS) is an interactive computer system for researchers in the behavioral and policy sciences and in fields with similar requirements for data management and statistical analysis. The researcher is not expected to be a programmer. The system offers a wide range of facilities and permits the user to combine them in novel ways. In particular, tools for statistical analysis may be used in combination with a powerful relational subsystem for data base management. This paper gives an overview of the objectives, capabilities, status, and availability of the system.




Journal Article
TL;DR: Several levels of automation in photogrammetry are described in this article, including computer control of comparators and analog stereo plotters, through the analytical stereoplotters, to orthophoto printers and fully automated correlation equipment.
Abstract: : All functional aspects of photogrammetry, i.e., triangulation mensuration, elevation data extraction, planimetric data extraction, and rectification and orthorectification, are now being automated to some degree. Automation extends from computer control of comparators and analog stereo plotters, through the analytical stereoplotters, to orthophoto printers and fully automated correlation equipment. The drive towards automation has been triggered not only by the continuing necessity to reduce costs, but also by the need to generate new products (e.g., DTM, land use, etc.) and to utilize other than conventional mapping photography. These have led, in turn, to the necessity for data editing and data management systems. The trend in the future will be towards the ever-increasing use of digital image processing technology. Examples of several levels of automation in photogrammetry are described. (Author)


Journal ArticleDOI
TL;DR: The system described is an in-house, on-line interactive system capable of being used by most agency personnel after minimal training and the basic concepts associated with a microcomputer based information system are presented.
Abstract: Microcomputer technology is coming to the point where it can be used on a cost effective basis by a small human service agency and even the private practitioner. This article describes a microcomputer based information system which addresses basic data management needs. It presents the basic concepts associated with a microcomputer based information system, the choices which must be made in developing a system, and the limitations of the system. The system described is an in-house, on-line interactive system capable of being used by most agency personnel after minimal training.






Patent
10 Apr 1980
TL;DR: In this paper, the authors proposed to generate all sales management data by generating sales data in the input processing unit on the basis of un-transmitted sales data of the external storage device when the transmission line between the input process unit and a center machine is faulty.
Abstract: PURPOSE:To generate surely all sales management data by generating sales management data in the input processing unit on a basis of un-transmitted sales data of the external storage device when the transmission line between the input processing unit and a center machine is faulty. CONSTITUTION:Sales data is inputted and processed in input processing unit 1 and is transmitted to center machine 2 and is processed in machine 2 according to prescribed data management contents to generate sales management data. That is, external storage device 4 is mounted on unit 1, and data is stored in device 4 when sales data cannot be transmitted to machine 2, and un-transmitted sales data is read from device 4 and is transmitted to machine 2 when transmission line 3 is restored. In this case, sales management data is generated in unit 1 on a basis of un-transmitted sales data of device 4 when transmission line 3 is not restored. As a result, machine 2 can be used effectively, and sales management data can be generated surely even if anomaly occurs in transmission line 3 between unit 1 and machine 2.

01 Dec 1980
TL;DR: The Data from Aeromechanics Test and Analytics - Management and Analysis Package (DATAMAP) was designed and programmed as a computer software tool for data management and processing of large, time-based data bases with particular attention to rotorcraft-related analyses.
Abstract: : The Data from Aeromechanics Test and Analytics - Management and Analysis Package (DATAMAP) was designed and programmed as a computer software tool for data management and processing of large, time-based data bases. Particular attention is given to rotorcraft-related analyses. The package will process data stored in two basic formats. The first format is that used for the Operational Loads Survey (OLS) test data based and anticipated for use in planned flight test programs. The second format is more general; it accommodates various data structures common to analytical data bases. This particular input capability is demonstrated by an interface between the Rotorcraft Flight Simulation Program (C81) and DATAMAP. The package transfers selected data to a large, direct access disc file and maintains the data on a semi-permanent basis. Data are retrieved from this file, processed, and displayed interactively or in batch. Plot output is generated on a Tektronix 4014 or an incremental plotter (e.g., Calcomp). A small sample of available processing options includes amplitude spectrum, harmonic analysis, digital filtering, auto-spectral density, frequency response function, acoustic analyses, and blade static pressure and normal force coefficient derivations. This program will accommodate data from multiple sensors simultaneously for processing of functions with two geometric independent variables (e.g., chord and radius). The package is written entirely in FORTRAN. Package specifications require nonstandard FORTRAN coding to be used, but the package has been made as easily transportable as possible.

01 Sep 1980
TL;DR: An overview of the capabilities and software architecture of the IPAD information processor (IPIP) is presented, a state-of-the-art data base management system that satisfies engineering requirements not addressed by present day commercial systems.
Abstract: An overview of the capabilities and software architecture of the IPAD information processor (IPIP) is presented. IPIP is a state-of-the-art data base management system that satisfies engineering requirements not addressed by present day commercial systems. It also significantly advances a number of capabilities that are offered commercially. IPIP capabilities range from support for multiple schemas and data models to support for distributed processing, configuration control, and data inventory management. IPIP exploits semantic commonality in features offered in various forms at different user interfaces in today's commercial systems. An integrated software architecture supports all user interfaces: programming languages, interactive data manipulation, and schema languages. This approach promotes simplicity and compactness in software and permits features to be offered symmetrically across all appropriate user interfaces.