scispace - formally typeset
Search or ask a question

Showing papers on "Data warehouse published in 1990"


Patent
22 Oct 1990
TL;DR: In this paper, a data base system management system having a dictionary for maintaining a list of sets of data associated with each of a plurality of user application programs is presented, where the data base access time is decreased by forming a data model of the logical relationship of said list, generating static access modules, creating a secondary data model in a pre-built machine executable format and storing the data model, and in the execution stage accessing the secondary data models and manipulating data values using the secondary model.
Abstract: A data base system management system having a dictionary for maintaining a list of sets of data associated with each of a plurality of user application programs. The data base access time is decreased by forming a data model of the logical relationship of said list, generating static access modules, creating a secondary data model in a pre-built machine executable format and storing the data model, and in the execution stage accessing the secondary data model and manipulating data values using the secondary data model.

8 citations


Proceedings ArticleDOI
01 Aug 1990
TL;DR: There is a need for a data (base) model that possesses elements of a modern data base management system but is oriented toward scientific data sets and applications that must be easy to use, support large diskbased data set and accommodate scientific data structures.
Abstract: The common data format (CDF) is described in terms of its support applications for the database management of visualization systems. The CDF is a self-describing data abstraction technique for the storage and manipulation of multidimensional data that are based on block structures. The discipline-independent approach is designed to manage, manipulate, archive, display, and analyze data, and can be applied to heterogeneous equipment communicating different data structures over networks. An improved CDF version incorporates a hyperplane access allowing random aggregate access to subdimensional blocks within a multidimensional variable. The visualization pipeline is also discussed, which controls the flow of data and permits the visualization of different classes of data representation techniques. The system is found to accommodate a large variety of scientific data structures and large disk-based data sets.

5 citations



Patent
14 Nov 1990
TL;DR: In this article, the authors propose to eliminate a useless conversion process in such an application form where no conversion of codes is desirable by preparing the defining information showing its own node system or another node system that should control data and performing the necessary code conversion process by reference to the defined information.
Abstract: PURPOSE:To eliminate a useless conversion process in such an application form where no conversion of codes is desirable by preparing the defining information showing its own node system or another node system that should control data and performing the necessary code conversion process by reference to the defining information. CONSTITUTION:A data base 21 stores the data on its own node system or another node system. The defining information which is previously defined by a user is registered in a defining information memory 2b in accordance with the data stored in the base 21. Then the necessary conversion of codes is carried out by reference to the registered information when data are read, written, transferred and received. Thus the data are processed. Thus it is possible to eliminate the useless conversion processes in an application form where no conversion of codes is desirable with a host used as a data warehouse in a micromain connection system, etc.

3 citations


Journal ArticleDOI
TL;DR: The hypothesis is that mass storage data management can only be implemented successfully based on highly “intelligent” meta data management services that would allow database administrators and users to manipulate, update, and access data and related information and knowledge in a logical manner, and yet would be powerful enough to support the performance needs of a large mass store system.

3 citations


01 Mar 1990
TL;DR: This document provides general information about system development requirements, describes data elements required by OADSS in detail, and discusses data collection responsibilities, and a data modeling approach was utilized to detail the logical organizations of OadSS data elements in a normalized form.
Abstract: : This data requirements analysis was completed as part of the Life Cycle Management (LCM) process for development of an Officer Assignment Decision Support System (OADSS). This document provides general information about system development requirements, describes data elements required by OADSS in detail, and discusses data collection responsibilities. A data modeling approach was utilized to detail the logical organizations of OADSS data elements in a normalized (i.e., non-redundant) form. This process was carried out by applying an entity level, or top down, approach and included identifying data entities, their keys, and other data elements they describe. A complete data model diagram for the OADSS data bases is provided along with descriptions of the 19 major data entities. Data elements are identified as static system data, dynamic input data, dynamic output data, or internally generated data. The organizational, operational, and developmental impact of OADSS data base collection, maintenance, and utilization is also discussed. It is recommended that a Project Management Plan (PMP) be completed as the next phase in the development of OADSS. Officer assignment, Decision support system, Automated information, Military personnel, Marine corps.

1 citations


01 Sep 1990
TL;DR: The operations of such a DBMS are described, the general design of it is discussed, and the detailed design and implementation of the retrieval operation are outlined.
Abstract: : Current conventional Database Management Systems (DBMS) manage only alphanumeric data. However, data to be stored in the future is expected to include some multimedia form, such as images, graphics, sounds or signals. The structure and the semantics of the media data and the operations on that data are complex. It is not clear what requirements are needed in a DBMS to manage this kind of data. It is also not clear what is needed in the data model to support this kind of data; nor what the user interface should be for such a system. The goal of the Multimedia Database Management System project in the computer science department of the Naval Post Graduate School is to build into a Database Management System (DBMS) the capability to manage multimedia data, as well as the formatted data, and define operations on multimedia data. This thesis, focusing only on the media data of image and sound, first describes the operations of such a system, then discusses the general design of it, and finally outline the detailed design and implementation of the retrieval operation.

1 citations


Journal ArticleDOI
TL;DR: This paper discusses some of the problems in deriving suitable data models and structures because of the diverse sources and applications of the data and puts forward some pragmatic solutions.
Abstract: The term ‘data model’ is used to describe the conceptual view of how data which purports to model reality is arranged in a computer system. A ‘data structure’ is the logical view, and a ‘file structure’ is the actual physical arrangement of the data. Spatial data, as used in geographic information systems, gives rise to particular problems in deriving suitable data models and structures because of the diverse sources and applications of the data. This paper discusses some of these problems and puts forward some pragmatic solutions.