scispace - formally typeset
Search or ask a question

Showing papers on "Data warehouse published in 1991"


Patent
Tadashi Tenma1, Kichizo Akashi1, Tetsuo Kusuzaki1, Mitsuo Sudo1, Takayuki Ishii1 
28 Oct 1991
TL;DR: In this article, a data management system necessary for management such as a profit management of a shop, it is required to collect and to analyze a great amount of various kinds of data items.
Abstract: In a data management system necessary for management such as a profit management of a shop, it is required to collect and to analyze a great amount of various kinds of data items. In such a data management system, in addition to a data base storing therein all data, there are disposed a data base associated with each data utilization purpose and a data base for the data analysis so as to conduct data-base-oriented processing, which enables an analysis result suitable for a purpose to be easily attained and which simplifies the system configuration.

58 citations


Book ChapterDOI
Laura M. Haas1, William F. Cody1
28 Aug 1991
TL;DR: This paper shows how EDBMS technology can support GIS applications through powerful data modeling and data management functions that are analogous to, and integrated with, those employed for managing traditional data.
Abstract: A major focus of recent database research has been extensible database management systems (EDBMS). These systems' goal is to accommodate the increasing amount of digitized data that is different in structure, in size and in processing needs from traditional transaction data. Geographic Information Systems (GIS), for example, have developed effective and specialized data representation and analysis techniques for spatial data that must be supported by any underlying DBMS. In this paper we show how EDBMS technology can support GIS applications through powerful data modeling and data management functions that are analogous to, and integrated with, those employed for managing traditional data.

45 citations


01 Nov 1991
TL;DR: In this article, a browse capability for space and Earth science data is needed to enable scientists to check the appropriateness and quality of particular data sets before obtaining the full data set(s) for detailed analysis.
Abstract: Soon after space and Earth science data is collected, it is stored in one or more archival facilities for later retrieval and analysis. Since the purpose of the archival process is to keep an accurate and complete record of data, any data compression used in an archival system must be lossless, and protect against propagation of error in the storage media. A browse capability for space and Earth science data is needed to enable scientists to check the appropriateness and quality of particular data sets before obtaining the full data set(s) for detailed analysis. Browse data produced for these purposes could be used to facilitate the retrieval of data from an archival facility. Quick-look data is data obtained directly from the sensor for either previewing the data or for an application that requires very timely analysis of the space or Earth science data. Two main differences between data compression techniques appropriate to browse and quick-look cases, are that quick-look can be more specifically tailored, and it must be limited in complexity by the relatively limited computational power available on space platforms.

3 citations


Journal Article
TL;DR: This paper describes strategies and tactics that are being employed to successively manage these functions in US commercial E and P and downstream data bases: integrity, integration, and interface.
Abstract: Improved cost effectiveness and enhanced productivity are the primary benefits to be gained from the management and application of large petroleum data bases during the 1990s. Achievement of these benefits depends on three critical data base management functions: integrity, integration, and interface. This paper describes strategies and tactics that are being employed to successively manage these functions in US commercial E and P and downstream data bases. Data base integrity is the foundation for successful data base utilization. Rising user expectations demand high data base quality standards. Key activities include (1) the assigning of unique identifiers and standard data codes, (2) the verifying of accurate identification location, and (3) the compiling of complete and accurate technical data. Integration is the hot data base topic for the 1990s. Both data and software must be addressed to realize cost effective systems. This function's activities include (1) the assigning of common standard codes to data bases, (2) the integrating of physical data in master data bases, (3) the capturing and indexing of graphics files through scanned image technology, and (4) the developing of data base connectivity and standard data models. Establishing satisfactory interface between data base and users is required to achieve productivity.more » User interface with data is facilitated with GIS spatial data management systems. Vendors are cooperating to interface data with workstations and applications. Potential benefits of high quality, integrated data bases, and a GIS user interface are illustrated for case histories in the Austin Chalk and Texas Gulf Coast.« less

1 citations