scispace - formally typeset
Search or ask a question

Showing papers on "Data warehouse published in 1993"


Book
01 Jan 1993
TL;DR: Techniques necessary to use DEC's Rdb technology to face the challenge of the '90s are described--how to store data to meet an organization's decision support needs and provide users with a common, simple, transparent access to information regardless of where it resides.
Abstract: From the Publisher: Describes techniques necessary to use DEC's Rdb technology to face the challenge of the '90s--how to store data to meet an organization's decision support needs and provide users with a common, simple, transparent access to information regardless of where it resides. Contains detailed discussions and analysis of major issues including why a split between operational and informational databases is necessary and how to accomplish it, the lack of data creditability, integration of DSS data and how Rdb technology fits with DSS needs.

77 citations


01 Apr 1993

41 citations


Proceedings ArticleDOI
01 Jun 1993
TL;DR: This paper discusses a new approach to database management systems that is better suited to a wide class of new applications such as scientific, hypermedia, and financial applications, which requires that the raw data be mapped in complex ways to an evolving schema.
Abstract: This paper discusses a new approach to database management systems that is better suited to a wide class of new applications such as scientific, hypermedia, and financial applications. These applications are characterized by their need to store large amounts of raw, unstructured data. Our premise is that, in these situations, database systems need a way to store data without imposing a schema, and a way to provide a schema incrementally as we process the data. This requires that the raw data be mapped in complex ways to an evolving schema.

27 citations


Book ChapterDOI
26 Oct 1993
TL;DR: This paper presents a scientific data abstraction at the conceptual level, a schema model for scientific data that allows us to store and manipulate scientific data in a uniform way independent of the implementation data model and provides an operational definition for metadata.
Abstract: The absence of a uniform and comprehensive representation for complex scientific data makes the adaptation of database technology to multidisciplinary research projects difficult. In this paper, we clarify the taxonomy of data representations required for scientific database systems. Then, based on our proposed scientific database environment, we present a scientific data abstraction at the conceptual level, a schema model for scientific data. This schema model allows us to store and manipulate scientific data in a uniform way independent of the implementation data model. We believe that more information has to be maintained as metadata for scientific data analysis than in statistical and commercial databases. Clearly, metadata constitutes an important part of our schema model. As part of the schema model, we provide an operational definition for metadata. This definition enables us to focus on the complex relationship between data and metadata.

15 citations


01 Jan 1993
TL;DR: In the 1990 world that the TPC-D development effort was born into, decision support was a minor league game, the knowledge base, the experience level, and the interest in decision support, both in and out of the T PC Council was in its infancy.
Abstract: In the 1990 world that the TPC-D development effort was born into, decision support was a minor league game. The term “data warehouse” had not yet been coined, and many of the parallel technologies taking aim at decision support were still on the drawing board. Global competition had not yet forced corporations into serious self-examination, nor had it taught them to respect and cherish their data. So the knowledge base, the experience level, and the interest in decision support, both in and out of the TPC Council, was in its infancy.

14 citations


Proceedings ArticleDOI
01 Jan 1993
TL;DR: The emerging Standard for The Exchange of Product Model Data (STEP), being developed in the International Organization for Standardization (ISO), addresses this need by providing information models, called application protocols, which clearly and unambiguously describe data.
Abstract: The problem of sharing data has many facets. The need to share data across multiple enterprises, different hardware platforms, different data storage paradigms and systems, and a variety of network architectures is growing. The emerging Standard for The Exchange of Product Model Data (STEP), being developed in the International Organization for Standardization (ISO), addresses this need by providing information models, called application protocols, which clearly and unambiguously describe data. The validity of these information models is essential for success in sharing data in a highly automated engineering environment.

6 citations


Proceedings ArticleDOI
22 Oct 1993
TL;DR: This paper defines the data schema for picture painting so that the client can store, retrieve, and process the data in the database efficiently and has an extended relational data model that has a hierarchical structure along the above process.
Abstract: In a service process of a software system, an intention of a client is satisfied through the cooperative communication between the client and the server. For a description of specification of the system, it is necessary that this communication is explicitly represented as an interaction. This is called interactive specification. At the interaction of the picture painting process, the cooperative communication between client and server is done through the query of the picture data in the database. Picture data has much information compared with numerical or character data. This paper defines the data schema for picture painting so that the client can store, retrieve, and process the data in the database efficiently. This data schema also has an extended relational data model that has a hierarchical structure along the above process. Two types of schema are described in this paper. One is the denotation schema of interactive specification. Another is data schema for the picture painting process. They have the same idea of description.© (1993) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

3 citations


Proceedings Article
01 Jan 1993
TL;DR: MR-CDF is a system for managing multi-resolution scientific data sets, an extension of the popular CDF (Common Data Format) system that provides a simple functional interface to client programs for storage and retrieval of data.
Abstract: MR-CDF is a system for managing multi-resolution scientific data sets. It is an extension of the popular CDF (Common Data Format) system. MR-CDF provides a simple functional interface to client programs for storage and retrieval of data. Data is stored so that low resolution versions of the data can be provided quickly. Higher resolutions are also available, but not as quickly. By managing data with MR-CDF, an application can be relieved of the low-level details of data management, and can easily trade data resolution for improved access time.

2 citations


01 Mar 1993
TL;DR: A standard is proposed describing a portable format for electronic exchange of data in the physical sciences to allow the user to make her own choices regarding strategic tradeoffs to achieve the performance desired in her local environment.
Abstract: A standard is proposed describing a portable format for electronic exchange of data in the physical sciences. Writing scientific data in a standard format has three basic advantages: portability; the ability to use metadata to aid in interpretation of the data (understandability); and reusability. An improperly formulated standard format tends towards four disadvantages: (1) it can be inflexible and fail to allow the user to express his data as needed; (2) reading and writing such datasets can involve high overhead in computing time and storage space; (3) the format may be accessible only on certain machines using certain languages; and (4) under some circumstances it may be uncertain whether a given dataset actually conforms to the standard. A format was designed which enhances these advantages and lessens the disadvantages. The fundamental approach is to allow the user to make her own choices regarding strategic tradeoffs to achieve the performance desired in her local environment. The choices made are encoded in a specific and portable way in a set of records. A fully detailed description and specification of the format is given, and examples are used to illustrate various concepts. Implementation is discussed.

2 citations


Proceedings ArticleDOI
04 Aug 1993
TL;DR: An architecture of object-oriented semantic data flow system (OSDF), put forward to model scientific data set, concentrated on the scientific data representation and access inData flow system.
Abstract: At present, most visualization tools have applied the traditional flat sequential file for handling scientific data. This results in inefficient and ineffective in storage, access and ease-of-use for large complex data set especially for applications like scientific visualization. The available data models for scientific visualization, including CDF, netCDF and HDF, only supported some common data types. The relationship among data is not considered. But it is important for revealing the insight of scientific data set. In this paper, we presented an architecture of object-oriented semantic data flow system (OSDF). OSDF concentrated on the scientific data representation and access in data flow system. Scientific Data Model (SDM) was put forward to model scientific data set. In SDM, the semantics of data were described by the Association and Data Constructor. Association was used to describe the relationship among data and data constructor to construct new data type. All data objects of an application were stored in object base. Data were organized and accessed by its semantics. An interface to access data in object base was supplied. To attain the best visualization effect, rules and principles for selecting visualization techniques were integrated into data object.© (1993) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

2 citations


Proceedings ArticleDOI
O. Graf1, M. Jones1, F. Sisco1
26 Apr 1993
TL;DR: The prototype system demonstrates an approach to issues of data ingestion, data restructuring, physical data models, relationship file structure to data system performance, data product generation, data transfer to remote users, data subset extraction, data browsing, and user interface.
Abstract: Approaches to the issues of data ingestion, data restructuring, physical data models, relationship file structure to data system performance, data product generation, data transfer to remote users, data subset extraction, data browsing, and user interface have been examined. The High Performance Data System architecture provides an environment for bringing together the technologies of mass storage, large bandwidth data networks, high-performance data processing, and intelligent data access. The prototype system demonstrates an approach to these issues. In addition, the design process has defined some important requirements for the mass storage file system, such as logical grouping of files, aggregate file writes, and multiple dynamic storage device hierarchies. >


Journal ArticleDOI
TL;DR: Mechanisms based on DBV and DBV graph concepts, proposed in EDBMS/2, make EDB MS/2 own powerful abilities for version management and run efficiently.
Abstract: Efficient data management is crucial to the success of a CAD/CAM system. Traditional database systems, designed to deal with only regular and structural data, cannot efficiently manage design data. In this paper, we present a data manager called EDBMS/2, which has been developed by our laboratory for engineering support applications. EDBMS/2 has a data model that combines features of both relational and semantic ones and owns flexible abilities for modeling engineering data, such as variable-length data processing, integrated management of structured data and unstructured data, and composite object handling. Mechanisms based on DBV and DBV graph concepts, proposed in EDBMS/2, make EDBMS/2 own powerful abilities for version management and run efficiently. By now, EDBMS/2 has been used successfully in EDCADS (integrated Electronic Devices CAD environment) project and as a lower level support to develop an object-oriented DBMS for mechanical engineering.


Proceedings ArticleDOI
P. Labourdette1
24 May 1993
TL;DR: The author focuses on the end-users computing services (EUCS) point of view including the satisfaction of user needs as well as the organizational and technical approach of this nonapplication environment.
Abstract: The business data warehouse approach at the IBM Corbeil-Essonnes plant is described. Both concept and implementation aspects are discussed. The author focuses on the end-users computing services (EUCS) point of view including the satisfaction of user needs as well as the organizational and technical approach of this nonapplication environment. >