scispace - formally typeset
Search or ask a question

Showing papers on "Data management published in 1993"


Journal ArticleDOI
TL;DR: This paper describes how both the domain and the information sources are modeled, shows how a query at the domain level is mapped into a set of queries to individual information sources, and presents algorithms for automatically improving the efficiency of queries using knowledge about both the Domain and the Information sources.
Abstract: With the current explosion of data, retrieving and integrating information from various sources is a critical problem. Work in multidatabase systems has begun to address this problem, but it has primarily focused on methods for communicating between databases and requires significant effort for each new database added to the system. This paper describes a more general approach that exploits a semantic model of a problem domain to integrate the information from various information sources. The information sources handled include both databases and knowledge bases, and other information sources (e.g. programs) could potentially be incorporated into the system. This paper describes how both the domain and the information sources are modeled, shows how a query at the domain level is mapped into a set of queries to individual information sources, and presents algorithms for automatically improving the efficiency of queries using knowledge about both the domain and the information sources. This work is implemented in a system called SIMS and has been tested in a transportation planning domain using nine Oracle databases and a Loom knowledge base.

506 citations


Journal ArticleDOI
TL;DR: The generally high quality of the Manitoba registry file and the hospital claims is supported by comparisons with other data sources, and some of the research possibilities associated with population registries and administrative data are outlined.
Abstract: In this article the organization and accuracy of the population registry and administrative data base in Manitoba, Canada are discussed. The overall data management strategy and a framework for analyzing the accuracy of such data are presented. The generally high quality of the Manitoba registry file (necessary to track individuals over time) and the hospital claims is supported by comparisons with other data sources. Hospital claims' main quality problems concern the reliability of certain secondary diagnoses and the level of aggregation necessary for reasonable agreement with other data collection methods (such as chart reviews). Finally, some of the research possibilities associated with population registries and administrative data are outlined.

279 citations


Journal ArticleDOI
01 Mar 1993
TL;DR: New research problems include management of location dependent data, wireless data broadcasting, disconnection management and energy efficient data access in mobile computing.
Abstract: Mobile Computing is a new emerging computing paradigm of the future. Data Management in this paradigm poses many challenging problems to the database community. In this paper we identify these new challenges and plan to investigate their technical significance. New research problems include management of location dependent data, wireless data broadcasting, disconnection management and energy efficient data access.

182 citations



Patent
Makoto Mita1
16 Feb 1993
TL;DR: In this article, a data management system consisting of a management device, a table formation device for forming a management table indicating respective conditions of a right of use for each one of the plurality of data, a use authorization determination device for determining whether a request for the right-of-use for one of a set of data is authorized by reference to the management table, and a use allocation device for granting the right -of use for a subset of the data on the basis of a determination by the use-authorization determination device.
Abstract: A data management system comprising, a management device, wherein the management device comprises, a plurality of data, a table formation device for forming a management table indicating respective conditions of a right of use for each one of the plurality of data, a use authorization determination device for determining whether a request for the right of use for one of the plurality of data is authorized by reference to the management table, and a use allocation device for granting the right of use for one of the plurality of data on the basis of a determination by the use authorization determination device. The data management system also includes a plurality of information processing devices, each one of the plurality comprising, a use authorization request device for requesting from the management device a grant of the right of use for one of the plurality of data, and a data storage devices for storing at least one of the plurality of data upon transfer of the one of the plurality of data from the management device and, a connection cable between the management device and the plurality of information processing devices, whereby a network is formed and whereby data is transferred.

124 citations


Proceedings ArticleDOI
Ken Brodlie1, Andrew Poon1, Helen Wright1, Lesley Brankin1, Greg Banecki1 
25 Oct 1993
TL;DR: An architecture is proposed in which tools for computation and visualization can be embedded in a framework which assists in the management of the problem solving process and has an integral data management facility which allows an audit trail of the experiments to be recorded.
Abstract: Visualization has proved an efficient tool in the understanding of large data sets in computational science and engineering. There is growing interest today in the development of problem solving environments which integrate both visualization and the computational process which generates the data. The GRASPARC project has looked at some of the issues involved in creating such an environment. An architecture is proposed in which tools for computation and visualization can be embedded in a framework which assists in the management of the problem solving process. This framework has an integral data management facility which allows an audit trail of the experiments to be recorded. This design therefore allows not only steering but also backtracking and more complicated problem solving strategies. A number of demonstrator case studies have been implemented. >

111 citations


Journal ArticleDOI
01 Jan 1993
TL;DR: In this paper, the authors examine three central topics in model management: model base structure and its correspondence to network and relational data base structures, model base processing and the application of artificial intelligence to model interfacing, integration, construction, and interpretation.
Abstract: During the past fifteen years, model management has grown from a few suggestions that data management be enlarged to include decision models to an established but still growing field of study We examine three central topics in model management The first is model base structure and its correspondence to network and relational data base structures The second is model base processing and the application of artificial intelligence to model interfacing, integration, construction, and interpretation The third is the organizational environment of model management systems and the contribution of model management systems to organizational intelligence

110 citations


Journal ArticleDOI
TL;DR: It is argued that the main challenge of management standardization is to develop conventions to support integrated management of heterogeneous networks.
Abstract: Management systems are responsible for monitoring, interpreting, and controlling network operations. Management platform workstations query device data, or obtain event modifications through management protocols. The management platform supports tools to display the data graphically, interpret it, and control operations. It is argued that the main challenge of management standardization is to develop conventions to support integrated management of heterogeneous networks. Platform-centered management requires a few standards. First, access by platforms to multivendor devices must be unified through a standard management protocol. Second, the structure of the agent's management databases, manipulated by the protocol, must be standardized. Together, these standards permit a platform to access and manipulate managed information at multivendor device agents. The OSI and Internet management models developed to standardize both areas are discussed. >

105 citations



Journal ArticleDOI
TL;DR: This work approaches the problem of key management in a modular and hierarchical manner and discusses key management security requirements, deals with generic key management concepts and design criteria, and describes key management services and building blocks, as well as key management facilities, key management units, and their interrelationship.
Abstract: Security services based on cryptographic mechanisms assume keys to be distributed prior to secure communications. The secure management of these keys is one of the most critical elements when integrating cryptographic functions into a system, since any security concept will be ineffective if the key management is weak. This work approaches the problem of key management in a modular and hierarchical manner. It discusses key management security requirements, deals with generic key management concepts and design criteria, describes key management services and building blocks, as well as key management facilities, key management units, and their interrelationship. >

93 citations


Journal ArticleDOI
TL;DR: Several essential aspects of weather observing and the management of weather data are discussed as related to improving knowledge of climate variations and change in the surface boundary layer and the resultant consequences for socioeconomic and biogeophysical systems.
Abstract: Several essential aspects of weather observing and the management of weather data are discussed as related to improving knowledge of climate variations and change in the surface boundary layer and the resultant consequences for socioeconomic and biogeophysical systems. The issues include long-term homogeneous time series of routine weather observations; time- and space-scale resolution of datasets derived from the observations; information about observing systems, data collection systems, and data reduction algorithms; and the enhancement of weather observing systems to serve as climate observing systems. Although much has been learned from existing weather networks and methods of data management, the system is far from perfect. There are several vital areas that have not received adequate attention. Particular improvements are needed in the interaction between network designers and climatologists; operational analyses that focus on detecting and documenting outliers and time-dependent biases wit...

Book ChapterDOI
Moira C. Norrie1
15 Dec 1993
TL;DR: Database programming in object-oriented systems can be supported by combining data modelling and programming technologies such that a data model supports the management of collections of objects where those objects are as specified by the underlying object- oriented programming language.
Abstract: Database programming in object-oriented systems can be supported by combining data modelling and programming technologies such that a data model supports the management of collections of objects where those objects are as specified by the underlying object-oriented programming language This approach is the basis of the object data management services (ODMS) of the Comandos system The ODMS data model provides constructs for the representation of both entities and their relationships and further supports rich classification structures To complement the structural model, there is an operational model based on an algebra over collections of objects

Journal Article
TL;DR: This report documents and presents a top-level design and implementation plan for geographic information systems (GISs) for transportation based on an assessment of the current state of the art of GIS for transportation, and a projection of technological developments through the next five to ten years.
Abstract: This report documents and presents a top-level design and implementation plan for geographic information systems (GISs) for transportation. The basis for the design and implementation plan has been first an assessment of the current state of the art of GIS for transportation (GIS-T) through interviews with DOTs and MPOs and through a survey of GIS vendors, and second a projection of technological developments through the next five to ten years. A GIS-T may be thought of as a union of a GIS and a transportation information system, with enhancements to the GIS software and to the transportation data. A central significance of GIS-T technology is in its potential to serve as the long-sought data and systems integrator for transportation agencies. Given that so many transportation data are or can be geographically referenced, the GIS-T enabled and managed concept of location provides a basis for integrating databases and information systems across almost all transportation agency applications. In order to realize the greatest benefits of GIS-T, DOTs should develop agency-wide strategic plans that comprehend not only GIS technology adoption and application, but also concurrent adoption and application of open-systems standards and of a wide range of imminent complementary technologies, from computer networking and distributed computing, through new data storage media and database architectures, through computer-aided software engineering, to computer-based graphics and computer-aided design. The plans should address a full range of application scales, because GIS has the potential to become ubiquitous throughout all functional areas of transportation agencies. The recommended approach is top down for system design, then bottom up for application development. A GIS-T server-net architecture with computational and data management labor divided among different kinds of servers is recommended. (Fifteen kinds are suggested as a plausible first iteration for the required design.) Implementation of the server net can be incremental with a conceptual architecture in place as an organizing principle before complete physical realization is feasible, just as the concept of location can serve as a conceptual integrator for data schemas before the GIS enabled and managed spatial databases required for actual integration are fully available and as they are being incrementally constructed.

Proceedings Article
24 Aug 1993
TL;DR: A user interface paradigm for database management systems that is motivated by scientific visualization applications is presented, which includes a “boxes and arrows” notation for database access and a flight simulator model of movement through information space.
Abstract: We present a user interface paradigm for database management systems that is motivated by scientific visualization applications Our graphical user interface includes a “boxes and arrows” notation for database access and a flight simulator model of movement through information space We also provide means to specify a hierarchy of abstracts of data of different types and resolutions, so that a “zoom” capability can be supported The underlying DI3MS support for this system is described and includes the compilation of query plans into megaplans, new algorithms for data buffering, and provisions for a guaranteed rate of data delivery The current state of the Tioga implementation is also de-

Patent
17 Sep 1993
TL;DR: In this paper, a method for restructuring input data having a prespecified input data structure with a data management engine, such as a translation engine, which is dynamically configured by the input data, is presented.
Abstract: A method for restructuring input data having a prespecified input data structure with a data management engine, such as a translation engine, which is dynamically configured by the input data to conform the input data to an output data structure. When the data management engine takes the form of a translation engine, capability exists to translate data to and from various electronic data interchange (EDI) formats. Because the data management engine is dynamically configured by the input data, the translation process is independent of the specific details of EDI protocol and capable of coordinating multiple data sources/destinations without requiring any recoding of the engine. Initially, data protocol utilized in the data interchange system is ascertained. Then, the protocol data related to the data protocol and protocol processing instruction data is initialized. Next, the initialized protocol data and the initialized protocol instruction processing data is linked to a core program which takes the form of a translator in the EDI system to form the translation engine. Finally, the input data is converted into an output data structure with the translation engine wherein the input data dynamically configures the translation engine.

Journal ArticleDOI
01 Feb 1993
TL;DR: It is argued that an overall object-oriented approach can significantly contribute towards the integration of model management, data management, software engineering, and artificial intelligence.
Abstract: Through various studies, a number of model management (MM) issues have been addressed in the literature. There is a need to consolidate the various proposals and the different interpretations of the notion of a model. Towards this end, this paper proposes an object-oriented framework which provides a unifying context for MM research and DSS development. The framework coherently integrates Geoffrion's structured modeling together with Muhanna and Pick's systems approach, thereby offering a methodology for both modeling-in-the-small as well as modeling-in-the-large. Further, we argue that an overall object-oriented approach can significantly contribute towards the integration of model management, data management, software engineering, and artificial intelligence.

Proceedings ArticleDOI
M.J. Maullo1, S.B. Calo1
14 Apr 1993
TL;DR: An approach is suggested for the transformation of policy statements into executable process decision functions; and, an architecture is outlined to help organize the elements of policy in a manageable way.
Abstract: This paper deals with the role of policies in the management of large complex systems. An approach is suggested for the transformation of policy statements into executable process decision functions; and, an architecture is outlined to help organize the elements of policy in a manageable way. To illustrate the concepts, an example is given from information processing that conveys more concretely the manner in which this architecture can be applied to policy management, Tools are introduced to deploy a policy management system. >


Proceedings ArticleDOI
19 Apr 1993
TL;DR: The typical three-level architecture approach for supporting data management applications and previous work on the translation of extended entity-relationships schemas into relational database management system schemas are reviewed.
Abstract: A query language called the Concise Object Query Language (COQL) is described. COQL is unique in its conciseness, in its support of inheritance, and in the capabilities it provides for defining application-specific structures. The COQL-to-SQL translation, its implementation on top of a commercial relational DBMS, and the ways in which COQL can be used for constructing application-specific views for scientific applications are discussed. The typical three-level architecture approach for supporting data management applications and previous work on the translation of extended entity-relationships schemas into relational database management system schemas are reviewed. >

Patent
23 Mar 1993
TL;DR: In this paper, a data management method of managing shared data which is shared by a plurality of processes and data inherent in a process which exists during execution of one particular process and disappears when the process is finished is presented.
Abstract: According to a data management method of managing shared data which is shared by a plurality of processes and data inherent in a process which exists during execution of one particular process and disappears when the process is finished, when each process fetches shared data from a data base into a memory, whether the shared data requires data inherent in the process is checked. Any inherent data of the process is determined, if necessity for that data is determined. The determined inherent data of the process is stored in the memory. A pointer for the inherent data of the process, which is stored in the memory, is stored into the fetched shared data in accordance with attributes inherent in the process requiring the inherent data of the process.

01 Jan 1993
TL;DR: In this paper, the authors discuss the foundations of management, the evolution of management the external environment managerial decision-making, planning and strategy, ethics corporate social responsibility managing in our natural environment international management managing new ventures.
Abstract: Part 1 Foundations of management: managers and organizations the evolution of management the external environment managerial decision making. Part 2 Planning and strategy: planning and strategic management ethics corporate social responsibility managing in our natural environment international management managing new ventures. Part 3 Organizing and staffing organization structure the responsive organization human resources management managing the diverse work force. Part 4 Leading: leadership motivating for performance managing teams communicating. Part 5 Control and change: organizational control operations management managing technology and innovation becoming world class.

Proceedings Article
24 Aug 1993
TL;DR: This paper presents a framework for capturing and managing scientific data derivation histories as implemented in the Gaea scientific database management system and proposes to extend current semantic modeling and object-oriented technology with special constructs: concepts, processes, and tasks.
Abstract: One important aspect of scientific data management is metadata management. Metadata is information about data (e.g., content, source, processing applied, precision). One kind of metsdata which needs special attention is the data derivation information, i.e., how data are generated. In our application domain of geographical information systems (GIS) and global change research, we view scientific objects according to three different extents: spatial, temporal, and derivation. While the spatial and temporal extents have been studied and formal semantics to those extents proposed, derivation semantics have been ignored. This paper presents a framework for capturing and managing scientific data derivation histories as implemented in the Gaea scientific database management system. We focus on how Gaea handles metadata and propose to extend current semantic modeling and object-oriented technology with special constructs: concepts, processes, and tasks. Concepts are used to capture entity sets with imprecise definitions. A process captures the derivation procedure of a specific scientific object class, while a task is the instance representing the derivation of a scientific data object. We believe that this framework,.useful for GIS and global change studies, generalizes well to other scientific fields.

Proceedings ArticleDOI
01 Dec 1993
TL;DR: This paper describes the application of workflow management systems in an Italian bank which copes with events in the daily appearance of overdrafts on current accounts which have to be managed by the agency director and the branch’s staff.
Abstract: This paper describes the application of workflow management systems in an Italian bank. Under a changing competitive and financial situation, the bank had to react by redesigning its market-oriented business processes. Customer related credit processes have been analyzed using a method based on a client/supplier model. The credit management process was the target for a workflow-based reporting system. The system copes with events in the daily appearance of overdrafts on current accounts which have to be managed by the agency director and the branch’s staff. The reporting system developed is part of a global change from the centralized into a distributed credit management information system based on a clienthcrver architecture. The functional architecture for workflow management technology defines how to integrate the diffenimt functional modules (message handling, data management and document management), and in particular, mainframe EDP with end user computing.

Patent
19 May 1993
TL;DR: In this article, the system analyzes this request to acquire the format ID of the CCP data requested by the application and passes the searched CCP data to the application, and ends processing.
Abstract: CCP data as common data, which can be utilized by a plurality of users or applications, are stored in a CCP buffer in units of data types (format IDs). More specifically, the CCP buffer stores one CCP data per format ID. Upon reception of a request for CCP data from an application, the system analyzes this request to acquire the format ID of the CCP data requested by the application. The system searches CCP data having a format ID coinciding with the requested format ID from data management information of CCP data. The system passes the searched CCP data to the application, and ends processing. In this manner, CCP data can be stored in correspondence with a plurality of data types, and required CCP data can be easily selected by using its format ID. Therefore, operability with respect to common data can be improved, and work efficiency of users can be improved.

Patent
20 Apr 1993
TL;DR: An image data management system as discussed by the authors includes an image input device for taking in information represented on paper as image data, an image data output apparatus for processing image data inputted from the image input devices and sending processed image data onto the communication circuit network.
Abstract: An image data management system includes a communication circuit network, and an image data input output apparatus and an image data registration apparatus connected to this communication circuit network. The image data output apparatus includes an image input device for taking in information represented on paper as image data, a management information input device for reading image data management information from a portable memory device, the image data management information being needed to register and manage image data inputted from the image input device, an input device for inputting command information given by a user, and an input output processing device for processing image data inputted from the image input device and sending processed image data onto the communication circuit network, on the basis of the command information inputted from the input device and the image data management information read from the management information input device. The image data registration apparatus includes a memory device for registering image data sent from the image data input output apparatus, and an image data registering device for taking in image data on the communication circuit network, providing image data thus taken in with search information needed at the time of searching the image data, and registering the image data with the search information attached thereto in the memory device.

Book ChapterDOI
13 Oct 1993
TL;DR: The ways in which storage system architectures must change in order to provide constrained-latency storage access (CLSA) on continuous media, taking into account operating system and network support as well as database management are examined.
Abstract: Data storage systems are being called on to manage continuous media data types, such as digital audio and video. There is a demand by applications for “constrained-latency storage access” (CLSA) to such data: precisely scheduled delivery of data streams. We believe that anticipated quantitative improvements in processor and storage-device performance will not be sufficient for current data management architectures to meet CLSA requirements. The need for high-volume (but high-latency) storage devices, high-bandwidth access and predictable throughput rates mean that standard latency-masking techniques, such as buffering, are inadequate for the service demands of these applications. We examine the ways in which storage system architectures must change in order to provide CLSA on continuous media, taking into account operating system and network support as well as database management. Particular points we cover include changes in the form of requests and responses at the application-database and database-OS interfaces new kinds of abstractions and data independence that data mangement systems will need to supply, such as quality-of-service requests and mapping of domain events to OS events effects of CLSA demands on query optimization, planning and evaluation, including the need for accurate resource estimates and detailed schedules new information requirements for the database system, such as better characterizations of storage subsystem performance and application patterns.

Journal Article
TL;DR: The desktop EMS is viewed as the sole interface between the experimental scientist and the data, and management tools that can be tailored by a typical scientific team to effectively manage their unique experimental environment are developed.
Abstract: Traditionally, the scale and scope of an experimental study was determined by the ability of the research team to generate data. Over the past few years, we have been experiencing an unprecedented increase in the ability of small teams of experimental scientists to generate data. This has led to a shift in the balance between the different components of an experimental study. Today, it is not uncommon to find a study whose scale and scope have been determined by the ability of the team to manage the data rather than to generate it. Whether the discipline is experimental computer science [4], genetics, earth and space sciences, soil sciences, or high-energy physics, scientists are faced in their daily work with an experiment and data management bottleneck. Unfortunately, an experimental scientist can not find ready off-the-shelf management tools that offer both the functionality required by a scientific environment and an interface that feels natural and intuitive to the non-expert. While no special expertise is needed to manage a collection of images stored as files on a PC or as pictures in a paper notebook, existing database systems (DBMSs) that offer the desired functionality require expertise that most teams of experimental scientists do not have and can not afford to pay for. This poses a challenge to the database community to develop management tools that can be tailored by a typical scientific team to effectively manage their unique experimental environment. To address this challenge, we have undertaken an effort to develop a desktop Experiment Management System (EMS) [2]. We view the desktop EMS as the sole interface between the experimental scientist and the data

Book ChapterDOI
30 Aug 1993
TL;DR: Fox (Finding Objects of eXperiments) is the declarative query language for Moose (Modeling Objects of Scientific Experiments), an object-oriented data model at the core of a scientific experiment management system (EMS) being developed at Wisconsin.
Abstract: Fox (Finding Objects of eXperiments) is the declarative query language for Moose (Modeling Objects Of Scientific Experiments), an object-oriented data model at the core of a scientific experiment management system (EMS) being developed at Wisconsin. The goal of the EMS is to support scientists in managing their experimental studies and the data that are generated from them.

Book
01 Aug 1993
TL;DR: In this paper, the authors present a set of strategies for the management of quality, learning, flexibility, and forecasting in a manufacturing, service, and operations management environment, including: 1. Operations Management, Systems and the Environment Operations Management 2. Productivity, Learning, Flexibility and Forecasting 3. Corporate, Market and Financial Strategies 4. Operations Strategy 5. Manufacturing, Service and Operations Management 6. The Management of Quality 7. Materials Management Systems 8. Human Resource Development
Abstract: 1. Operations Management, Systems and the Environment Operations Management 2. Productivity, Learning, Flexibility and Forecasting 3. Corporate, Market and Financial Strategies 4. Operations Strategy 5. Manufacturing, Service and Operations Management 6. The Management of Quality 7. Materials Management Systems 8. Technology Strategy 9. Human Resource Development

Journal ArticleDOI
TL;DR: Major issues researchers should consider in choosing a computer or a manual QDMS include availability and accessibility, comfort, appropriateness, efficiency, thoroughness and contextualization.
Abstract: A primary issue in conducting qualitative research is the time required for data analysis Qualitative research can be costly, since data analysis is generally labour intensive and our tune factors into money There is, unfortunately, no magic formula for hastening the conceptual tasks associated with qualitative analysis, yet effective qualitative data management systems (QDMS) expedite the mechanical tasks, those tasks associated with storing and retrieving qualitative data Rapid and smooth data management increases the time one can allot to data analysis Although computer QDMS are increasingly recommended for their time-saving potential in relation to data management, some significant issues associated with the adoption of a computer versus a manual QDMS have not yet been fully explored The purpose of this paper is to present major issues researchers should consider in choosing a computer or a manual QDMS These issues include availability and accessibility, comfort, appropriateness, efficiency, thoroughness and contextualization