scispace - formally typeset
Search or ask a question

Showing papers on "Data management published in 1995"


Proceedings Article
01 Jan 1995
TL;DR: The use of repetitiv e broadcas t as a way of augmentin g th e mem ­ ory hierarchy of clients in an asymmetri c communicatio n environment and several "pure " cache managemen t policies are examined.
Abstract: Thi s pape r proposes th e use of repetitiv e broadcas t as a way of augmentin g th e mem ­ ory hierarchy of clients in an asymmetri c communicatio n environment . We describe a ne w techniqu e called "Broadcas t Disks" for structurin g th e broadcas t in a way tha t provides improve d performanc e for non-uniformly accessed data . Th e Broadcas t Disk superimpose s multipl e disks spinnin g at different speeds on a single broadcas t chan­ nel — in effect creatin g an arbitraril y fine-grained memor y hierarchy. In additio n to proposin g an d defining th e mechanism , a mai n result of thi s work is tha t exploiting th e potentia l of th e broadcas t structur e requires a re-evaluation of basic cache man ­ agemen t policies. We examin e several "pure " cache managemen t policies an d develop Previously appeared in Proceedings of the ACM SIGMOD International Conference on the Management of Data, 1995, pp. 199-210. Copyright ©1995 by the Association for Com­ puting Machinery, Inc. Reprinted by permission. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or direct commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Publications Dept, ACM Inc., fax +1 (212) 869-0481,or (permissions@acm.org)

459 citations


Journal ArticleDOI
TL;DR: This work indicates how one can achieve enhanced access to data and knowledge by using descriptions in languages for schema design and integration, queries, answers, updates, rules, and constraints.
Abstract: Description logics and reasoners, which are descendants of the KL-ONE language, have been studied in depth in artificial intelligence. After a brief introduction, we survey their application to the problems of information management, using the framework of an abstract information server equipped with several operations-each involving one or more languages. Specifically, we indicate how one can achieve enhanced access to data and knowledge by using descriptions in languages for schema design and integration, queries, answers, updates, rules, and constraints. >

321 citations


Journal ArticleDOI
TL;DR: This paper overviews the Object-Protocol Model (OPM) and a suite of data management tools based on OPM, a data model that allows specifying database structures and queries in terms of objects and protocols specific to scientific applications.

154 citations


Book ChapterDOI
13 Dec 1995
TL;DR: A convergence between workflow management and databases is occurring, where workflow management systems need to be more integrated with data management technology, particularly as it concerns the access to external databases.
Abstract: Workflow management is emerging as a challenging area for databases, stressing database technology beyond its current capabilities. Workflow management systems need to be more integrated with data management technology, in particular as it concerns the access to external databases. Thus, a convergence between workflow management and databases is occurring.

137 citations


Patent
10 Jan 1995
TL;DR: A smart telecommunications supervisor management workstation with monitoring system provides real-time operation statistics and graphical representation of system operation in real time as mentioned in this paper, with a standardized graphic user interface and a mouse-driven point and click user friendly interface minimize keyboard entry.
Abstract: A smart telecommunications supervisor management workstation with monitoring system provides real-time operation statistics and graphical representation of system operation in real time. A standardized graphic user interface and a mouse-driven point and click user friendly interface minimize keyboard entry. A management workstation also generates reports using off-the-shelf spreadsheet packages incorporates data management functions via a highly refined user interface. The management workstation comprises an integrated system for generating alerts based on user-defined criteria for database information. Color-coded or shaded monochrome displays provide ease of viewing. Extensive use of icons allows quick identification and selection of management control functions.

132 citations


Journal ArticleDOI
TL;DR: The quality assurance and monitoring program is an integral and continuing part of study operations and can be met only when the coordinating center staff understand data quality goals and are up to date with all phases of data management and reporting.

95 citations


Journal ArticleDOI
TL;DR: In this paper, the theoretical framework used in interpreting data on runoff and soil loss from field experiments to yield information on soil erodibility is described, which has been employed in the form of computer programs in the field experiments.

94 citations


Patent
Shimamura Nobuaki C O1
25 May 1995
TL;DR: In this paper, resource management processes are linked to each other in accordance with parent-child relationships between the respective resource management groups to form a resource management tree, and resources necessary for a newly generated resource management group are distributed from resources owned by a parent resource management Group of the newly generated group.
Abstract: With a time-sharing-oriented operating system, resource management groups are hierarchically formed including a plurality of processes. Each resource management group includes a resource management process for managing resources allocated to its group, at least one process which is a descendant of the resource management process and not included in other resource management groups, and a resource management block for storing information on the resources managed by its own group. Resource management processes are linked to each other in accordance with parent-child relationships between the respective resource management groups to form a resource management tree. At the generation of a new resource management group, resources necessary for the newly generated resource management group are distributed from resources owned by a parent resource management group of the newly generated resource management group in accordance with the resource management tree.

89 citations


01 Apr 1995
TL;DR: TheEROP is an automated database of selected elements from the Economic Report of the President that allows the user to graph economic data over time to aid in determining trends and relationships.
Abstract: : The Economic Report of the President Data Analysis Model (EROP) is an automated database of selected elements from the Economic Report of the President. This report is published annually by the Government Printing Office. The model uses the LOTUS spreadsheet program and is macro-driven based on button selections made by the user. The model allows the user to select various functions to perform on the data such as graphing and regression analysis. The primary value of the model is that it allows the user to graph economic data over time to aid in determining trends and relationships. The user can also perform regression analysis on data elements, create and graph new data elements based on simple mathematical relationships developed by the user, and maintain and update the database based on newly published reports. The model was developed to create a simple, automated process that allows ICAF faculty and students to analyze economic data quickly and easily.

85 citations


Book
01 Oct 1995
TL;DR: In this article, the authors present a knowledge-based process planning approach for machining process planning based on knowledge representations and reasoning systems, and present an object-oriented knowledgebased inspection process planner.
Abstract: Preface. Introduction. Knowledge representations and reasoning systems. Knowledge-based systems approach to process planning. Feature-based modelling for process planning. Knowledge-based process planning for machining. Object-oriented knowledge-based inspection process planner. Knowledge-based assembly planning. Next generation intelligent manufacturing systems. Index.

77 citations


Proceedings ArticleDOI
02 Dec 1995
TL;DR: It is shown that dynamic reallocation can improve system throughput by a factor of two and a half for wide area networks and that individual site load must be taken into consideration when reallocating data, and a simple policy is provided that incorporates load in the reallocated decision.
Abstract: Traditionally, allocation of data in distributed database management systems has been determined by off-line analysis and optimization. This technique works well for static database access patterns, but is often inadequate for frequently changing workloads. In this paper we address how to dynamically reallocate data for partionable distributed databases with changing access patterns. Rather than complicated and expensive optimization algorithms, a simple heuristic is presented and shown, via an implementation study, to improve system throughput by 3 based system. Based on artificial wide area network delays, we show that dynamic reallocation can improve system throughput by a factor of two and a half for wide area networks. We also show that individual site load must be taken into consideration when reallocating data, and provide a simple policy that incorporates load in the reallocation decision.

Journal ArticleDOI
TL;DR: This paper has developed algorithms for partitioning the original datasets into “clusters” based on analysis of data access patterns and storage device characteristics, and designed enhancements to current storage server protocols to permit control over physical placement of data on storage devices.

Journal ArticleDOI
TL;DR: Fact analysis of an author cocitation frequency matrix derived from a database file consisting of a total of 15,030 cited reference records taken from 692 citing articles uncovered seven informal clusters of decision support systems (DSS) research subspecialties and reference disciplines.
Abstract: This study applies factor analysis of an author cocitation frequency matrix derived from a database file consisting of a total of 15,030 cited reference records taken from 692 citing articles. Seven informal clusters of decision support systems (DSS) research subspecialties and reference disciplines were uncovered. Four of them represent DSS research subspecialties—foundations, group DSS, model/data management, and individual differences. Three other conceptual groupings define the reference disciplines of DSS—organizational science, multiple criteria decision making, and artificial intelligence. DSS is a very young academic field and is still growing. DSS has just entered the era of growth after 20 years of research. During the 1990s, DSS research will be further grounded in a diverse set of reference disciplines. Furthermore, DSS is in the active process of solidifying its domain and demarcating its reference disciplines. A DSS theory is imminent in the very near future in some area of DSS research such as model management.

Book
16 Mar 1995
TL;DR: The evolution of total quality management has been discussed in detail in this article, where managers and organizations today are described as: 1. Managers and organizations Today. 2. The Evolution Toward Total Quality Management. 3. International Competition and Management. 4. Managerial Ethics and Corporate Social Responsibility. 5. Corporate and Competitive Strategy. 6. Customer Value Strategy. 7. Organizational Structure and Design.
Abstract: Part I: Management Today:. 1. Managers and Organizations Today. 2. The Evolution Toward Total Quality Management. 3. International Competition and Management. 4. Managerial Ethics and Corporate Social Responsibility. Part II: Planning and Strategic Management for Customer Value:. 5. Decision Making, Problem Solving and Continuous Improvement. 6. Corporate and Competitive Strategy. 7. Customer Value Strategy. Part III: Organizing for Total Quality:. 8. Organizational Structure and Design. 9. Managing Human Resource Systems. 10. Organizational Culture and Change. Part IV: Behavioral Processes:. 11. Leadership. 12. Motivation. 13. Communication. 14. Teams and Groups. Part V: Control and Systems Improvement:. 15. Statistical and Other Quality Tools for Systems Improvement. 16. Management Information Systems. 17. Managing Technology and Technological Change. 18. Operations Management for Control and Improvement.

Journal ArticleDOI
TL;DR: The planning process for marine protected areas in Belize, Central America, adopts geographic information system (GIS) technology to integrate data from a variety of sources as discussed by the authors, and illustrates the many advantages of GIS over conventional approaches to mapping and data management.
Abstract: The planning process for marine protected areas in Belize, Central America, adopts geographic information system (GIS) technology to integrate data from a variety of sources This article describes the importance of GIS in assisting institutional cooperation and illustrates the many advantages of GIS over conventional approaches to mapping and data management GIS is a powerful analytical tool that has allowed the improvement of inadequate ground control and base mapping through the incorporation of differential global positioning systems technology Subsequent resource mapping has been carried out through the application of SPOT Panchromatic and other remotely sensed imagery, field survey, and existing GIS data In Belize, GIS products are increasingly being used as the primary source for management plans The technology will play a key role in the future development of management initiatives for the country's coastal zone

Book
08 Aug 1995
TL;DR: In this paper, the authors present a planning and budget Hiring Training of Interviewers Participant Acquisition and Retention Supervision of Interviewer Data Management Quality Control and Research Ethics Ethics.
Abstract: Introduction Planning and Budget Hiring Training of Interviewers Participant Acquisition and Retention Supervision of Interviewers Data Management Quality Control and Research Ethics

Patent
08 Nov 1995
TL;DR: In this article, a data management system is connected to a Signaling System 7 telecommunications network (SS#7), which includes a database library that classifies the problem and provides prioritized resolutions, based on past historical events.
Abstract: A data management system is connected to a Signaling System 7 telecommunications network (SS#7). Network elements and other data sources of the SS#7 network provide alarm data to the management system when problems occur with the network. The data management system includes a database library that classifies the problem and provides prioritized resolutions, based on past historical events. The library continues to expand with data each time an alarm is resolved. Data from the network elements are also stored for generation of reports, such as those dealing with configuration, performance, faults, and security.


Journal ArticleDOI
TL;DR: Systems with graphical data presentation have advantages over systems presenting data mainly in numeric format and communication between clinicians, nurses, computer scientists and PDMS vendors must be enhanced to achieve the common goal: useful and practical data management systems at ICUs.
Abstract: Computerized Patient Data Management Systems (PDMS) have been developed for handling the enormous increase in data collection in ICUs. This study tries to evaluate the functionality of such systems installed in Europe. Criteria reflecting usefulness and practicality formed the basis of a questionaire to be answered accurately by the vendors. We then examined functions provided and their implementation in European ICUs. Next, an “Information Delivery Test” evaluated variations in performance, taking questions arising from daily routine work and measured time of information delivery. ICUs located in Vienna (Austria), Antwerp (Belgium), Dortmund (Germany), Kuopio (Finland). 5 PDMS were selected on the basis of our inclusion criteria: commercial availability with at least one installation in Europe, bedside-based design, realization of international standards and a prescribed minimum of functionality. The “Table of Functions” shows an overview of functions and their implementation. “System Analyses” indicates predominant differences in properties and functions found between the systems. Results of the “Information Delivery Tests” are shown in the graphic charts. Systems with graphical data presentation have advantages over systems presenting data mainly in numeric format. Time has come to form a medical establishment powerful enough to set standards and thus communicate with the industrial partners as well as with hospital management responsible for planning, purchasing and implementing PDMS. Overall, communication between clinicians, nurses, computer scientists and PDMS vendors must be enhanced to achieve the common goal: useful and practical data management systems at ICUs.

Journal ArticleDOI
TL;DR: This paper proposes criteria that would facilitate characterizing, evaluating, and comparing heterogeneous molecular biology database systems and proposes a methodology for evaluating these systems.
Abstract: Molecular biology data are distributed among multiple databases. Although containing related data, these databases are often isolated and are characterized by various degrees of heterogeneity: they usually represent different views (schemas) of the scientific domain and are implemented using different data management systems. Currently, several systems support managing data in heterogeneous molecular biology databases. Lack of clear criteria for characterizing such systems precludes comprehensive evaluations of these systems or determining their relationships in terms of shared goals and facilities. In this paper, we propose criteria that would facilitate characterizing, evaluating, and comparing heterogeneous molecular biology database systems. Key words: characterization criteria, heterogeneous database systems, molecular biology databases

Patent
28 Jun 1995
TL;DR: In this paper, the authors present an apparatus for adaptable performance evaluation of an application including queries by analytical resolution of a data base, and operating on an information processing system having a given architecture, including a library for knowledge of the specific environment of the data base.
Abstract: An apparatus for adaptable performance evaluation of an application including queries by analytical resolution of a data base, and operating on an information processing system having a given architecture, including a library for knowledge of the specific environment of a data base. The knowledge library, in an environment specification language, includes an architecture library for modeling hardware architectures; a system library modeling the operational and transactional systems supported by the hardware architectures; an access and operation method library modeling the algorithms used by the data management system; and a library of data base profiles collecting knowledge on the data base layout and statistics on the user application. A local optimizer uses a performance evaluator to evaluate the application and select an optimal plan for the execution thereof using the information in the knowledge library including information on the given architecture.

Book
03 Oct 1995
TL;DR: This book explores the design of advanced multimedia systems in depth - the characteristics, design challenges, emerging technologies, methodologies, and implementation techniques - and presents a number of new concepts and methodologies.
Abstract: Informative as well as tutorial, this book explores the design of advanced multimedia systems in depth - the characteristics, design challenges, emerging technologies, methodologies, and implementation techniques. Using coded modules to illustrate design aspects, it covers the underlying data management system, specialized hardware and software, and an advanced user interface - and presents a number of new concepts and methodologies.

Journal ArticleDOI
01 Dec 1995
TL;DR: A wireless client/server computing architecture will be discussed for the delivery of PISA, and data management issues such as transactional services and cache consistency will be examined under this architecture.
Abstract: We are witnessing a profound change in the global information infrastructure that has the potential to fundamentally impact many facets of our life. An important aspect of the evolving infrastructure is the seamless, ubiquitous wireless connectivity which engenders continuous interactions between people and interconnected computers. A challenging area of future ubiquitous wireless computing is the area of providing mobile users with integrated Personal Information Services and Applications (PISA). In this paper, a wireless client/server computing architecture will be discussed for the delivery of PISA. Data management issues such as transactional services and cache consistency will be examined under this architecture.

Patent
03 May 1995
TL;DR: In this paper, a generic database (GDB) having a central control system (CU) and generic data modules (GM) is described, with the CU providing all the functionality to carry out access control and data maintenance in cooperation with the central controller system, and the generic modules allowing a user-individual form of the user data.
Abstract: A data management system should relieve the user of data management tasks as much as possible and nevertheless be flexible with respect to user-individual forms of the user data. This aim is achieved according to the invention by a generic database (GDB) having a central control system (CU) and generic data modules (GM), the generic data modules already containing all the functionality to carry out access control and data maintenance in cooperation with the central control system, and the generic data modules nevertheless allowing a user-individual form of the user data.

Journal ArticleDOI
TL;DR: During the hantavirus outbreak, computer technology became part of the problem; it initially prevented good data management and may have hindered some of the laboratory and epidemiologic efforts to control the outbreak.
Abstract: Data Management Issues for Emerging Diseases Since 1976, when Legionnaires’ disease affected attendees at the American Legion Convention in Philadelphia (1), the scope of public health has expanded. During the 1976 outbreak investigation, public attention was drawn to news accounts of the increasing numbers of cases and deaths as well as to speculations about diseases causes and prevention. After the outbreak, public health officials contended with volumes of information, including clinical data, epidemiologic survey results, and records of specimens collected from patients and the environment. This information was managed on mainframe computers. In 1980, a cluster of cases of unrecognized illness, primarily affecting young women, created a data management situation similar to that surrounding the Legionnaires’ disease outbreak. A major epidemiologic investigation, which included examining a multitude of laboratory specimens and analyzing volumes of data, was undertaken by a large team of federal, state, and local public health officials, as well as numerous academic institutions and private industries. The problems with establishing databases and implementing a data management system for toxic shock syndrome (2) were essentially the same as the data management problems of Legionnaires’ disease, except that computer technology had crept forward slightly in public health offices. During the spring of 1993, a cluster of cases of another unknown illness, eventually attributed to hantavirus (3), occurred in the southwestern United States. The reaction to this unknown disease by public health officials reflected a startling fact: even though the epidemiologic and laboratory methods for curtailing the outbreak were in place, a consistent data management strategy had not been established. Ad hoc databases built by outbreak investigators for a multitude of purposes began to bog down the investigation. Cases were recorded in multiple databases that did not recognize duplicate reports of cases. Updates of data about cases were done in some, but not all, databases. Laboratory data about specimens from patients were not linked to other clinical and epidemiologic data about a patient. No single database was available with well-edited, complete data about all the cases. Parallel, fragmented data management efforts evolved in at least 15 locations, with no coordinated mechanism to integrate them into one system. Introducing a single system for data management in the midst of the hantavirus outbreak involved more than the data management issues encountered in the earlier outbreaks. Previously, computer technology was viewed as a solution that, although somewhat cumbersome, enabled officials to move from data management by hand to electronic management. However, during the hantavirus outbreak, computer technology became part of the problem; it initially prevented good data management and may have hindered some of the laboratory and epidemiologic efforts to control the outbreak. Data were essentially being locked into various databases and could not be adequately analyzed or merged with data in other databases. In some instances, this peculiar circumstance caused investigators to perform analyses by hand using printouts from electronic databases or entering data again into other systems. In recent years, legal considerations, such as the Privacy Act enacted in 1974 and the Freedom of Information Act enacted in 1966 (4,5), have also complicated data management. These acts, in their efforts to protect individual privacy and ensure availability of data, have in some cases, constrained public health responses to emergency situations and subsequent surveillance efforts by enforcing strict database design and handling requirements.

Proceedings ArticleDOI
02 Dec 1995
TL;DR: The modeling requirements for a virtual enterprise are described and it is shown how a global, mediated VE conceptual model can be constructed at build-time and be used by a knowledge base management system (KBMS) to provide run-time support for the operation of avirtual enterprise.
Abstract: The main objective of a virtual enterprise (VE) is to allow a number of organizations to rapidly develop a working environment to manage a collection of resources contributed by the organizations toward the attainment of some common goals. One of the key requirements of a tirtua.1 enterprise is to develop an information infrastructure to sup port the interoperability of distributed and heterogeneous systems for controlling and conducting the business of the virtual enterprise. In order to achieve the objective and to meet this requirement, it is necessary to model all things of interest to a virtual enterprise such as data, human and hardware resources, organizational structures, business constraints, production processes, and activities in work management. Additionally, a system is needed to manage the meta-information and the shared data and to provide both build-time and run-time services to the heterogeneous systems to achieve their interoperability. In this paper, we describe the modeling requirements for a virtual enterprise and show how a global, mediated VE conceptual model can be constructed at build-time and be used by a knowledge base management system (KBMS) to provide run-time support for the operation of a virtual enterprise. The KBMS differs from the traditional database management system (DBMS) in that it provides not only the traditional database management services (such as persistent, object management, transaction management, etc.), but also a set of knowledge base management services. Most notably, the KBMS provides a request/event monitoring service which monitors the invocation of the methods which automatically triggers the processing of rules by a rule processing service when certain methods are invoked. We shall also describe how we apply the KBMS technology in the R&D efforts of a project called the National Industrial Information Infrastructure Protocols (NHIP) to provide a rule-based interoperabtity among heterogeneous systems. Permission to make digitel/hard copies of all or part of thk material for personal or claasroom use is granted witbout fee provided that the copies are not made or distributed for profit or commercial advantage, the copyright notice, the title of the pubhcation and its date appear, and notice is given that copyright is by permission of the ACM, Inc. To copy otherwise, to repubtieh, to post on aemera or to redistribute to Iiata, requirea specific permission andlor fee. CIKM ’95, Baltimore MD USA @ 1995 ACM 089791 -8124/95/11 ..$3.50 * Acknowledgement: This work is supported by the Advanced Research project Agency under ARPA Order #B76100. It is a part of the R&D effort of the NIIIP Consortium. The ideas and techniques presented here are those of-the authors and do not necessarily represent the opinion of other NIIIP Consortium members.

Journal Article
TL;DR: In this paper, the authors introduce the concept of precision farming and discuss the role of GIS as a centralized data management and analysis tool for agricultural applications. But, the use of geographic information systems (GIS) is essential for such a management paradigm.
Abstract: Agricultural ecosystems are inherently variable entities. To manage spatial variability, modern farmers are looking for advanced technological solutions. Management strategies incorporating remote sensing, the Global Positioning System (GPS), and variable rate treatment (VRT) offer the possibility of positioning inputs exactly in order to optimize farm returns and minimize chemical inputs and environmental hazards. The use of geographic information systems (GIS) is essential for such a management paradigm. This paper introduces the concept of precision farming and discusses the role of GIS as a centralized data management and analysis tool. Results from a survey of the use of GIS in precision farming are included in order to determine strengths and weaknesses of the technology and to provide impetus for improvements in GIS to support agricultural applications.


Patent
Tadashi Okamoto1
06 Apr 1995
TL;DR: In this paper, the authors present a video-on-demand system, consisting of an image data storage device, including a plurality of recording media, which stores image data for a pluralityof programs divided into blocks which are distributed among all of the recording media.
Abstract: The present invention is a video-on-demand system, comprising an image data storage device, including a plurality of recording media, which stores image data for a plurality of programs divided into a plurality of blocks which are distributed among all of the recording media, an image data management device, including image data management tables in which management information for the image data is stored, which consults the image data management tables and transmits necessary information separately for each block, when there is a request from outside the device for information about one of the programs, these image data management tables including a title table corresponding a title of each of the programs with starting block identification information for each, a block table corresponding block identification information for each of the blocks with starting frame identification information for each block, and a frame table corresponding frame identification information for each frame with position information showing at which position on which recording medium out of the recording media each of the frames is stored, and an image data transmission device which requests to the image data management device for the necessary information for a transmission of a program requested by a user, receives the necessary information from the image data management device, retrieves image block data separately for each block from the recording media and transmits the image block data to the user.

Book ChapterDOI
01 Jan 1995
TL;DR: It is shown that by viewing the network as a conceptual global database the six managementfunctionalities can be performed in a declarative fashion through specification of management functionalities as data manipulation statements.
Abstract: The purpose of a network management system is to provide smooth functioning of a large heterogeneous network through monitoring and controlling of network behavior. ISO/OSI has defined six management functionalities that aid in overall management of a network: configuration, fault, performance, security, directory and accounting management. These management functionalities provide tools for overall graceful functioning of the network on both day-to-day and long-term basis. All of the functionalities entail dealing with huge volumes of data. So network management in a sense is management of data, like a DBMS is used to manage data. This is precisely our purpose in this paper to show that by viewing the network as a conceptual global database the six management functionalities can be performed in a declarative fashion through specification of management functionalities as data manipulation statements.