scispace - formally typeset
Search or ask a question

Showing papers on "Data management published in 1991"


Journal ArticleDOI
TL;DR: In this paper, the authors investigated the relationship between organizational quality context and actual and ideal quality management using data collected from 152 managers from 77 business units of 20 manufacturing and service companies in order to measure managers' perceptions of ideal and actual quality management in terms of eight critical factors including product/service design, training, employee relations and top management leadership.
Abstract: While the quality literature abounds with prescriptions for how quality should be managed, no one has proposed an organization-theory explanation for how quality is managed in organizations. This paper proposes a system-structural model of quality management that relates organizational quality context, actual quality management, ideal quality management, and quality performance. The relationships between organizational quality context and actual and ideal quality management are investigated using data collected from 152 managers from 77 business units of 20 manufacturing and service companies. A previously reported instrument is used to measure managers' perceptions of ideal and actual quality management in terms of eight critical factors including product/service design, training, employee relations, and top management leadership. Several measures are used to characterize organizational quality context including company type, company size, degree of competition, and corporate support for quality. The results indicate that organizational quality context influences managers' perceptions of both ideal and actual quality management. This suggests that knowledge of organizational quality context is useful for explaining and predicting quality management practice. Important contextual variables are corporate support for quality, past quality performance, managerial knowledge, and the extent of external quality demands.

354 citations


Journal ArticleDOI
TL;DR: Requirements imposed on both the object data model and object management by the support of complex objects are outlined and object-oriented models are compared with semantic, relational, and Codasyl models.
Abstract: Requirements imposed on both the object data model and object management by the support of complex objects are outlined. The basic concepts of an object-oriented data model are discussed. They are objects and object identifiers, aggregation, classes and instantiation mechanisms, metaclasses, and inheritance. Object-oriented models are compared with semantic, relational, and Codasyl models. Object-oriented query languages and query processing are considered. Some operational aspects of data management in object-oriented systems are examined. Schema evolution is discussed. >

257 citations


Proceedings ArticleDOI
03 Apr 1991
TL;DR: It is shown that many query processing algorithms can manipulate compressed data just as well as decompressed data, and that processing compressed data can speed query processing by a factor much larger than the compression factor.
Abstract: Data compression is widely used in data management to save storage space and network bandwidth. The authors outline the performance improvements that can be achieved by exploiting data compression in query processing. The novel idea is to leave data in compressed state as long as possible, and to only uncompress data when absolutely necessary. They show that many query processing algorithms can manipulate compressed data just as well as decompressed data, and that processing compressed data can speed query processing by a factor much larger than the compression factor. >

193 citations


Book
31 Jul 1991
TL;DR: This book explains how GIS encourages the integrated activity of multiple departments to solve problems more comprehensively and systematically, while dramatically reducing the redundancies and inconsistencies that occur daily in research and industry.
Abstract: Geographic information systems (GIS) enable governments, public agencies and utilities, scientists and commercial organizations to access and integrate data from several unrelated databases at once to share information more easily. Now a group of leading GIS experts offer the first comprehensive implementation guide to geographic information management technology for mapping, storing, acquiring and manipulating spatial data. Designed to advance the understanding and implementation of this growing technology. it explains how GIS encourages the integrated activity of multiple departments to solve problems more comprehensively and systematically, while dramatically reducing the redundancies and inconsistencies that occur daily in research and industry. This book reaches out to novices by clearly answering the most frequently asked questions about the technology, defusing misconceptions and calming typical user anxieties along the way. While some knowledge of computer and data processing concepts is helpful, it is not essential to their understanding of the book - a glossary provides full explanations of basic computer terms and concepts. At the same time it serves veteran GIS practitioners by thoroughly addressing such central concerns as database development, maintenance and accrss, cost benefit analysis, selecting and upgrading hardware and software packages, minimizing the "cultural" impact of database sharing and dealing with corporate and legal issues. This book should be of interest to earth scientists, computer scientists, information technologists.

190 citations


Journal ArticleDOI
TL;DR: It is shown how NuMon, a seismological analysis system for monitoring compliance with nuclear test-ban treaties is managed within the Meta framework, the meta system that solves some longstanding problems of distributed applications.
Abstract: The issues of managing distributed applications are discussed, and a set of tools, the meta system, that solves some longstanding problems is presented. The Meta model of a distributed application is described. To make the discussion concrete, it is shown how NuMon, a seismological analysis system for monitoring compliance with nuclear test-ban treaties is managed within the Meta framework. The three steps entailed in using Meta are described. First the programmer instruments the application and its environment with sensors and actuators. The programmer then describes the application structure using the object-oriented data modeling facilities of the authors' high-level control language, Lomita. Finally, the programmer writes a control program referencing the data model. Meta's performance and real-time behavior are examined. >

127 citations


ReportDOI
01 Jun 1991
TL;DR: In this article, the authors describe a natural language system, START, which analyzes English text and automatically transforms it into an appropriate representation, the {\it knowledge base}, which incorporates the information found in the text.
Abstract: This paper describes a natural language system, START. The system analyzes English text and automatically transforms it into an appropriate representation, the {\it knowledge base}, which incorporates the information found in the text. The user gains access to information stored in the knowledge base by querying it in English. The system analyzes the query and decides through a matching process what information in the knowledge base is relevant to the question. Then it retrieves this information and formulates its response in English.

74 citations


Book ChapterDOI
TL;DR: The chapter discusses examples of scientific-visualization for facilitating CS&E research specifically in the fields of planetary sciences, molecular modeling, mathematics, and medical Imaging.
Abstract: Publisher Summary Computational science and engineering (CSE it enables them to analyze the data to uncover new information. Computational scientists rely upon a host of high-volume data sources in order to conduct their research. However, they are deluged by the flood of data generated. Using an exclusively numerical format, the human brain cannot interpret gigabytes of data each day, so much information now goes to waste. It is impossible for users to ever quantitatively examine more than a tiny fraction of the solution; that is, it is impossible to investigate the qualitative global nature of numeric solutions. Therefore, the ability to visualize complex computations and simulations is absolutely essential to ensure the integrity of analyses, to provoke insights, and to communicate those insights with others. The chapter focuses on visualization. It is a method of computing that gives visual form to complex data. The growing importance of CS&E, especially with supercomputer capabilities, is creating a commensurate need for more sophisticated visual representations of natural phenomena across time. This requires the development of new tool sets for image generation, visual communication, and analysis. The chapter discusses examples of scientific-visualization for facilitating CS&E research specifically in the fields of planetary sciences, molecular modeling, mathematics, and medical Imaging. The chapter also highlights the current limitations and bottlenecks in visualization technology, focusing on software limitations, data management limitation, hardware limitations, educational limitations, and communication and publication limitation.

63 citations


Patent
28 May 1991
TL;DR: A smart telecommunications supervisor management workstation with monitoring system provides real-time operation statistics and graphical representation of system operation in real time as discussed by the authors, with a standardized graphic user interface and a mouse-driven point and click user friendly interface minimize keyboard entry.
Abstract: A smart telecommunications supervisor management workstation with monitoring system provides real-time operation statistics and graphical representation of system operation in real time. A standardized graphic user interface and a mouse-driven point and click user friendly interface minimize keyboard entry. A management workstation also generates reports using off-the-shelf spreadsheet packages incorporates data management functions via a highly refined user interface. The management workstation comprises an integrated system for generating alerts based on user-defined criteria for database information. Color-coded or shaded monochrome displays provide ease of viewing. Extensive use of icons allows quick identification and selection of management control functions.

62 citations


Patent
Tadashi Tenma1, Kichizo Akashi1, Tetsuo Kusuzaki1, Mitsuo Sudo1, Takayuki Ishii1 
28 Oct 1991
TL;DR: In this article, a data management system necessary for management such as a profit management of a shop, it is required to collect and to analyze a great amount of various kinds of data items.
Abstract: In a data management system necessary for management such as a profit management of a shop, it is required to collect and to analyze a great amount of various kinds of data items. In such a data management system, in addition to a data base storing therein all data, there are disposed a data base associated with each data utilization purpose and a data base for the data analysis so as to conduct data-base-oriented processing, which enables an analysis result suitable for a purpose to be easily attained and which simplifies the system configuration.

58 citations


01 Jan 1991
TL;DR: This paper presented the object management facilities being designed into a next-generation data manager, POSTGRES, which is unique in that it does not invent a new data model for support of objects but chooses instead to extend the relational model with a powerful abstract data typing capability and procedures as full-fledged data base objects.
Abstract: This paper presents the object management facilities being designed into a next-generation data manager, POSTGRES. This system is unique in that it does not invent a new data model for support of objects but chooses instead to extend the relational model with a powerful abstract data typing capability and procedures as full-fledged data base objects. The reasons to remain with the relational model are indicated in this paper along with the POSTGRES relational extensions.

50 citations


Journal ArticleDOI
TL;DR: This approach utilizes MACCS-3D's capability of handling data specific for atoms and atom pairs, so that various biological, computational, and spectroscopic data can be merged, allowing scientists from different disciplines to access and use this information more efficiently.
Abstract: In the past decade, the scientific community has realized the value of three-dimensional (3D) structural information and '3D searching' has started to become an important new methodology for computer-aided drug design. During this time, molecular modeling information generated from various sources has proliferated due to the growing availability of software and hardware and the increasing use of crystallographic and spectroscopic techniques. This information needs to be organized to allow for its effective storage and retrieval. This paper presents an approach to address this problem with a recently introduced program, MACCS-3D. In particular, this approach utilizes MACCS-3D's capability of handling data specific for atoms and atom pairs. With this software, various biological, computational, and spectroscopic data can be merged, allowing scientists from different disciplines to access and use this information more efficiently.


Proceedings ArticleDOI
07 May 1991
TL;DR: The use of data structures stored in linked lists and processed through pointers is described, which compact the data storage and reduce algorithm execution times.
Abstract: Electric power distribution circuit analysis programs must efficiently manage a large quantity of system and equipment data. Utility engineers now wish to use integrated software packages with several functions that work efficiently and share data. The use of data structures stored in linked lists and processed through pointers is described. The pointers and linked lists compact the data storage and reduce algorithm execution times. Various algorithms can share data management and graphics functions, decreasing the cost of enhancing and maintaining the code. The approach has been applied to a software package that integrates graphics, data management, analysis, and design algorithms. Several graduate students have developed algorithms as thesis projects, working with the same linked-list data structures. >

Journal ArticleDOI
TL;DR: The International Organization for Standardization has drafted the Remote Database Access Standard to provide a protocol that can be universally accepted, implemented, and validated, and thereby provide users with a single, well-defined interface for heterogeneous environments.
Abstract: As an enterprise begins tying heterogeneous systems into a single information system, it encounters the difficult task of interfacing different platforms running different operating systems, network protocols, and data management systems. Bringing all these elements together into a coherent information resource can require substantial effort and might result in a system that needs the interfaces reworked every time a new platform or product is introduced into the system. To address these issues, the International Organization for Standardization has drafted the Remote Database Access Standard to provide a protocol that can be universally accepted, implemented, and validated, and thereby provide users with a single, well-defined interface for heterogeneous environments. Based on a client-server architecture, RDA uses well-defined Open Systems Interconnect services as the basis for the RDA services. >

Book ChapterDOI
TL;DR: The role of the NIST National PDES testbed, technical leadership and a testing-based foundation for the development of STEP, is described, and the goal is a complete, unambiguous, computer- readable definition of the physical and functional characteristics of a product throughout its life cycle.
Abstract: : Concurrent engineering involves the integration of people, systems and information into a responsive, efficient system. Integration of computerized systems allows additional benefits: automatic knowledge capture during development and lifetime management of a product, and automatic exchange of that knowledge among different computer systems. Critical enablers are product dat standards and enterprise integration frameworks. A pioneering assault on the complex technical challenges is associated with the emerging international Standard for the Exchange of Product Model Data (STEP). Surpassing in scope previous standards efforts, the goal is a complete, unambiguous, computer- readable definition of the physical and functional characteristics of a product throughout its life cycle. U.S. government agencies, industrial firms, and standards organizations are cooperating in a program, Product Data Exchange using STEP (PDES), to develop and implement STEP in a shared-database environment. PDES will lead to higher, integrated levels of automation based upon information standards and frameworks. U.S. manufacturers will benefit from concurrent engineering without sacrificing the historical strengths and traditions of individuality, initiative, and intellectual property rights. Concurrent engineering, through information technology and standards, represents the power of a new industrial revolution. The role of the NIST National PDES testbed, technical leadership and a testing-based foundation for the development of STEP, is described.

Journal ArticleDOI
01 Sep 1991
TL;DR: An extension of the method that was used for building and using the network-based knowledge system SUPER is proposed to fully utilize the benefits of this approach in the domain of diagnosing distributed dynamically evolving processes.
Abstract: An extension of the method that was used for building and using the network-based knowledge system SUPER is proposed to fully utilize the benefits of this approach in the domain of diagnosing distributed dynamically evolving processes. In the scope of distributed sensor networks, several issues are addressed concerning problems dealing with software architectures, strategies, and properties for efficient sensor data management: (1) how to build links efficiently between the elements of each network-based knowledge base, (2) how to maintain consistency of the whole structure and manage the constraints of domain dependent variables corresponding to sensor data, (3) how to manage and update efficiently the database at run time in order to maintain its consistency and to satisfy a high level of response time performances, and (4) how to propagate the solutions given by a qualitative knowledge base into a knowledge base utilizing sensor data whenever the sensors are out of order. The answers given are based on extending and generalizing the principles that have been defined for SUPER. >

Proceedings ArticleDOI
12 May 1991
TL;DR: A patient data management system developed for use in the intensive care unit (ICU) of the Montreal Children's Hospital is described, and the relational database design is expected to solve the query shortcomings of the previous data management structure, as well as offer support for security and concurrency.
Abstract: A patient data management system (PDMS) developed for use in the intensive care unit (ICU) of the Montreal Children's Hospital is described. The PDMS acquires real-time patient data from a network of physiological bedside monitors, and facilitates the review and interpretation of this data by presenting it as graphical trends, charts, and plots on a color video display. The data management structure integrates varied data types and provides database support for different applications, while preserving the real-time acquisition of network data. This structure is based primarily on OS/2 Extended Edition relational database. The relational database design is expected to solve the query shortcomings of the previous data management structure, as well as offer support for security and concurrency. >

Book ChapterDOI
01 Feb 1991
TL;DR: This paper proposes to have multimedia data accompanied by natural language descriptions that will be used for content search of these data, and a parser is used to interpret the descriptions and to match them semantically with queries.
Abstract: Advanced applications frequently require the management of multimedia data like text, images, graphics, sound, etc. While it is practical for today's computers to store these types of data, managing them requires the search of them based on their contents. Database management systems should be extended to organize these new types of data and to enable content search. However, the complexity of the contents and their semantics makes content search of these data a very difficult problem. This paper proposes to have multimedia data accompanied by natural language descriptions that will be used for content search of these data. A parser is used to interpret the descriptions and to later match them semantically with queries. Implications and difficulties of such an approach are discussed.

Journal Article
TL;DR: An automated and computerized intensive care unit flowsheet and patient chart can reduce nonnursing work and improve the quality, quantity, and recall of clinical information.

Proceedings Article
03 Sep 1991
TL;DR: An algorithm is developed to select derived relations for materialization so that the overall cost of processing the inference rules is minimized while satisfying requirements on query response time.
Abstract: Managing data in large rule systems is a critical issue, and DBMS systems are being extended for the support of rule-based, data-intensive decision making such as in expert system applications. We suggest to selectively materialize the rule-generated data in relations so that rule-based decisions can be made incrementally and automatically when the collected data are updated, An algorithm is developed to select derived relations for materialization so that the overall cost of processing the inference rules is minimized while satisfying requirements on query response time.

Patent
08 Apr 1991
TL;DR: In this article, the authors propose to notify a user of the update fact and content using a mail function at the time of updating data using a data edition part 31 performing edition against data stored in a data storage device and a message transmission part (mail transmission part 35) transmitting an arbitrary message through a network and a data management device.
Abstract: PURPOSE: To notify a user of the update fact and content using a mail function at the time of updating data CONSTITUTION: A data edition part 31 performing edition against data stored in a data storage device 11 and a message transmission part (mail transmission part 35) transmitting an arbitrary message through a network 10 and a data management device 12 are provided When the data is updated in the data edition part 31, the prescribed update generation message is prepared by extracting the update part while comparing the data before and after update to be transmitted to the work station for the other user The user operating based on the data stored in the data storage device can recognize the update of the data executed by a work station on the spot COPYRIGHT: (C)1992,JPO&Japio

01 Jan 1991
TL;DR: The paper discusses research in the Intelligent Data Management project at the NASA/Goddard Space Flight Center, with emphasis on recent improvements in low-level feature detection algorithms for performing real-time characterization of images using neural networks.
Abstract: The paper discusses research in the Intelligent Data Management project at the NASA/Goddard Space Flight Center, with emphasis on recent improvements in low-level feature detection algorithms for performing real-time characterization of images. Images, including MSS and TM data, are characterized using neural networks and the interpretation of the neural network output by an expert system for subsequent archiving in an object-oriented data base. The data show the applicability of this approach to different arrangements of low-level remote sensing channels. The technique works well when the neural network is trained on data similar to the data used for testing.

Journal ArticleDOI
TL;DR: PIF/Gestalt is presented as a test implementation of the toolkit which provides C and Common Lisp language interfaces, implemented on a database for use in a CAD/CIM (computer integrated manufacturing) system for semiconductor process design and fabrication.
Abstract: A formal object-oriented approach to the data structuring and data management of semiconductor wafer structure and device information is presented. The profile interchange format (PIF) is extended beyond a file format 'intersite' version in order to enhance the storage and access of profile information, the communication of profile information between cooperating tools, and the integration and portability of technology CAD (computer-aided design) tools. An intertool PIF toolkit is a programmatic interface to profile information, consisting of a library of objects for the storage and manipulation of data by technology CAD tools. PIF/Gestalt is presented as a test implementation of the toolkit which provides C and Common Lisp language interfaces, implemented on a database for use in a CAD/CIM (computer integrated manufacturing) system for semiconductor process design and fabrication. >

Journal ArticleDOI
TL;DR: An information-based approach to integrate the parallel and serial functions that occur in a typical design and manufacturing environment is presented and the end result in a plantwide database system that integrates the information requirements of multiple groups and disciplines is presented.
Abstract: This paper presents an information-based approach to integrate the parallel and serial functions that occur in a typical design and manufacturing environment. EMTRIS (Engineering and Manufacturing Technical Relational Information System), a relational data base application designed using ORACLE [1], was developed to cater to the engineering data needs of practicing designers and engineers working in a large industrial setting. The design and manufacturing activities within a large computer manufacturing plant provided a realistic case study environment for implementing some of the strategies discussed here. A description of the methodology that was employed in designing and implementing EMTRIS and the numerous issues that arise while undertaking such endeavors is presented. Strategies to coordinate the data transfer between different groups of individuals as well as mechanisms to ensure data consistency in such environments are discussed in this context. Various techniques used in documenting the data flows and the interrelationships between different forms of data are also reviewed. The end result in a plantwide database system that integrates the information requirements of multiple groups and disciplines.

Journal ArticleDOI
TL;DR: The software is a FORTRAN-77 based program written on a VAX 6420 which can access a commercial database management utility, DATATRIEVE (Digital Equipment Corporation, Maynard, MA), and is able to perform several useful data management manipulations.

Journal ArticleDOI
TL;DR: A conceptual database structure that can serve as a basis for storing and retrieving multitemporal remote sensing images within a geographical information system (GIS) is presented and a time domain is introduced into the database management system for handling the image historical information and analyzing the remote sensing data through time.
Abstract: A conceptual database structure that can serve as a basis for storing and retrieving multitemporal remote sensing images within a geographical information system (GIS) is presented. The concept of time representation is described. The characteristics of a relational database structure are discussed. Then a time domain is introduced into the database management system (DBMS) for handling the image historical information and analyzing the remote sensing data through time. To overcome the weakness of current databases, a multitemporal database structure is presented by extending relational algebra into a GIS remote sensing image-management system. >

Patent
07 Jun 1991
TL;DR: In this paper, a data management mechanism for a processor system provides for management of data with minimum data transfer between processes executing work requests, where data is described with descriptor elements in the work requests which indicate the location and length of segments of the data.
Abstract: A data management mechanism for a processor system provides for management of data with minimum data transfer between processes executing work requests. Each process has storage areas for storing data associated with work requests. The data is described with descriptor elements in the work requests which indicate the location and length of segments of the data. Data is transferred to a process only if it is required for execution of a work request. Further work requests can be generated by a process executing a work request which reference the data without the process actually receiving the data The segments of data may reside in storage areas of different processors with the descriptor elements of a work request defining a logical data stream.

Proceedings ArticleDOI
H.P. Tschirky1
27 Oct 1991
TL;DR: In this article, the authors present the contours of an integrative concept of technology management, which includes management issues on the normative, strategic, and operational levels of general management.
Abstract: It is pointed out that current approaches to technology management express the need to manage technology systematically, from both strategic and operational perspectives. However, they do not make a conceptual link with general management issues, a link which constitutes an indispensable prerequisite for the adequate acceptance of technology-related decision factors. The author presents the contours of an integrative concept of technology management. Its main features include management issues on the normative, strategic, and operational levels of general management. In addition, it allows for the dynamic development of enterprises. A user surface of the concept which enables management to give an enterprise-specific diagnosis of the state and corresponding requirements of technology management is discussed. >

Journal ArticleDOI
Lloyd A. Treinish1, C. Goettsche
TL;DR: A generalized approach to data visualization is critical for the correlative analysis of distinct, complex, multidimensional data sets in the space and Earth sciences.
Abstract: Critical to the understanding of data is the ability to provide pictorial or visual representation of those data, particularly in support of correlative data analysis. Despite the advancement of visualization techniques for scientific data over the last several years, there are still significant problems in bringing today's hardware and software technology into the hands of the typical scientist. For example, there are other computer science domains outside of computer graphics that are required to make visualization effective such as data management. Well-defined, flexible mechanisms for data access and management must be combined with rendering algorithms, data transformation, etc. to form a generic visualization pipeline. A generalized approach to data visualization is critical for the correlative analysis of distinct, complex, multidimensional data sets in the space and Earth sciences. Different classes of data representation techniques must be used within such a framework, which can range from simple, static two- and three-dimensional line plots to animation, surface rendering, and volumetric imaging. Static examples of actual data analyses will illustrate the importance of an effective pipeline in data visualization system.

Journal ArticleDOI
TL;DR: In this paper, the authors describe the quality approach to management in government and contrast it with the classic, often more bureaucratic, approach, and place particular focus on Project Pacer Share, a government total quality management effort underway at McClellan Air Force Base where, using quality tools and techniques, major reforms in the United States Civil Service System and human resource management are being tested.
Abstract: This article describes the quality approach to management in government and contrasts it with the classic, often more bureaucratic, approach. It places particular focus on Project Pacer Share, a government total quality management effort underway at McClellan Air Force Base where, using quality tools and techniques, major reforms in the United States Civil Service System and human resource management are being tested. Specific human resource management initiatives that need to be undertaken to support organizationwide quality performance are presented.