scispace - formally typeset
Search or ask a question

Showing papers on "Data management published in 1998"


Journal ArticleDOI
TL;DR: The purpose of this TDQM methodology is to deliver highquality information products (IP) to information consumers and aims to facilitate the implementation of an organization’s overall data quality policy formally expressed by top management.
Abstract: The field of information quality (IQ) has experienced significant advances during its relatively brief history. Today, researchers and practitioners alike have moved beyond establishing information quality as an important field to resolving IQ problems—problems ranging from IQ definition, measurement, analysis, and improvement to tools, methods, and processes. However, theoretically-grounded methodologies for Total Data Quality Management (TDQM) are still lacking. Based on cumulated research efforts, this article presents such a methodology for addressing these problems. The purpose of this TDQM methodology is to deliver highquality information products (IP) to information consumers. It aims to facilitate the implementation of an organization’s overall data quality policy formally expressed by top management [10]. Richard Y. Wang

886 citations


Proceedings Article
01 Jul 1998
TL;DR: The goal of the research described here is to automatically create a computer understandable world wide knowledge base whose content mirrors that of the World Wide Web, and several machine learning algorithms for this task are described.
Abstract: The World Wide Web is a vast source of information accessible to computers, but understandable only to humans. The goal of the research described here is to automatically create a computer understandable world wide knowledge base whose content mirrors that of the World Wide Web. Such a knowledge base would enable much more effective retrieval of Web information, and promote new uses of the Web to support knowledge-based inference and problem solving. Our approach is to develop a trainable information extraction system that takes two inputs: an ontology defining the classes and relations of interest, and a set of training data consisting of labeled regions of hypertext representing instances of these classes and relations. Given these inputs, the system learns to extract information from other pages and hyperlinks on the Web. This paper describes our general approach, several machine learning algorithms for this task, and promising initial results with a prototype system.

766 citations


Journal ArticleDOI
TL;DR: The SSW system minimizes the learning curve when doing research away from the home institution or when correlating results from multiple experiments, and provides a consistent look and feel at co-investigator institutions.
Abstract: The SolarSoftWare (SSW) system is a set of integrated software libraries, databases and system utilities which provide a common programming and data analysis environment for solar physics. Primarily an IDL based system, SSW is a collection of common data management and analysis routines derived from the Yohkoh and SOHO missions, the Solar Data Analysis Center, the astronomy libraries and other packages. The SSW environment is designed to provide a consistent look and feel at co-investigator institutions and facilitate sharing and exchange of data. The SSW system minimizes the learning curve when doing research away from the home institution or when correlating results from multiple experiments.

692 citations


Book
01 Dec 1998
TL;DR: This chapter discusses the evolution of computer methods for the Handling of Spatial Data, and the role that modelling systems thinking and GIS have in this evolution.
Abstract: 1. What is GIS? 2. Concepts of Space 3. The Evolution of Computer Methods for the Handling of Spatial Data 4. Modelling Systems Thinking and GIS 5. Spatial Data Models 6. Attribute Data Management 7. Data Encoding and Manipulation 8. Data Analysis 9. Data Output 10. Data Quality Issues 11. Organisational Issues 12. Project Design

534 citations


Patent
24 Jun 1998
TL;DR: In this paper, the authors describe a design control system suitable for use in connection with the design of integrated circuits and other elements of manufacture having many parts which need to be developed in a concurrent engineering environment with inputs provided by users and or systems which may be located anywhere in the world providing a set of control information for coordinating movement of the design information through development and to release while providing dynamic tracking of the status of elements of the bills of materials in an integrated and coordinated activity control system utilizing a repository which can be implemented in the form of a database (relational, object oriented,
Abstract: A design control system suitable for use in connection with the design of integrated circuits and other elements of manufacture having many parts which need to be developed in a concurrent engineering environment with inputs provided by users and or systems which may be located anywhere in the world providing a set of control information for coordinating movement of the design information through development and to release while providing dynamic tracking of the status of elements of the bills of materials in an integrated and coordinated activity control system utilizing a repository which can be implemented in the form of a database (relational, object oriented, etc.) or using a flat file system. Once a model is created and/or identified by control information design libraries hold the actual pieces of the design under control of the system without limit to the number of libraries, and providing for tracking and hierarchical designs which are allowed to traverse through multiple libraries. Data Managers become part of the design team, and libraries are programmable to meet the needs of the design group they service.

318 citations


Proceedings ArticleDOI
23 Feb 1998
TL;DR: The WebOQL system is presented, which supports a general class of data restructuring operations in the context of the Web and synthesizes ideas from query languages for the Web, for semistructured data and for Website restructuring.
Abstract: The widespread use of the Web has originated several new data management problems, such as extracting data from Web pages and making databases accessible from Web browsers, and has renewed the interest in problems that had appeared before in other contexts, such as querying graphs, semistructured data and structured documents. Several systems and languages have been proposed for solving each of these Web data management problems, but none of these systems addresses all the problems from a unified perspective. Many of these problems essentially amount to data restructuring: we have information represented according to a certain structure and we want to construct another representation of (part of it) using a different structure. We present the WebOQL system, which supports a general class of data restructuring operations in the context of the Web. WebOQL synthesizes ideas from query languages for the Web, for semistructured data and for Website restructuring.

296 citations


Patent
25 Sep 1998
TL;DR: An Intranet/Internet/Web-based data management tool that provides a common GUI enabling the requesting, customizing, scheduling and viewing of various types of priced call detail data reports pertaining to a customer's usage of telecommunications services is presented in this article.
Abstract: An Intranet/Internet/Web-based data management tool that provides a common GUI enabling the requesting, customizing, scheduling and viewing of various types of priced call detail data reports pertaining to a customer's usage of telecommunications services. The Web-based reporting system tool comprises a novel Web-based, client-server application integrated with an operational data management/storage infrastructure that enables customers to access their own relevant data information timely, rapidly and accurately through the GUI client interface. The data management system infrastructure is designed to enable the secure initiation, acquisition, and presentation of telecommunications priced call detail data reports to customer workstations implementing a web browser.

213 citations


Patent
24 Feb 1998
TL;DR: In this paper, a user interface and data management procedures for the efficient display, manipulation and analysis of multi attributed data or data amenable to multidimensional display and management are presented.
Abstract: This invention discloses a user interface and data management procedures for the efficient display, manipulation and analysis of multi attributed data or data amenable to multidimensional display, manipulation and management. The invention is centered on the construction and use of data carrousels comprising one or more n-gons where each n-gon can be a layered n-gon at solid or each side of each n-gon can be a single face of an embedded n-gon.

203 citations



Patent
10 Aug 1998
TL;DR: A real-time data management system which uses data generated at different rates, by multiple heterogeneous incompatible data sources is presented in this paper. But this system is not suitable for taxi queuing.
Abstract: A real-time data management system which uses data generated at different rates, by multiple heterogeneous incompatible data sources. In one embodiment, the invention is as an airport surface traffic data management system (traffic adviser) that electronically interconnects air traffic control, airline, and airport operations user communities to facilitate information sharing and improve taxi queuing. The system uses an expert system to fuse data from a variety of airline, airport operations, ramp control, and air traffic control sources, in order to establish, predict, and update reference data values for every aircraft surface operation.

156 citations



Journal ArticleDOI
TL;DR: The role of advanced communications and computing technologies, coupled with analytic procedures and models, is discussed in this paper, emphasizing the need for both pre-and post-event strategies and policies.
Abstract: Successful emergency management requires a better understanding of events with potentially disastrous consequences, a comprehensive, holistic view of managing such events, and the effective use of technology. This guest editorial for this special issue of this TRANSACTIONS provides the raison d'ˆ etre for a new field of emergency management and engineering. It provides a systems view of emergency management, emphasizing the need for both pre- and postevent strategies and policies. The role of advanced communications and computing technologies, coupled with analytic procedures and models, is discussed. This paper concludes with the recognition of the need for emergency managers to be able to utilize these technologies.

Patent
02 Oct 1998
TL;DR: In this article, a data management system user interface allows users to enter, store, retrieve, and display multiple, related groups of information in a single document by loading document data into a separate template (403) which defines various fields, and the interface determines the fields that should be displayed based on user supplied information.
Abstract: A data management system user interface allows users to enter, store, retrieve, and display multiple, related groups of information in a single document. The interface loads document data (402) into a separate template (403) which defines various fields, and the interface determines the fields that should be displayed based on user supplied information. The user enters an unlimited amount of data into each field, creating a free-flowing document. The user can create groups of entries for each field. The interface also contains a data validation and error correction feature that provides automatic correction, and allows the user to save a draft document with a list of errors for future correction.

01 Jan 1998
TL;DR: In this paper, the authors present an industry-academe collaborative study to account for existing knowledge management facilities, approaches and technology; explore alternative responsibility and role sharing scenarios; define a broad system architecture; and propose strategies to promote the framework across SAP, its clients and partners.
Abstract: Strategic alliances between ERP software vendors, their implementation partners and clients, for knowledge sharing and integrated knowledge management across the ERP life-cycle, hold promise for leveraging scarce expertise and human resources, thereby streamlining implementation and promoting growth of the market. SAP's significant market presence offers a unique opportunity to encourage such a tripartite ERP knowledge management framework. This industry-academe collaborative study will account for existing knowledge management facilities, approaches and technology; explore alternative responsibility and role sharing scenarios; define a broad system architecture; and propose strategies to promote the framework across SAP, its clients and partners.

Patent
25 Sep 1998
TL;DR: The Intranet/Internet/Web-based data management tool as discussed by the authors provides a common GUI for requesting, customizing, scheduling and viewing of various types of unpriced call detail data reports pertaining to a customer's telecommunications network traffic.
Abstract: An Intranet/Internet/Web-based data management tool (17) that provides a common GUI (207) enabling the requesting, customizing, scheduling and viewing of various types of unpriced call detail data reports pertaining to a customer's telecommunications network traffic (22). The Intranet/Internet/Web-based (17) reporting system appllication comprises a novel Web-based, client-server application that enables customers to access their own relevant data information timely, rapidly and accurately through a client GUI. A periodic acquisition of data from the customer's telecommunications network (22) at a user-specified frequency and configured to meet real-time traffic reporting requirements (34). The system infrastructure provided enables secure initiation, acquisition, and presentation of unpriced call detail and statistical data reports to customers.

01 Jan 1998
TL;DR: This paper presents four views of knowledge: access to information, repositories of information, sets of rules, and knowing/understanding.
Abstract: This paper presents four views of knowledge: access to information, repositories of information, sets of rules, and knowing/understanding. Examples are given of how these definitions are both enabling and constraining the application of information technology to the area of knowledge management, and a call is made for multiple views of knowledge so that future systems may achieve greater success.

Patent
04 Jun 1998
TL;DR: In this article, a data management system and method that enables acquisition, integration and management of real-time data generated at different rates, by multiple, heterogeneous incompatible data sources is presented.
Abstract: A data management system and method that enables acquisition, integration and management of real-time data generated at different rates, by multiple, heterogeneous incompatible data sources. The system achieves this functionality by using an expert system to fuse data from a variety of airline, airport operations, ramp control, and air traffic control tower sources, to establish and update reference data values for every aircraft surface operation. The system may be configured as a real-time airport surface traffic management system (TMS) that electronically interconnects air traffic control, airline data and airport operations data to facilitate information sharing and improve taxi queuing. In the TMS operational mode, empirical data shows substantial benefits in ramp operations for airlines, reducing departure taxi times by about one minute per aircraft in operational use, translating as $12 to $15 million per year savings to airlines at the Atlanta, Georgia airport. The data management system and method may also be used for scheduling the movement of multiple vehicles in other applications, such as, marine vessels in harbors and ports, trucks or railroad cars in ports or shipping yards, and railroad cars in switching yards. Finally, the data management system and method may be used for managing containers at a shipping dock, stock on a factory floor or in a warehouse, or as a training tool for improving situational awareness of FAA tower controllers, ramp and airport operators or commercial airline personnel in airfield surface operations.


Journal ArticleDOI
TL;DR: The architecture of an engineering data management (EDM) system is described, which consists of an integrated product database and six STEP-compatible data models constructed to demonstrate the integratibility of EDM system using common data modeling format.
Abstract: In an iterative design process, there is a large amount of engineering data to be processed. Well-managed engineering data can ensure the competitiveness of companies in the competitive market. It has been recognized that a product data model is the basis for establishing engineering database. To fully support the complete product data representation in its life cycle, an international product data representation and exchange standard, STEP, is applied to model the representation of a product. In this paper, the architecture of an engineering data management (EDM) system is described, which consists of an integrated product database. There are six STEP-compatible data models constructed to demonstrate the integratibility of EDM system using common data modeling format. These data models are product definition, product structure, shape representation, engineering change, approval, and production scheduling. These data models are defined according to the integrated resources of STEP/ISO 10303 (Parts 41–44), which support a complete product information representation and a standard data format. Thus, application systems, such as CAD/CAM and MRP systems, can interact with the EDM system by accessing the database based on the STEP data exchange standard.

Journal ArticleDOI
Bin Srinidhi1
TL;DR: Notes that to be effectively implemented, quality management has to be aligned with strategy and properly co‐ordinated and suggests that the congruence management framework can help make quality management more effective.
Abstract: Notes that to be effectively implemented, quality management has to be aligned with strategy and properly co‐ordinated. Develops a systems framework entitled congruence management business architecture. Notes that under this architecture an activity is the core entity for change and that every quality or related initiative will change, eliminate or create activities. Considers various quality management mechanisms under this architecture and considers various barriers. Suggests that the congruence management framework can help make quality management more effective.

Journal ArticleDOI
TL;DR: The article reiterates the three aspects and points out the advantages offered by this network management paradigm developed as part of OSI standards, and discusses the semantics of the various operations and the parameters associated with each operation.
Abstract: Data communications standards to allow exchange of information between two application processes in different heterogeneous computing environments have been developed by International Standards groups. With the development of these standards, the need for managing the communications protocols was realized as part of both the Internet and OSI standards suites. This article addresses the network management paradigm developed as part of OSI standards. The OSI network management application includes three different aspects: categories of network management, a protocol that specifies the structure for transferring network management information, and information models that define resource-specific management information for the specific management functions. These three aspects will be described in this article. Network management functions are grouped into five categories: configuration, fault, performance, security, and accounting. The resource is managed to accomplish these functions. These five categories have been used not only in OSI network management but also in specifying the management functions for telecommunications network. These five categories are briefly discussed in the paper. The protocol structure for OSI network management is defined as an application service element known as CMISE. Regardless of the resource being managed, the protocol defines a basic set of operations applicable to network management. The article discusses the semantics of the various operations and the parameters associated with each operation. Using the structure defined by the protocol, for the various management functions, information is modeled to represent the managed resource. Object-oriented principles are used in defining information models. An introduction to these principles is provided. The management information exchanged is a combination of the three aspects. As part of OSI network management, information models to represent communication entities have been developed. An example is shown to illustrate the exchanged message for a management function. The article reiterates the three aspects and points out the advantages offered by this network management paradigm.

Journal ArticleDOI
J.P. Thompson1
TL;DR: The substance of the initiative is provided by the WBEM process and information architectures that support a scalable and heterogeneous management structure.
Abstract: This article is a description of the Web-Based Enterprise Management initiative. It discusses both the substance of the initiative itself, in the form of the overall architectural approach implied by WBEM, and the industry context that gave rise to WBEM. The industry context is the current state of management systems and software, standardization efforts, and the actual systems being managed. These provided the major influences in shaping the evolution of the WBEM initiative. The substance of the initiative is provided by the WBEM process and information architectures that support a scalable and heterogeneous management structure. The various aspects of the process architecture are covered, including the object manager, object providers, schema management, and protocols required for communication among the various components of the architecture. An overview of the information architecture is provided in the form of a summary of the Desktop Management Taskforce's Common Information Model schema implemented by WBEM.

Patent
10 Feb 1998
TL;DR: In this article, a business operation management system has a data bank which includes a rule database for registering data relative to rules of how to proceed with work, including business plans and new product development plans.
Abstract: A business operation management system has a data bank which includes a rule database for registering data relative to rules of how to proceed with work, including data relative to business operation management manuals prescribed with respect to respective themes including business plans and new product development plans, a project management database for registering management data relative to details of each of the themes, a change management database for registering contents of changes in the management data, a problem management database for registering data indicative of rejected confirmed results, a technical report management database for registering solutions to problems and data indicative of accepted confirmed results, a reference classification management database for registering instruction contents, and a know-how management database for registering know-hows extracted from problems and confirmed results.

Journal ArticleDOI
TL;DR: An architecture for a general‐purpose framework for hypermedia collaboration environments that support purposeful work by orchestrated teams and reuses object‐oriented data management for application‐specific hyperbase organization, and workflow enactment and cooperative transactions as built‐in services, which were originally developed for the Oz non‐hypermedia environment are developed.
Abstract: We have developed an architecture for a general-purpose framework for hypermedia collaboration environments that support purposeful work by orchestrated teams. The hypermedia represents all plausible multimedia artifacts concerned with the collaborative task(s) at hand that can be placed or generated on-line, from application-specific materials (e.g., source code, chip layouts, blueprints) to formal documentation to digital library resources to informal email and chat transcripts. The framework capabilities support both internal (WWW-style hypertext) and external (non-WWW open hypertext link server) links among these artifacts, which can be added incrementally as useful connections are discovereds project-specific intelligent hypermedia search and browsings automated construction of artifacts and hyperlinks according to the semantics of the group and individual tasks and the overall workflow among the taskss application of arbitrary tools to the artifactss and collaborative work for geographically dispersed teams connected by the Internet and/or an intranet/extranet. We also present a general architecture for a WWW-based distributed tool launching service compatible with our collaboration environment framework. We describe our prototype realization of the framework in OzWeb. It reuses object-oriented data management for application-specific hyperbase organization, and workflow enactment and cooperative transactions as built-in services, which were originally developed for the Oz non-hypermedia environment. The tool service is implemented by the generic Rivendell component, which has been integrated into OzWeb as an example “foreign” (i.e., add-on) service. Rivendell could alternatively be employed in a stand-alone manner. We have several months experience using an OzWeb hypermedia collaboration environment for our own continuing software development work on the system.

Journal ArticleDOI
Graham Pervan1
TL;DR: A survey of Australasia's largest organizations was conducted to identify which issues were perceived by their chief executive officers (CEOs) as being important, problematic and critical over the next 3–5 years, and the most critical issues were revealed to be a mix of technology management issues.
Abstract: As part of a research programme on key information systems (IS) management issues, a survey of Australasia's largest organizations was conducted to identify which issues were perceived by their chief executive officers (CEOs) as being important, problematic and critical over the next 3-5 years. The results reported are based on a moderate response rate (though perhaps reasonable for the target group) but formal testing showed an absence of non-response bias. The most critical issues were revealed to be a mix of technology management issues (managing and measuring the effectiveness of the information technology (IT) infrastructure, and disaster recovery), strategic management issues (business process redesign, competitive advantage, and information architecture), people and support management issues (organizational learning, and executive and decision support) and systems development and data management issues (effective use of the data resource and effectiveness of software development). This reflects the...

01 Jan 1998
TL;DR: In this article, the authors carried out a field study on the knowledge management initiative of a multi-national financial institution and, based on an initial analysis of their data, proposed a preliminary framework that incorporates several critical success factors for knowledge management.
Abstract: More and more organizations are now trying to attain competitive superiority through better management and use of their knowledge assets. While some initial success stories on knowledge management initiatives have been reported, no coherent framework on knowledge management has yet been proposed. Using the grounded theory approach, we carried out a field study on the knowledge management initiative of a multi-national financial institution and, based on an initial analysis of our data, propose a preliminary framework that incorporates several critical success factors for knowledge management. Introduction In a dynamic and competitive business environment characterized by unpredictable changes, organizations need to be able to continually adapt or adjust their structures, processes, domains, and goals to remain viable (Huber, 1991). This quest for competitive superiority also implies that organizations need to become increasingly effective in using their existing knowledge base for strategic benefit, and in creating and acquiring new knowledge to broaden their knowledge base (Sanchez and Heene, 1997). In a volatile environment, this creation and acquisition of new knowledge must outpace the rate at which current organizational knowledge is becoming obsolete. As we move from an information era into a knowledge era, competitive superiority is increasingly being derived from intellectual assets. To obtain greater value from their intellectual assets, organizations need to manage knowledge generation, transfer, and use among their various functional or business units. Organizational Learning and Knowledge Management Organizational learning literature provides us with a basis for studying knowledge management. Duncan and Weiss (1991) argue that organizational knowledge is critical for the effective operation and adaptation of organizations and it is through learning that organizations develop this knowledge. The current body of organization learning studies focuses specifically on the processes through which organizational knowledge grows and changes. However, little research has been done on how organizational units possessing knowledge and organizational units needing knowledge can seek each other out quickly and with a high likelihood (Huber, 1991). Toward this direction, we need to investigate how organizational memory can serve as a repository of organizational knowledge so as to facilitate the internal search for knowledge by organizational units. Knowledge management has emerged as a key theme among recent research efforts aimed at understanding how to better use organizational memory. At present, such research efforts are still in an infancy stage where the literature comprises mainly articles in magazines, articles on websites, or internal articles produced by consultancy firms. These articles provide some insights, based on anecdotal evidence, that suggest how we may be able to obtain more value from organizational knowledge bases. There have also been several descriptive case studies that describe knowledge management initiatives at prominent organizations, such as British Petroleum, Dow Chemical, Hewlett Packard, Skandia Assurance, and Texas Instruments. These success stories have prompted scholars to recognize the need for knowledge management in organizations. As a result, scholars began to identify key challenges and issues involved in knowledge management and to seek out critical success factors for effective knowledge management. For example, a critical challenge confronting most knowledge management is how to transform the deep-rooted organizational culture and individual belief of “knowledge is power” into “knowledge sharing is power”. To overcome such challenges, all knowledge management initiatives need to be undertaken with a long-term perspective, with the support from top management and the cooperation of organizational units. Two interesting observations are apparent from the few success stories on knowledge management. First, successful knowledge management initiatives typically begin with the recognition of a need to accelerate knowledge transfer and access. In the case of British Petroleum, the use of virtual teamwork through video-conferences to solve critical operational problems had led to faster knowledge transfer, which results in significant time and cost savings. Secondly, to facilitate knowledge management, organizations have begun to create new roles such as Chief Knowledge Officer or Director of Intellectual Capital (see Table 1).

Book ChapterDOI
09 Nov 1998
TL;DR: The proliferation of the Internet and intranets, the development of wireless and satellite networks, and the availability of asymmetric, high-bandwidth links to the home, have fueled a wide range of new dissemination-based applications.
Abstract: The proliferation of the Internet and intranets, the development of wireless and satellite networks, and the availability of asymmetric, high-bandwidth links to the home, have fueled the development of a wide range of new “dissemination-based” applications. These applications involve the timely distribution of data to a large set of consumers, and include stock and sports tickers, traffic information systems, electronic personalized newspapers, and entertainment delivery. Dissemination-oriented applications have special characteristics that render traditional client-server data management approaches ineffective. These include: tremendous scale. a high-degree of overlap in user data needs. asymmetric data flow from sources to consumers.

Book
01 Sep 1998
TL;DR: In this paper, the authors provide an analysis of knowledge management and its role in the enterprise and discuss E-mail, groupware and push technology and explain how these technologies can be used to capture, store and analyze corporate knowledge.
Abstract: This report provides an analysis of knowledge management and its role in the enterprise. It examines concepts and theories of knowledge management and discusses E-mail, groupware and push technology and explains how these technologies can be used to capture, store and analyze corporate knowledge.

Journal ArticleDOI
TL;DR: The paper addresses two problems of realistic workloads: skewed access frequencies to the records and evolving access patterns where previously cold records may become hot and vice versa and automatically chooses the appropriate granularity for dynamic data migrations.
Abstract: Networks of workstations are an emerging architectural paradigm for high-performance parallel and distributed systems. Exploiting networks of workstations for massive data management poses exciting challenges. We consider here the problem of managing record-structured data in such an environment. For example, managing collections of HTML documents on a cluster of WWW servers is an important application for which our approach provides support. The records are accessed by a dynamically growing set of clients based on a search key (e.g., a URL). To scale up the throughput of client accesses with approximately constant response time, the records and thus also their access load are dynamically redistributed across a growing set of workstations. The paper addresses two problems of realistic workloads: skewed access frequencies to the records and evolving access patterns where previously cold records may become hot and vice versa. Our solution incorporates load tracking at different levels of granularity and automatically chooses the appropriate granularity for dynamic data migrations. Experimental results based on a detailed simulation model show that our method is indeed successful in providing scalable cost/performance and explicitly controlling its level.