scispace - formally typeset
Search or ask a question

Showing papers on "Data management published in 1994"


01 Jan 1994

3,425 citations


Journal ArticleDOI
TL;DR: An emergent theory of quality management is proposed and links it to the literature, and a set of reliable and valid scales was developed that may be used by other researchers for hypothesis testing and by practitioners for assessing quality management practices in their plants and for internal and external benchmarking.

1,982 citations


Book
01 Jan 1994
TL;DR: The management of learning the curriculum and the school monitoring, review and quality assurance women in education management managing human resources motivation communication accountability and responsiveness managing resources models of decision making in resouce allocation marketing and external relations as discussed by the authors.
Abstract: Management in educational organizations theories of education management leadership strategy, policy and planning organizational structure and culture the management of learning the curriculum and the school monitoring, review and quality assurance women in education management managing human resources motivation communication accountability and responsiveness managing resources models of decision making in resouce allocation marketing and external relations

184 citations



Book ChapterDOI
02 May 1994
TL;DR: It is demonstrated that providing index or hashing based access to the data transmitted over wireless is very important for extending battery life and can result in very significant improvement in battery utilization.
Abstract: Organizing massive amount of information on communication channels is a new challenge to the data management and telecommunication communities. In this paper, we consider wireless data broadcasting as a way of disseminating information to a massive number of battery powered palmtops. We show that different physical requirements of the wireless digital medium make the problem of organizing wireless broadcast data different from data organization on the disk. We demonstrate that providing index or hashing based access to the data transmitted over wireless is very important for extending battery life and can result in very significant improvement in battery utilization. We describe two methods (Hashing and Flexible Indexing) for organizing and accessing broadcast data in such a way that two basic parameters: tuning time, which affects battery life, and access time (waiting time for data) are minimized.

162 citations


PatentDOI
TL;DR: Embodied as a hotel reservation and data management network, the system supports not only the storage and exchange of basic room-inventory/reservation data between and among the central and remote locations, but also such advanced data management features as a single depleting inventory for both the centraland remote databases.

137 citations



Patent
11 Apr 1994
TL;DR: Computer-based systems and methods for managing data as discussed by the authors take advantage of a unique model which increases speed and flexibility; eliminates the need for a complex data manipulation language, data or application dependent software, and separate structuring tools such as pointers, lists, and indexes; and automatically creates among the data relationships which may or may not have been apparent to a user or the designer.
Abstract: Computer-based systems and methods for managing data. These systems and methods take advantage of a unique model which: increases speed and flexibility; eliminates the need for a complex data manipulation language, data or application dependent software, and separate structuring tools such as pointers, lists, and indexes; and automatically creates among the data relationships which may or may not have been apparent to a user or the designer. Salient, unique features of the systems and methods are their capabilities for: efficiently creating a composite view of the data; importing data from and exporting data to external databases; and otherwise providing significant and advantageous, additional control over the creation, maintenance, and display of the data making up a database constructed in according with the principles described in parent application Ser. No. 07/621,834 filed 4 Dec. 1990.

113 citations


Journal ArticleDOI
K. Pearsall1, B. Raines1
TL;DR: The Customer Quality Analysis (CQA) system as mentioned in this paper is a data management system to combine raw material quality data, in-process statistical control data, and the resulting customer satisfaction data in a manufacturing environment.
Abstract: A business need existed for a data management system to combine raw material quality data, in-process statistical control data, and the resulting customer satisfaction data in a manufacturing environment No system existed that analyzed these three sets of data to optimize customer satisfaction Therefore, an expert system was developed to fill this need It includes a relational database for correlating data about the manufacturing process and customer needs The Customer Quality Analysis (CQA) system has been implemented as a pilot program within IBM The program that initiated the development of this data management system is highlighted, then specific materials applications are discussed The authors conclude with suggestions for possible extensions of this process into other manufacturing process areas >

112 citations


Patent
14 Jul 1994
TL;DR: In this paper, a data management system consisting of a centrally located host computer, at least one trading partner output data file that communicates with the host computer and a data processing means resident within a host computer that provides visual representation of the data following processing from the management transaction database, the management customer database, and the output data files.
Abstract: The present invention consists of a data management system having a centrally located host computer, at least one trading partner output data file that communicates with the host computer, a data management processing means resident within the host computer and in communication with the trading partner's output data file, at least one management transaction database in communication with the processing means, at least one management customer database in communication with the data management processing means and at least one data output means within the host computer that provides visual representation of the data following processing from the management transaction database, the management customer database, and the output data file.

108 citations


Proceedings ArticleDOI
14 Feb 1994
TL;DR: The design of Mariposa is described, an experimental distributed data management system that provides high performance in an environment of high data mobility and heterogeneous host capabilities and a general, flexible platform for the development of new algorithms for distributed query optimization, storage management, and scalable data storage structures.
Abstract: We describe the design of Mariposa, an experimental distributed data management system that provides high performance in an environment of high data mobility and heterogeneous host capabilities. The Mariposa design unifies the approaches taken by distributed file systems and distributed databases. In addition, Mariposa provides a general, flexible platform for the development of new algorithms for distributed query optimization, storage management, and scalable data storage structures. This flexibility is primarily due to a unique rule-based design that permits autonomous, local-knowledge decisions to be made regarding data placement, query execution location, and storage management. >

Patent
Kaoru Nakabayashi1, Akira Mochida1
18 May 1994
TL;DR: In this article, a data management system for management of name cards is presented, which allows quick access to a desired name card by displaying a hierarchical display of data in the form of a card box list, a target list including a plurality of target data, and individual name card data.
Abstract: The present invention provides a data management system realizing simple, clear, and easily understandable display and operation. The data management system is applicable, for example, to management of name cards. In management of name cards, name card data (101) includes a plurality of name card records (100). A card summary record (110, 112) prepared as title data includes a card ID and part of item data included in each name card record (100). Each summarized card data (111, 113) includes a plurality of card summary records (110, 112) collected according to a predetermined condition, and is placed into a certain card box (120, 121). Each card box has a characteristic name such as `business` or `private`. Visually hierarchical display of data in the form of a card box list, a target list including a plurality of target data, and individual name card data helps the user clearly understand the whole data display and data retrieval. This allows quick access to a desired name card. The target list is data having a fixed length and used on a memory, thereby realizing high-speed data retrieval and display. The system of the invention further includes a data maintenance mechanism and an auto-dialing mechanism.

Patent
Manabe Toshihiko1
13 Sep 1994
TL;DR: In this article, different shared data identifiers are assigned to different threads, and different locks are set up for different shared identifiers, so as to enable the detection of an access in violation to the locks among the accesses to the shared data for each thread separately.
Abstract: A shared data management scheme capable of manipulating the shared data by the multi-threading without requiring the explicit programming of the lock in the program. In this scheme, different shared data identifiers are assigned to different threads, and different locks are set up for different shared data identifiers, so as to enable the detection of an access in violation to the locks among the accesses to the shared data for each thread separately. Preferably, the shared data required in an execution of each thread are mapped to a region exclusively allocated to each thread in a virtual space, in units of pages, a page protection of a memory management unit is set up with respect to each page mapping the shared data, and a lock for the shared data mapped to each page is automatically set up in response to an occurrence of an exception due to the page protection of the memory management unit caused by an access to the shared data mapped to each page from each thread.

Book
01 Oct 1994
TL;DR: Any database system user interested in the latest technologies, particularly users with large amounts of complex data to manage, as well as students, designers, and implementors of much systems, will find this book packed with useful information for their particular needs.
Abstract: From the Publisher: This book provides a comprehensive, self-contained, and up-to-date introduction to rapidly emerging database systems and technologies. It incorporates a wealth of information accumulated by the author in designing and evaluating new database systems; defines object data management terms and categorizes a range of new systems; examines without hype or bias the performance and functionality of particular new products and approaches; covers real applications and their requirements; and now includes a valuable overview of industry standards. Since initial publication, a number of books has appeared with "object-oriented databases in the title. Cattell's work, however, remains the most thorough and most balanced coverage of the new technology, and it is now the most current, as well. His book discusses a much wider range of database approaches, including extended relational systems and object-oriented systems. It also provides deeper insight into the implementation and architecture of these systems. Any database system user interested in the latest technologies, particularly users with large amounts of complex data to manage, as well as students, designers, and implementors of much systems, will find this book packed with useful information for their particular needs.

Proceedings ArticleDOI
28 Sep 1994
TL;DR: A data model of a specialized time series management system (TSMS) which accounts for these needs is proposed, centered around an object-oriented kernel that offers the classes and value types needed for the target applications.
Abstract: The analysis of time series is a central issue in economic research and many other scientific applications. However, the data management functionality for this field is not provided by general-purpose DBMSs. Therefore, we propose a data model of a specialized time series management system (TSMS) which accounts for these needs. The model is centered around an object-oriented kernel that offers the classes and value types needed for the target applications. The model provides base classes for multivariate time series and for groups as a means to hierarchically partition the time series space. The system offers a computationally complete data manipulation language including capabilities to query time series and groups. An elaborate array model is supported to account for the functional needs of statistical computations. Furthermore, a customizable calendar system providing a variety of predefined calendars is included. >

Proceedings ArticleDOI
15 May 1994
TL;DR: A system which enables querying with the facility of acquiring the contents of multiple types of data is presented, and Domain Knowledge accommodated in the system gives the information on how the system views the target multimedia data for content-based retrieval attaining application-independency as well as media-independence.
Abstract: In accordance with the progress of data management ability of computers, databases have become able to integrate various types of data in an application domain, and are now called multimedia databases. Unlike conventional databases managing only textual and numerical data, multimedia databases are required to evaluate properties of the data specified in a query. In this paper, a system which enables querying with the facility of acquiring the contents of multiple types of data is presented. Domain Knowledge accommodated in the system gives the information on how the system views the target multimedia data for content-based retrieval attaining application-independency as well as media-independency. >

Book
01 Jan 1994
TL;DR: This field-tested guide provides experienced data modelers, architects, and engineers with hands-on guidance from two noted data management experts and offers detailed guidance to establishing a continuous quality evaluation program that's easy to implement and follow.
Abstract: A Straightforward, No-Nonsense Guide to Building the Most Accurate, Complete, and Useful Data Models Possible. How do I know if my data model is accurate? When is a model really complete? Is it possible for a model to be both technically perfect and of no use to an organization, and what can I do to avoid that problem? This book provides answers to these and other crucial data modeling questions. While there are plenty of books that describe the characteristics of finished high-quality data models, only The Data Modeling Handbook gets down to the nitty-gritty of actually building one. Packed with real-world examples, annotated diagrams, and a wealth of rules and best practices, this field-tested guide provides experienced data modelers, architects, and engineers with hands-on guidance from two noted data management experts. * The only book offering clear, straightforward rules and guidelines for judging model accuracy and completeness * Presents all rules in several notations, including IDEF1X, Martin, Chen, and Finkelstein * Compares and contrasts the most popular modeling styles and demonstrates how great models can be built using any type of notation * Explains how to use an organization's plans, policies, objectives, and strategies to build accurate, complete, and useful models * Offers detailed guidance to establishing a continuous quality evaluation program that's easy to implement and follow * Packed with real-world examples and annotated diagrams illustrating each point covered * Describes how to use Case tools most effectively to build high-quality models

Patent
04 Nov 1994
TL;DR: In this article, an object oriented database is utilized to model a complex process since it is easily extended to include tables of transactions for each of the many process steps in a complex operation, and the database is accessed through the Desktop Management Interface (DMI) with individual DMI commands issued to get or set each individual entry.
Abstract: Instrumentation logic is provided to map object oriented protocols to efficient data management protocols to provide direct, keyed access to multiple data entries. An object oriented database is utilized to model a complex process since it is easily extended to include tables of transactions for each of the many process steps in a complex operation. The database is accessed through the Desktop Management Interface (DMI) with individual DMI commands issued to get or set each individual entry. An application requiring access to many entries would require detailed knowledge of the database and would need to generate many DMI commands. For such an application, instrumentation logic is provided and is accessed by the application through a normal DMI command. The instrumentation then generates all of the successive DMI commands needed to access multiple entries. The invention is extended by utilizing an additional database management system such as DB2 which provide efficient query/response access to large databases. In such case, the invoked instrumentation logic issues a query to obtain the requested data. In both cases, the instrumentation returns the data utilizing the normal DMI interface.

Book
01 Jan 1994
TL;DR: The foundations of network management and system management and ODP as the basis for integrated application management and technical possibilities for implementing corporate management are outlined.
Abstract: Preface Part I: Foundations *1 Foundations of communication systems and distributed systems Architectures * Protocols * Resources *2 Foundations of network management and system management Network management from the provider's point of view * Management dimensions * Differentiation between network and system management Part II: Management Architectures *3 Need for and requirements of integrated management *4 Submodels of a network management architecture Information model * Organizational model * Communication model * Functional model *5 OSI network management Overview * OSI information model * OSI organizational model * OSI communication model * OSI functional model * Assessment, open questions *6 Internet management Internet Activities Board * Internet management architecture model * Internet information model * Simple network management protocol * Internet MIBs * SNMPv2 * Comparison of ISO and IAB approaches to management *7 Other manufacturer-independent architectures IEEE LAN/MAN management * ITU-S telecommunications management network (TMN) * OSF distributed management environment (DME) * ISO open distributed processing (ODP) * Other manufacturer-independent activities *8Gateways between management architectures Protocol gateways * Interface gateways * MIB gateways *9Manufacturer-specific network management architectures and approaches to the migration to open architectures SNA network management * TRANSVIEW (Siemans Nixdorf) * Other manufacturer approaches Part III: Management Tools *10 Properties and classes of tools Introduction * Criteria for classifying management tools * A general classification scheme *11 Isolated management tools Test sets * Protocol analysers * Tools from the internet area * Documentation systems * Trouble ticket systems *12 Management platforms Archetecture of management platforms * Infrastructure of a management platform * User interface * Basic applications * Selection criteria *13 Development tools for management MIB tools * Tools for designing user interfaces * Development environments for implementing management applications *14 Integration of management tools into platforms Forms of integration * User-interface integration * Integration via a proxy agent * Data integration * Total integration Part IV: Tried and Tested Management Scenarios *15 Component management Introduction to the management scenario * Management of LAN components * Management of WAN components *16 System management Management areas from the user's viewpoint * Management areas from the administrator's viewpoint * Integration of system management Part V: Future Management Scenarios, Open Questions *17 Application management Introduction to the management scenario * ODP as the basis for integrated application management * Existing solutions *18 Enterprise management Introduction to the management scenario * Positioning the management problem in the enterprise context * Technical possibilities for implementing corporate management * View of an enterprise management architecture *19 Open questions in network and system management Information model * Organizational model * Functional model * Integration of system management * Abbreviations * Bibliography * Index

Journal ArticleDOI
TL;DR: This paper shows how to compute approximations of window sets defined by Gannon, Jalby, and Gallivan, which allows derivation of a global strategy of data management for local memories which may be combined efficiently with various parallelization and/or vectorization optimizations.
Abstract: One major point in loop restructuring for data locality optimization is the choice and the evaluation of data locality criteria. In this paper we show how to compute approximations of window sets defined by Gannon, Jalby, and Gallivan. The window associated with an iterationi describes the "active" portion of an array: elements that have already been referenced before iterationi and that will be referenced after iterationi. Such a notion is extremely useful for data localization because it identifies the portions of arrays that are worth keeping in local memory because they are going to be referenced later. The computation of these window approximations can be performed symbolically at compile time and generates a simple geometrical shape that simplifies the management of the data transfers. This strategy allows derivation of a global strategy of data management for local memories which may be combined efficiently with various parallelization and/or vectorization optimizations. Indeed, the effects of loop transformations fit naturally into the geometrical framework we use for the calculations. The determination of window approximations is studied both from a theoretical and a computational point of view, and examples of applications are given.

Journal ArticleDOI
TL;DR: Considers the security aspects of communication between two management processes operating in different management domains, and identifies two major risks: the security of information exchanged during the management association, and control of access to the management information base (MIB).
Abstract: Considers the security aspects of communication between two management processes operating in different management domains; identifies two major risks: the security of information exchanged during the management association, and control of access to the management information base (MIB); and enumerates the various threats that must be guarded against and possible methods of attack. Security techniques, including symmetric and public key cryptosystems, are employed in the design of a method of achieving a secure management association. A scheme of authorization control for MIB access is developed. The management of an open system's network resources takes place in the context of a management association. The resources themselves are controlled by an agent process which presents a view of these resources to the outside world as a number of managed objects, each of which contains a number of attributes. The collection of objects presented to the outside world by the agent is known as the MIB. A manager process regulates the operation of the managed resources by engaging in a management association with the agent and instructing it to carry out simple operations on elements of the MIB. Within a single management domain where all processing nodes and network links are under the control of the same administration, security is not such a critical issue. However, when the management association takes place across the boundary between two separate management domains, and make use of public data networks, security issues must be considered in greater detail. >

Book ChapterDOI
02 May 1994
TL;DR: This work has taken an existing information retrieval system (INQUERY) and substituted a persistent object store (Mneme) for the portion of the custom data management system that manages an inverted file index, resulting in an improvement in performance and significant opportunities for the information retrieved system to take advantage of the standard data management services provided by the persistence store.
Abstract: Full-text information retrieval systems have unusual and challenging data management requirements Attempts have been made to satisfy these requirements using traditional (eg, relational) database management systems Those attempts, however, have produced rather discouraging results Instead, information retrieval systems typically use custom data management facilities that require significant development effort and usually do not provide all of the services available from a standard database management system Advanced data management systems, such as object-oriented database management systems and persistent object stores, offer a reasonable alternative to the two previous approaches We have taken an existing information retrieval system (INQUERY) and substituted a persistent object store (Mneme) for the portion of the custom data management system that manages an inverted file index The result is an improvement in performance and significant opportunities for the information retrieval system to take advantage of the standard data management services provided by the persistent object store We describe our implementation, present performance results on a variety of document collections, and discuss the advantages of using a persistent object store to support information retrieval

31 Aug 1994
TL;DR: The central concept here is the Parametrized Graphics Object: an interface is built up from graphics objects whose properties are functions of data in the Data Manager, which extends to many other application domains.
Abstract: Computational Steering is the ultimate goal of interactive simulation: researchers change parameters of their simulation and immediately receive feedback on the effect. We present a general and flexible environment for computational steering. Within this environment a researcher can easily develop user interfaces and 2-D visualizations of his simulation. Direct manipulation is supported, and the required changes of the simulation are minimal. The architecture of the environment is based on a Data Manager that takes care of centralized data storage and event notification, and satellites that produce and visualize data. One of these satellites is a graphics tool to define a user interface interactively and to visualize the data. The central concept here is the Parametrized Graphics Object: an interface is built up from graphics objects whose properties are functions of data in the Data Manager. The scope of these tools is not limited to computational steering, but extends to many other application domains.

BookDOI
01 Feb 1994
TL;DR: The first comprehensive volume on the subject of clinical data management, this book contains concise, well-researched information covering all aspects of data management from handling early phase I studies in volunteers to the presentation of final reports for regulatory purposes.
Abstract: From the Publisher: The first comprehensive volume on the subject of clinical data management, this book contains concise, well-researched information covering all aspects of data management from handling early phase I studies in volunteers to the presentation of final reports for regulatory purposes. In addition, however, other areas fundamental to the success of any data management group are covered, such as study set-up, quality assurance, regulatory requirements and the impact of clinical research in general, and with all the subject matter an international approach has been taken. This easy-to-use text has been written by contributors who are known for their extensive experience in industry and who have been drawn from multinational companies worldwide. It will be of interest to anyone in the field of clinical data management, within the pharmaceutical industry, or associated industries and to all biomedical professionals working in clinical research.

Report SeriesDOI
TL;DR: This document provides a full description of the GREEN model, lists all the model equations, provides a data dictionary to link the equation variables with the variables in the code, explains details which are traditionally bypassed in technical papers, and provides an explanation of the data base and the data management part of the code.
Abstract: This document provides a full description of the GREEN model. It is intended to accompany the GREEN code, i.e. the implementation of the model, and to enable the user to understand the links between the theoretical framework of the model and its practical implementation. The document lists all the model equations, provides a data dictionary to link the equation variables with the variables in the code, explains details which are traditionally bypassed in technical papers, and provides an explanation of the data base and the data management part of the code. The document is organised as follows. Following a non-technical overview of the model in Part I, Part II presents the structure of the model with a complete description of the equations, the variables, and parameters which are part of the GREEN model. Part III explains the data management in GREEN ...

Journal ArticleDOI
01 Feb 1994
TL;DR: This paper presents an authorization model that substantially extends and revises that model defined for object-oriented (and semantic) database systems and concerns the support for content-dependent authorization, which was not provided in [20].
Abstract: Authorization is an important functionality that every data management system should provide. An authorization mechanism allows different access rights on data items to be selectively assigned to users. Authorization models and mechanisms have been widely investigated in the framework of traditional database systems. The extension of those models and mechanisms to advanced data management systems is quite complex, because those systems are characterized by data models with a larger number of semantic constructs than traditional models, like the relational one. A first authorization model defined for object-oriented (and semantic) database systems has been presented in [20]. In this paper we present an authorization model that substantially extends and revises that model. The most significant extension concerns the support for content-dependent authorization, which was not provided in [20]. Content-dependent authorization is very important in providing an authorization mechanism able to directly support authorization policies of application environments. Moreover, it is a crucial functionality in environments where data objects frequently change their status. In addition, the model presented here differs from the model defined in [20] in that new authorization types are introduced and a finer control of versions is provided. Finally, authorization administration of objects is considered.

Journal ArticleDOI
TL;DR: This paper describes aspects of a software prototype MMS, called SYMMS, which is designed to provide for model sharing, reusability and integration, and also provides support for data management in the context of modeling.

Journal ArticleDOI
TL;DR: This chapter discusses the development of Total Quality within the VA Healthcare System, as well as its role in the quality of care in the United States and beyond.
Abstract: PART I: HISTORY and OVERVIEW 1. Introduction and Historical Background The Early Days of Quality Assurance History of TQ in Manufacturing The Influence of Government on Quality in Healthcare Quality Assurance Departments Transitions 2. The United States Healthcare System and Quality of Care The First Two Hundred Fifty Years (1600-1850) The Next Hundred Years (1850-1950) 1950 to the 1990s PART II: PRINCIPLES, TOOLS, and TECHNIQUES of TOTAL QUALITY 3. Principles of Total Quality in Healthcare Organizations Selected Articles: Sounding Board: Continuous Improvement as an Ideal in Health Care The Case for Using Industrial Quality Management Science in Health Care Organizations Total Quality Management for Physicians: Translating the New Paradigm Organizationwide Quality Improvement in Health Care Articles Misconstrue Joint Commission's Position on Quality Improvement A New Look for Quality in Home Care 4. The Implementation of Total Quality Executive Level Commitment Transformation of the Culture Planning Quality Organizing Quality Evaluating Quality Selected Article: How to Start a Direct Patient Care Team 5. Data Management for Total Quality Transformation of Data to Information Data versus Information Data Reliability Data Validity Sensitivity and Specificity Data Collection, Display, and Analysis Tools for Identifying, Collecting, and Displaying Data Tools for Improving and Monitoring Quality PART III: TOTAL QUALITY ENVIRONMENT 6. Cost and Healthcare Quality Selected Article: Accounting for the Cost of Quality 7. The Law, Ethics, and Total Quality The Role of Laws and the Legal System The Role of Ethics The Interplay between Laws and Ethics Legislative Impetus Judicial Involvement in Determining Quality Ethics and the Elevation of Quality 8. Total Quality and Management Philosophies Organizational Cultures The Use of Power in Organizations TQ and Corporate Culture: Finding a Fit Transforming Organizations to a TQ Philosophy and Culture Managing the Process through Change PART IV: STRATEGIC DIRECTIONS 9. Integration of Total Quality and Quality Assurance Quality Assurance Activities in Healthcare Total Quality Transition to Total Quality: Manufacturing and Health Service Sectors Comparison of QA and TQ Integration of QA and TQ 10. Outcome Management and TQ Background of Quality Outcome What is Outcome and What is Outcome Management? Measurement of Outcomes Questions about Outcome Measurements Research on Outcome and Its Management Outcomes Management and Total Quality 11. Research and Total Quality Quality Management Terminology Background and Definitions Research-Based Practice Social Science Research Guidelines and Databases Research on TQ Instrumentation Research Epidemiological Studies Cost Studies Demonstration Projects Research Utilization Research Agenda PART V: CASE STUDY 12. Total Quality within the VA Healthcare System: A Case Study Annotated Bibliography Index

Proceedings Article
31 Jul 1994
TL;DR: In this article, the authors present a two-level architecture for a data mining environment consisting of a mining tool and a parallel DBMS server, where the mining tool organizes and controls the search process, while the DBMS provides optimal response times for the few query types being used by the tool.
Abstract: One of the main obstacles in applying data mining techniques to large, real-world databases is the lack of efficient data management In this paper, we present the design and implementation of an effective two-level architecture for a data mining environment It consists of a mining tool and a parallel DBMS server The mining tool organizes and controls the search process, while the DBMS provides optimal response times for the few query types being used by the tool Key elements of our architecture are its use of fast and simple database operations, its re-use of results obtained by previous queries, its maximal use of main-memory to keep the database hot-set resident, and its parallel computation of queries Apart from a clear separation of responsibilities, we show that this architecture leads to competitive performance on large data sets Moreover, this architecture provides a flexible experimentation platform for further studies in optimization of repetitive database queries and quality driven rule discovery schemes

01 Jan 1994
TL;DR: This work describes a four dimensional CDM modelling space - product dimension (relation to product model and product decomposition), time dimension (document life cycle), organisation dimension and finally presentation dimension, and elaborate the functional requirements that such systems should fulfil.
Abstract: Within the context of the construction industry Electronic Document Management (EDM) systems offer the means for rapidly achieving a shallow level of integration by providing process integration and information management on a coarse, document sized level of detail. A question which so far has received little attention is how construction document management (CDM) can provide a smooth transition toward computer-integrated construction, based on full-grown building product data models. An important research question for researchers developing a theoretical basis for CIC is to define hybrid conceptual models which synthesise these two levels of data management and describe accurately in an application independent way the CIC processes where a part of the information is managed by CDM systems and a part using product and project models. As a pre-stage to the definition of formal models of the information manipulated by CDM systems we elaborate the functional requirements that such systems should fulfil. Next we describe a four dimensional CDM modelling space - product dimension (relation to product model and product decomposition), time dimension (document life cycle), organisation dimension and finally presentation dimension. In the end a document classification table which includes generic document properties sorted according to the four dimensions described above is presented and directions for further research are indicated.