scispace - formally typeset
Search or ask a question
Topic

Data mart

About: Data mart is a research topic. Over the lifetime, 559 publications have been published within this topic receiving 8550 citations.


Papers
More filters
01 Jan 2008
TL;DR: A design research project was undertaken to demonstrate that an Access-based data mart could successfully streamline this report generating process and demonstrate the need to eliminate excessive detail and deliver highly summarized reports.
Abstract: Hospitals and medical centers participate in a physician profiling process. This process is important to ensure that physicians are providing safe care and to comply with regulations. One medical center was struggling with the ongoing generation of physician performance reports that were an important part of the profiling process. A design research project was undertaken to demonstrate that an Access-based data mart could successfully streamline this report generating process. The research also demonstrated the need to eliminate excessive detail and deliver highly summarized reports. In addition, the research provided thorough documentation of the entire data mart development approach. This documentation can serve as a resource for future research and/or for other medical centers that might be struggling to manage the profiling report requirements. Profiling Data Mart vii Table of
Journal ArticleDOI
30 Apr 2020
TL;DR: In this article, a bank profit loss and balance in the corporate data warehouse model using the bottom-up methodology at enterprise level is presented, which allows high flexibility and user friendliness, because it is based on the individual business department information needs.
Abstract: Nowadays, data warehouse (DWH) and the business intelligence enterprise solutions frequently used by companies blend the services of reporting, analysis and data mining by rich visual components and provide easy to interpret and meaningful information for decision makers. This study aims to summarize the bank profit loss and Balance in the corporate data warehouse model using the bottom up methodology at enterprise level. Building a data mart using the bottom up methodology allows; high flexibility and user friendliness, because it is based on the individual business department (finance) information needs. The other reason this methodology which was preferred, is that the fundamental concept of dimensional modelling, is the star schema and it also supported by data modelling architecture of Oracle OBIEE 11g .One of the main pillars of a bank's pricing policy is to control the profit and loss of branches. At the end of application of this concept’s study, Corporate memory became more mature and dependency on people was removed in terms of reporting. In addition communication and sharing of information within the finance department increased, personal Productivity increased and cost advantage was ensured and the widespread use of structural data, the users' confidence on business intelligence solutions increased by new data mart.
Book ChapterDOI
01 Jan 2016
TL;DR: This paper details the business process analysis and modeling stage of designing an automated solution for banking, focusing on the management of the lending activities business process, and presents the resulting business process model as well as the entities model.
Abstract: In this paper we will detail the business process analysis and modeling stage of designing an automated solution for banking, focusing on the management of the lending activities business process. We will present the resulting business process model as well as the entities model. The latter represents the base upon which we design a Data Mart needed for implementing a scoring algorithm and data mining algorithms for profiling clients. The data mart is part of a departmental data warehouse to ensure better flexibility in the analysis of current activity.
Proceedings ArticleDOI
18 Jun 2007
TL;DR: All of the algorithms developed in this paper were tested using a dataset from the 10 ton dump truck of the family of medium tactical vehicles (FMTV) and measuring everything from GPS position to engine speed to strain rate.
Abstract: VISION (versatile information system - integrated, on-line) represents a comprehensive, holistic, top down approach to information collection, management, and ultimate transformation into knowledge. VISION employs a Web based approach that combines a modular instrumentation suite and a digital library (VDLS) with modern communications technology. VDLS is a Web based collaborative knowledge management system that incorporates associated data marts and additional test information sources such as reports, analyses and documents. One of these data marts contains engineering performance data. A Web based online analytical processing (OLAP) toolbox has been developed to allow querying of this data mart via a Java graphical user interface (GUI). Currently, the OLAP toolbox contains functions to perform metadata searches, view a Global Positioning System (GPS) map of the location of the item under test, view time series traces of all the parameters, generate custom plots of any parameter or download raw data files. The work described in this paper adds additional functions to this toolbox that allows test engineers, data analysts or mechanical engineers to quickly find the data of interest. Tools were developed that use wavelet transforms for both data validation purposes and for data de-noising. Other tools were created that use Google Earth to plot GPS coordinates as markers where each marker contains a balloon of information (e.g., time, date, latitude, longitude, speed, direction). In addition, Google Earth was used in a spatial data mining application where the GPS coordinates of extreme values (i.e., values exceeding a given threshold) are plotted along the track. All of the algorithms developed in this paper were tested using a dataset from the 10 ton dump truck of the family of medium tactical vehicles (FMTV). Up to 35 sensors were used in these tests measuring everything from GPS position to engine speed to strain rate. The tests were conducted at the Aberdeen Test Center on a variety of courses covering everything from smooth paved surfaces to block gravel to dirt.
Proceedings ArticleDOI
01 Jan 2010
TL;DR: This passage analyzed the development of information system and the past studies onCPFR, and then studied the developing strategy of the information system based on CPFR in order to provide technique support for the use of CPFR.
Abstract: During the development, companies need more and more help from the information systems, and some of the needs cannot be met due to the limited functions of the systems. And CPFR is a model that can be used in the development of information system in order to strength the functions of them. This passage analyzed the development of information system and the past studies on CPFR, and then studied the developing strategy of the information system based on CPFR in order to provide technique support for the use of CPFR. It provides the separated model for the nine steps in CPFR procedure and analyses how the key factors, such as data mining, data mart, and data warehouse, work in the systems.

Network Information
Related Topics (5)
Information system
107.5K papers, 1.8M citations
77% related
The Internet
213.2K papers, 3.8M citations
72% related
Scheduling (computing)
78.6K papers, 1.3M citations
72% related
Cloud computing
156.4K papers, 1.9M citations
71% related
Software
130.5K papers, 2M citations
70% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202113
202020
201926
201823
201726
201627