scispace - formally typeset
Search or ask a question

Showing papers on "Business intelligence published in 2002"


Proceedings ArticleDOI
03 Jun 2002
TL;DR: The paper explores an algebraic framework to split the query to minimize the computation at the client site, and explores techniques to execute SQL queries over encrypted data.
Abstract: Rapid advances in networking and Internet technologies have fueled the emergence of the "software as a service" model for enterprise computing. Successful examples of commercially viable software services include rent-a-spreadsheet, electronic mail services, general storage services, disaster protection services. "Database as a Service" model provides users power to create, store, modify, and retrieve data from anywhere in the world, as long as they have access to the Internet. It introduces several challenges, an important issue being data privacy. It is in this context that we specifically address the issue of data privacy.There are two main privacy issues. First, the owner of the data needs to be assured that the data stored on the service-provider site is protected against data thefts from outsiders. Second, data needs to be protected even from the service providers, if the providers themselves cannot be trusted. In this paper, we focus on the second challenge. Specifically, we explore techniques to execute SQL queries over encrypted data. Our strategy is to process as much of the query as possible at the service providers' site, without having to decrypt the data. Decryption and the remainder of the query processing are performed at the client site. The paper explores an algebraic framework to split the query to minimize the computation at the client site. Results of experiments validating our approach are also presented.

1,351 citations


Proceedings ArticleDOI
07 Aug 2002
TL;DR: A novel paradigm for data management in which a third party service provider hosts "database as a service", providing its customers with seamless mechanisms to create, store, and access their databases at the host site is explored.
Abstract: We explore a novel paradigm for data management in which a third party service provider hosts "database as a service", providing its customers with seamless mechanisms to create, store, and access their databases at the host site. Such a model alleviates the need for organizations to purchase expensive hardware and software, deal with software upgrades, and hire professionals for administrative and maintenance tasks which are taken over by the service provider. We have developed and deployed a database service on the Internet, called NetDB2, which is in constant use. In a sense, a data management model supported by NetDB2 provides an effective mechanism for organizations to purchase data management as a service, thereby freeing them to concentrate on their core businesses. Among the primary challenges introduced by "database as a service" are the additional overhead of remote access to data, an infrastructure to guarantee data privacy, and user interface design for such a service. These issues are investigated. We identify data privacy as a particularly vital problem and propose alternative solutions based on data encryption. The paper is meant as a challenge for the database community to explore a rich set of research issues that arise in developing such a service.

707 citations


Journal ArticleDOI
TL;DR: In this article, the authors address the following issues: why mindset matters, what a global mindset is, the value of global mindset, and finally, what companies can do to cultivate global mindset.
Abstract: Executive Overview The economic landscape of the world is changing rapidly and becoming increasingly global. For virtually every medium-sized to large company in developed as well as developing economies, market opportunities, critical resources, cutting-edge ideas, and competitors lurk not just around the corner in the home market but increasingly in distant and often little-understood regions of the world as well. How successful a company is at exploiting emerging opportunities and tackling their accompanying challenges depends crucially on how intelligent it is at observing and interpreting the dynamic world in which it operates. Creating a global mindset is one of the central ingredients required for building such intelligence. In this article, we address the following issues: why mindset matters, what a global mindset is, the value of a global mindset, and finally, what companies can do to cultivate a global mindset.

586 citations


Patent
11 Oct 2002
TL;DR: In this paper, real-time monitoring of web site interactions allows users to modify and fine-tune their websites to maximize value realized, including business intelligence, data visualization, data warehousing and data mining.
Abstract: Systems and methods for processing and reporting information and data, such as business information, and more particularly, to systems, software, hardware, products, and processes for use by businesses, individuals and other organizations to collect, process, distribute, analyze and visualize information, including, but not limited to, business intelligence, data visualization, data warehousing, and data mining. Real-time monitoring of web site interactions allows users to modify and fine-tune their websites to maximize value realized.

498 citations


01 Jan 2002
TL;DR: This paper is concerned with detecting behavioural fraud through the analysis of longitudinal data, and discusses two methods for unsupervised fraud detection in credit data and applies them to some real data sets.
Abstract: Credit card fraud falls broadly into two categories: behavioural fraud and application fraud. Application fraud occurs when individuals obtain new credit cards from issuing companies using false personal information and then spend as much as possible in a short space of time. However, most credit card fraud is behavioural and occurs when details of legitimate cards have been obtained fraudulently and sales are made on a 'Cardholder Not Present' basis. These sales include telephone sales and e-commerce transactions where only the card details are required. In this paper, we are concerned with detecting behavioural fraud through the analysis of longitudinal data. These data usually consist of credit card transactions over time, but can include other variables, both static and longitudinal. Statistical methods for fraud detection are often classification (supervised) methods that discriminate between known fraudulent and non-fraudulent transactions; however, these methods rely on accurate identification of fraudulent transactions in historical databases – information that is often in short supply or non-existent. We are particularly interested in unsupervised methods that do not use this information but instead detect changes in behaviour or unusual transactions. We discuss two methods for unsupervised fraud detection in credit data in this paper and apply them to some real data sets. Peer group analysis is a new tool for monitoring behaviour over time in data mining situations. In particular, the tool detects individual accounts that begin to behave in a way distinct from accounts to which they had previously been similar. Each account is selected as a target account and is compared with all other accounts in the database, using either external comparison criteria or internal criteria summarizing earlier behaviour patterns of each account. Based on this comparison, a peer group of accounts most similar to the target account is chosen. The behaviour of the peer group is then summarized at each subsequent time point, and the behaviour of the target account compared with the summary of its peer group. Those target accounts exhibiting behaviour most different from their peer group summary behaviour are flagged as meriting closer investigation. Break point analysis is a tool that identifies changes in spending behaviour based on the transaction information in a single account. Recent transactions are compared with previous spending behaviour to detect features such as rapid spending and an increase in the level of spending, features that would not necessarily be captured by outlier detection. Introduction In the fight against fraud, actions fall under two broad categories: fraud prevention and fraud detection. Fraud prevention describes measures to stop fraud occurring in the first place. These include PINs for bankcards, Internet security systems for credit card transactions and passwords on telephone bank accounts. In contrast, fraud detection involves identifying fraud as quickly as possible once it has been perpetrated. We apply fraud detection once fraud prevention has failed, using detection methods continuously, as we will usually be unaware that fraud prevention has failed. In this article we are concerned solely with fraud detection. Fraud detection must evolve continuously. Once criminals realise that a certain mode of fraudulent behaviour can be detected, they will adapt their strategies and try others. Of course, new criminals are also attempting to commit fraud and many of these will not be aware of the fraud detection methods that have been successful in the past, and will adopt strategies that lead to identifiable frauds. This means that the earlier detection tools need to be applied as well as the latest developments. Statistical fraud detection methods may be ‘supervised’ or ‘unsupervised’. In supervised methods, models are trained to discriminate between fraudulent and non-fraudulent behaviour, so that new observations can be assigned to classes so as to optimise some measure of classification performance. Of course, this requires one to be confident about the true classes of the original data used to build the models; uncertainty is introduced when legitimate transactions are mistakenly reported as fraud or when fraudulent observations are not identified as such. Supervised methods require that we have examples of both classes, and they can only be used to detect frauds of a type that have previously occurred. These methods also suffer from the problem of unbalanced class sizes: in fraud detection problems, the legitimate transactions generally far outnumber the fraudulent ones and this imbalance can cause misspecification of models. Brause et al (1999) say that, in their database of credit card transactions, ‘the probability of fraud is very low (0.2%) and has been lowered in a preprocessing step by a conventional fraud detecting system down to 0.1%.’ Hassibi (2000) remarks ‘Out of some 12 billion transactions made annually, approximately 10 million – or one out of every 1200 transactions – turn out to be fraudulent.’ In contrast, unsupervised methods simply seek those accounts, customers, etc. whose behaviour is ‘unusual’. We model a baseline distribution that represents normal behaviour and then attempt to detect observations that show greatest departure from this norm. These can then be examined more closely. Outliers are a basic form of nonstandard observation that can be used for fraud detection. This leads us to note the fundamental point that we can seldom be certain, by statistical analysis alone, that a fraud has been perpetrated. Rather, the analysis should be regarded as alerting us to the fact that an observation is anomalous, or more likely to be fraudulent than others – so that it can then be investigated in more detail. One can think of the objective of the statistical analysis as being to return a suspicion score (where we will regard a higher score as more suspicious than a lower one). The higher the score is, then the more unusual is the observation, or the more like previously fraudulent values it is. The fact that there are many different ways in which fraud can be perpetrated, and many different scenarios in which it can occur, means that there are many different ways of computing suspicion scores. We can compute suspicion scores for each account in the database, and these scores can be updated as time progresses. By ordering accounts according to their suspicion score, we can focus attention on those with the highest scores, or on those that exhibit a sudden increase in suspicion score. If we have a limited budget, so that we can only afford to investigate a certain number of accounts or records, we can concentrate investigation on those thought to be most likely to be fraudulent. Credit Card Fraud Credit card fraud is perpetrated in various ways but can be broadly categorised as application, ‘missing in post’, stolen/lost card, counterfeit card and ‘cardholder not present’ fraud. Application fraud arises when individuals obtain new credit cards from issuing companies using false personal information; application fraud totalled £10.2 million in 2000 (Source: APACS) and is the only type of fraud that actually declined between 1999 and 2000. ‘Missing in post’ (£17.3m in 2000) describes the interception of credit cards in the post by fraudsters before they reach the cardholder. Stolen or lost cards accounted for £98.9 million in fraud in 2000, but the greatest percentage increases between 1999 and 2000 were in counterfeit card fraud (£50.3m to £102.8m) and ‘cardholder not present’ (i.e. postal, phone, internet transactions) fraud (£29.3m to £56.8m). To commit these last two types of fraud it is necessary to obtain the details of the card without the cardholder’s knowledge. This is done in various ways, including employees using an unauthorised ‘swiper’ that downloads the encoded information onto a laptop computer and hackers obtaining credit card details by intrusion into companies’ computer networks. A counterfeit card is then made, or the card details simply used for phone, postal or Internet transactions. Supervised methods to detect fraudulent transactions can be used to discriminate between those accounts or transactions known to be fraudulent and those known (or at least presumed) to be legitimate. For example, traditional credit scorecards (Hand and Henley, 1997) are used to detect customers who are likely to default, and the reasons for this may include fraud. Such scorecards are based on the details given on the application forms, and perhaps also on other details, such as bureau information. Classification techniques, such as statistical discriminant analysis and neural networks, can be used to discriminate between fraudulent and non-fraudulent transactions to give transactions a suspicion score. However, information about fraudulent transactions may not be available and in these cases we apply unsupervised methods to attempt to detect fraud. These methods are scarce in the literature and are less popular than supervised methods in practice as suspicion scores reflect a propensity to act anomalously when compared with previous behaviour. This is different to suspicion scores obtained using supervised techniques, which are guided to reflect a propensity to commit fraud in a manner already previously discovered. The idea behind suspicion scores from unsupervised methods is that unusual behaviour or transactions can often be indicators of fraud. An advantage of using unsupervised methods over supervised methods is that previously undiscovered types of fraud may be detected. Supervised methods are only trained to discriminate between legitimate transactions and previously known fraud. Unsupervised methods and their application to fraud detection As we mentioned above, the emphasis on fraud detection methodology is with supervised techniques. In particular, neural networks

369 citations


Journal ArticleDOI
TL;DR: This paper describes a particular approach based on an OLAP (on-line analytical processing) model enhanced with text analysis, and describes two tools that have been developed to explore this approach.
Abstract: Enterprise executives understand that timely, accurate knowledge can mean improved business performance. Two technologies have been central in improving the quantitative and qualitative value of the knowledge available to decision makers: business intelligence and knowledge management. Business intelligence has applied the functionality, scalability, and reliability of modern database management systems to build ever-larger data warehouses, and to utilize data mining techniques to extract business advantage from the vast amount of available enterprise data. Knowledge management technologies, while less mature than business intelligence technologies, are now capable of combining today's content management systems and the Web with vastly improved searching and text mining capabilities to derive more value from the explosion of textual information. We believe that these systems will blend over time, borrowing techniques from each other and inspiring new approaches that can analyze data and text together, seamlessly. We call this blended technology BIKM. In this paper, we describe some of the current business problems that require analysis of both text and data, and some of the technical challenges posed by these problems. We describe a particular approach based on an OLAP (on-line analytical processing) model enhanced with text analysis, and describe two tools that we have developed to explore this approach--eClassifier performs text analysis, and Sapient integrates data and text through an OLAP-style interaction model. Finally, we discuss some new research that we are pursuing to enhance this approach.

283 citations


01 Jan 2002
TL;DR: The legal basis of this concept consists of the Constitution of the Russian Federation, the Federal laws, other legislative acts of Russian Federation that regulate the activity of Federal bodies of state power in foreign policy, generally recognized principles and norms of international law, and international treaties as mentioned in this paper.
Abstract: The legal basis of this concept consists of the Constitution of the Russian Federation, the Federal laws, other legislative acts of the Russian Federation that regulate the activity of Federal bodies of state power in foreign policy, generally recognized principles and norms of international law, and international treaties of the Russian Federation, as well as the Concept of National Security of the Russian Federation that was approved by Decree No. 24 of the President of the Russian Federation on January 10, 2000.

222 citations



Journal ArticleDOI
TL;DR: In the last decade, there has been a major split from the traditional realism, which henceforth became known as "classical" realism as discussed by the authors and "structural" realism.
Abstract: Great Power Politics. New York: W.W. Norton, 2001. More than afty years have passed since Hans Morgenthau introduced “realism” as an approach to the study of international relations. Since then, the approach has withstood not only a steady assault from such external quarters as liberal institutionalism, the democratic peace school, and “constructivism” but also a marked divisive tendency. Splinter groups have emerged, each waving an identifying adjective to herald some new variant or emphasis. The arst of these came in the late 1970s, when Kenneth Waltz’s “neorealism” marked a major split from Morgenthau’s traditional realism, which henceforth became known as “classical” realism. Since then, especially during the last decade, new variants and new tags have proliferated. The aeld of international relations now has at least two varieties of “structural realism,” probably three kinds of “offensive realism,” Mearsheimer’s World— Offensive Realism and the Struggle for Security Glenn H. Snyder

117 citations



Book ChapterDOI
04 Sep 2002
TL;DR: In this article, the authors present an architecture of an ETL environment for real-time data warehouses, which supports a continual near realtime data propagation, taking full advantage of existing J2EE (Java 2 Platform, Enterprise Edition) technology.
Abstract: The amount of information available to large-scale enterprises is growing rapidly. While operational systems are designed to meet well-specified (short) response time requirements, the focus of data warehouses is generally the strategic analysis of business data integrated from heterogeneous source systems. The decision making process in traditional data warehouse environments is often delayed because data cannot be propagated from the source system to the data warehouse in time. A real-time data warehouse aims at decreasing the time it takes to make business decisions and tries to attain zero latency between the cause and effect of a business decision. In this paper we present an architecture of an ETL environment for real-time data warehouses, which supports a continual near real-time data propagation. The architecture takes full advantage of existing J2EE (Java 2 Platform, Enterprise Edition) technology and enables the implementation of a distributed, scalable, near real-time ETL environment. Instead of using vendor proprietary ETL (extraction, transformation, loading) solutions, which are often hard to scale and often do not support an optimization of allocated time frames for data extracts, we propose in our approach ETLets (spoken "et-lets") and Enterprise Java Beans (EJB) for the ETL processing tasks.

Journal ArticleDOI
TL;DR: An insight into the issue of ambiguous weak signs is given through a new strategic business intelligence system called PUZZLE, which shows that the individual cognitive process appears heuristic when interpreting weak signs.
Abstract: Business intelligence (BI) is a strategic approach for systematically targeting, tracking, communicating and transforming relevant weak signs 1 into actionable information on which strategic decision-making is based. Despite the increasing importance of BI, there is little underlying theoretical work, which directly can guide the interpretation of ambiguous weak signs. This paper gives an insight into the issue through a new strategic business intelligence system called PUZZLE. We describe this system and validate it by designing a prototype, test the system using in-depth interviews, and hold learning sessions in order to further knowledge about BI. The main results from tests show that: interpreting weak signs is potentially important for senior managers, consultants, and researchers; interpretation can be achieved gradually by bringing the weak signs together using a tracking form based upon the concept of actor/theme/weak signs/enrichment /links; interpreting weak signs is a complex process of establishing links between the weak signs. Final results show that the individual cognitive process appears heuristic when interpreting weak signs. Implications for strategic management practice and research are addressed.

Journal ArticleDOI
TL;DR: A simplified model is developed that can be used for identifying and classifying the functions and features in corporate portal software and used as benchmarking tools for evaluating portals capabilities.
Abstract: Enterprise portal is a type of new information system that can help companies and their employees to manage, share, and use previously disparate information. There are more than 60 vendors that are offering corporate portal solutions. With so many vendors, selecting the right one can be a difficult task. The primary objective of this research is to identify and evaluate the functions and features in enterprise portal products. In particular, this study develops a simplified model that can be used for identifying and classifying the functions and features in corporate portal software. The results of this study may be useful to information technology managers, educators, and students involved in knowledge management, business intelligence, information systems resources management, and data management. System developers, software engineers, project managers, financial managers, and data architects can use the functions and features identified in this study as benchmarking tools for evaluating portals capabilities.



Proceedings ArticleDOI
12 Dec 2002
TL;DR: A set of criteria for evaluating and selecting Web resources as external data sources of a data warehouse and how to screen Web data sources using multi-criteria decision making (MCDM) methods are developed and discussed.
Abstract: A company's local data is often insufficient for analyzing market trends and making reasonable business plans. Decision making must also be based on information from suppliers, partners and competitors. Systematically integrating suitable external data from the Web into a data warehouse is a meaningful solution and will benefit the enterprise. However, the autonomy and dynamics of the Web make the task of selecting relevant and qualified external data from the Web challenging. We develop a set of criteria for evaluating and selecting Web resources as external data sources of a data warehouse and discuss how to screen Web data sources using multi-criteria decision making (MCDM) methods. The final decision with respect to selecting Web sources is sensitive to critical factors, i.e., the criterion weight and performance score of alternatives in terms of each criterion. We analyzed the sensitivity of the final rank of alternatives in terms of critical factors in order to gain an insight into the stability of our final decision. The comparison of several MCDM approaches for Web source screening is also presented.


Patent
19 Feb 2002
TL;DR: In this article, the authors present a business intelligence monitor method and system, leveraging the functionality of an existing business information reporting infrastructure, which includes the step of authoring a monitor document derived from a determined methodology by creating one or more intelligence indicators, establishing thresholds for the created intelligence indicators and selecting status definitions for the established thresholds.
Abstract: The present invention is directed to a business intelligence monitor method and system. The method, leveraging the functionality of an existing business information-reporting infrastructure, includes the step of authoring a monitor document derived from a determined methodology by creating one or more intelligence indicators, establishing thresholds for the one or more created intelligence indicators and selecting status definitions for the established thresholds. The method further includes the step of building a report guided by the authored monitor document by retrieving data from a data source for each of the one or more intelligence indicator, assigning statuses to each of the one or more intelligence indicators based on the retrieved data and generating a report incorporating one or more intelligence indicators, retrieved data and assigned statuses. The method further includes the step of publishing the built report by parsing one or more selected instruction templates to obtain publishing instructions and publishing predicated on the obtained publishing instructions.

Journal ArticleDOI
14 Nov 2002

Book
01 Jan 2002
TL;DR: This book discusses the evolution of Financial Business Intelligence, the power of Business Intelligence tools, and the challenges faced in implementing and implementing a Business Intelligence System.
Abstract: Foreword. Introduction. Part One: Evolution of Financial Business Intelligence. 1. History and Future of Business Intelligence. History of BI. Trends. 2. Leveraging the Power of Business Intelligence Tools. New Breed of Business Intelligence Tools. 3. Why Consider a Financial BI Tool in Your Organization? Defining the Financial Datawarehouse. What Is Your Company's Business Intelligence Readiness? Part Two: BI Technology. 4. Platforms and Differences in Technology. Financial and Nonfinancially Focused Tools. Who Are the Players? Unix versus Microsoft as a Platform Choice. 5. Performance: Getting Information on Time. Fast Enough for Your Business. Periodicity Considerations. Data Storage Methodology (ROLAP, MOLAP, or HOLAP?). 6. Building the Datawarehouse/Mart. Important Success Factors. Using Financial Data Sources. Staging Data for Financial Analysis. 7. Front-end Analytic Tools. Specialized BI Tools. Real-Time Analysis: Myth or Reality (Technology Perspective). Excel Add-Ins. Traditional Report Writer versus OLAP-Based Analysis Tools. 8. Security. Role-Based Access. Internet Security: Only as Good as Your Barriers. 9. The Internet's Impact on Business Intelligence. Everyone's a Player. What Is a Portal? How Do I Deliver BI Information for the Internet? Part Three: Software Evaluation and Selection. 10. Selecting a Business Intelligence Solution. Create a Plan. Using a Software Selection Company. 11. Software Evaluation: Factors to Consider. Expected Use Now and in the Future. Getting the Rest of the Company to Buy in to the Process. Cost/Benefit Analysis. Return on Investment Analysis. Features and Flexibility. Compatibility with Existing Software. Ease of Use. Software Stability. Vendor-Related Items. Working with an Implementation Partner. How to Select: Summary. 12. Outsourcing: The New Alternative. How It Works. When You Should Consider an Application Service Provider. Selection Criteria for an Application Service Provider. Ensuring Continuous Success. 13. Buyer's Guide. Query and Reporting Systems. Decision Support Systems. OLAP. Enterprise Information Portals. Datawarehouse Software. Extraction, Transformation, and Loading Tools Vendors. eLearning Tools Vendors. Part Four: Implementing a Business Intelligence System. 14. Project Planning. Business Intelligence Project Dos. Business Intelligence Project Don'ts. Drafting and Executing the Project Plan. Sample Project Plan. 15. Datawarehouse or Data Mart? 16. Multidimensional Model Definition. Defining Dimension Hierarchies and Dimension Types. Multidimensional Schemas: Star and Snowflake. 17. Model Maintenance. Periodicity Considerations. Slowly Changing Dimensions. More Help in Maintaining Dimensions. 18. Financial Data Modeling. Data Collection. Extending Your Financial Vision! Balance Sheet. 19. Survey of Datawarehouse Users. The Survey. Analysis of Responses. Appendix A: Sample RFP. Appendix B: Software Candidate Evaluation and Rating Sheet. Appendix C: Sample License Agreement. Appendix D: Sample Confidentiality and Nondisclosure Agreement. (Sales/Demo Process). Appendix E: Sample Support Plan/Agreement. Appendix F: Sample Project Plan. Appendix G: Sample Consulting Agreement. Appendix H: Vendor Addresses. Appendix I: References and Further Reading. Glossary. Index.

Patent
01 Mar 2002
TL;DR: The Electronic Marketplace Solution (EMS) as discussed by the authors is an Internet portal which assists Independent Convenience Retailers through the provision of a single system to sell, buy and pay for goods/services and manage their business more effectively.
Abstract: Broadly, the invention provides an Electronic Marketplace Solution (EMS) which may be embodied, in part or in whole, as a method, system or computer readable medium of instructions. The invention provides a business to business Internet portal serving companies operating within the convenience marketplace. In one embodiment, the invention is an Internet portal which assists: Independent Convenience Retailers through the provision of a single system to sell, buy and pay for goods/services and manage their business more effectively; Organised Convenience Groups, by providing them access to improved network management, the ability to ensure compliance and the ability to reduce costs; FMCG Manufacturers in the areas of secondary supply chain efficiencies, by improved business intelligence and direct access to Convenience Retailers; Wholesalers/Logistics Providers, for reduced costs and improved efficiency and enabling them to expand their customer base; and/or Service Providers, by enabling them to more efficiently reach a large target market.

Journal ArticleDOI
TL;DR: In this article, the authors examine various issues raised by the idea of CRM and how it has affected traditional views on marketing, focusing on how information technology in the form of the Internet and business intelligence solutions have enabled large businesses to focus on the customer as well as on their products and sales levels.
Abstract: This paper examines various issues raised by the idea of Customer Relationship Management (CRM) and how it has affected traditional views on marketing. It uses three case studies from European telecommunications companies to illustrate the points made, focusing on how information technology in the form of the Internet and business intelligence solutions have enabled large businesses to focus on the customer as well as on their products and sales levels. CRM is a attitude that needs to pervade the company, but it needs a solid foundation of knowledge of customers. This knowledge comes not only from customer-facing employees but also from the vast amounts of data collected by companies today. It is the technological infrastructure that allows this knowledge to be distilled from data about customers and their interactions with the company, facilitates better business decisions and encourages customer loyalty and retention.

Journal ArticleDOI
TL;DR: Data warehousing concepts are brought to life through a case study of Harrah’s Entertainment, a firm that became a leader in the gaming industry with its CRM business strategy supported by data warehousing.
Abstract: Data warehousing is a strategic business and IT initiative in many organizations today. Data warehouses can be developed in two alternative ways -- the data mart and the enterprisewide data warehouse strategies -- and each has advantages and disadvantages. To create a data warehouse, data must be extracted from source systems, transformed, and loaded to an appropriate data store. Depending on the business requirements, either relational or multidimensional database technology can be used for the data stores. To provide a multidimensional view of the data using a relational database, a star schema data model is used. Online analytical processing can be performed on both kinds of database technology. Metadata about the data in the warehouse is important for IT and end users. A variety of data access tools and applications can be used with a data warehouse – SQL queries, management reporting systems, managed query environments, DSS/EIS, enterprise intelligence portals, data mining, and customer relationship management. A data warehouse can be used to support a variety of users – executives, managers, analysts, operational personnel, customers, and suppliers. Data warehousing concepts are brought to life through a case study of Harrah’s Entertainment, a firm that became a leader in the gaming industry with its CRM business strategy supported by data warehousing.

Book
01 Jan 2002
TL;DR: The most detailed business resource ever published, Business will be a one-stop reference covering virtually every aspect of the world of business, and aimed at everyone who works as discussed by the authors, and will also be a source of inspiration and insight, with original essays from more than 150 world-renowned business thinkers, leaders, academics, and practitioners.
Abstract: "Could there be a business intelligence-a set of abilities that distinguish those truly outstanding in the world of commerce? Could business intelligence be the mark of outstanding individual performers, as well as the building block of the best-performing companies?" From the Introduction by Daniel Goleman. A landmark in reference publishing, Business will be to global commerce what Britannica is to general knowledge. The most detailed business resource ever published, Business will be a one-stop reference covering virtually every aspect of the world of business, and aimed at everyone who works. }The gold standard of business information for the twenty-first century, Business will also be a source of inspiration and insight, with original essays from more than 150 world-renowned business thinkers, leaders, academics, and practitioners such as Charles Handy, Warren Bennis, Jim Collins, Thomas Petzinger, Jr. , Peter L. Bernstein, and John Seely Brown. Unprecedented in scope (2.5 million words, 2,200 pages), Business covers all significant intellectual, practical, and factual areas in the field of management. A major feature of Business is an authoritative world almanac featuring 26 industry sector surveys and profiles of 150 countries and all U.S. states. Lively biographies of the management thinkers who have shaped the world as we know it-from Adam Smith to Peter Drucker, from Henry Ford to Este Lauder-will inform and entertain, while digests of the 70 most influential business books ever published are certain to spark debate. In addition, Business includes a comprehensive dictionary, an anthology of quotations, and an extensive source section covering nearly 200 topics from accounting to team building.Guided by an eminent team of editors and advisors, Business equips everyone who works with the tools necessary to learn from yesterday and prepare for tomorrow. Business is the first-ever comprehensive resource on business and management. A remarkable compendium of lively essays, inspiring biographies, and impressive source materials, Business features the following:Original Best-Practice Essays from 150 of Today's Thought LeadersFascinating Profiles of Top Management Thinkers and PioneersA Management Library-The 70 Most Important Business Books of All TimeIndispensable Management Checklists and Action ListsFirst-Class World Business AlmanacComprehensive Dictionary of 5,000 Business TermsBusiness Information Sources -Where to Go from HereVisit www ultimatebusinessresource.com }

Book
22 Jan 2002
TL;DR: Clickstream Data Warehousing is a great read for the serious data warehouse designer grappling with clickstream data, and combines engineering knowledge of the clickstream with state-of-the-art dimensional data warehouse design techniques to produce a very useful book.
Abstract: From the Publisher: "Clickstream Data Warehousing is a great read for the serious data warehouse designer grappling with clickstream data With a clear style, the authors explain the intricacies of this important source of customer behavior data They combine engineering knowledge of the clickstream with state-of-the-art dimensional data warehouse design techniques to produce a very useful book" –Ralph Kimball, author of The Data Warehouse Toolkit and The Data Warehouse Lifecycle Toolkit The Web is an incredibly rich source of business intelligence, and many enterprises are scrambling to build data warehouses that capture the knowledge contained in the clickstream data from their Web sites By analyzing the user behavior patterns contained in these clickstream data warehouses, savvy businesses can expand their markets, improve customer relationships, reduce costs, streamline operations, strengthen their Web sites, and hone their business strategies Whether you come from an e-business, Web architecture, or data warehouse background, this book gives you the integrated perspective necessary to create a successful clickstream data warehouse The first part of the book explains everything you need to know about the Web technology and IT infrastructure required to build a clickstream data warehouse The second part of the book walks you through the process of designing and implementing a clickstream data warehouse, including: Planning, staffing, and managing the project Designing your clickstream data warehouse schema using the innovative meta-schema design template Picking the appropriate data warehouse software and storage subsystems to support your click-stream data warehouse Building the extract, transformation, and load (ETL) mechanism Delivering data to end users for analysis The companion Web site features additional reference material, an interactive question and answer forum, additional articles and information on data warehousing topics, and links to related Web sites Wiley Computer Publishing Timely Practical Reliable Author Biography: Mark Sweiger is President and Principal of Clickstream Consulting, a boutique consultancy specializing in clickstream data warehouses and data warehouse education He is a noted author on data warehousing and is a speaker at The Data Warehousing Institute Mark R Madsen is Vice President and Principal of Clickstream Consulting He has held various high-level positions within IT organizations and technology vendors and has extensive experience in data warehousing and supply chain management Jimmy Langston is a Senior Consultant at Clickstream Consulting, where he advises clients on optimal data warehouse architectures, implementation, multidimensional database design, database configuration, and performance tuning Howard Lombard is Chief Architect at Gazelle Consulting, Inc, a consultancy specializing in the design and implementation of large-scale data warehousing and business intelligence solutions

01 Jan 2002
TL;DR: McKee et al. as discussed by the authors presented a review of the role of media in the democratic process and provided a theoretical perspective from which hypotheses can be derived, which can be used to reexamine historical cases as well as make predictions about the future.
Abstract: The Role of Media in Democratization by Patrick J. McConnell and Lee B. Becker A diverse and growing body of research and writing on the role of media in democratic development exists in the literatures of political science, mass communication, economics and sociology, as well as other fields. Unfortunately, little has been done to integrate this work. As a result, there is at present no consensus on the role the media play in the democratic process. This paper provides an integration of that literature. This review indicates that the process of democratization does not always move in a single direction. Countries move toward democracy in starts and stops, with regression at least somewhat common. The literature indicates that there are four distinct phases that a country or territory goes through on the path to becoming a stable democracy. These four stages of societal development can be labeled pretransition, transition, consolidation and stable (or mature). The pretransition stage focuses on societal conditions under the old regime, while the transition stage is that historical moment when the previous regime no longer holds political power. A state becomes consolidated when the ideals of democracy are accepted and adhered to, and then is considered stable when democracy functions over a period of time. This approach suggests that media tend to be most supportive of democracy in the early, often euphoric, period after the previous regime has fallen, when journalists as well as other citizens are enjoying newfound freedoms. As the transition process moves toward consolidation, the media as well as the public can become more cynical, particularly in the face of continued political wrangling and the financial pressures of a market economy. The media in a stable democracy are considered the principal institutions from which members of the public can better understand their society. Ideally, the media contribute to the public sphere by providing citizens with information about their world, by fostering debate about various issues and by encouraging informed decisions about available courses of action. The media are also a site of contestation in which diverse positions are advanced, significant opinions are heard, interests and inner-workings are exposed, and input is received. These all contribute to public debate. The media are also expected to act as “watchdogs” on government and industry. This paper summarizes and integrates the existing literature. It identifies consistencies and inconsistencies in the literatures and offers possible explanations of those inconsistencies. The outcome is a theoretical perspective from which hypotheses can be derived. These hypotheses can be used to reexamine historical cases as well as make predictions about the future.



Book
01 Apr 2002
TL;DR: This book describes business intelligence and demonstrates how it can improve profits by helping businesses outmaneuver their competitors and discusses the Microsoft platform for business intelligence.
Abstract: From the Publisher: This title provides decision makers with the information they need as they adopt a comprehensive business intelligence solution that helps them react to change and anticipate customer needs more quickly. It describes business intelligence and demonstrates how it can improve profits by helping businesses outmaneuver their competitors. It also discusses the Microsoft® platform for business intelligence. The book includes case studies plus a glossary of business-intelligence-related terms. Provides business decision-makers with the information they need as they consider adopting a comprehensive business-intelligence solution. It's the only book that discusses not only business intelligence in general but also the comprehensive Microsoft business-intelligence solution. Key Book Benefits: * Create a comprehensive business-intelligence strategy for your organization. * Build competitive advantage using the Microsoft platform * Customize your business intelligence through advanced querying, reporting, and analysis capabilities.