scispace - formally typeset
Search or ask a question

Showing papers in "Information Systems Research in 1996"


Journal ArticleDOI
TL;DR: A large-scale custom software effort, the Worm Community System (WCS), a collaborative system designed for a geographically dispersed community of geneticists, is analyzed, using Bateson's model of levels of learning to analyze the levels of infrastructural complexity involved in system access and designer-user communication.
Abstract: We analyze a large-scale custom software effort, the Worm Community System (WCS), a collaborative system designed for a geographically dispersed community of geneticists. There were complex challenges in creating this infrastructural tool, ranging from simple lack of resources to complex organizational and intellectual communication failures and tradeoffs. Despite high user satisfaction with the system and interface, and extensive user needs assessment, feedback, and analysis, many users experienced difficulties in signing on and use. The study was conducted during a time of unprecedented growth in the Internet and its utilities (1991–1994), and many respondents turned to the World Wide Web for their information exchange. Using Bateson's model of levels of learning, we analyze the levels of infrastructural complexity involved in system access and designer-user communication. We analyze the connection between systems development aimed at supporting specific forms of collaborative knowledge work, local orga...

2,297 citations


Journal ArticleDOI
TL;DR: A perspective on organizational transformation is outlined which proposes change as endemic to the practice of organizing and hence as enacted through the situated practices of organizational actors as they improvise, innovate, and adjust their work routines over time.
Abstract: In this paper, I outline a perspective on organizational transformation which proposes change as endemic to the practice of organizing and hence as enacted through the situated practices of organizational actors as they improvise, innovate, and adjust their work routines over time. I ground this perspective in an empirical study which examined the use of a new information technology within one organization over a two-year period. In this organization, a series of subtle but nonetheless significant changes were enacted over time as organizational actors appropriated the new technology into their work practices, and then experimented with local innovations, responded to unanticipated breakdowns and contingencies, initiated opportunistic shifts in structure and coordination mechanisms, and improvised various procedural, cognitive, and normative variations to accommodate their evolving use of the technology. These findings provide the empirical basis for a practice-based perspective on organizational transfor...

2,031 citations


Journal ArticleDOI
TL;DR: In this article, the authors describe an empirical study of the relative importance of top management support and external IS expertise on IS effectiveness in 114 small businesses, and show that top management is not as important as effective externalIS expertise in small business IS implementation.
Abstract: Top management support is a key recurrent factor critical for effective information systems (IS) implementation. However, the role of top management support may not be as critical as external IS expertise, in the form of consultants and vendors, in small business IS implementation due to the unique characteristics of small businesses. This paper describes an empirical study of the relative importance of top management support and external IS expertise on IS effectiveness in 114 small businesses. Partial least squares (PLS) was used for statistical testing. The results show that top management support is not as important as effective external IS expertise in small business IS implementation. While top management support is essential for IS effectiveness, high quality external IS expertise is even more critical for small businesses operating in an environment of resource poverty. These findings call for more research efforts to be directed at selecting and engaging high quality external IS expertise for IS ...

554 citations


Journal ArticleDOI
TL;DR: The benefits and costs of allowing diversity to reign in the Information Systems discipline are considered and a structure is proposed that is hoped will facilitate discourse on the benefits and cost of diversity and on the role that diversity should now play in the IS discipline.
Abstract: Three types of diversity have been prominent in the Information Systems discipline for over a decade: a diversity in the problems addressed; b diversity in the theoretical foundations and reference disciplines used to account for IS phenomena; and c diversity in the methods used to collect, analyze, and interpret data. History has played a major part in encouraging IS researchers to use diversity as a means of countering criticisms of their discipline and increasing their research rigor and productivity. In particular, frequent recourse to reference disciplines has underpinned much of the research that has been undertaken since the early 1980s. There are now signs, however, that the level of diversity that currently exists in IS research may be problematic. In this paper, we consider some of the benefits and costs of allowing diversity to reign in the IS discipline. We also propose a structure that we hope will facilitate discourse on the benefits and costs of diversity and on the role that diversity should now play in the IS discipline.

439 citations


Journal ArticleDOI
TL;DR: The evidence of diversity in information systems IS research is confirmed and the ways in which diversity both threatens and advances the field of IS are identified.
Abstract: This paper confirms the evidence of diversity in information systems IS research and identifies the ways in which diversity both threatens and advances the field of IS. While advocating diversity within the field of IS, the paper also discusses the responsibilities that must be assumed by IS researchers. Responsibilities include a “disciplined methodological pluralism” Landry and Banville [Landry, M., C. Banville. 1992. A disciplined methodological pluralism for MIS research. Accounting, Management and Inform. Technologies22, April--June 77--97. in which researchers clearly justify their research aims, theories, and methods. Responsibilities also include researchers' commitment to collaborative ideals.

412 citations


Journal ArticleDOI
TL;DR: A comparative case study was designed to assess the consequences of implementing a particular geographic information system (GIS) in two neighboring county government organizations and reported radically different experiences with, and consequences of, the GIS technology.
Abstract: A comparative case study was designed to assess the consequences of implementing a particular geographic information system (GIS) in two neighboring county government organizations. Respondents reported radically different experiences with, and consequences of, the GIS technology. In North County, participants considered GIS to be responsible for transforming the way that work was accomplished and for changing patterns of communication among departments. In South County, the same GIS technology was implemented with little social consequence. These divergent outcomes are associated with differences in four specific processes related to the implementation of the GIS in the two organizations: initiation, transition, deployment, and spread of knowledge. In North County, implementation was initiated by an influential group of users (geographers) who positioned the technology as a shared resource that built upon existing competencies. A distributed configuration was deployed in North County, and conceptual know...

296 citations


Journal ArticleDOI
TL;DR: Four measures of consumer welfare are estimated, including Marshallian surplus, exact surplus based on compensated (Hicksian) demand curves, a non-parametric estimate, and a value based on the theory of index numbers, implying that the value created for consumers from spending on IT is about three times as large as the amount paid to producers of IT equipment.
Abstract: Over the past two decades, American businesses have invested heavily in information technology IT hardware. Managers often buy IT to enhance customer value in ways that are poorly measured by conventional output statistics. Furthermore, because of competition, firms may be unable to capture the full benefits of the value they create. This undermines researchers' attempts to determine IT value by estimating its contribution to industry productivity or to company profits and revenues. An alternative approach estimates the consumers' surplus from IT investments by integrating the area under the demand curve for IT. This methodology does not directly address the question of whether managers and consumers are purchasing the optimal quantity of IT, but rather assumes their revealed willingness-to-pay for IT accurately reflects their valuations. Using data from the U.S. Bureau of Economic Analysis, we estimate four measures of consumers' surplus, including Marshallian surplus, Exact surplus based on compensated Hicksian demand curves, a “nonparametric” estimate, and a value based on the theory of index numbers. Interestingly, all four estimates indicate that in our base year of 1987, IT spending generated approximately $50 billion to $70 billion in net value in the United States and increased economic growth by about 0.3% per year. According to our estimates, which are likely to be conservative, IT investments generate approximately three times their cost in value for consumers.

232 citations


Journal ArticleDOI
TL;DR: It is argued that organizational payoff is maximized when several factors relating to IT, decision authority, business processes and incentives are changed in a coordinated manner in the right directions by the right magnitude to move toward an ideal design configuration.
Abstract: Advances in new Information Technologies IT and changes in the business environment such as globalization and competitive pressure have prompted organizations to embark on reengineering projects involving significant investments in IT and business process redesign. However, the evidence of payoff from such investments can be classified as mixed as best, a problem we partly attribute to the absence of a strong theoretical foundation to assess and analyze reengineering projects. We seek to apply complementarity theory and a business value modeling approach to address some questions involving what, when, and how much to reengineer. Complementarity theory is based on the notion that the value of having more of one factor increases by having more of another complementary factor. Further, related developments in the optimization of “supermodular” functions provide a useful way to maximize net benefits by exploiting complementary relationships between variables of interest. Combining this theory with a multi-level business value model showing relationships between key performance measures and their drivers, we argue that organizational payoff is maximized when several factors relating to IT, decision authority, business processes and incentives are changed in a coordinated manner in the right directions by the right magnitude to move toward an ideal design configuration. Our analysis further shows that when a complementary reengineering variable is left unchanged either due to myopic vision or self-interest, the organization will not be able to obtain the full benefits of reengineering due to smaller optimal changes in the other variables. We also show that by increasing the cost of changing the levels of design variables, unfavorable pre-existing conditions e.g., too much heterogeneity in the computing environment can lead to reengineering changes of smaller magnitude than in a setting with favorable conditions.

221 citations


Journal ArticleDOI
TL;DR: A cognitive learning perspective is used to develop and test a model of the relationship between information acquisition and learning in the executive support systems (ESS) context and proposes two types of learning: mental model maintenance in which new information fits into existing mental models and confirms them and mental model building in which mental models are changed to accommodate new information.
Abstract: A cognitive learning perspective is used to develop and test a model of the relationship between information acquisition and learning in the executive support systems (ESS) context. The model proposes two types of learning: mental model maintenance in which new information fits into existing mental models and confirms them; and mental model building in which mental models are changed to accommodate new information. It also proposes that information acquisition objectives determine the type of learning that is possible. When ESS are used to answer specific questions or solve well-defined problems, they help to fine-tune operations and verify assumptions—in other words, they help to maintain current mental models. However, ESS may be able to challenge fundamental assumptions and help to build new mental models if executives scan through them to help formulate problems and foster creativity. Thirty-six interviews with executive ESS users at seven organizations and a survey of 361 users at 18 additional organ...

206 citations


Journal ArticleDOI
TL;DR: The role of KBS explanations is discussed to provide an understanding of both the specific factors that influence explanation use and the consequences of such use.
Abstract: Ever since MYCIN introduced the idea of computer-based explanations to the artificial intelligence community, it has come to be taken for granted that all knowledge-based systems KBS need to provide explanations. While this widely-held belief has led to much research on the generation and implementation of various kinds of explanations, there has been no theoretical basis to justify the use of explanations by KBS users. This paper discusses the role of KBS explanations to provide an understanding of both the specific factors that influence explanation use and the consequences of such use. The first part of the paper proposes a model based on cognitive learning theories to identify the reasons for the provision of KBS explanations from the perspective of facilitating user learning. Using the feedforward and feedback operators of cognitive learning the paper develops strategies for providing KBS explanations and classifies the various types of explanations found in current KBS applications. This second part of the paper presents a two-part framework to investigate empirically the use of KBS explanations. The first part of the framework focuses on the potential factors that influence the explanation seeking behavior of KBS users, including user expertise, the types of explanations provided and the level of user agreement with the KBS. The second part of the framework explores the potential effects of the use of KBS explanations and specifically considers four distinct categories of potential effects: explanation use behavior, learning, perceptions, and judgmental decision making.

202 citations


Journal ArticleDOI
TL;DR: This research empirically tests propositions and hypotheses for a specific instantiation of Adaptive Structuration Theory and supported the proposition that appropriation mediators can increase the faithful use of structured decision techniques and that faithful use can improve decision quality.
Abstract: Structured decision techniques have been a mainstay of prescriptive decision theory for decades. Group Support Systems GSSs automate many of the features found in decision techniques, yet groups often choose to ignore both the technique and the technology in favor of more familiar decision processes. This research empirically tests propositions and hypotheses for a specific instantiation of Adaptive Structuration Theory. A controlled laboratory experiment tests the ability of three appropriation mediators e.g., facilitation, GSS configuration, and training to directively affect group decision making through guidance and restrictiveness. The experiment used a hidden-profile task and structured decision technique which directed group members to reach a decision by identifying the problem, choosing criteria, and selecting a solution. The results supported the proposition that appropriation mediators can increase the faithful use of structured decision techniques and that faithful use can improve decision quality.

Journal ArticleDOI
TL;DR: In both experiments, groups using the decomposed process generated 60% more ideas, and this paper attributes these differences to the ability of time constraints to increase the rate of idea generation, and theAbility of problem decomposition to refocus members' attention more evenly across the entire problem.
Abstract: One aspect of brainstorming that has received little research attention is how the brainstorming problem should be presented to the group, whether as one all-encompassing question or as a series of separate questions each focusing on one aspect of the problem. This paper reports the results of two experiments in which subjects (MBAs in the first, senior executives in the second) electronically brainstormed on intact problems (where all parts of the problem were presented simultaneously) or on decomposed problems (where three subcategories of the problem were sequentially posed to the groups). In both experiments, groups using the decomposed process generated 60% more ideas. We attribute these differences to the ability of time constraints to increase the rate of idea generation, and the ability of problem decomposition to refocus members' attention more evenly across the entire problem.

Journal ArticleDOI
TL;DR: Biased discussion was found to occur to a greater degree when communication mode was computer-mediated, and the group members were not in conflict prior to the discussion.
Abstract: One advantage of groups is that they have access to a larger pool of expertise and knowledge than individual group members. However, groups are sometimes ineffective at exchanging information. This tendency has been called biased discussion. The present study examines the effects of communication mode face-to-face vs. computer mediated, and Prediscussion information distribution characteristics on biased discussion. Biased discussion was found to occur to a greater degree when communication mode was computer-mediated, and the group members were not in conflict prior to the discussion.

Journal ArticleDOI
TL;DR: In this article, the authors compared traditional and nontraditional training techniques with regard to computer related training and found that the use of hands-on training methods, especially behavior modeling, resulted in superior retention of knowledge, transfer of learning, and end-user satisfaction.
Abstract: This study compares traditional and nontraditional training techniques with regard to computer related training. Its purpose was to determine which training methods could best be utilized in computer related training to maximize a trainee's retention of material and transfer of learning. A field experiment was conducted using two hundred members of active duty U.S. Naval Construction Battalion as subjects. Evaluation of trainees included a pre-training screening, post-training evaluation immediately after training, and a follow-up session four weeks after the post-training session utilizing previously validated instruments. Training treatments included instruction lecture, exploration independent study, and a nontraditional technique---behavior modeling an enhanced combination of the other two methods. Performance outcomes were operationalized using hands-on task performance and comprehension of the computer system as dependent variables. End-user satisfaction with the computer system was also measured. Two covariates, cognitive ability and system use, were also introduced into the study. The use of hands-on training methods, especially behavior modeling, resulted in superior retention of knowledge, transfer of learning, and end-user satisfaction. Cognitive ability failed to be a good predictor of trainee success but a connection was established between training methodology, system use, and end-user satisfaction.

Journal ArticleDOI
TL;DR: Relevant calibration, decision making, and DSS literatures are synthesized and related behavioral theories are borrowed to identify the properties of expressiveness, visibility, and inquirability as requisite components of the DSS design theory for user calibration.
Abstract: A theory is proposed for designing decision support systems (DSS) so that the confidence a decision maker has in a decision made using the aid equals the quality of that decision. The DSS design theory for user calibration prescribes properties of a DSS needed for users to achieve perfect calibration. Relevant calibration, decision making, and DSS literatures are synthesized; and related behavioral theories are borrowed to identify the properties of expressiveness, visibility, and inquirability as requisite components of the DSS design theory for user calibration.

Journal ArticleDOI
TL;DR: This paper examines the use of the cellular telephone in police agencies as an example of ‘low tech’ innovation in information technology and draws on qualitative data, including interviews, focus group discussions, and first-hand observations in American police agencies to illustrate the impact of cellular phones on the social organization of police work in the early 1990s.
Abstract: This paper examines the use of the cellular telephone in police agencies as an example of ‘low tech’ innovation in information technology. It draws on qualitative data, including interviews, focus group discussions, and first-hand observations in American police agencies to illustrate the impact of cellular phones on the social organization of police work in the early 1990s. Dramaturgical analysis—the study of the selective use of messages to communicate to an audience—frames the study (Goffman [Goffman, E. 1959. The Presentation of Self in Everyday Life. Doubleday, New York.], Burke [Burke, K. 1962. A Grammer of Motives and a Rhetoric of Motives. Mendan Publishing, Cleveland, OH.). Dramaturgy reveals how the emergent meanings of information technology arising from changes in communication and symbolization shape work processes and authority. Significant differences in response to and use of the technology are discovered, and are best understood as consistent with the impressions members of the organizati...

Journal ArticleDOI
TL;DR: A multi-trial free-recall experiment was conducted with database designers who had been trained primarily in a binary conceptual schema design methodology, finding that these designers had been admonished to eschew any distinction between entities and attributes.
Abstract: A longstanding debate in the data modeling literature pertains to whether the grammars used to generate conceptual schemas should sustain a distinction between entities and attributes. The grammars used to generate entity-relationship diagrams and object-oriented conceptual models, for example, provide separate constructs for representing entities and attributes. The grammars used to generate binary data models, however, provide only a single construct for representing both entities and attributes. To sharpen the focus of the debate, a multi-trial free-recall experiment was conducted with database designers who had been trained primarily in a binary conceptual schema design methodology. In the experiment, the designers were first shown conceptual schema diagrams based on a binary model. The designers were then asked to recall the diagrams. Throughout their training as designers, they had been admonished to eschew any distinction between entities and attributes. Moreover, the diagrams they were shown in th...

Journal ArticleDOI
TL;DR: It is suggested that the low trading volumes on many off-exchange systems do not result from traders' inability to break away from established trading floors, but, improved designs for IT-based trading mechanisms are needed, and when these are available, they are likely to win significant trading volume from established exchanges.
Abstract: Reasons for the mixed reactions to today's electronic off-exchange trading systems are examined, and regulatory implications are explored. Information technology (IT) could provide more automated markets, which have lower costs. Yet for an electronic trading system to form a liquid and widely used market, a sufficient number of traders would need to make a transition away from established trading venues and to this alternative way of trading. This transition may not actually occur for a variety of reasons. Two tests are performed of the feasibility and the desirability of transitions to new markets. In the first test, traders in a series of economic experiments demonstrate an ability to make a transition and develop a critical mass of trading activity in a newly opened market. In the second test, simulation is used to compare the floor-based specialist auction in place in most U.S. stock exchanges today to a disintermediated alternative employing screen-based order matching. The results indicate that redu...

Journal ArticleDOI
TL;DR: It is shown that people often possess the appropriate decision rules but are unable to apply them correctly because they have an ineffective causal mental representation induced by the problem context.
Abstract: Many biases have been observed in probabilistic reasoning, hindering the ability to follow normative rules in decision-making contexts involving uncertainty. One systematic error people make is to neglect base rates in situations where prior beliefs in a hypothesis should be taken into account when new evidence is obtained. Incomplete explanations for the phenomenon have impeded the development of effective debiasing procedures or tools to support decision making in this area. In this research, we show that the main reason behind these judgment errors is the causal representation induced by the problem context. In two experiments we demonstrate that people often possess the appropriate decision rules but are unable to apply them correctly because they have an ineffective causal mental representation. We also show how this mental representation may be modified when a graph is used instead of a problem narrative. This new understanding should contribute to the design of better decision aids to overcome this...

Journal ArticleDOI
TL;DR: The changes in information technology within the census over a period of more than a century are explored, and a contrast is drawn with the U.S. census---which mechanized in 1890---on the adoption of new technology.
Abstract: The first British census was taken in 1801 and was processed by a handful of clerks in a tiny office. By the mid-1800s, the census had evolved into an elaborate Victorian data-processing operation involving over a hundred clerks, each of whom had a specialized information-processing role. In 1911 the census was mechanized and the routine data processing was taken over by punched-card machines. This paper explores the changes in information technology within the census over a period of more than a century, and the resulting organizational changes. A contrast is drawn with the U.S. census—which mechanized in 1890—on the adoption of new technology.

Journal ArticleDOI
TL;DR: Strong similarities between post-industrialization, which has been attributed to the use of Information Technology (IT) and an information-based economy; proto-industrializations, which was a goods-based manufacturing economy with little IT; and flexible specialization, a form of workplace organization common to the early industrial period and surviving in some areas today cast doubt on the argument that the causal link between technology and the organization of work is a simple or direct one.
Abstract: The mid-twentieth century was marked by the dominance of large, stable, centralized business; the late twentieth century is marked by a downsizing and disaggregation of the firm. This manuscript investigates this change and considers some possible causes for it using historical analysis. In order to explore the causes of the post-industrial organization of work, we compare it to the history of the industrial organization of work and to the early or proto-industrial system of artisanal and “putting out” manufacturing. We identify strong similarities between post-industrialization, which has been attributed to the use of Information Technology (IT) and an information-based economy; proto-industrialization, which was a goods-based manufacturing economy with little IT; and flexible specialization, a form of workplace organization common to the early industrial period and surviving in some areas today. These similarities cast doubt on the argument that the causal link between technology and the organization of...

Journal ArticleDOI
TL;DR: This paper proposes the integration of several technologies that might help the modeler gain insights from the analysis of multiple model instances, and reports on preliminary tests of a prototype built using the architecture proposed.
Abstract: After building and validating a decision support model, the decision maker frequently solves often many times different instances of the model. That is, by changing various input parameters and rerunning different model instances, the decision maker develops insights into the workings and tradeoffs of the complex system represented by the model. The purpose of this paper is to explore inductive model analysis as a means of enhancing the decision maker's capabilities to develop insights into the business environment represented by the model. The justification and foundation for inductive model analysis is based on three distinct literatures: 1 the cognitive science theory of learning literature, 2 the decision support system literature, and 3 the model management system literature. We also propose the integration of several technologies that might help the modeler gain insights from the analysis of multiple model instances. Then we report on preliminary tests of a prototype built using the architecture proposed in this paper. The paper concludes with a discussion of several research questions. Much of the previous MIS/DSS and management science research has focused on model formulation and solution. This paper posits that it is time to give more attention to enhancing model analysis.

Journal ArticleDOI
TL;DR: The DLS as a rule-learning technique is described and the resulting computational performance is presented, with definitive computational benefits clearly demonstrated to show the efficacy of using the DLS.
Abstract: This report is concerned with a rule learning system called the Distributed Learning System (DLS). Its objective is two-fold: First, as the main contribution, the DLS as a rule-learning technique is described and the resulting computational performance is presented, with definitive computational benefits clearly demonstrated to show the efficacy of using the DLS. Second, the important parameters of the DLS are identified to show the characteristics of the Group Problem Solving (GPS) strategy as implemented in the DLS. On one hand this helps us pinpoint the critical designs of the DLS for effective rule learning; on the other hand this analysis can provide insight into the use of GPS as a more general rule-learning strategy.

Journal ArticleDOI
TL;DR: An integrated and comprehensive framework for decision support that shows how case snippets can be retrieved, adapted, and synthesized to generate multiple design solutions, whose consistency is enforced through a dynamic constraint management mechanism.
Abstract: This paper presents an integrated and comprehensive framework for decision support. A model integrating case-based reasoning with constraint posting and multicriteria decision making is proposed for providing effective and efficient assistance in solving routine design problems. The model is developed based on an analysis of the knowledge acquired from experts in engineering design, and is subsequently operationalized as a computer-based design assistant called IDEA. IDEA employs constraint posting to initially bound the design space and to maintain consistency of the design solutions. Case-based reasoning allows IDEA to generate new designs by retrieving, adapting, and composing from similar cases in memory. Finally, IDEA optimizes multiple objectives to identify a set of pareto-optimal designs. By organizing computer memory as a collection of cases and case snippets, and by adapting and synthesizing those cases and snippets---using techniques similar to those employed by design experts---IDEA provides valuable design assistance. In addition to providing a framework for decision support, the research makes specific contributions to case-based design. It shows how case snippets can be retrieved, adapted, and synthesized to generate multiple design solutions, whose consistency is enforced through a dynamic constraint management mechanism. The concepts and techniques developed for performing dynamic adaptation adaptation during composition from case snippets and for maintaining an evolving solution space a solution space that shrinks and expands over time contribute to the state-of-the-art in case-based design.

Journal ArticleDOI
TL;DR: It is shown that simple marginal capacity cost pricing is often optimal in the absence of private user information, and it outperforms cost recovery and profit center pricing methods.
Abstract: This paper extends the analysis of the long-run pricing and capacity decision problem for shared computer services by Dewan and Mendelson 1990 and makes two further contributions. First, we show that simple marginal capacity cost pricing is often optimal in the absence of private user information, and it outperforms cost recovery and profit center pricing methods. Second, we provide insights into the implications of declining computing costs on the tradeoff between capacity costs and user time. In equilibrium, expected user delay costs are bounded by capacity costs due to the substitution of cheaper information processing capacity for valuable user time.

Journal ArticleDOI
TL;DR: This article analyzes the adoption and use of the fax machine to illustrate how information technology has altered the organizing process and conduct of electoral and legislative politics in Texas.
Abstract: This article analyzes the adoption and use of the fax machine to illustrate how information technology has altered the organizing process and conduct of electoral and legislative politics in Texas. The major groups studied are election campaigns, political parties, the legislature, and lobbyists. Faxing allows campaigns to do more, stay more informed, and disseminate information far more quickly and accurately than before. Consequently, the political process has accelerated significantly compared with five or ten years ago. Most obviously, the cycle for political news has shrunk from days to hours or minutes, forcing campaigns to be more organized and responsive. Faxing also enables organizations to generate a unified political theme statewide easily and quickly, creating the semblance, if not the reality, of a grassroots movement. Faxing gives centralized organizations the ability to appear decentralized and decentralized organizations the ability to act in a coordinated manner. Furthermore, faxing has i...

Journal ArticleDOI
TL;DR: Through a series of example queries and updates, this paper illustrates the differences between these two approaches and demonstrates that the temporally grouped approach more adequately captures the semantics of historical data.
Abstract: Numerous proposals for extending the relational data model to incorporate the temporal dimension of data have appeared over the past decade. It has long been known that these proposals have adopted one of two basic approaches to the incorporation of time into the extended relational model. Recent work formally contrasted the expressive power of these two approaches, termed temporally ungrouped and temporally grouped, and demonstrated that the temporally grouped models are more expressive. In the temporally ungrouped models, the temporal dimension is added through the addition of some number of distinguished attributes to the schema of each relation, and each tuple is “stamped” with temporal values for these attributes. By contrast, in temporally grouped models the temporal dimension is added to the types of values that serve as the domain of each ordinary attribute, and the application's schema is left intact. The recent appearance of TSQL2, a temporal extension to the SQL-92 standard based upon the temporally ungrouped paradigm, means that it is likely that commercial DBMS's will be extended to support time in this weaker way. Thus the distinction between these two approaches---and its impact on the day-to-day user of a DBMS---is of increasing relevance to the database practitioner and the database user community. In this paper we address this issue from the practical perspective of such a user. Through a series of example queries and updates, we illustrate the differences between these two approaches and demonstrate that the temporally grouped approach more adequately captures the semantics of historical data.