scispace - formally typeset
Search or ask a question

Showing papers in "Information Systems Research in 2009"


Journal ArticleDOI
TL;DR: An extended deterrence theory model is presented that combines work from criminology, social psychology, and information systems and suggests that user awareness of security countermeasures directly influences the perceived certainty and severity of organizational sanctions associated with IS misuse, which leads to reduced IS misuse intention.
Abstract: Intentional insider misuse of information systems resources (i.e., IS misuse) represents a significant threat to organizations. For example, industry statistics suggest that between 50%--75% of security incidents originate from within an organization. Because of the large number of misuse incidents, it has become important to understand how to reduce such behavior. General deterrence theory suggests that certain controls can serve as deterrent mechanisms by increasing the perceived threat of punishment for IS misuse. This paper presents an extended deterrence theory model that combines work from criminology, social psychology, and information systems. The model posits that user awareness of security countermeasures directly influences the perceived certainty and severity of organizational sanctions associated with IS misuse, which leads to reduced IS misuse intention. The model is then tested on 269 computer users from eight different companies. The results suggest that three practices deter IS misuse: user awareness of security policies; security education, training, and awareness (SETA) programs; and computer monitoring. The results also suggest that perceived severity of sanctions is more effective in reducing IS misuse than certainty of sanctions. Further, there is evidence that the impact of sanction perceptions vary based on one's level of morality. Implications for the research and practice of IS security are discussed.

1,070 citations


Journal ArticleDOI
TL;DR: The results indicate that trust directly and indirectly affects a consumer's purchase decision in combination with perceived risk and perceived benefit, and also that trust has a longer term impact on consumer e-loyalty through satisfaction.
Abstract: Trust and satisfaction are essential ingredients for successful business relationships in business-to-consumer electronic commerce. Yet there is little research on trust and satisfaction in e-commerce that takes a longitudinal approach. Drawing on three primary bodies of literature, the theory of reasoned action, the extended valence framework, and expectation-confirmation theory, this study synthesizes a model of consumer trust and satisfaction in the context of e-commerce. The model considers not only how consumers formulate their prepurchase decisions, but also how they form their long-term relationships with the same website vendor by comparing their prepurchase expectations to their actual purchase outcome. The results indicate that trust directly and indirectly affects a consumer's purchase decision in combination with perceived risk and perceived benefit, and also that trust has a longer term impact on consumer e-loyalty through satisfaction. Thus, this study extends our understanding of consumer Internet transaction behavior as a three-fold (prepurchase, purchase, and postpurchase) process, and it recognizes the crucial, multiple roles that trust plays in this process. Implications for theory and practice as well as limitations and future directions are discussed.

809 citations


Journal ArticleDOI
TL;DR: This study develops a definition and formative taxonomy of agility in an ISD context, based on a structured literature review of agility across a number of disciplines, including manufacturing and management where the concept originated, matured, and has been applied and tested thoroughly over time.
Abstract: Awareness and use of agile methods has grown rapidly among the information systems development (ISD) community in recent years. Like most previous methods, the development and promotion of these methods have been almost entirely driven by practitioners and consultants, with little participation from the research community during the early stages of evolution. While these methods are now the focus of more and more research efforts, most studies are still based on XP, Scrum, and other industry-driven foundations, with little or no conceptual studies of ISD agility in existence. As a result, this study proposes that there are a number of significant conceptual shortcomings with agile methods and the associated literature in its current state, including a lack of clarity, theoretical glue, parsimony, limited applicability, and naivety regarding the evolution of the concept of agility in fields outside systems development. Furthermore, this has significant implications for practitioners, rendering agile method comparison and many other activities very difficult, especially in instances such as distributed development and large teams that are not conducive to many of the commercial agile methods. This study develops a definition and formative taxonomy of agility in an ISD context, based on a structured literature review of agility across a number of disciplines, including manufacturing and management where the concept originated, matured, and has been applied and tested thoroughly over time. The application of the texonomy in practice is then demonstrated through a series of thought trials conducted in a large multinational organization. The intention is that the definition and taxonomy can then be used as a starting point to study ISD method agility regardless of whether the method is XP or Scrum, agile or traditional, complete or fragmented, out-of-the-box or in-house, used as is or tailored to suit the project context.

753 citations


Journal ArticleDOI
TL;DR: Although many participants had the urge to buy impulsively, regardless of website quality, this behavior's likelihood and magnitude was directly influenced by varying the quality of task-relevant and mood-relevant cues.
Abstract: With the proliferation of e-commerce, there is growing evidence that online impulse buying is occurring, yet relatively few researchers have studied this phenomenon. This paper reports on two studies that examine how variations in a website influence online impulse buying. The results reveal some relevant insights about this phenomenon. Specifically, although many participants had the urge to buy impulsively, regardless of website quality, this behavior's likelihood and magnitude was directly influenced by varying the quality of task-relevant and mood-relevant cues. Task-relevant cues include characteristics, such as navigability, that help in the attainment of the online consumer's shopping goal. Conversely, mood-relevant cues refer to the characteristics, such as visual appeal, that affect the degree to which a user enjoys browsing a website but that do not directly support a particular shopping goal. The implications of the results for both future research and the design of human-computer interfaces are discussed.

611 citations


Journal ArticleDOI
TL;DR: Arguing that the most effective control modes are those that provide teams with autonomy in determining the methods for achieving project objectives, a model related to the interaction between control modes, agile methodology use, and requirements change is proposed.
Abstract: In this paper, we draw on control theory to understand the conditions under which the use of agile practices is most effective in improving software project quality. Although agile development methodologies offer the potential of improving software development outcomes, limited research has examined how project managers can structure the software development environment to maximize the benefits of agile methodology use during a project. As a result, project managers have little guidance on how to manage teams who are using agile methodologies. Arguing that the most effective control modes are those that provide teams with autonomy in determining the methods for achieving project objectives, we propose hypotheses related to the interaction between control modes, agile methodology use, and requirements change. We test the model in a field study of 862 software developers in 110 teams. The model explains substantial variance in four objective measures of project quality---bug severity, component complexity, coordinative complexity, and dynamic complexity. Results largely support our hypotheses, highlighting the interplay between project control, agile methodology use, and requirements change. The findings contribute to extant literature by integrating control theory into the growing literature on agile methodology use and by identifying specific contingencies affecting the efficacy of different control modes. We discuss the theoretical and practical implications of our results.

368 citations


Journal ArticleDOI
TL;DR: A nomological network is developed in which shared understanding between the CIO and TMT about the role of IS in the organization is posited to be a proximal antecedent of the intellectual dimension of IS strategic alignment.
Abstract: Alignment of information systems (IS) strategy with business strategy is a top concern of both the chief information officer (CIO) and the top management team (TMT) of organizations. Even though researchers and key decision makers in organizations recognize the importance of IS strategic alignment, they often struggle to understand how this alignment is created. In this paper, we develop a nomological network in which shared understanding between the CIO and TMT about the role of IS in the organization (which represents the social dimension of IS strategic alignment) is posited to be a proximal antecedent of the intellectual dimension of IS strategic alignment. We further posit that shared language, shared domain knowledge manifest in the CIO's business knowledge and the TMT's strategic IS knowledge, systems of knowing (structural and social), and CIO-TMT experiential similarity are important determinants of this shared understanding. Data were collected from 243 matched CIO-TMT pairs. Results largely support the proposed nomological network. Specifically, shared understanding between the CIO and TMT is a significant antecedent of IS strategic alignment. Furthermore, shared language, shared domain knowledge, and structural systems of knowing influence the development of shared understanding between the CIO and the TMT. Contrary to expectations and to findings of prior research, social systems of knowing, representing informal social interactions between the CIO and TMT, and experiential similarity did not have a significant effect on shared understanding.

360 citations


Journal ArticleDOI
TL;DR: A conceptual model that links three IT-related resources (backend integration, managerial skills, and partner support) to firm performance improvement is developed, proposing a moderating effect of competition on the resource-performance relationships.
Abstract: In this study, we seek to better understand the value of information technology (IT) in supply chain contexts. Grounded in the resource-based theory in conjunction with transaction cost economics, we develop a conceptual model that links three IT-related resources (backend integration, managerial skills, and partner support) to firm performance improvement. The model differs from previous studies by proposing a moderating effect of competition on the resource-performance relationships. Using data of 743 manufacturing firms, our analysis indicates significant contribution of IT to supply chains, which is generated through development of the digitally enabled integration capability and manifested at the process level along the supply chain. The technological resource alone, however, does not hold the answer to IT value creation. In fact, managerial skills, which enable adaptations on supply chain processes and corporate strategy to accommodate the use of IT, are shown to play the strongest role in IT value creation. Furthermore, backend integration and managerial skills are found to be more valuable in more competitive environments. While commodity-like resources have diminishing value under competition, integrational and managerial resources become even stronger. Overall, our results shed light on the key drivers of IT-enabled supply chains, and provide insights into how competition shapes IT value.

317 citations


Journal ArticleDOI
TL;DR: Time pacing, self-management with discipline and routinization of exploration are among the agile enablers found in the cases studies while event pacing, centralized management, and lack of resources allocated to exploration are found to be inhibitors to agility.
Abstract: Despite the popularity of agile methods in software development and increasing adoption by organizations there is debate about what agility is and how it is achieved. The debate suffers from a lack of understanding of agile concepts and how agile software development is practiced. This paper develops a framework for the organization of agile software development that identifies enablers and inhibitors of agility and the emergent capabilities of agile teams. The work is grounded in complex adaptive systems (CAS) and draws on three principles of coevolving systems: match coevolutionary change rate, maximize self-organizing, and synchronize exploitation and exploration. These principles are used to study the processes of two software development teams, one a team using eXtreme Programming (XP) and the other a team using a more traditional, waterfall-based development cycle. From the cases a framework for the organization of agile software development is developed. Time pacing, self-management with discipline and routinization of exploration are among the agile enablers found in the cases studies while event pacing, centralized management, and lack of resources allocated to exploration are found to be inhibitors to agility. Emergent capabilities of agile teams that are identified from the research include coevolution of business value, sustainable working with rhythm, sharing and team learning, and collective mindfulness.

313 citations


Journal ArticleDOI
TL;DR: A deeper understanding of agility is provided through an intensive study of the distributed ISD experience in TECHCOM, revealing that agility should be viewed as a multifaceted concept having three dimensions: resource, process, and linkage.
Abstract: Agility is increasingly being seen as an essential element underlying the effectiveness of globally distributed information systems development (ISD) teams today. However, for a variety of reasons, such teams are often unable develop and enact agility in dealing with changing situations. This paper seeks to provide a deeper understanding of agility through an intensive study of the distributed ISD experience in TECHCOM, an organization widely recognized for its excellence in IT development and use. The study reveals that agility should be viewed as a multifaceted concept having three dimensions: resource, process, and linkage. Resource agility is based on the distributed development team's access to necessary human and technological resources. Process agility pertains to the agility that originates in the team's systems development method guiding the project, its environmental scanning, and sense-making routines to anticipate possible crises, and its work practices enabling collaboration across time zones. Linkage agility arises from the nature of interactional relationships within the distributed team and with relevant project stakeholders, and is composed of cultural and communicative elements. The paper highlights some of the difficulties in developing agility in distributed ISD settings, provides actionable tactics, and suggests contingencies wherein different facets of agility may become more (or less) critical.

275 citations


Journal ArticleDOI
TL;DR: A grounded approach using interviews, observations, and secondary data is advanced to advance a model of the information security compromise process from the perspective of the attacked organization, and the implications for the emerging research stream on information security in the information systems literature are discussed.
Abstract: No longer the exclusive domain of technology experts, information security is now a management issue. Through a grounded approach using interviews, observations, and secondary data, we advance a model of the information security compromise process from the perspective of the attacked organization. We distinguish between deliberate and opportunistic paths of compromise through the Internet, labeled choice and chance, and include the role of countermeasures, the Internet presence of the firm, and the attractiveness of the firm for information security compromise. Further, using one year of alert data from intrusion detection devices, we find empirical support for the key contributions of the model. We discuss the implications of the model for the emerging research stream on information security in the information systems literature.

247 citations


Journal ArticleDOI
TL;DR: It is found that widespread Internet use among people who live in proximity has a direct effect on an individual's propensity to go online, and strong evidence of peer effects is provided, suggesting that individual Internet use is influenced by local patterns of usage.
Abstract: Given the increasingly important role of the Internet in education, healthcare, and other essential services, it is important that we develop an understanding of the “digital divide.” Despite the widespread diffusion of the Web and related technologies, pockets remain where the Internet is used sparingly, if at all. There are large geographic variations, as well as variations across ethnic and racial lines. Prior research suggests that individual, household, and regional differences are responsible for this disparity. We argue for an alternative explanation: Individual choice is subject to social influence (“peer effects”) that emanates from geographic proximity; this influence is the cause of the excess variation. We test this assertion with empirical analysis of a data set compiled from a number of sources. We find, first, that widespread Internet use among people who live in proximity has a direct effect on an individual's propensity to go online. Using data on residential segregation, we test the proposition that the Internet usage patterns of people who live in more ethnically isolated regions will more closely resemble usage patterns of their ethnic group. Finally, we examine the moderating impact of housing density and directly measured social interactions on the relationship between Internet use and peer effects. Results are consistent across analyses and provide strong evidence of peer effects, suggesting that individual Internet use is influenced by local patterns of usage. Implications for public policy and the diffusion of the Internet are discussed.

Journal ArticleDOI
TL;DR: The model develops and empirically test a relational model of coordination delay and finds that temporal boundaries are more difficult to cross with communication technologies than spatial boundaries.
Abstract: In globally distributed projects, members have to deal with spatial boundaries (different cities) and temporal boundaries (different work hours) because other members are often in cities within and across time zones. For pairs of members with spatial boundaries and no temporal boundaries (those in different cities with overlapping work hours), synchronous communication technologies such as the telephone, instant messaging (IM), and Web conferencing provide a means for real-time interaction. However, for pairs of members with spatial and temporal boundaries (those in different cities with nonoverlapping work hours), asynchronous communication technologies, such as e-mail, provide a way to interact intermittently. Using survey data from 675 project members (representing 5,674 pairs of members) across 108 projects in a multinational semiconductor firm, we develop and empirically test a relational model of coordination delay. In our model, the likelihood of delay for pairs of members is a function of the spatial and temporal boundaries that separate them, as well as the communication technologies they use to coordinate their work. As expected, greater use of synchronous web conferencing reduces coordination delay for pairs of members in different cities with overlapping work hours relative to pairs of members with nonoverlapping work hours. Unexpectedly, greater use of asynchronous e-mail does not reduce coordination delay for pairs of members in different cities with nonoverlapping work hours, but rather reduces coordination delay for those with overlapping work hours. We discuss the implications of our findings that temporal boundaries are more difficult to cross with communication technologies than spatial boundaries.

Journal ArticleDOI
TL;DR: This research examines how fit and appropriation (from the Fit Appropriation Model) influence performance over time, and shows that fit can predict team performance soon after technology adoption, but initial assessments of fit are temporary as teams innovate and adapt.
Abstract: Prior research on technology and team performance concludes that the fit of the technology to tasks influences team performance. It also suggests that the way teams appropriate technology influences performance. This research examines how fit and appropriation (from the Fit Appropriation Model) influence performance over time. Initially, the results show that fit better predicted performance; teams using poor-fitting technology performed worse than teams with better fitting technology. However, over a short time period (two days in this study), this initial fit no longer predicted performance; performance of teams using better fitting technology remained constant while teams using poor-fitting technology innovated and adapted, improving performance. There are two key findings from this study. First, fit can predict team performance soon after technology adoption, but initial assessments of fit are temporary as teams innovate and adapt; thus, our current theoretical models of fitting technology to a task likely will not be useful beyond the first use. Second, teams should understand how to better adapt existing technology and work structures. Because our current theories of task-technology fit failed to predict performance beyond the first use of technology, we believe that this calls for a reconsideration of what fit means for teams using technology.

Journal ArticleDOI
TL;DR: A comprehensive coding scheme is developed to capture contract provisions across four major dimensions: monitoring, dispute resolution, property rights protection, and contingency provisions, and the effects of transaction and relational characteristics on the specific contractual provisions, as well as on overall contract extensiveness are hypothesized.
Abstract: Outsourcing of information technology (IT) services has received much attention in the information systems (IS) literature. However, considerably less attention has been paid to actual contract structures used in IT outsourcing (ITO). Examining contract structures yields important insights into how the contracting parties structure the governance provisions and the factors or transaction risks that influence them. Based on insights from prior literature, from practicing legal experts, and through in-depth content analysis of actual contracts, we develop a comprehensive coding scheme to capture contract provisions across four major dimensions: monitoring, dispute resolution, property rights protection, and contingency provisions. We then develop an empirical data set describing the contract structures across these distinct dimensions, using a sample of 112 ITO contracts from the Securities and Exchange Commission (SEC) database from 1993 to 2003. Drawing on transaction cost, agency, and relational exchange theories, we hypothesize the effects of transaction and relational characteristics on the specific contractual provisions, as well as on overall contract extensiveness. Furthermore, we examine how these associations vary under conditions of fixed price and time and materials pricing structures. The results provide good support for the main hypotheses of the study and yield interesting insights about contractual governance of ITO arrangements.

Journal ArticleDOI
TL;DR: The results demonstrate that flexibility may be needed when the starting conditions are uncertain and that effective control in these situations requires use of traditional controls plus a new type of control the authors term emergent outcome control.
Abstract: When should software development teams have the flexibility to modify their directions and how do we balance that flexibility with controls essential to produce acceptable outcomes? We use dynamic capabilities theory and an extension of control theory to understand these questions. This work is examined in a case study. Our results demonstrate that flexibility may be needed when the starting conditions are uncertain and that effective control in these situations requires use of traditional controls plus a new type of control we term emergent outcome control.

Journal ArticleDOI
TL;DR: Flexibility is one of the twin primary points of focus for the special issue on distributed development and suggests that success and survival are not the preserve of the strongest nor themost intelligent, rather, the ability to adapt to chang-ing circumstances is the key trait.
Abstract: Process flexibility and globally distributed develop-ment are two major current trends in software andinformation systems development (ISD). The questfor flexibility is very much evident in the recent devel-opment and increasing acceptance of various agilemethods, such as eXtreme Programming (Beck andAndres 2005) and Scrum (Schwaber and Beedle 2002).Agile development methods are examples of appar-ently major success stories that seem to have runcounter to the prevailing wisdom in information sys-tems (IS) and software engineering. However, ratherthan being antimethod, agile approaches operate onthe principle of “just enough method.” The quest forflexibility is also apparent in the currently increasinginterest in striking a balance between the rigor of tra-ditional approaches and the need for adaptation ofthose approaches to suit particular development situ-ations. Although suitable methods may exist, devel-opers struggle in practice when selecting methodsand tailoring them to suit their needs. Certainly,agile methods are not exempt from this problem asthey too need to be flexibly tailored to the devel-opment context at hand (Fitzgerald et al. 2006a).Distributed development recognizes that, more andmore, ISD takes place in globally distributed settings.This is perhaps most evident in the many cases ofoffshoring and outsourcing of software developmentto low-cost countries (King and Torkzadeh 2008). Dis-tributed development places new demands on thedevelopment process through the increased complex-ity related to communication, coordination, cooper-ation, control, and culture, as well as to technologyand tools. Interestingly, many of the difficulties facedin globally distributed ISD are the same issues sur-faced by agile methods and development flexibility ingeneral.It is something of an irony that the special issuebefore us appears on the bicentenary of Darwin’sbirth. Evolutionary theory suggests that success andsurvival are not the preserve of the strongest nor themost intelligent. Rather, the ability to adapt to chang-ing circumstances is the key trait. Flexibility, one ofthe twin primary points of focus for this special issue,addresses this trait directly. A further parallel is thatDarwin’s theory of evolution was best exemplified bydifferences across different spatial locations. This isalso inherent in the second focal point for the specialissue dual focus—distributed development.

Journal ArticleDOI
TL;DR: A middle-range theory of how governance-knowledge fit shapes ISD performance by influencing the effective exercise of these decision rights during the development process is developed.
Abstract: This study addresses the theoretically underexplored question of how fit between project governance configurations, and the knowledge of specialized information technology (IT) and client departments, influences information systems development (ISD) performance. It conceptualizes project governance configurations using two classes of project decisions rights---decision control rights and decision management rights. The paper then develops a middle-range theory of how governance-knowledge fit shapes ISD performance by influencing the effective exercise of these decision rights during the development process. Further, the two dimensions of ISD performance---efficiency and effectiveness---are shaped by different classes of project decision rights. Data from 89 projects in 89 firms strongly support the proposed ideas. Implications for theory and practice are also discussed.

Journal ArticleDOI
TL;DR: A potential outcomes framework for estimating causal effects and the application of the framework is illustrated in the context of a phenomenon that is also of substantive interest to IS researchers and directions to move from establishing association to assessing causation are provided.
Abstract: Despite the importance of causal analysis in building a valid knowledge base and in answering managerial questions, the issue of causality rarely receives the attention it deserves in information systems (IS) and management research that uses observational data. In this paper, we discuss a potential outcomes framework for estimating causal effects and illustrate the application of the framework in the context of a phenomenon that is also of substantive interest to IS researchers. We use a matching technique based on propensity scores to estimate the causal effect of an MBA on information technology (IT) professionals' salary in the United States. We demonstrate the utility of this counterfactual or potential outcomes--based framework in providing an estimate of the sensitivity of the estimated causal effects because of selection on unobservables. We also discuss issues related to the heterogeneity of treatment effects that typically do not receive as much attention in alternative methods of estimation, and show how the potential outcomes approach can provide several new insights into who benefits the most from the interventions and treatments that are likely to be of interest to IS researchers. We discuss the usefulness of the matching technique in IS and management research and provide directions to move from establishing association to assessing causation.

Journal ArticleDOI
TL;DR: It is argued that a DSS must be designed to induce an alignment of a decision maker’s mental model with the decision model embedded in the DSS, and it is shown that deep learning, in tandem, induce decision makers to align their mental models with the decisions model, a process called deep learning.
Abstract: Model-based decision support systems (DSS) improve performance in many contexts that are data-rich, uncertain, and require repetitive decisions. But such DSS are often not designed to help users understand and internalize the underlying factors driving DSS recommendations. Users then feel uncertain about DSS recommendations, leading them to possibly avoid using the system. We argue that a DSS must be designed to induce an alignment of a decision maker's mental model with the decision model embedded in the DSS. Such an alignment requires effort from the decision maker and guidance from the DSS. We experimentally evaluate two DSS design characteristics that facilitate such alignment: (i) feedback on the upside potential for performance improvement and (ii) feedback on corrective actions to improve decisions. We show that, in tandem, these two types of DSS feedback induce decision makers to align their mental models with the decision model, a process we call deep learning, whereas individually these two types of feedback have little effect on deep learning. We also show that deep learning, in turn, improves user evaluations of the DSS. We discuss how our findings could lead to DSS design improvements and better returns on DSS investments.

Journal ArticleDOI
TL;DR: A model of what the authors call trans-situated learning that is supported by the local universality of an information infrastructure whose use becomes embedded with other infrastructures is proposed.
Abstract: This paper investigates the practice-based learning dynamics that emerge among peers who share occupational practices but do not necessarily work with each other or even know each other because of geographical or organizational distance. To do so, it draws on the literatures on situated learning, networks of practice, and information infrastructures, and on insights from a longitudinal case study of the implementation of a Web-based information system used by people working in the field of environmental health. The system was deeply involved in the transformations of local practices as well as relationships between peers. Based on a dialogue between existing literatures and observations from the case study, this research extends the practice-based perspective on learning to the computer-mediated context of a network of practice. To that effect, it proposes a model of what we call trans-situated learning that is supported by the local universality of an information infrastructure whose use becomes embedded with other infrastructures.

Journal ArticleDOI
TL;DR: Examining the moderating impact of two measures of competitive environment, demand uncertainty, and industry concentration, on the relationship between IT and vertical integration suggests that firms' choice of the level of VI and IT investment, under different levels of demand uncertainty and Industry concentration, are rational.
Abstract: The information systems (IS) literature suggests that by lowering coordination costs, information technology (IT) will lead to an overall shift towards more use of markets. Empirical work in this area provides evidence that IT is associated with a decrease in vertical integration (VI). Economy-wide data, however, suggests that over the last 25 years the average level of VI has, in fact, increased. This paper studies this empirical anomaly by explicating the moderating impact of two measures of competitive environment, demand uncertainty, and industry concentration, on the relationship between IT and VI. We examine firms included in 1995 to 1997 InformationWeek 500 and the COMPUSTAT database. Consistent with the IS literature, the analysis suggests that IT is associated with a decrease in VI when demand uncertainty is high or industry concentration is low. However, contrary to the IS literature, IT is found to be associated with an increase in VI when industry concentration is high or demand uncertainty is low. Furthermore, as demand uncertainty increases, less vertically integrated firms invest more in IT, while as industry concentration increases, more vertically integrated firms invest more in IT. The analysis also suggests that firms' choice of the level of VI and IT investment, under different levels of demand uncertainty and industry concentration, are rational. When demand uncertainty is high or industry concentration is low, increase in VI may increase coordination and production costs. Thus, less VI is rational. However, when industry concentration is high or demand uncertainty is low, increase in VI may decrease coordination and production costs. Thus, firms choose more VI in such industries. The implications for research and practice are discussed.

Journal ArticleDOI
TL;DR: In this article, the authors study configuration of and interaction between a firewall and intrusion detection systems (IDS) and find that the optimal configuration of an IDS does not change whether it is deployed alone or together with a firewall.
Abstract: Proper configuration of security technologies is critical to balance the needs for access and protection of information. The common practice of using a layered security architecture that has multiple technologies amplifies the need for proper configuration because the configuration decision about one security technology has ramifications for the configuration decisions about others. Furthermore, security technologies rely on each other for their operations, thereby affecting each other's contribution. In this paper we study configuration of and interaction between a firewall and intrusion detection systems (IDS). We show that deploying a technology, whether it is the firewall or the IDS, could hurt the firm if the configuration is not optimized for the firm's environment. A more serious consequence of deploying the two technologies with suboptimal configurations is that even if the firm could benefit when each is deployed alone, the firm could be hurt by deploying both. Configuring the IDS and the firewall optimally eliminates the conflict between them, ensuring that if the firm benefits from deploying each of these technologies when deployed alone, it will always benefit from deploying both. When optimally configured, we find that these technologies complement or substitute each other. Furthermore, we find that while the optimal configuration of an IDS does not change whether it is deployed alone or together with a firewall, the optimal configuration of a firewall has a lower detection rate (i.e., allowing more access) when it is deployed with an IDS than when deployed alone. Our results highlight the complex interactions between firewall and IDS technologies when they are used together in a security architecture, and, hence, the need for proper configuration to benefit from these technologies.

Journal ArticleDOI
TL;DR: An augmented form of the Cobb-Douglas production function is developed to separate and measure different productivity-enhancing effects of IT, and indicates structural differences in the role of IT in production between industries that are IT-intensive and those that are not.
Abstract: Many studies measure the value of information technology (IT) by focusing on how much value is added rather than on the mechanisms that drive value addition. We argue that value from IT arises not only directly through changes in the factor input mix but also indirectly through IT-enabled augmentation of non-IT inputs and changes in the underlying production technology. We develop an augmented form of the Cobb-Douglas production function to separate and measure different productivity-enhancing effects of IT. Using industry-level data from the manufacturing sector, we find evidence that both direct and indirect effects of IT are significant. Partitioning industries into IT-intensive and non-IT-intensive, we find that the indirect effects of IT predominate in the IT-intensive sector. In contrast, the direct effects of IT predominate in the non-IT intensive sector. These results indicate structural differences in the role of IT in production between industries that are IT-intensive and those that are not. The implication for decision-makers is that for IT-intensive industries the gains from IT come primarily through indirect effects such as the augmentation of non-IT capital and labor.

Journal ArticleDOI
TL;DR: A basic contingency framework is developed, one that models the benefit/cost economics described in narratives about the transition from craft to industrial production of physical products and shows that there remain many opportunities for information systems research to have a major impact on practice in this area.
Abstract: In recent years, flexibility has emerged as a divisive issue in discussions about the appropriate design of processes for making software. Partisans in both research and practice argue for and against plan-based (allegedly inflexible) and agile (allegedly too flexible) approaches. The stakes in this debate are high; questions raised about plan-based approaches undermine longstanding claims that those approaches, when realized, represent maturity of practice. In this commentary, we call for research programs that will move beyond partisan disagreement to a more nuanced discussion, one that takes into account both benefits and costs of flexibility. Key to such programs will be the development of a robust contingency framework for deciding when (in what conditions) plan-based and agile methods should be used. We develop a basic contingency framework in this paper, one that models the benefit/cost economics described in narratives about the transition from craft to industrial production of physical products. We use this framework to demonstrate the power of even a simple model to help us accomplish three objectives: (1) to refocus discussions about the appropriate design of software development processes, concentrating on when to use particular approaches and how they might be usefully combined; (2) to suggest and guide a trajectory of research that can support and enrich this discussion; and (3) to suggest a technology-based explanation for the emergence of agile development at this point in history. Although we are not the first to argue in favor of a contingency perspective, we show that there remain many opportunities for information systems (IS) research to have a major impact on practice in this area.

Journal ArticleDOI
TL;DR: Three selected linear price ICA formats are compared based on allocative efficiency and revenue distribution using different bidding strategies and bidder valuations and it is found that ICA designs with linear prices performed very well for different valuation models even in cases of high synergies among the valuations.
Abstract: Iterative combinatorial auctions (ICAs) are IT-based economic mechanisms where bidders submit bundle bids in a sequence and an auctioneer computes allocations and ask prices in each auction round. The literature in this field provides equilibrium analysis for ICAs with nonlinear personalized prices under strong assumptions on bidders' strategies. Linear pricing has performed very well in the lab and in the field. In this paper, we compare three selected linear price ICA formats based on allocative efficiency and revenue distribution using different bidding strategies and bidder valuations. The goal of this research is to benchmark different ICA formats and design and analyze new auction rules for auctions with pseudodual linear prices. The multi-item and discrete nature of linear price iterative combinatorial auctions and the complex price calculation schemes defy much of the traditional game theoretical analysis in this field. Computational methods can be of great help in exploring potential auction designs and analyzing the virtues of various design options. In our simulations, we found that ICA designs with linear prices performed very well for different valuation models even in cases of high synergies among the valuations. There were, however, significant differences in efficiency and in the revenue distributions of the three ICA formats. Heuristic bidding strategies using only a few of the best bundles also led to high levels of efficiency. We have also identified a number of auction rules for ask price calculation and auction termination that have shown to perform very well in the simulations.

Journal ArticleDOI
TL;DR: Analytical, computational, and empirical analyses of strategies for intelligent bid formulations in online auctions related to a weighted-average ascending price auction mechanism that is designed to provide opaque feedback information to bidders and presents a challenge in formulating appropriate bids are presented.
Abstract: This paper presents analytical, computational, and empirical analyses of strategies for intelligent bid formulations in online auctions We present results related to a weighted-average ascending price auction mechanism that is designed to provide opaque feedback information to bidders and presents a challenge in formulating appropriate bids Using limited information provided by the mechanism, we design strategies for software agents to make bids intelligently In particular, we derive analytical results for the important characteristics of the auction, which allow estimation of the key parameters; we then use these theoretical results to design several bidding strategies We demonstrate the validity of designed strategies using a discrete event simulation model that resembles the mechanisms used in treasury bills auctions, business-to-consumer (B2C) auctions, and auctions for environmental emission allowances In addition, using the data generated by the simulation model, we show that intelligent strategies can provide a high probability of winning an auction without significant loss in surplus

Journal ArticleDOI
TL;DR: The results indicate that the availability of an OSS/FS alternative has little impact on willingness to pay for Microsoft Office, however, piracy controls significantly increase willingness to Pay forMicrosoft Office, even in the presence of OSS-FS alternatives.
Abstract: Competition from open source software and free software (OSS/FS) alternatives is causing proprietary software producers to reevaluate product strategies. OSS/FS alternatives complicate an already complex information goods market plagued by piracy concerns. Although producer perspectives on software pricing and piracy controls have been addressed extensively, consumers' perspective and willingness to pay for commercial software is not very well understood. This paper empirically determines willingness to pay for a leading commercial software application (Microsoft Office) in the presence of an OSS/FS alternative. A contingent valuation approach is used to elicit willingness to pay for the application. The research design employs a 2 × 2 × 2 experiment to investigate the impact of preventive control, deterrence control, and OSS/FS alternative. The results indicate that the availability of an OSS/FS alternative has little impact on willingness to pay for Microsoft Office. However, piracy controls significantly increase willingness to pay for Microsoft Office, even in the presence of OSS/FS alternatives.

Journal ArticleDOI
TL;DR: The results of this study indicate that, when constructing logical data models, data modelers should consider tradeoffs between parsimony and expressiveness.
Abstract: Data models provide a map of the components of an information system. Prior research has indicated that more expressive conceptual data models (despite their increased size) result in better performance for problem solving tasks. An initial experiment using logical data models indicated that more expressive logical data models also enhanced end-user performance for information retrieval tasks. However, the principles of parsimony and bounded rationality imply that, past some point, increases in size lead to a level of complexity that results in impaired performance. The results of this study support these principles. For a logical data model of increased but still modest size, users composing queries for the more expressive logical data model did not perform as well as users composing queries for the corresponding less expressive but more parsimonious logical data model. These results indicate that, when constructing logical data models, data modelers should consider tradeoffs between parsimony and expressiveness.

Journal ArticleDOI
TL;DR: An analytical model is presented and evaluated for effectiveness of a proposed dynamic priority-based pricing scheme vis-a-vis a baseline fixed-price single-quality level SLA, with the objective of maximizing organizational welfare of the participants.
Abstract: A key feature of service-oriented models of information technology is the promise of prespecified quality levels enforceable via service level agreements (SLAs). This poses difficult management problems when considerable variability exists in user preferences and service demand within any organization. Because variance in expectations impact service levels, effective pricing and resource allocation mechanisms are needed to deliver services at the promised quality level. In this paper, we propose a mechanism for SLA formulation that is responsive to demand fluctuations and user preference variance, with the objective of maximizing organizational welfare of the participants. This formulation features a dynamic priority based price-penalty scheme targeted to individual users. An analytical model is presented and evaluated for effectiveness of a proposed dynamic priority-based pricing scheme vis-a-vis a baseline fixed-price single-quality level SLA. Simulations using data from an existing SLA is used to provide evidence that the proposed dynamic pricing scheme is likely to be more effective than a fixed-price approach from a system welfare perspective.

Journal ArticleDOI
TL;DR: This work builds on a quality framework proposed in prior work, and develops quality profiles for the result of the primitive relational operations Difference and Union, which have nuances that make both the classification of the resulting records as well as the estimation of the different classes quite difficult to address.
Abstract: Information derived from relational databases is routinely used for decision making. However, little thought is usually given to the quality of the source data, its impact on the quality of the derived information, and how this in turn affects decisions. To assess quality, one needs a framework that defines relevant metrics that constitute the quality profile of a relation, and provides mechanisms for their evaluation. We build on a quality framework proposed in prior work, and develop quality profiles for the result of the primitive relational operations Difference and Union. These operations have nuances that make both the classification of the resulting records as well as the estimation of the different classes quite difficult to address, and very different from that for other operations. We first determine how tuples appearing in the results of these operations should be classified as accurate, inaccurate or mismember, and when tuples that should appear do not (called incomplete) in the result. Although estimating the cardinalities of these subsets directly is difficult, we resolve this by decomposing the problem into a sequence of drawing processes, each of which follows a hyper-geometric distribution. Finally, we discuss how decisions would be influenced based on the resulting quality profiles.