scispace - formally typeset
Search or ask a question

Showing papers in "Management Information Systems Quarterly in 2011"


Journal ArticleDOI
TL;DR: In this article, a comprehensive set of recommendations that can be used to give researchers in MIS and the behavioral sciences a framework for developing valid measures is presented. But the scale development and validation of constructs is still a challenging activity.
Abstract: Despite the fact that validating the measures of constructs is critical to building cumulative knowledge in MIS and the behavioral sciences, the process of scale development and validation continues to be a challenging activity Undoubtedly, part of the problem is that many of the scale development procedures advocated in the literature are limited by the fact that they (1) fail to adequately discuss how to develop appropriate conceptual definitions of the focal construct, (2) often fail to properly specify the measurement model that relates the latent construct to its indicators, and (3) underutilize techniques that provide evidence that the set of items used to represent the focal construct actually measures what it purports to measure Therefore, the purpose of the present paper is to integrate new and existing techniques into a comprehensive set of recommendations that can be used to give researchers in MIS and the behavioral sciences a framework for developing valid measures First, we briefly elaborate upon some of the limitations of current scale development practices Following this, we discuss each of the steps in the scale development process while paying particular attention to the differences that are required when one is attempting to develop scales for constructs with formative indicators as opposed to constructs with reflective indicators Finally, we discuss several things that should be done after the initial development of a scale to examine its generalizability and to enhance its usefulness

1,783 citations


Journal ArticleDOI
TL;DR: An interdisciplinary review of privacy-related research is provided in order to enable a more cohesive treatment and recommends that researchers be alert to an overarching macro model that is referred to as APCO (Antecedents → Privacy Concerns → Outcomes).
Abstract: To date, many important threads of information privacy research have developed, but these threads have not been woven together into a cohesive fabric. This paper provides an interdisciplinary review of privacy-related research in order to enable a more cohesive treatment. With a sample of 320 privacy articles and 128 books and book sections, we classify previous literature in two ways: (1) using an ethics-based nomenclature of normative, purely descriptive, and empirically descriptive, and (2) based on their level of analysis: individual, group, organizational, and societal. Based upon our analyses via these two classification approaches, we identify three major areas in which previous research contributions reside: the conceptualization of information privacy, the relationship between information privacy and other constructs, and the contextual nature of these relationships. As we consider these major areas, we draw three overarching conclusions. First, there are many theoretical developments in the body of normative and purely descriptive studies that have not been addressed in empirical research on privacy. Rigorous studies that either trace processes associated with, or test implied assertions from, these value-laden arguments could add great value. Second, some of the levels of analysis have received less attention in certain contexts than have others in the research to date. Future empirical studies-both positivist and interpretive--could profitably be targeted to these under-researched levels of analysis. Third, positivist empirical studies will add the greatest value if they focus on antecedents to privacy concerns and on actual outcomes. In that light, we recommend that researchers be alert to an overarching macro model that we term APCO (Antecedents → Privacy Concerns → Outcomes).

1,595 citations


Journal ArticleDOI
TL;DR: Action design research (ADR) reflects the premise that IT artifacts are ensembles shaped by the organizational context during development and use and conceptualizes the research process as containing the inseparable and inherently interwoven activities of building the IT artifact, intervening in the organization, and evaluating it concurrently.
Abstract: Design research (DR) positions information technology artifacts at the core of the Information Systems discipline. However, dominant DR thinking takes a technological view of the IT artifact, paying scant attention to its shaping by the organizational context. Consequently, existing DR methods focus on building the artifact and relegate evaluation to a subsequent and separate phase. They value technological rigor at the cost of organizational relevance, and fail to recognize that the artifact emerges from interaction with the organizational context even when its initial design is guided by the researchers' intent. We propose action design research (ADR) as a new DR method to address this problem. ADR reflects the premise that IT artifacts are ensembles shaped by the organizational context during development and use. The method conceptualizes the research process as containing the inseparable and inherently interwoven activities of building the IT artifact, intervening in the organization, and evaluating it concurrently. The essay describes the stages of ADR and associated principles that encapsulate its underlying beliefs and values. We illustrate ADR through a case of competence management at Volvo IT.

1,538 citations


Journal ArticleDOI
TL;DR: The research model proposes that certain technology characteristics--like usability, usefulness, complexity, and reliability, intrusiveness, and dynamism--are related to stressors (work overload, role ambiguity, invasion of privacy, work-home conflict, and job insecurity), whereas intrusive technology characteristics are found to be the dominant predictors of stressors.
Abstract: With the proliferation and ubiquity of information and communication technologies (ICTs), it is becoming imperative for individuals to constantly engage with these technologies in order to get work accomplished. Academic literature, popular press, and anecdotal evidence suggest that ICTs are responsible for increased stress levels in individuals (known as technostress). However, despite the influence of stress on health costs and productivity, it is not very clear which characteristics of ICTs create stress. We draw from IS and stress research to build and test a model of technostress. The person-environment fit model is used as a theoretical lens. The research model proposes that certain technology characteristics--like usability (usefulness, complexity, and reliability), intrusiveness (presenteeism, anonymity), and dynamism (pace of change)--are related to stressors (work overload, role ambiguity, invasion of privacy, work-home conflict, and job insecurity). Field data from 661 working professionals was obtained and analyzed. The results clearly suggest the prevalence of technostress and the hypotheses from the model are generally supported. Work overload and role ambiguity are found to be the two most dominant stressors, whereas intrusive technology characteristics are found to be the dominant predictors of stressors. The results open up new avenues for research by highlighting the incidence of technostress in organizations and possible interventions to alleviate it.

1,167 citations


Journal ArticleDOI
TL;DR: A critical analysis of the literature reveals that information privacy is a multilevel concept, but rarely studied as such, and calls for research on information privacy to use a broader diversity of sampling populations and to publish more design and action research in journal articles that can result in IT artifacts for protection or control of information privacy.
Abstract: Information privacy refers to the desire of individuals to control or have some influence over data about themselves. Advances in information technology have raised concerns about information privacy and its impacts, and have motivated Information Systems researchers to explore information privacy issues, including technical solutions to address these concerns. In this paper, we inform researchers about the current state of information privacy research in IS through a critical analysis of the IS literature that considers information privacy as a key construct. The review of the literature reveals that information privacy is a multilevel concept, but rarely studied as such. We also find that information privacy research has been heavily reliant on studentbased and USA-centric samples, which results in findings of limited generalizability. Information privacy research focuses on explaining and predicting theoretical contributions, with few studies in journal articles focusing on design and action contributions. We recommend that future research should consider different levels of analysis as well as multilevel effects of information privacy. We illustrate this with a multilevel framework for information privacy concerns. We call for research on information privacy to use a broader diversity of sampling populations, and for more design and action information privacy research to be published in journal articles that can result in IT artifacts for protection or control of information privacy.

1,068 citations



Journal ArticleDOI
TL;DR: In this article, a human and material agency metaphor is used to suggest how a human agency approach to technology can usefully incorporate notions of material agency into its explanations of organizational change.
Abstract: Employees in many contemporary organizations work with flexible routines and flexible technologies. When those employees find that they are unable to achieve their goals in the current environment, how do they decide whether they should change the composition of their routines or the materiality of the technologies with which they work? The perspective advanced in this paper suggests that the answer to this question depends on how human and material agencies-the basic building blocks common to both routines and technologies-are imbricated. Imbrication of human and material agencies creates infrastructure in the form of routines and technologies that people use to carry out their work. Routine or technological infrastructure used at any given moment is the result of previous imbrications of human and material agencies. People draw on this infrastructure to construct a perception that a technology either constrains their ability to achieve their goals, or that the technology affords the possibility of achieving new goals. The case of a computer simulation technology for automotive design used to illustrate this framework suggests that perceptions of constraint lead people to change their technologies while perceptions of affordance lead people to change their routines. This imbrication metaphor is used to suggest how a human agency approach to technology can usefully incorporate notions of material agency into its explanations of organizational change.

998 citations


Journal ArticleDOI
TL;DR: The premise that organizations need to develop superior firm-wide IT capability to successfully manage their IT resources to realize agility is developed, and a possible resolution to the contradictory effect of IT on agility is suggested.
Abstract: Information technology is generally considered an enabler of a firm's agility. A typical premise is that greater IT investment enables a firm to be more agile. However, it is not uncommon that IT can also hinder and sometimes even impede organizational agility. We propose and theorize this frequently observed but understudied IT-agility contradiction by which IT may enable or impede agility. We develop the premise that organizations need to develop superior firm-wide IT capability to successfully manage their IT resources to realize agility. We refine the conceptualization and measurement of IT capability as a latent construct reflected in its three dimensions: IT infrastructure capability, IT business spanning capability, and IT proactive stance. We also conceptualize two types of organizational agility: market capitalizing agility and operational adjustment agility. We then conduct a matched-pair field survey of business and information systems executives in 128 organizations to empirically examine the link between a firm's IT capability and agility. Business executives responded to measurement scales of the two types of agility and organizational context variables, and IS executives responded to measurement scales of IT capabilities and IS context variables. The results show a significant positive relationship between IT capability and the two types of organizational agility. We also find a significant positive joint effect of IT capability and IT spending on operational adjustment agility but not on market capitalizing agility. The findings suggest a possible resolution to the contradictory effect of IT on agility: while more IT spending does not lead to greater agility, spending it in such a way as to enhance and foster IT capabilities does. Our study provides initial empirical evidence to better understand essential IT capabilities and their relationship with organizational agility. Our findings provide a number of useful implications for research and managerial practices.

913 citations


Journal ArticleDOI
TL;DR: This research extends and integrates the literature on strategic IT alignment and organizational agility at a time when both alignment and agility are recognized as critical and concurrent organizational goals.
Abstract: Strategic information technology alignment remains a top priority for business and IT executives. Yet with a recent rise in environmental volatility, firms are asking how to be more agile in identifying and responding to market-based threats and opportunities. Whether alignment helps or hurts agility is an unresolved issue. This paper presents a variety of arguments from the literature that alternately predict a positive or negative relationship between alignment and agility. This relationship is then tested using a model in which agility mediates the link between alignment and firm performance under varying conditions of IT infrastructure flexibility and environmental volatility. Using data from a matched survey of IT and business executives in 241 firms, we uncover a positive and significant link between alignment and agility and between agility and firm performance. We also show that the effect of alignment on performance is fully mediated by agility, that environmental volatility positively moderates the link between agility and firm performance, and that agility has a greater impact on firm performance in more volatile markets. While IT infrastructure flexibility does not moderate the link between alignment and agility, except in a volatile environment, we reveal that IT infrastructure flexibility has a positive and significant main effect on agility. In fact, the effect of IT infrastructure flexibility on agility is as strong as the effect of alignment on agility. This research extends and integrates the literature on strategic IT alignment and organizational agility at a time when both alignment and agility are recognized as critical and concurrent organizational goals.

905 citations


Journal Article
TL;DR: Among the several approaches proposed for estimating interactions in latent variable models in CBSEM, Bollen and Paxton (1998) and Little et al. (2006) describe procedures that do not require special-purpose software, and so are accessible to all CBSEM users.
Abstract: Among the several approaches proposed for estimating interactions in latent variable models in CBSEM, Bollen and Paxton (1998) and Little et al. (2006) describe procedures that do not require special-purpose software, and so are accessible to all CBSEM users. Bollen and Paxton described an approach using two-stage least squares (2SLS), an analytical approach used in regression to overcome estimation problems which would confound OLS. 2SLS associates each predictor in a regression model with an instrumental variable. In the first stage, each predictor is regressed on its instrument, then in the second stage the ultimate dependent variable is regressed on the expected or predicted portion of each predictor from the first stage.

817 citations


Journal ArticleDOI
TL;DR: It is found that information management capability plays an important role in developing other firm capabilities for customer management, process management, and performance management and favorably influence customer, financial, human resources, and organizational effectiveness measures of firm performance.
Abstract: How do information technology capabilities contribute to firm performance? This study develops a conceptual model linking IT-enabled information management capability with three important organizational capabilities (customer management capability, process management capability, and performance management capability). We argue that these three capabilities mediate the relationship between information management capability and firm performance. We use a rare archival data set from a conglomerate business group that had adopted a model of performance excellence for organizational transformation based on the Baldrige criteria. This data set contains actual scores from high quality assessments of firms and intraorganizational units of the conglomerate, and hence provides unobtrusive measures of the key constructs to validate our conceptual model. We find that information management capability plays an important role in developing other firm capabilities for customer management, process management, and performance management. In turn, these capabilities favorably influence customer, financial, human resources, and organizational effectiveness measures of firm performance. Among key managerial implications, senior leaders must focus on creating necessary conditions for developing IT infrastructure and information management capability because they play a foundational role in building other capabilities for improved firm performance. The Baldrige model also needs some changes to more explicitly acknowledge the role and importance of information management capability so that senior leaders know where to begin in their journey toward business excellence.

Journal ArticleDOI
Sein, Henfridsson, Purao, Rossi, Lindgren 

Journal ArticleDOI
TL;DR: The results indicate that website quality influences consumers' perceptions of product quality, which subsequently affects online purchase intentions, and signal credibility strengthens the relationship between website quality and product quality perceptions for a high quality website.
Abstract: An electronic commerce marketing channel is fully mediated by information technology, stripping away much of a product's physical informational cues, and creating information asymmetries (i.e., limited information). These asymmetries may impede consumers' ability to effectively assess certain types of products, thus creating challenges for online sellers. Signaling theory provides a framework for understanding how extrinsic cues ᾢ signals ᾢ can be used by sellers to convey product quality information to consumers, reducing uncertainty and facilitating a purchase or exchange. This research proposes a model to investigate website quality as a potential signal of product quality and consider the moderating effects of product information asymmetries and signal credibility. Three experiments are reported that examine the efficacy of signaling theory as a basis for predicting online consumer behavior with an experience good. The results indicate that website quality influences consumers' perceptions of product quality, which subsequently affects online purchase intentions. Additionally, website quality was found to have a greater influence on perceived product quality when consumers had higher information asymmetries. Likewise, signal credibility was found to strengthen the relationship between website quality and product quality perceptions for a high quality website. Implications for future research and website design are examined.

Journal ArticleDOI
TL;DR: The research reported here undertook a three-phase approach of selecting, analyzing, and synthesizing relevant literature to develop a holistic, transdisciplinary, integrative framework for IT-enabled business transformation.
Abstract: The quality and future of human existence are directly related to the condition of our natural environment, but we are damaging the environment. Scientific evidence has mounted a compelling case that human behavior is responsible for deterioration in the Earth's natural environment, with the rate of deterioration predicted to increase in the future. Acknowledging this evidence, the governments of 192 countries have formally agreed to take action to resolve problems with the climate system, one of the most highly stressed parts of the natural environment. While the intention is clear, the question of how best to proceed is not. The research reported here undertook a three-phase approach of selecting, analyzing, and synthesizing relevant literature to develop a holistic, transdisciplinary, integrative framework for IT-enabled business transformation. The focus on business transformation is because business is recognized as being a critical contributor in realizing the challenges of environmental sustainability due to its potential capacity for innovation and change-locally, nationally, and globally. This article also serves as a resource base for researchers to begin to undertake significant information systems and multidisciplinary work toward the goal of environmental sustainability. Through selection and analysis of illustrative examples of current work from 12 academic disciplines across 6 core categories, the framework addresses the key issues of uncertainty: (1) What is meant by environmental sustainability? (2) What are its major challenges? (3) What is being done about these challenges? (4) What needs to be done?


Journal ArticleDOI
TL;DR: In this paper, the authors highlight the need to integrate predictive analytics into information systems research and show several concrete ways in which this goal can be accomplished, including new theory generation, measurement development, comparison of competing theories, improvement of existing models, relevance assessment, and assessment of the predictability of empirical phenomena.
Abstract: This research essay highlights the need to integrate predictive analytics into information systems research and shows several concrete ways in which this goal can be accomplished. Predictive analytics include empirical methods (statistical and other) that generate data predictions as well as methods for assessing predictive power. Predictive analytics not only assist in creating practically useful models, they also play an important role alongside explanatory modeling in theory building and theory testing. We describe six roles for predictive analytics: new theory generation, measurement development, comparison of competing theories, improvement of existing models, relevance assessment, and assessment of the predictability of empirical phenomena. Despite the importance of predictive analytics, we find that they are rare in the empirical IS literature. Extant IS literature relies nearly exclusively on explanatory statistical modeling, where statistical inference is used to test and evaluate the explanatory power of underlying causal models, and predictive power is assumed to follow automatically from the explanatory model. However, explanatory power does not imply predictive power and thus predictive analytics are necessary for assessing predictive power and for building empirical models that predict well. To show that predictive analytics and explanatory statistical modeling are fundamentally disparate, we show that they are different in each step of the modeling process. These differences translate into different final models, so that a pure explanatory statistical model is best tuned for testing causal hypotheses and a pure predictive model is best in terms of predictive power. We convert a well-known explanatory paper on TAM to a predictive context to illustrate these differences and show how predictive analytics can add theoretical and practical value to IS research.

Journal ArticleDOI
TL;DR: In this article, the authors examined how user cognition and ultimately usage intentions toward an information technology are distorted by addiction to the technology, and found that addiction to online auctions augments user perceptions of enjoyment, usefulness, and ease of use attributed to the information technology, which in turn influence usage intentions.
Abstract: Technology addiction is a relatively new mental condition that has not yet been well integrated into mainstream MIS models. This study bridges this gap and incorporates technology addiction into technology use processes in the context of online auctions. It examines how user cognition and ultimately usage intentions toward an information technology are distorted by addiction to the technology. The findings from two empirical studies of 132 and 223 eBay users, using three different operationalizations of addiction, indicate that the level of online auction addiction distorts the way the IT artifact is perceived. Informing a range of cognitionmodification processes, addiction to online auctions augments user perceptions of enjoyment, usefulness, and ease of use attributed to the technology, which in turn influence usage intentions. Overall, consistent with behavioral addiction models, the findings indicate that users' levels of online auction addiction influence their reasoned IT usage decisions by altering users' belief systems. The formation of maladaptive perceptions is driven by a combination of memory-, learning-, and bias-based cognition modification processes. Implications of the findings are discussed.

Journal ArticleDOI
TL;DR: It is found that flow mediates the impacts of technological and spatial environments on intention to purchase virtual products.
Abstract: Although research on three-dimensional virtual environments abounds, little is known about the social and business aspects of virtual worlds. Given the emergence of large-scale social virtual worlds, such as Second Life, and the dramatic growth in sales of virtual goods, it is important to understand the dynamics that govern the purchase of virtual goods in virtual worlds. Employing the stimulus-organism-response (S-O-R) framework, we investigate how technological (interactivity and sociability) and spatial (density and stability) environments in virtual worlds influence the participants' virtual experiences (telepresence, social presence, and flow), and how experiences subsequently affect their response (intention to purchase virtual goods). The results of our survey of 354 Second Life residents indicate that interactivity, which enhances the interaction with objects, has a significant positive impact on telepresence and flow. Also, sociability, which fosters interactions with participants, is significantly associated with social presence, although no such significant impact was observed on flow. Furthermore, both density and stability are found to significantly influence participants' virtual experiences; stability helps users to develop strong social bonds, thereby increasing both social presence and flow. However, contrary to our prediction of curvilinear patterns, density is linearly associated with flow and social presence. Interestingly, the results exhibit two opposing effects of density: while it reduces the extent of flow, density increases the amount of social presence. Since social presence is found to increase flow, the net impact of density on flow depends heavily on the relative strength of the associations involving these three constructs. Finally, we find that flow mediates the impacts of technological and spatial environments on intention to purchase virtual products. We conclude the paper with a discussion of the theoretical and practical contributions of our findings.

Journal ArticleDOI
TL;DR: The nature of the critical research perspective is examined, its significance is clarified, and its major discourses are reviewed, recognizing that its mission and methods cannot be captured by a fixed set of criteria once and for all.
Abstract: While criteria or principles for conducting positivist and interpretive research have been widely discussed in the IS research literature, criteria or principles for critical research are lacking Therefore, the purpose of this paper is to propose a set of principles for the conduct of critical research in information systems We examine the nature of the critical research perspective, clarify its significance, and review its major discourses, recognizing that its mission and methods cannot be captured by a fixed set of criteria once and for all, particularly as multiple approaches are still in the process of defining their identity However, we suggest it is possible to formulate a set of principles capturing some of the commonalities of those approaches that have so far become most visible in the IS research literature The usefulness of the principles is illustrated by analyzing three critical field studies in information systems We hope that this paper will further reflection and debate on the important subject of grounding critical research methodology

Journal ArticleDOI
TL;DR: A 20-month action research project was conducted to study the user's experience and to identify design principles for virtual co-creation systems and reveals how to design co-created systems and enriches research on co- creation to fit the virtual world context.
Abstract: Emerging virtual worlds, such as the prominent Second Life, offer unprecedented opportunities for companies to collaborate with co-creating users. However, pioneering corporate co-creation systems fail to attract a satisfying level of participation and engagement. The experience users have with the co-creation system is the key to making virtual places a vibrant source of great connections, creativity, and co-creation. While prior research on co-creation serves as a foundation for this work, it does not provide adequate guidance on how to design co-creation systems in virtual worlds. To address this shortcoming, a 20-month action research project was conducted to study the user's experience and to identify design principles for virtual co-creation systems. In two action research cycles, a virtual co-creation system called Ideation Quest was created, deployed, evaluated, and improved. The study reveals how to design co-creation systems and enriches research on co-creation to fit the virtual world context. Practitioners receive a helpful framework to leverage virtual worlds for co-creation.

Journal ArticleDOI
TL;DR: The related topics of measurement and construct validity are summarized and discussed, with particular focus on formative and reflective indicators and common method bias, and, where relevant, a number of allied issues are considered.
Abstract: Despite renewed interest and many advances in methodology in recent years, information systems and organizational researchers face confusing and inconsistent guidance on how to choose amongst, implement, and interpret findings from the use of different measurement procedures. In this article, the related topics of measurement and construct validity are summarized and discussed, with particular focus on formative and reflective indicators and common method bias, and, where relevant, a number of allied issues are considered. The perspective taken is an eclectic and holistic one and attempts to address conceptual and philosophical essentials, raise salient questions, and pose plausible solutions to critical measurement dilemmas occurring in the managerial, behavioral, and social sciences.


Journal ArticleDOI
Paul A. Pavlou1
TL;DR: In this paper, the current state of the IS literature on information privacy (where are we now?) and identifies promising research directions for advancing information privacy research on Information Systems (where should we go?).
Abstract: While information privacy has been studied in multiple disciplines over the years, the advent of the information age has both elevated the importance of privacy in theory and practice, and increased the relevance of information privacy literature for Information Systems, which has taken a leading role in the theoretical and practical study of information privacy. There is an impressive body of literature on information privacy in IS, and the two Theory and Review articles in this issue of MIS Quarterly review this literature. By integrating these two articles, this paper evaluates the current state of the IS literature on information privacy (where are we now?) and identifies promising research directions for advancing IS research on information privacy (where should we go?). Additional thoughts on further expanding the information privacy research in IS by drawing on related disciplines to enable a multidisciplinary study of information privacy are discussed.

Journal ArticleDOI
TL;DR: In this article, the authors examined the effect of using two-dimensional versus three-dimensional virtual world environments on telepresence, enjoyment, brand equity, and behavioral intention, and found that the 3D virtual world environment produces both positive and negative effects on brand equity when compared to the 2D environment.
Abstract: This research uses theories of flow, telepresence, positive emotions, and brand equity to examine the effect of using two-dimensional versus three-dimensional virtual world environments on telepresence, enjoyment, brand equity, and behavioral intention. The findings suggest that the 3D virtual world environment produces both positive and negative effects on brand equity when compared to the 2D environment. The positive effect of the 3D virtual world environment on brand equity occurs through telepresence, a specific aspect of flow, as well as enjoyment. The negative effect on brand equity can be explained using distraction-conflict theory in which attentional conflicts faced by users of a highly interactive and rich medium resulted in distractions from attending to the brand. Brand equity, in turn, has a positive effect on behavioral intention. The results suggest that although the 3D virtual world environment has the potential to increase brand equity by offering an immersive and enjoyable virtual product experience, the rich environment can also be a distraction. Therefore, developers of virtual world branding sites need to take into account limitations in the information processing capacity and attention span of users when designing their sites in order to avoid cognitive overload, which can lead to users being distracted from branding information. This paper not only provides a theoretical foundation for explaining users' experience with 2D versus 3D virtual world branding sites, but also provides insights to practitioners for designing 3D virtual world sites to enhance brand equity and intentions through user engagement.

Journal ArticleDOI
TL;DR: The aim is to provide practicing IS researchers with an understanding of key issues and potential problems associated with formatively measured constructs within a covariance-based modeling framework and encourage them to consider using CSA in their future research endeavors.
Abstract: Formatively measured constructs have been increasingly used in information systems research. With few exceptions, however, extant studies have been relying on the partial least squares (PLS) approach to specify and estimate structural models involving constructs measured with formative indicators. This paper highlights the benefits of employing covariance structure analysis (CSA) when investigating such models and illustrates its application with the LISREL program. The aim is to provide practicing IS researchers with an understanding of key issues and potential problems associated with formatively measured constructs within a covariance-based modeling framework and encourage them to consider using CSA in their future research endeavors.

Journal Article
Ning Nan1
TL;DR: A theoretical framework drawing on the concepts and the analytical tool of complex adaptive systems (CAS) theory is built that encodes a bottom-up IT use process into three interrelated elements: agents that consist of the basic entities of actions in an ITUse process, interactions that refer to the mutually adaptive behaviors of agents, and an environment that represents the social organizational contexts of IT use.
Abstract: Although information systems researchers have long recognized the possibility for collective- level information technology use patterns and outcomes to emerge from individual-level IT use behaviors, few have explored the key properties and mechanisms involved in this bottom-up IT use process. This paper seeks to build a theoretical framework drawing on the concepts and the analytical tool of complex adaptive systems (CAS) theory. The paper presents a CAS model of IT use that encodes a bottom-up IT use process into three interrelated elements: agents that consist of the basic entities of actions in an IT use process, interactions that refer to the mutually adaptive behaviors of agents, and an environment that represents the social organizational contexts of IT use. Agent-based modeling is introduced as the analytical tool for computationally representing and examining the CAS model of IT use. The operationability of the CAS model and the analytical tool are demonstrated through a theory-building exercise translating an interpretive case study of IT use to a specific version of the CAS model. While Orlikowski (1996) raised questions regarding the impacts of employee learning, IT flexibility, and workplace rigidity on IT-based organization transformation, the CAS model indicates that these factors in individual-level actions do not have a direct causal linkage with organizational- level IT use patterns and outcomes. This theory-building exercise manifests the intriguing nature of the bottom-up IT use process: collective-level IT use patterns and outcomes are the logical and yet often unintended or unforeseeable consequences of individual-level behaviors. The CAS model of IT use offers opportunities for expanding the theoretical and methodological scope of the IT use literature.

Journal ArticleDOI
TL;DR: A typology of product-related deceptive information practices is presented that illustrates the various ways in which online merchants can deceive consumers via e-commerce product websites and serves as a foundation for further theoretical and empirical investigations.
Abstract: With the advent of e-commerce, the potential of new Internet technologies to mislead or deceive consumers has increased considerably This paper extends prior classifications of deception and presents a typology of product-related deceptive information practices that illustrates the various ways in which online merchants can deceive consumers via e-commerce product websites The typology can be readily used as educational material to promote consumer awareness of deception in e-commerce and as input to establish benchmarks for good business practices for online companies In addition, the paper develops an integrative model and a set of theory-based propositions addressing why consumers are deceived by the various types of deceptive information practices and what factors contribute to consumer success (or failure) in detecting such deceptions The model not only enhances our conceptual understanding of the phenomenon of product-based deception and its outcomes in e-commerce but also serves as a foundation for further theoretical and empirical investigations Moreover, a better understanding of the factors contributing to or inhibiting deception detection can also help government agencies and consumer organizations design more effective solutions to fight online deception

Journal ArticleDOI
TL;DR: Results of this analysis suggest that system capability shortcomings, limited availability of system support, and low levels of technical integration were key determinants of increased intentions to replace an existing system.
Abstract: Limited attention has been directed toward examining post-adoption stages of the information system life cycle. In particular, the final stages of this life cycle have been largely ignored despite the fact that most systems eventually reach the end of their useful life. This oversight is somewhat surprising given that end-of-life decisions can have significant implications for user effectiveness, the value extracted from IS investments, and organizational performance. Given this apparent gap, a multi-method empirical study was undertaken to improve our understanding of organizational level information system discontinuance. Research commenced with the development of a broad theoretical framework consistent with the technology-organization-environment (TOE) paradigm. The resulting framework was then used to guide a series of semi-structured interviews with organizational decision makers in an effort to inductively identify salient influences on the formation of IS discontinuance intentions. A set of research hypotheses were formulated based on the understanding obtained during these interviews and subsequently tested via a random survey of senior IS decision makers at U.S. and Canadian organizations. Data obtained from the survey responses was analyzed using partial least squares (PLS). Results of this analysis suggest that system capability shortcomings, limited availability of system support, and low levels of technical integration were key determinants of increased intentions to replace an existing system. Notably, investments in existing systems did not appear to significantly undermine organizational replacement intentions despite support for this possibility from both theory and our semi-structured interviews.

Journal ArticleDOI
TL;DR: This paper provides an overview of this topic by contrasting ways of assessing the validity of effect and causal indicators in structural equation models (SEMs) and draws a distinction between composite (formative) indicators and causalicators and argues that validity is most relevant to the latter.
Abstract: Although the literature on alternatives to effect indicators is growing, there has been little attention given to evaluating causal and composite (formative) indicators. This paper provides an overview of this topic by contrasting ways of assessing the validity of effect and causal indicators in structural equation models (SEMs). It also draws a distinction between composite (formative) indicators and causal indicators and argues that validity is most relevant to the latter. Sound validity assessment of indicators is dependent on having an adequate overall model fit and on the relative stability of the parameter estimates for the latent variable and indicators as they appear in different models. If the overall fit and stability of estimates are adequate, then a researcher can assess validity using the unstandardized and standardized validity coefficients and the unique validity variance estimate. With multiple causal indicators or with effect indicators influenced by multiple latent variables, collinearity diagnostics are useful. These results are illustrated with a number of correctly and incorrectly specified hypothetical models.

Journal ArticleDOI
TL;DR: Higher quality PPRs are associated with greater value derived by consumers from the online product brokering activity in terms of higher decision making quality, which is positively associated with repurchase intention.
Abstract: Recent research has acknowledged the key role of information technology in helping build stronger and more enduring customer relationships. Personalized product recommendations (PPRs) adapted to individual customers' preferences and tastes are one IT-enabled strategy that has been widely adopted by online retailers to enhance customers' shopping experience. Although many online retailers have implemented PPRs on their electronic storefronts to improve customer retention, empirical evidence for the effects of PPRs on retention is sparse, and the limited anecdotal evidence is contradictory. We draw upon the household production function model in the consumer economics literature to develop a theoretical framework that explains the mechanisms through which PPRs influence customer store loyalty in electronic markets. We suggest that retailer learning that occurs as a result of customer knowledge obtained to enable personalization influences the efficiency of the online product brokering activity. Data collected from a two-phase lab experiment with 253 student subjects where the quality of PPRs was manipulated are used to empirically test the predictions of the theoretical model. Empirical analyses of the data indicate that retailer learning reflected in higher quality PPRs is associated with lower product screening cost, but higher product evaluation cost. We further find that higher quality PPRs are associated with greater value derived by consumers from the online product brokering activity in terms of higher decision making quality, which is positively associated with repurchase intention. The paper presents the implications, limitations, and contributions of this study along with areas for future research.