scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Management Information Systems in 2003"


Journal ArticleDOI
TL;DR: This paper discusses many of the important IS success research contributions of the last decade, focusing especially on research efforts that apply, validate, challenge, and propose enhancements to the original model.
Abstract: Ten years ago, we presented the DeLone and McLean Information Systems (IS) Success Model as a framework and model for measuring the complex-dependent variable in IS research. In this paper, we discuss many of the important IS success research contributions of the last decade, focusing especially on research efforts that apply, validate, challenge, and propose enhancements to our original model. Based on our evaluation of those contributions, we propose minor refinements to the model and propose an updated DeLone and McLean IS Success Model. We discuss the utility of the updated model for measuring e-commerce system success. Finally, we make a series of recommendations regarding current and future measurement of IS success.

9,544 citations


Journal ArticleDOI
TL;DR: A research model that interconnects knowledge management factors and focuses on knowledge creation processes such as socialization, externalization, combination, and internalization to establish credibility between knowledge creation and performance is developed.
Abstract: Knowledge is recognized as an important weapon for sustaining competitive advantage and many companies are beginning to manage organizational knowledge Researchers have investigated knowledge management factors such as enablers, processes, and performance However, most current empirical research has explored the relationships between these factors in isolation To fill this gap, this paper develops a research model that interconnects knowledge management factors The model includes seven enablers: collaboration, trust, learning, centralization, formalization, T-shaped skills, and information technology support The emphasis is on knowledge creation processes such as socialization, externalization, combination, and internalization To establish credibility between knowledge creation and performance, organizational creativity is incorporated into the model Surveys collected from 58 firms were analyzed to test the model The results confirmed the impact of trust on knowledge creation The information technology support had a positive impact on knowledge combination only Organizational creativity was found to be critical for improving performance; neglecting ideas can undermine a business The results may be used as a stepping stone for further empirical research and can help formulate robust strategies that involve trade-offs between knowledge management enablers

2,036 citations


Journal ArticleDOI
TL;DR: The thinkLet concept is proposed, a codified packet of facilitation skill that can be applied by practitioners to achieve predictable, repeatable patterns of collaboration, such as divergence or convergence, which may become a sine qua non for organizations to effectively support virtual work teams.
Abstract: Field research and laboratory experiments suggest that, under certain circumstances, people using group support systems (GSS) can be significantly more productive than people who do not use them. Yet, despite their demonstrated potential, GSS have been slow to diffuse across organizations. Drawing on the Technology Transition Model, the paper argues that the high conceptual load of GSS (i.e., understanding of the intended effect of GSS functionality) encourages organizations to employ expert facilitators to wield the technology on behalf of others. Economic and political factors mitigate against facilitators remaining long term in GSS facilities that focus on supporting nonroutine, ad hoc projects. This especially hampers scaling GSS technology to support distributed collaboration. An alternative and sustainable way for organizations to derive value from GSS lies in an approach called "collaboration engineering": the development of repeatable collaborative processes that are conducted by practitioners themselves. To enable the development of such processes, this paper proposes the thinkLet concept, a codified packet of facilitation skill that can be applied by practitioners to achieve predictable, repeatable patterns of collaboration, such as divergence or convergence.A thinkLet specifies the facilitator'schoices and actions in terms of the GSS tool used, the configuration of this tool, and scripted prompts to accomplish a pattern of collaboration in a group. Using thinkLets as building blocks, facilitators can develop and transfer repeatable collaborative processes to practitioners. Given the limited availability of expert facilitators, collaboration engineering with thinkLets may become a sine qua non for organizations to effectively support virtual work teams.

570 citations


Journal ArticleDOI
TL;DR: The development and empirical validation of a model of software piracy by individuals in the workplace indicates that individual attitudes, subjective norms, and perceived behavioral control are significant precursors to the intention to illegally copy software.
Abstract: Theft of software and other intellectual property has become one of the most visible problems in computing today. This paper details the development and empirical validation of a model of software piracy by individuals in the workplace. The model was developed from the results of prior research into software piracy, and the reference disciplines of the theory of planned behavior, expected utility theory, and deterrence theory. A survey of 201 respondents was used to test the model. The results indicate that individual attitudes, subjective norms, and perceived behavioral control are significant precursors to the intention to illegally copy software. In addition, punishment severity, punishment certainty, and software cost have direct effects on the individual's attitude toward software piracy, whereas punishment certainty has a significant effect on perceived behavioral control. Consequently, strategies to reduce software piracy should focus on these factors. The results add to a growing stream of information systems research into illegal software copying behavior and have significant implications for organizations and industry groups aiming to reduce software piracy.

509 citations


Journal ArticleDOI
TL;DR: An experimental survey is used to test a model that includes a number of factors such as trust mechanisms, "system trust," and vendor reputation to suggest that one trust mechanism, vendor guarantees, has a direct influence on system trust.
Abstract: It has been argued that the buyer's trust of the vendor is a critical precursor to a transactional relationship in an e-commerce environment. This study uses an experimental survey to test a model that includes a number of factors such as trust mechanisms, "system trust," and vendor reputation. The results suggest that one trust mechanism, vendor guarantees, has a direct influence on system trust. Further, within e-commerce situations, system trust plays an important role in the nomological network by directly affecting trust in vendors and indirectly affecting attitudes and intentions to purchase. These results held in the case of both firms with and without an established reputation. The results demonstrate the importance of interventions such as self-reported vendor guarantees that affect system trust in enabling successful e-commerce outcomes.

392 citations


Journal ArticleDOI
TL;DR: It was found that richer media can have significantly positive impacts on decision quality when participants' task-relevant knowledge is high and effects of participant deception can be mitigated by employing richer media.
Abstract: Employing media richness theory, a model is developed to open the black box surrounding the impact of computer-mediated communication systems on decision quality. The effects on decision quality of two important communication system factors, cue multiplicity and feedback immediacy, are examined in light of three important mediating constructs: social perceptions, message clarity, and ability to evaluate others. A laboratory experiment examining two tasks and employing face-to-face, electronic meeting, electronic conferencing, and electronic mail communication systems is used to assess the model's validity. Results provide consistent support for the research model as well as media richness theory. Richer media facilitate social perceptions (total socio-emotional communication and positive socio-emotional climate) and perceived ability to evaluate others' deception and expertise. Leaner media (electronic mail and electronic conferencing) facilitate communication clarity when participants have less task-relevant knowledge. The impacts of these mediating constructs on decision quality were found to depend on the levels of participant expertise and deception. In general, it was found that richer media can have significantly positive impacts on decision quality when participants' task-relevant knowledge is high. Moreover, effects of participant deception can be mitigated by employing richer media.

372 citations


Journal ArticleDOI
TL;DR: Results from a field study within a large organization indicate that developers' intentions are directly influenced by their perceptions of usefulness, social pressure, compatibility, and organizational mandate, and it is suggested that an organizational mandate is not sufficient to guarantee use of the methodology in a sustained manner.
Abstract: Seeking to improve software development, many organizations attempt to deploy formalized methodologies. This typically entails substantial behavioral change by software developers away from previous informal practices toward conformance with the methodology. Developers' resistance to such change often results in failure to fully deploy and realize the benefits of the methodology. The present research draws upon theories of intention formation and innovation diffusion to advance knowledge about why developers accept or resist following methodologies. Results from a field study within a large organization indicate that developers' intentions are directly influenced by their perceptions of usefulness, social pressure, compatibility, and organizational mandate. This pattern of intention determinants is quite different from that typically observed in studies of information technology tool adoption, revealing several key differences between the domains of tool versus methodology adoption. Specifically, although organizational mandate had a significant effect on intentions, the strength of its direct influence was the lowest among the four significant constructs, and usefulness, compatibility, and social pressure all influenced intentions directly, above and beyond the effects of organizational mandate. The findings suggest, contrary to popular belief, that an organizational mandate is not sufficient to guarantee use of the methodology in a sustained manner.

274 citations


Journal ArticleDOI
TL;DR: It is found that temporal coordination per se is not the driver of performance; rather, it is the influence of coordination on interaction behaviors that affects performance.
Abstract: In this study, we explore the nature of team interaction and the role of temporal coordination in asynchronously communicating global virtual project teams (GVPT). Drawing on Time, Interaction, and Performance (TIP) theory, we consider how and why virtual team behavior is temporally patterned in complex ways. We report on the results of an experiment consisting of 35 virtual project teams comprised of 175 members residing in the United States and Japan. Through content and cluster analysis, we identify distinct patterns of interaction and examine how these patterns are associated with differential levels of GVPT performance. We also explore the role of temporal coordination mechanisms as a means to synchronize temporal patterns in GVPTs. Ourresults suggest that successful enactment of temporal coordination mechanisms is associated with higher performance. However, we found that temporal coordination per se is not the driver of performance; rather,it is the influence of coordination on interaction behaviors that affects performance.

259 citations


Journal ArticleDOI
TL;DR: The DeLone and McLean Information Systems Success Model is presented as a framework and model for measuring the complex-dependent variable in IS research.
Abstract: Ten years ago, we presented the DeLone and McLean Information Systems (IS) Success Model as a framework and model for measuring the complex-dependent variable in IS research. In this paper, we disc...

252 citations


Journal ArticleDOI
TL;DR: It is suggested that it is important to appropriately measure the boundary of interest to the study, assess and control for other influential boundaries within and across teams, andinguish the effects of each boundary on each team outcome of interest.
Abstract: Numerous methodological issues arise when studying teams that span multiple boundaries The main purpose of this paper is to raise awareness about the challenges of conducting field research on teams in global firms Based on field research across multiple firms (software development, product development, financial services, and high technology), we outline five types of boundaries that we encountered in our field research (geographical, functional, temporal, identity, and organizational)and discuss methodological issues in distinguishing the effects of one boundary where multiple boundaries exist We suggest that it is important to: (1) appropriately measure the boundary of interest to the study, (2) assess and control for other influential boundaries within and across teams, and (3)distinguish the effects of each boundary on each team outcome of interest Only through careful attention to methodology can we properly assess the effects of team boundaries and appreciate their research and practical implications for designing and using information systems to support collaborative work

251 citations


Journal ArticleDOI
TL;DR: It is found that work roles and the mode of knowledge do matter and data collectors with why-knowledge about the data production process contribute to producing better quality data.
Abstract: Knowledge about work processes is a prerequisite for performing work. We investigate whether a certain mode of knowledge, knowing-why, affects work performance and whether the knowledge held by different work roles matters for work performance. We operationalize these questions in the specific domain of data production processes and data quality. We analyze responses from three roles within data production processes, data collectors, data custodians, and data consumers, to investigate the effects of different knowledge modes held by different work roles on data quality. We find that work roles and the mode of knowledge do matter. Specifically, data collectors with why-knowledge about the data production process contribute to producing better quality data. Overall, knowledge of data collectors is more critical than that of data custodians.

Journal ArticleDOI
TL;DR: This study is the first to identify the steps a virtual team leader undertakes when building relationships with virtual team members, and it shows very clearly that the leaders considered it essential to build some level of personal relationship with theirvirtual team members before commencing a virtual working relationship.
Abstract: This paper seeks to add to the nascent research literature on virtual teams and virtual team leadership by investigating the issues facing virtual team leaders as they implement and lead virtual teams. In particular, the way in which leaders develop relationships with their virtual team members is explored. A research framework involving action learning was instituted, with data collection and analysis based on grounded theory approaches. In all, seven virtual team leaders from a variety of New Zealand organizations took part in the study. The data showed very clearly that the leaders considered it essential to build some level of personal relationship with their virtual team members before commencing a virtual working relationship. A unifying framework of three interrelated theoretical steps, which illustrates how a virtual leader builds relationships with virtual team members, is introduced. These three steps are assessing conditions, targeting level of relationship, and creating strategies. This study is the first to identify the steps a virtual team leader undertakes when building relationships with virtual team members. The implications for virtual team practice and research are discussed.

Journal ArticleDOI
TL;DR: This paper presents a method named product-based workflow design (PBWD), which takes the product specification and three design criteria as a starting point, after which formal models and techniques are used to derive a favorable new design of the workflow process.
Abstract: In manufacturing, the interaction between the design of a product and the process to manufacture this product is studied in detail. Consider, for example, material requirements planning (MRP) as part of current enterprise resource planning (ERP) systems, which is mainly driven by the bill of material (BOM). For information-intensive products such as insurances, and many other services, the workflow process typically evolves or is redesigned without careful consideration of the structure and characteristics of the product. In this paper, we present a method named product-based workflow design (PBWD). PBWD takes the product specification and three design criteria as a starting point, after which formal models and techniques are used to derive a favorable new design of the workflow process. The ExSpect tool is used to support PBWD. Finally, using a real case study, we demonstrate that a full evaluation of the search space for a workflow design may be feasible depending on the chosen design criteria and the specific nature of the product specifications.

Journal ArticleDOI
TL;DR: The goal here is to conceptualize the IS success antecedents (ISSA) area of research through surveying, synthesizing, and explicating the work in the domain through a combination of qualitative and quantitative research methods.
Abstract: Research in the information systems (IS) field has often been characterized as fragmented. This paper builds on a belief that for the field to move forward and have an impact on practitioners and other academic fields, the existing work must be examined and systematized. It is particularly important to systematize research on the factors that underlie success of organizational IS. The goal here is to conceptualize the IS success antecedents (ISSA) area of research through surveying, synthesizing, and explicating the work in the domain. Using a combination of qualitative and quantitative research methods, a taxonomy of 12 general categories is created, and existing research within each category is examined. Important lacunae in the direction of work have been determined. It is found that little work has been conducted on the macro-level independent variables, the most difficult variables to assess, although these variables may be the most important to understanding the ultimate value of IS to organizations. Similarly, ISSA research on success variables of consequence to organizations was found severely lacking. Variable analysis research on organizational-level success variables was found to be literally nonexistent in the IS field, whereas research in the organizational studies field was found to provide useful directions for IS researchers. The specifics of the 12 taxonomy areas are analyzed and directions for research in each of them provided. Thus, researchers and practitioners are directed toward available research and receive suggestions for future work to bring ISSA research toward an organized and cohesive future.

Journal ArticleDOI
TL;DR: The results suggest that performance is enhanced by establishing uniform performance criteria across projects while giving each project team the authority to make decisions with respect to methods (decentralization of methods), however, standardization of Methods across all projects and decentralization of performance criteria were both not significantly related to performance.
Abstract: The performance of firms in the software industry depends considerably on the quality of their software development processes. Managing software development is a challenging task, as management controls need to impose discipline and coordinate action to ensure goals are met while simultaneously incorporating autonomy to motivate software developers to be innovative and produce quality work. How should such firms manage software development projects so that their development processes are flexible and predictable--resulting in products that meet quality goals and that are delivered within budget and on time? The management literature suggests two approaches to control: the process approach and the structure approach. The process approach recommends control of activities through specifying methods (behavior control) and through specifying performance criteria (outcome control). In contrast, the structure approach recommends control through centrally devised standards for activities (standardization) and by the delegation of authority for decision-making (decentralization). This study synthesizes these two approaches to suggest that formal managerial control is exerted through a matrix of control comprising four modes: standardization of methods, standardization of performance criteria, decentralization of methods, and decentralization of performance criteria. We test the association of the modes of control with performance in a sample of 56 firms in the software industry in the United States. The results suggest that performance is enhanced by establishing uniform performance criteria across projects (standardization of performance criteria) while giving each project team the authority to make decisions with respect to methods (decentralization of methods). However, standardization of methods across all projects and decentralization of performance criteria by delegating the authority to make decisions about performance criteria to project teams were both not significantly related to performance. The matrix of control and its relationship to performance has theoretical and practical implications for managing software development. This model of control is also likely to be useful in other knowledge-work-intensive settings.

Journal ArticleDOI
TL;DR: A practical procedure for data gathering and analysis is defined to uncover and model CSC in the firm and to generate ideas for important IS projects to help managers to consider a wider range of development ideas.
Abstract: We extend critical success factors (CSF) methodology to facilitate participation by many people within and around the organization for information systems (IS) planning. The resulting new methodology, called "critical success chains" (CSC), extends CSF to explicitly model the relationships between IS attributes, CSF, and organizational goals. Its use is expected to help managers to (1) consider a wider range of development ideas, (2) better balance important strategic, tactical, and operational systems in the development portfolio, (3) consider the full range of options to accomplish desired objectives, and (4) better optimize the allocation of resources for maintenance and small systems. We trace the development of CSF and make the case for extending it. In two case studies, one at Rutgers University and another at Digia, Inc., we demonstrate the use of CSC in planning. At Rutgers, we use CSC to observe employees' preferences for new systems features, to model the reasons why they think that the features are important to the firm, and to generate strategic IS project proposal ideas. At Digia, we use CSC to generate ideas for new financial services applications based on mobile communications technology for which Digia would be a part of the value chain. From our experience in the case studies, we define a practical procedure for data gathering and analysis to uncover and model CSC in the firm and to generate ideas for important IS projects.

Journal ArticleDOI
TL;DR: A causal model of meeting satisfaction derived from goal setting theory is presented and results of analysis using structural equation modeling indicate support for the model's integrity across both GSS and FTF groups.
Abstract: Collaborative technologies such as group support systems (GSS) are of ten developed to improve the effectiveness and efficiency of teams; however, the satisfaction users have with the processes and outcomes of the teamwork itself often determines the ultimate adoption and sustained use of collaborative technologies. Much of the research on teamwork has focused on meetings in particular and, consequently, satisfaction with the process and outcomes of meetings, referred to collectively as meeting satisfaction. Research on meeting satisfaction in GSS-supported groups has been equivocal, indicating the need for advancement in our theoretical understanding of the construct. To that end, this paper presents a causal model of meeting satisfaction derived from goal setting theory. The model is tested with an empirical study consisting of 15 GSS groups and 11 face-to-face (FTF) groups engaged in the "lost at sea" task. The results of analysis using structural equation modeling indicate support for the model's integrity across both GSS and FTF groups. Implications for researchers and practitioners are discussed, including how the model can be used to improve future research on the use of collaborative technology to support teamwork.

Journal ArticleDOI
TL;DR: This work presents a theoretical model in which usage of a collaborative system intervenes between teamwork quality and team performance for tasks that are supported by the system and not for unsupported tasks.
Abstract: Although team-based work systems are pervasive in the workplace, the use of collaborative systems designed to facilitate and support ongoing teamwork is a relatively recent development. An understanding of how teams embrace and use such collaborative systems - and the relationship of that usage to teamwork quality and team performance - is critical for organizational success. We present a theoretical model in which usage of a collaborative system intervenes between teamwork quality and team performance for tasks that are supported by the system. We empirically validate the model in a setting where established teams voluntarily used a collaborative system over a four-month period to perform tasks with measurable outcomes. Our principal finding is that collaborative system use intervenes between teamwork quality and performance for tasks supported by the system but not for unsupported tasks.

Journal ArticleDOI
TL;DR: An IS methodology for analysis of Internet-based QD consisting of three steps: elicitation; reduction through IS-facilitated selection, coding, and clustering; and visualization to provide at-a-glance understanding, finding that qualitative data analysis (QDA) accurately reflected film popularity.
Abstract: The volume of qualitative data (QD) available via the Internet is growing at an increasing pace and firms are anxious to extract and understand users' thought processes, wants and needs, attitudes, and purchase intentions contained therein. An information systems (IS) methodology to meaningfully analyze this vast resource of QD could provide useful information, knowledge, or wisdom firms could use for a number of purposes including new product development and quality improvement, target marketing, accurate "user-focused" profiling, and future sales prediction. In this paper, we present an IS methodology for analysis of Internet-based QD consisting of three steps: elicitation; reduction through IS-facilitated selection, coding, and clustering; and visualization to provide at-a-glance understanding. Outcomes include information (relationships),knowledge (patterns), and wisdom (principles)explained through visualizations and drill-down capabilities. First we present the generic methodology and then discuss an example employing it to analyze free-form comments from potential consumers who viewed soon-to-bereleased film trailers provided that illustrates how the methodology and tools can provide rich and meaningful affective, cognitive, contextual, and evaluative information, knowledge, and wisdom. The example revealed that qualitative data analysis (QDA) accurately reflected film popularity. A finding is that QDA also provided a predictive measure of relative magnitude of film popularity between the most popular film and the least popular one, based on actual first week box office sales. The methodology and tools used in this preliminary study illustrate that value can be derived from analysis of Internet-based QD and suggest that further research in this area is warranted.

Journal ArticleDOI
TL;DR: This paper examines how information systems (IS) executives respond to what has been termed organizing visions for IT, grand ideas for applying IT, the presence of which is typically announced by much "buzz" and hyperbole.
Abstract: Making sense of new information technology (IT) and the many buzzwords associated with it is by no means an easy task for executives. Yet doing so is crucial to making good innovation decisions. This paper examines how information systems (IS) executives respond to what has been termed organizing visions for IT, grand ideas for applying IT, the presence of which is typically announced by much "buzz" and hyperbole. Developed and promulgated in the wider interorganizational community, organizing visions play a central role in driving the innovation adoption and diffusion process. Familiar and recent examples include electronic commerce, data warehousing, and enterprise systems. A key aspect of an organizing vision is that it has a career. That is, even as it helps shape how IS managers think about the future of application and practice in their field, the organizing vision undertakes its own struggle to achieve ascendancy in the community. The present research explores this struggle, specifically probing how IS executives respond to visions that are in different career stages. Employing field interviews and a survey, the study identifies four dimensions of executive response focusing on a vision's interpretability, plausibility, importance, and discontinuity. Taking a comparative approach, the study offers several grounded conjectures concerning the career dynamics of organizing visions. For the IS executive, the findings help point the way to a more proactive, systematic, and critical stance toward innovations that can place the executive in a better position to make informed adoption decisions.

Journal ArticleDOI
TL;DR: In this article, the authors have published more than 60 scholarly works on the theoretical foundations collaboration, and applied their findings to the development and deployment of collaborative tech-nology to enhance team productivity, team creativity, and team satisfaction.
Abstract: is Research Coordinator at the Center for the Management ofInformation at the University of Arizona and Associate Professor of CollaborationEngineering at Delft University of Technology in the Netherlands. He is also Directorof Research and Development for GroupSystems.com. As a reseiu-cher, he has pub-lished more than 60 scholarly works on the theoretical foundations collaboration,and he applies his findings to the development and deployment of collaborative tech-nology to enhance team productivity, team creativity, and team satisfaction. His workon organizational transition to collaborative technology led to new insights abouthow to conceive of and deploy group support systems so as to create self-sustainingand growing communities of users. He received his Ph.D. from University of Arizonain 1994.

Journal ArticleDOI
TL;DR: A model that can help companies to evaluate data currency, accuracy, and completeness in software architectures with different degrees of integration across channels and functionalities is provided and validated through simulation based on empirical data on financial information systems.
Abstract: Modern organizations offer services through multiple channels, such as branches, ATMs, telephones, and Internet sites, and are supported by multifunctional software architectures. Different functional modules share data, which are typically stored in multiple local databases. Functional modules are usually not integrated across channels, as channels are implemented at different times within independent software projects and are subject to varying requirements of availability and performance. This lack of channel and functional integration raises data quality problems that can impact the quality of the products and services of an organization. In particular, in complex systems in which data are managed in multiple databases, timeliness is critical. This paper focuses on time-related factors of data quality and provides a model that can help companies to evaluate data currency, accuracy, and completeness in software architectures with different degrees of integration across channels and functionalities. The model is validated through simulation based on empirical data on financial information systems. Results indicate how architectural choices on the degree of data integration have a varying impact on currency, accuracy, and completeness depending on the type of financial institution and on customer profiles.

Journal ArticleDOI
TL;DR: The proposed framework calls for trainers to be continuously engaged with users and help refine their knowledge during the process of appropriation of the collaborative application, and suggests that theoretical foundations rooted in collective learning be adopted to guide training research in collaborative applications.
Abstract: Researchers have emphasized that existing training strategies must be modified in order to adequately prepare users to employ collaborative applications. We utilize findings from the vast amount of training research conducted thus far and point to some problems that might occur when existing strategies are applied to train users of collaborative applications. We test our ideas by conducting a longitudinal field study of a collaborative work flow application. As proposed in a recent knowledge-level framework, our findings indicate that training programs must not solely focus on developing users' system proficiency skills but must also educate users about the business processes that the collaborative application will support. This additional knowledge will enable users to deal with technology-induced changes in the business processes due to the deployment of the collaborative application. Furthermore, we find that training programs should sensitize users to the interdependencies that exist among their tasks and make them aware of the collective consequences of their individual actions. We also found that users have to engage in collective problem solving efforts and continuously learn new knowledge during the process of appropriation of the collaborative application. We propose a training framework that integrates these ideas to prepare users to make effective use of collaborative applications. The proposed framework calls for trainers to be continuously engaged with users and help refine their knowledge during the process of appropriation. We suggest that theoretical foundations rooted in collective learning be adopted to guide training research in collaborative applications.

Journal ArticleDOI
TL;DR: This study examines the potential applications of the rational expectations hypothesis (REH) and adaptive learning theory in IT investment and adoption decision-making and presents the efficacy of the theoretical perspective that is presented to characterize the business value expectations formation process in IT adoption.
Abstract: This study examines the potential applications of the rational expectations hypothesis (REH) and adaptive learning theory in IT investment and adoption decision-making. Despite the fact that rationality is commonly assumed in economic analyses, the REH's assumptions make it a unique theory and allow us to offer new perspectives on IS/IT adoption and investment decision-making. Our application of these theoretical perspectives to the IT adoption context--the first time in the IS literature to our knowledge that REH has been used to examine the mechanism for business value expectations formation--will allow us to treat the investment and adoption issues using a perspective that is based on a longer time horizon. Such settings require managers, as economic agents, to form a set of expectations about the values of various variables related to the business value of IT. Rational expectations and adaptive learning assume that decision-makers are able to utilize all available decision-relevant information efficiently and can learn the true value of a prospective investment over time. We present a number of propositions that characterize this perspective, and discuss some illustrative examples that demonstrate the efficacy of the theoretical perspective that we present to characterize the business value expectations formation process in IT adoption.

Journal ArticleDOI
TL;DR: A conceptual model of data quality problem solving is built that finds that experienced practitioners solve data quality problems by reflecting on and explicating knowledge about contexts embedded in, or missing from, data.
Abstract: Motivated by the growing importance of data quality in data-intensive, global business environments and by burgeoning data quality activities, this study builds a conceptual model of data quality problem solving. The study analyzes data quality activities at five organizations via a five-year longitudinal study. The study finds that experienced practitioners solve data quality problems by reflecting on and explicating knowledge about contexts embedded in, or missing from, data. Specifically, these individuals investigate how data problems are framed, analyzed, and resolved throughout the entire information discourse. Their discourse on contexts of data, therefore, connects otherwise separately managed data processes, that is, collection, storage, and use. Practitioners' context-reflective mode of problem solving plays a pivotal role in crafting data quality rules. These practitioners break old rules and revise actionable dominant logic embedded in work routines as a strategy for crafting rules in data quality problem solving.

Journal ArticleDOI
TL;DR: It is submitted that firms are better off making efficiency-enhancing IT investments if the market in which they operate is more price sensitive than quality sensitive, and the effect of IT investments on productivity is contingent on market sensitivities to changes in the price and quality.
Abstract: Over the past two decades, numerous empirical studies have been conducted on the contribution of information technology (IT) to productivity and other measures of firm performance. However, few theoretical studies have attempted to explain the contingencies under which IT investments may or may not be valuable to a firm in a competitive market. This research proposes a duopoly competition model to study the impacts of IT investments on firm performance and productivity. We show that the extent to which a profit-maximizing firm benefits from IT investments is a function of, among other things, market sensitivities to the price and quality of the products and services offered by the firm and its competitor. We demonstrate that, under duopolistic competition, the effects of IT investments are not as deterministic as under monopolistic competition. We further show that the effect of IT investments on productivity, in a duopoly market, are contingent on market sensitivities to changes in the price and quality of products and services offered by the firm and its competitor, as well as on fixed and overhead costs being sufficiently large in relation to market size--an important condition in a monopoly market. Especially, the price sensitivity has a positive effect on the impact of IT investments on productivity and quality sensitivity has a negative effect. We submit that firms are better off making efficiency-enhancing IT investments if the market in which they operate is more price sensitive than quality sensitive.

Journal ArticleDOI
TL;DR: An analytical model formulates an analytical model to study the implications of maintaining different or incompatible technology standards in DVD and other optical disc players on global pricing and piracy of movie discs and finds that maintaining separate technology standards is very critical when there is piracy.
Abstract: Even if bandwidth on the Internet is limited, compression technologies have made online music piracy a foremost problem in intellectual copyright protection. However, due to significantly larger sizes of video files, movies are still largely pirated by duplicating DVDs, VCDs, and other physical media. In the case of DVDs, movie studios have historically maintained different technology codes or formats across various regions of the world, primarily to control the timing of theatrical releases in these parts of the world. This paper formulates an analytical model to study the implications of maintaining different or incompatible technology standards in DVD and other optical disc players on global pricing and piracy of movie discs. Our formulation develops two distinct piracy types, namely, regional and global piracy, signifying if consumers will pirate movies released for their own region or those meant for other regions. Our results find that maintaining separate technology standards is very critical when there is piracy, as losses from global piracy can be higher than when only regional piracy exists. Further, we observe that piracy is not a victimless crime, in that not only do producers suffer losses but consumers in regions with high willingness to pay for quality also stand to lose. In addition, we find that increasing homogeneity in consumer preferences for quality across regions may not be beneficial to digital product vendors unless there is also uniformity in copyright protection laws. We conclude with recommendations for research and practice for movie studios as well as producers for other goods that are dependent on copyright protection such as books and pharmaceuticals.

Journal ArticleDOI
TL;DR: In this paper, the authors outline a technical approach to a corporate householding knowledge processor (CHKP) to solve a particularly important type of householding problem -entity aggregation.
Abstract: Advances in corporate householding are needed to address certain categories of data quality problems caused by data misinterpretation. In this paper, we first summarize some of these data quality problems and our more recent results from studying corporate householding applications and knowledge exploration. Then we outline a technical approach to a corporate householding knowledge processor (CHKP) to solve a particularly important type of corporate householding problem--entity aggregation. We illustrate the operation of the CHKP by using a motivational example in account consolidation. Our CHKP design and implementation uses and expands on the COntext INterchange (COIN) technology to manage and process corporate householding knowledge.

Journal ArticleDOI
TL;DR: The study's participants made attributions that were significantly more accurate than chance guessing, and factors that had a positive influence on attribution accuracy include evaluative tone of comments and amount of prior communication received from other group members.
Abstract: This study examines whether technically "anonymous" comments entered by participants during group support system (GSS) brainstorming sessions are, in fact, unidentifiable. Hypotheses are developed and tested about the influences of comment length, comment evaluative tone, duration of group membership, and prior communication among group members on the accuracy of attributions they made about the identity of the authors of these technically anonymous comments. Data on prior communication and group history about each of the 32 small groups was collected before participants began using a GSS for brainstorming. Immediately after the session, each member was asked to attribute authorship to a sample of the session's anonymous comments (comment authorship was known to the researchers). The study's participants made attributions that were significantly more accurate than chance guessing. Factors that had a positive influence on attribution accuracy include evaluative tone of comments (especially humorous comments) and amount of prior communication received from other group members. Vividness of comment tone and comment length was not significantly correlated with attribution accuracy. Although the attributions of anonymous comments were more accurate than expected by chance, most of the attributions were incorrect. Implications and consequences of both accurate and inaccurate attribution are discussed along with suggestions for future research.

Journal ArticleDOI
TL;DR: This research presents a novel and scalable approach to knowledge management called “Smart Knowledge Management,” which automates the very labor-intensive and therefore time-heavy and expensive process of manually cataloging and storing knowledge.
Abstract: Knowledge is recognized as an important weapon for sustaining competitive advantage and many companies are beginning to manage organizational knowledge. Researchers have investigated knowledge mana...