scispace - formally typeset
Search or ask a question

Showing papers in "Journal of the Association for Information Systems in 2006"


Journal ArticleDOI
TL;DR: A method for evaluating ideas with regard to four dimensions—novelty, workability, relevance, and specificity—and has identified two measurable sub-dimensions for each of the four main dimensions is described.
Abstract: Researchers and practitioners have an abiding interest in improving tools and methods to support idea generation. In studies that go beyond merely enumerating ideas, researchers typically select one or more of the following three constructs, which are often operationalized as the dependent variable(s): 1) idea quality, 2) idea novelty, which is sometimes referred to as rarity or unusualness, and 3) idea creativity. It has been chronically problematic to compare findings across studies because these evaluation constructs have been variously defined and the constructs have been sampled in different ways. For example, some researchers term an idea ‘creative’ if it is novel, while others consider an idea to be creative only if it is also applicable, effective, and implementable. This paper examines 90 studies on creativity and idea generation. Within the creativity studies considered here, the novelty of ideas was always measured, but in 1 Detmar Straub was the accepting senior editor. This paper was submitted on September, 3, 2004, and went through three revisions. Journal of the Association for Information Systems Vol. 7 No. 10, pp. 646-699/October 2006 646 Identifying Quality, Novel, and Creative Ideas/Dean et al. some cases the ideas had to also meet additional requirements to be considered creative. Some studies that examined idea quality also assessed novelty, while others measured different quality attributes, such as effectiveness and implementability, instead. This paper describes a method for evaluating ideas with regard to four dimensions—novelty, workability, relevance, and specificity—and has identified two measurable sub-dimensions for each of the four main dimensions. An action-research approach was used to develop ordinal scales anchored by clearly differentiable descriptions for each sub-dimension. Confirmatory factor analysis revealed high loadings among the sub-dimensions that comprise each dimension as well as high discriminant validity between dimensions. Application of this method resulted in high inter-rater reliability even when the method was applied by different raters to different problems and to ideas produced by both manual methods and group support systems (GSS).

416 citations


Journal ArticleDOI
TL;DR: This paper presents the response rates reported in six well-regarded IS journals and summarizes how nonresponse error was estimated and handled in published IS research, and calculates its impact on confidence intervals.
Abstract: We believe IS researchers can and should do a better of job of improving (assuring) the validity of their findings by minimizing nonresponse error. To demonstrate that there is, in fact, a problem, we first present the response rates reported in six well-regarded IS journals and summarize how nonresponse error was estimated and handled in published IS research. To illustrate how nonresponse error may bias findings in IS research, we calculate its impact on confidence intervals. After demonstrating the impact of nonresponse on research findings, we discuss three post hoc remedies and three preventative measures for the IS researcher to consider. The paper concludes with a general discussion about nonresponse and its implications for IS research practice. In our delimitations section, we suggest directions for further exploring external validity.

374 citations


Journal ArticleDOI
TL;DR: Previous models of e-commerce adoption are extended by specifically assessing the impact that consumers' concerns for information privacy (CFIP) have on their willingness to engage in online transactions, and results indicate that merchant familiarity does not moderate the relationship between CFIP and risk perceptions or CFIP
Abstract: Although electronic commerce experts often cite privacy concerns as barriers to consumer electronic commerce, there is a lack of understanding about how these privacy concerns impact consumers' willingness to conduct transactions online. Therefore, the goal of this study is to extend previous models of e-commerce adoption by specifically assessing the impact that consumers' concerns for information privacy (CFIP) have on their willingness to engage in online transactions. To investigate this, we conducted surveys focusing on consumers’ willingness to transact with a well-known and less well-known Web merchant. Results of the study indicate that concern for information privacy affects risk perceptions, trust, and willingness to transact for a wellknown merchant, but not for a less well-known merchant. In addition, the results indicate that merchant familiarity does not moderate the relationship between CFIP and risk perceptions or CFIP and trust. Implications for researchers and practitioners are discussed.

352 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigate the causal relationships between two constructs, perceived enjoyment (PE) and perceived ease of use (PEOU), within the nomological net of user technology acceptance.
Abstract: Identifying causal relationships is an important aspect of scientific inquiry. Causal relationships help us to infer, predict, and plan. This research investigates the causal relationships between two constructs, perceived enjoyment (PE) and perceived ease of use (PEOU), within the nomological net of user technology acceptance. PE has been theorized and empirically validated as either an antecedent or a consequence of PEOU. We believe that there are two reasons that account for this ambiguity the conceptual coupling of PE and PEOU and the limitations of covariance-based statistical methods. Accordingly, we approach this inconsistency by providing more theoretical reasoning and employing an alternative statistical method, namely Cohen’s path analysis. Specifically, as suggested by previous research on the difference between utilitarian and hedonic systems, we propose the conditional dominance of causal directions. Empirical results from two studies using different technologies and user samples support the theoretical claim that the PE∆PEOU causal direction outweighs the PEOU∆PE direction for utilitarian systems. There are both theoretical and the methodical contributions of this research. The approach applied in this research can be generalized to study causal relationships between conceptually coupled variables, which otherwise may be overlooked by confirmatory methods. We encourage researchers to pay attention to causal directions in addition to causal connectedness.

278 citations



Journal ArticleDOI
TL;DR: This paper advances six design theories (the conservative-deontological, liberal-intuitive, prima-facie, virtue, utilitarian and universalizability theories) and outlines the use of their distinctive application principles in guiding the application of IS security policies.
Abstract: The unpredictability of the business environment drives organizations to make rapid business decisions with little preparation. Exploiting sudden business opportunities may require a temporary violation of predefined information systems (IS) security policies. Existing research on IS security policies pays little attention to how such exceptional situations should be handled. We argue that normative theories from philosophy offer insights on how such situations can be resolved. Accordingly, this paper advances six design theories (the conservative-deontological, liberal-intuitive, prima-facie, virtue, utilitarian and universalizability theories) and outlines the use of their distinctive application principles in guiding the application of IS security policies. Based on the testable design product hypotheses of the six design theories, we derive a theoretical model to explain the influence of the different normative theories on the “success” of IS security policies and guidelines.

156 citations


Journal ArticleDOI
TL;DR: What it means to be an informed IS researcher is considered by focusing attention on theory adaptation in IS research, and four recommendations for theory adaptation are offered.
Abstract: In this paper we consider what it means to be an informed IS researcher by focusing attention on theory adaptation in IS research. The basic question we seek to address is: “When one borrows theory from another discipline, what are the issues that one must consider?” After examining the role of theory in IS research, we focus on escalation theory applied to IS projects as an example. In doing so, we seek to generate increased awareness of the issues that one might consider when adapting theories from other domains to research in our field. This increased awareness may then translate to a more informed use of theories in IS. Following a self-reflexive tale of how and why escalation theory was adopted to IS research, we offer four recommendations for theory adaptation: (1) consider the fit between selected theory and phenomenon of interest, (2) consider the theory’s historical context, (3) consider how the theory impacts the choice of research method, and (4) consider the contribution of theorizing to cumulative theory.

152 citations



Journal ArticleDOI
TL;DR: A quantitative analysis of over 72,600 citations spread across 1406 IS articles in 16 journals over the period 1990-2003 reveals a distinct trend toward a cumulative tradition, a changing mix of reference disciplines, and a two-way relationship between IS and some of the more mature disciplines.
Abstract: For the past two decades notions of “cumulative tradition” and “reference disciplines” have been a significant part of the introspective debates on the IS field. We provide an 1 Detmar Straub was the accepting senior editor. This paper was submitted on June 5, 2004, and went through three revisions. 2 We are indebted to Detmar Straub, and three anonymous reviewers for their many insightful comments on earlier drafts of this article. Grover et al./ The Evolution and State of IS Journal of the Association for Information Systems Vol. 7 No. 5, pp. 270-325/ May 2006 271 exploratory test on these notions using sociometric analysis. In doing so, we extend the work of Culnan and Swanson originally carried out about 25 years ago. By using the concept of a “work point” and “reference points” to identify where an IS article is published and the extent to which it draws from or contributes to other disciplines, we can position research in the IS field. First, a quantitative analysis of over 72,600 citations spread across 1406 IS articles in 16 journals over the period 1990-2003 reveals a distinct trend toward a cumulative tradition, a changing mix of reference disciplines, and a two-way relationship between IS and some of the more mature disciplines. Second, post-hoc content analysis provides a glimpse of how IS work is being utilized by other disciplines. Overall, our analysis indicates that IS is taking up a more socio-technical persona, building upon its own knowledge base, and repaying its debts by contributing to other disciplines. We interpret the movement towards building a cumulative tradition, and informing work in other disciplines as positive, as we strive toward being part of an intellectual network and establish centrality in areas that matter to us most.

128 citations


Journal ArticleDOI
TL;DR: This paper seeks to clarify the multi-dimensional notion of flexibility in outsourcing by examining robustness, modifiability, new capability, and ease of exit, and develops a framework to classify existing practices in managing outsourcing flexibility.
Abstract: In recent years, outsourcing has gained considerable management attention. However, the benefits of outsourcing are not without concessions. One major risk is losing the flexibility to change the extent, nature, or scope of the outsourced business services, and such flexibility is strategically imperative in today’s dynamic business environment. This paper seeks to clarify the multi-dimensional notion of flexibility in outsourcing by examining robustness, modifiability, new capability, and ease of exit. Adapting from Evans (1991), we also develop a framework to classify existing practices in managing outsourcing flexibility. We go beyond contractual provision to surface a portfolio of pre-emptive, protective, exploitive, and corrective maneuvers. These strategic maneuvers map well to traditional notions in coordination theory, both in advanced structuring through loose coupling and dependency diversification, and in dynamic adjustment through proactive sensing and reactive adapting. We put forward a set of propositions hypothesizing the relationships between the various strategic maneuvers and the different dimensions of outsourcing flexibility, and discuss the moderating impact of such maneuvers on outsourcing success. We hope the greater conceptual clarity will not only contribute to the effectiveness of outsourcing management but also spawn a new research agenda on outsourcing flexibility.

127 citations


Journal ArticleDOI
TL;DR: In this article, the main dangers of offshoring are the loss of possibly-important business skills and reliance on remote suppliers who face risks that are unfamiliar to the client firm.
Abstract: e Abstract Offshore provision of IS/IT related services has been growing rapidly in recent years and seems firmly set to continue. This trend is fueled by the many advantages of offshore service procurement; however, there are dangers in this practice. Furthermore, offshoring requires adaptation of the IS function and IS management. This, in turn suggests the need for modifications of IS curricula in order to prepare graduates for the new environment. The advantages of offshoring are those of outsourcing in general - cost saving and allowing the organization to focus on its core activities. The main dangers include loss of possibly-important business skills and reliance on remote suppliers who face risks that are unfamiliar to the client firm. The loss of jobs due to offshoring also introduces political considerations. Offshore IS activities are generally the responsibility of an organization's CIO. This management responsibility requires awareness of cultural and legal differences and of risks associated with offshoring and outsourcing in general. Offshoring has an effect on job opportunities for graduates of


Journal ArticleDOI
TL;DR: A survey of IS project membership provides the data, which indicates that partnering significantly relates to higher user support, less residual risk, and better project performance.
Abstract: Information system software development projects suffer from a high failure rate. One of many obstacles faced by project managers is non-supportive users, those not actively sharing in development responsibilities. The coordination activity of early partnering has been proposed in the literature to promote collaboration and enhance user support. The extent of partnering is considered in a model that relates partnering to the risks of user non-support and eventual project success. The model is developed from contingency theory, with residual performance risk as an intermediary variable. A survey of IS project membership provides the data, which indicates that partnering significantly relates to higher user support, less residual risk, and better project performance. Researchers may use variations on the model to examine other barriers to success and the techniques applied to lower the barriers. Practitioners should consider applying partnering techniques to improve software development project performance.

Journal ArticleDOI
TL;DR: According to the present invention, there is provided an adsorbent of interleukins which comprises a water-insoluble carrier having an anionic functional group, a method for adsorbing, removing and recovering interleucins using the same and an adsorber using the the same.
Abstract: With the advent of the Internet, we have seen existing markets transform and new ones emerge. We contribute to the understanding of this phenomenon by developing a unified theory about the role that IT plays in affecting market information, transparency and market structure. In particular, we introduce a new theoretical framework which uncovers the process and the forces that, together with IT, facilitate or inhibit the emerging dominance of transparent electronic markets. Transparent electronic markets offer unbiased, complete, and accurate market information. Our effort to develop a unified theoretical framework begins with a thorough assessment of the prior literature. It also uses an inductive approach involving the case study method, in which we contrast and compare the forces that have led the air travel and financial securities markets to become increasingly transparent. Building on the electronic markets and electronic hierarchies research of Malone, Yates and Benjamin (1987), our findings suggest that IT alone does not explain a move to transparent electronic markets. Instead, we argue that enhanced electronic representation of products, and competitive and institutional forces have also played an important role in the process by which most sellers have come to favor transparent markets.

Journal ArticleDOI
TL;DR: In this paper, an exploratory study of model variations was conducted, in which empirically obtained model variations were qualitatively analyzed and classified into variation types, and two ontology-based modeling frameworks were evaluated in order to evaluate their potential contribution to a reduction in variations.
Abstract: Conceptual models are aimed at providing formal representations of a domain. They are mainly used for the purpose of understanding and communicating about requirements for information systems. Conceptual modeling has acquired a large body of research dealing with the semantics of modeling constructs, with the goal to make models better vehicles for understanding and communication. However, it is commonly known that different people construct different models of a given domain although all may be similarly adequate. The premise of this paper is that variations in models reflect vagueness in the criteria for deciding how to map reality into modeling constructs. Exploring model variations as such can contribute to research that deals with the semantics of modeling constructs. This paper reports an exploratory study in which empirically obtained model variations were qualitatively analyzed and classified into variation types. In light of the identified variation types, we analyzed two ontology-based modeling frameworks in order to evaluate their potential contribution to a reduction in variations. Our analysis suggests that such frameworks may contribute to more conclusive modeling decision making, thus reducing variations. However, since there is no complete consistency between the two frameworks, in order to reduce variations, a single framework should be systematically applied.

Journal ArticleDOI
TL;DR: It is shown that it is possible to match the social attributes of a technological artifact with those of the user, and the specific ways in which technology design can manifest social attributes are described.
Abstract: This research proposes that technological artifacts are perceived as social actors, and that users can attribute personality and behavioral traits to them. These formed perceptions interact with the user’s own characteristics to construct an evaluation of the similarity between the user and the technological artifact. Such perceptions of similarity are important because individuals tend to more positively evaluate others, in this case technological artifacts, to whom they are more similar. Using an automated shopping assistant as one type of technological artifact, we investigate two types of perceived similarity between the customer and the artifact: perceived personality similarity and perceived behavioral similarity. We then investigate how design characteristics drive a customer’s perceptions of these similarities and, importantly, the bases for those design characteristics. Decisional guidance and speech act theory provide the basis for personality manifestation, while normative versus heuristic-based decision rules provide the basis for behavioral manifestation. We apply these design bases in an experiment. The results demonstrate that IT design characteristics can be used to manifest desired personalities and behaviors in a technological artifact. Moreover, these manifestations of personality and behavior interact with the customer’s own personality and behaviors to create matching 1 Dennis Galetta was the accepting senior editor. This paper was submitted on March 1 2006 and went through 2 rounds of revision. Journal of the Association for Information Systems Vol. 7 No. 12, pp. 821-861/December 2006 821 The Role of Design Characteristics in Shaping Perceptions/Al-Natour et al. perceptions of personality and behavioral similarity between the customer and the artifact. This study emphasizes the need to consider technological artifacts as social actors and describes the specific ways in which technology design can manifest social attributes. In doing so, we show that it is possible to match the social attributes of a technological artifact with those of the user.

Journal ArticleDOI
TL;DR: Design theory is used to develop a SIS design theory framework that defines six requirements for Sis design methods, and it is shown how known S IS design methods fail to satisfy these requirements.
Abstract: Many alternative methods for designing secure information systems (SIS) have been proposed to ensure system security. However, within all the literature on SIS methods, there exists little theoretically grounded work that addresses the fundamental requirements and goals of SIS design. This paper first uses design theory to develop a SIS design theory framework that defines six requirements for SIS design methods, and second, shows how known SIS design methods fail to satisfy these requirements. Third, the paper describes a SIS design method that does address these requirements and reports two empirical studies that demonstrate the validity of the proposed framework.

Journal ArticleDOI
TL;DR: While the two sets of authors agree on certain, specific empirical results and a number of critical concepts, such as the basic point that the field is evolving into a mature discipline, they disagree on whether IS is influencing other fields as mentioned in this paper.
Abstract: While the two sets of authors agree on certain, specific empirical results and a number of critical concepts, such as the basic point that the field is evolving into a mature discipline, they disagree on whether IS is influencing other fields. What is fascinating to scholars intrigued by the domain of scientometrics—that is, the study of the scientific process itself—is that the groups could have come to such dramatically different conclusions using large, similar datasets of roughly 70,000 articles.

Journal ArticleDOI
TL;DR: This study proposes that collaborative exchange and integration of explicit knowledge across phases of the development process positively influence the performance of systems development and suggests that process formalization moderates the performance effects of the knowledge integration factors.
Abstract: Systems development processes have received significant negative publicity due to failed projects, often at large costs, and performance issues that continue to plague IS managers. This study complements existing systems development research by proposing a knowledge management perspective for managing tacit and explicit knowledge in the systems development process. Specifically, it proposes that collaborative exchange and integration of explicit knowledge across phases of the development process positively influence the performance of systems development. It also suggests that process formalization not only directly impacts development performance but also moderates the performance effects of the knowledge integration factors. Data for the empirical study were collected from 60 organizations that are part of a user group for one of the world’s largest software development tool vendors.

Journal ArticleDOI
TL;DR: It is found that education level is a significant predictor of one’s likelihood to make mistakes, suggesting that existing social inequalities translate into differences in online behavior.
Abstract: A refined approach to digital inequality requires that in addition to looking at differences in access statistics we also must examine differences among Internet users. People encounter numerous hurdles during their online information-seeking behavior. In this paper, I focus on the likelihood that Internet users will make spelling or typographical mistakes during their online activities. Information seeking on the Web often requires users to type text into forms. Users sometimes make mistakes, which can hinder their browsing efficiency because they may get detoured to irrelevant sources or encounter errors. I draw on data collected from in-person observations with a diverse sample of 100 Internet users to see what explains their tendency to make spelling and typographical mistakes and the frequency with which they make such errors. I find that education level is a significant predictor of one’s likelihood to make mistakes, suggesting that existing social inequalities translate into differences in online behavior.

Journal ArticleDOI
TL;DR: A more fine-grained analysis of plagiarism is needed, in order to distinguish copying that is harmful to the intellectual process, and that which is important to it.
Abstract: The unattributed incorporation of the work of others into an academic publication is widely regarded as seriously inappropriate behavior. Yet imitation is fundamental to many things that people do, even in academic disciplines. This paper examines the range of activities in which academics engage, including a detailed study of the authoring of textbooks. It concludes that a more fine-grained analysis of plagiarism is needed, in order to distinguish copying that is harmful to the intellectual process, and that which is important to it.

Journal ArticleDOI
TL;DR: A conceptual model with insights obtained from literatures on the technology acceptance model (TAM), the economics of intermediation, and transaction cost analysis (TCA) posits that infomediaries offer two major types of utilitarian benefits to online customers: namely, perceived efficiency and perceived effectiveness.
Abstract: The emergence of infomediaries — which allow online consumers to search for, and provide comparisons among, many online retailers — is a prominent trend in ecommerce. However, little research has been done on consumer reactions to this new ecommerce tool. To explain why and how online shoppers adopt a new infomediary website, this study proposes a conceptual model with insights obtained from literatures on the technology acceptance model (TAM), the economics of intermediation, and transaction cost analysis (TCA). Infomediaries provide powerful search capabilities to online shoppers to provide them with a list of potential retailers (efficiency benefits), and then provide information to aid in selecting from this list of retailers (effectiveness benefits). Accordingly, the proposed model posits that infomediaries offer two major types of utilitarian benefits to online customers: namely, perceived efficiency and perceived effectiveness. In addition, the model predicts that one’s willingness to adopt an infomediary is a function of his/her evaluation of the two types of utilitarian benefits of using the infomediary, which are in turn determined by the subjective interpretation of his/her e-commerce transaction environment. The model was tested using data collected from an online questionnaire administered to 367 online shoppers. Online shoppers’ intention to use the infomediary was found to be a function of the two types of utilitarian benefits and perceived ease of use. In addition, our findings suggest that online shoppers who are low on asset specificity (e.g., consumers who have not made a high

Journal ArticleDOI
Ron Weber1
TL;DR: The logic-of-the-core arguments made by Lyytinen and King (2004) are examined and evaluated and it is found that they are based on idiosyncratic views that are difficult to either justify or refute.
Abstract: Papers published about the need for a theoretical core in the information systems (IS) discipline can be characterized as either nature-of-the-discipline commentaries or logic- of-the-core commentaries. The former articulate the authors' views on those phenomena that research in the IS discipline ought to investigate. The latter scrutinize some of the logic that underlies arguments made by those who either support or reject the need for a theoretical core. Unfortunately, nature-of-the-discipline commentaries are unlikely to help clarify or resolve fundamental issues that underpin the debate. Too often they are based on idiosyncratic views that are difficult to either justify or refute. Logic-of- the-core commentaries, however, lay bare the arguments made by the protagonists so they can be evaluated. In this paper, I examine the logic-of-the-core arguments made by Lyytinen and King (2004) and evaluate their validity.

Journal ArticleDOI
TL;DR: In this article, the authors investigated the effect of ontologically clearer logical models compared to traditional logical models on query performance and found that the benefits of ontological clarity at the conceptual level may translate into similar benefits when querying ontologically clear logical models.
Abstract: End users respond to stakeholders' information requests by using query tools to retrieve information from their organizations' data stores. The structure of these data stores impacts end users' performance, e. g., the accuracy of their responses. Ontologically clearer conceptual models have been shown to facilitate better problem solving within real-world application domains. If, however, ontologically clearer conceptual models are directly transformed into implementation ( logical) data models, the differences in the number of entities and relationships may cause cognitive issues for end users that are likely to affect their query performance. This paper reports the results of an experiment that investigated the effect on query performance of more traditional logical models compared to ontologically clearer logical models. Results indicate that end users of the ontologically clearer implementation made fewer semantic errors overall. Thus, the benefits of ontological clarity at the conceptual level may translate into similar benefits when querying ontologically clearer logical models. Unfortunately, an examination of the specific types of errors that were made indicated that the benefits are not clear cut. While the removal of optional attributes and relationships led to an overall reduction in the number of errors, closer analyses show that some types of errors ( involving projection and restriction) decreased as expected, while other types of errors ( involving joins) increased.

Journal ArticleDOI
TL;DR: The results indicate that the virtual and physical facilitating conditions of a public computer are determinants of e-commerce use in a public environment, and the user’s need for privacy moderates these relationships.
Abstract: Organizations and governments continue to advance toward using electronic means to interact with their customers. However, the use of this medium presents an access-toservice issue for people across the digital divide who do not have private Internet access from their homes. Publicly-available computers connected to the Internet are an important and expanding source of Internet access for consumers. Still, we do not know if people are willing to engage in e-commerce transactions in such environments. We expand the Facilitating Conditions construct of Triandis’ (1980) modified theory of reasoned action to develop a model of transactional Web site use in public environments that incorporates the physical and virtual computer environments associated with publicly accessible computers, moderated by the individual’s need for privacy. The model was tested in public libraries, and the results indicate that the virtual and physical facilitating conditions of a public computer are determinants of e-commerce use in a public environment, and the user’s need for privacy moderates these relationships.

Journal ArticleDOI
TL;DR: The practical problems in identifying a theoretical core are examined, the ontological connection between identity and legitimacy is clarified, and suggestions for improving the workability of efforts to improve the legitimacy of the IS field are suggested.
Abstract: We respond to Ron Weber’s commentaries regarding the necessity of a theoretical core in achieving academic legitimacy for the IS field. We examine the practical problems in identifying a theoretical core, clarify the ontological connection between identity and legitimacy, acknowledge mistakes in our earlier formulation criticizing the necessity of theory in legitimation, and attempt a synthesis between our views and those of Weber. The paper concludes with suggestions for improving the workability of efforts to improve the legitimacy of the IS field.

Journal ArticleDOI
TL;DR: A framework based on this Evolutionary Information-Processing Theory is developed to aid practitioners in IS design and offers promise in the better design of IT to improve knowledge creation performance.
Abstract: Information Systems (IS) research on knowledge creation has not adequately accounted for the evolutionary nature of knowledge. Research limitations also exist in depicting the roles of information in the knowledge creation process. These two problems present difficulties for practitioners when attempting to successfully implement Information Technology (IT) to facilitate knowledge creation. Based on a problem-solving paradigm, this research analyzes knowledge creation from both the evolutionary and information-processing perspectives. The resultant theory outlines a process whereby tentative knowledge is generated from varied existing knowledge and applied to a problem, producing information to test the extent to which the problem can be solved. An iterative process continues until the tentative knowledge with the highest potential to solve the problem is found, yielding the information to best meet the goal. This process is further embedded in an organization-wide problem-solving hierarchy where new knowledge is developed via the integration of knowledge elements of sub-problems. By incorporating the evolutionary nature of knowledge, this research provides a deeper understanding of the knowledge creation process and the key determinants of its success. More importantly, by clearly specifying the roles of information in the process, it offers promise in the better design of IT to improve knowledge creation performance. We develop a framework based on this Evolutionary Information-Processing Theory to aid practitioners in IS design.

Journal ArticleDOI
TL;DR: In the authors' pooled time-series models of 58 developing nations over the 1995-2000 time period, it is found that both structural conduciveness and globalization shape the distribution and growth of Internet usage.
Abstract: The growing perception that the Internet is becoming an engine of global economic and social change has inspired both governments and intergovernmental agencies to accelerate the diffusion of the Internet around the globe via multimillion dollar programs and initiatives. Unfortunately, few empirical studies guide these initiatives. The purpose of this research is to investigate the causes that drive Internet capacity, with special emphasis on diffusion theory. Global diffusion of IT requires some degree of structural conduciveness (similarities between developed and developing countries in economic, political, and social structures) as well as contact with developed countries. In our pooled time-series models of 58 developing nations over the 1995-2000 time period, we find that both structural conduciveness (i.e., teledensity, service economies, political openness, and global urban share) and globalization (i.e., aid share, tourist share, foreign investment share, and trade share) shape the distribution and growth of Internet usage.

Journal ArticleDOI
TL;DR: A critique of Wade et al.
Abstract: Two articles published in this issue (Wade et al. and ours) through similar analyses reach contrasting conclusions on whether Information Systems, as a field, is evolving toward a reference discipline. In this article, we provide a critique of Wade et al. We first assess our different interpretations of reference discipline, and then discuss the consequences of including highly related disciplines in citation analysis. Finally, we illustrate the sensitivity of Wade et al.’s results to the inclusion and exclusion of certain journals. We also consider potential interpretations of second degree citations. It is hoped that the arguments presented here reconcile the differences as we collectively advance thinking on the state of IS as a reference discipline.

Journal ArticleDOI
TL;DR: The findings are negative, and the mood of the paper is humbling and critical, whereas after reading Grover et al.
Abstract: Readers could be confused after reading our paper (Wade et al.) and the Grover et al. paper—both in this issue—one right after the other. Both papers examine a similar topic using a similar methodology on a similar dataset over a similar time period. Yet, we come to very different conclusions. Grover et al.’s conclusions are positive and its tone is congratulatory and upbeat. By contrast, our findings are negative, and the mood of our paper is humbling and critical. After reading Grover et al. you may feel like reaching for a glass of champagne, while after reading ours, you are more likely to reach for an aspirin!