scispace - formally typeset
Search or ask a question

Showing papers in "Journal of the Association for Information Systems in 2007"


Journal ArticleDOI
TL;DR: In this article, the authors present a critique of the technology acceptance model (TAM) and points to specific remedies in each case and present a model for the purposes of providing a foundation for a paradigm shift.
Abstract: This article presents a critique of a number of shortcomings with the technology acceptance model (TAM) and points to specific remedies in each case. In addition, I present a model for the purposes of providing a foundation for a paradigm shift. The model consists first of a decision making core (goal desire → goal intention → action desire → action intention) that is grounded in basic decision making variables/processes of a universal nature. The decision core also contains a mechanism for self-regulation that moderates the effects of desires on intentions. Second, added to the decision making core are a number of causes and effects of decisions and self-regulatory reasoning, with the aim of introducing potential contingent, contextual nuances for understanding decision making. Many of the causal variables here are contained within TAM or its extensions; also considered are new variables grounded in emotional, group/social/cultural, and goal-directed behavior research.

1,775 citations


Journal ArticleDOI
TL;DR: The present commentary discusses concerns about the intense focus on TAM, speculates on the possible contributions to the current state of affairs, and makes several suggestions to alleviate the problems associated with TAM and to advance IT adoption research to the next stage.
Abstract: The Technology Acceptance model (TAM) is one of the most influential theories in Information Systems. However, despite the model's significant contributions, the intense focus on TAM has diverted researchers’ attention away from other important research issues and has created an illusion of progress in knowledge accumulation. Furthermore, the independent attempts by several researchers to expand TAM in order to adapt it to the constantly changing IT environments has lead to a state of theoretical chaos and confusion in which it is not clear which version of the many iterations of TAM is the commonly accepted one. The present commentary discusses these concerns, speculates on the possible contributions to the current state of affairs, and makes several suggestions to alleviate the problems associated with TAM and to advance IT adoption research to the next stage.

1,275 citations


Journal ArticleDOI
TL;DR: This essay aims to extend the work of Walls, Widemeyer and El Sawy (1992) on the specification of information systems design theories (ISDT), drawing on other streams of thought on design research and theory to provide a basis for a more systematic and useable formulation of these theories.
Abstract: Design work and design knowledge in Information Systems (IS) is important for both research and practice. Yet there has been comparatively little critical attention paid to the problem of specifying design theory so that it can be communicated, justified, and developed cumulatively. In this essay we focus on the structural components or anatomy of design theories in IS as a special class of theory. In doing so, we aim to extend the work of Walls, Widemeyer and El Sawy (1992) on the specification of information systems design theories (ISDT), drawing on other streams of thought on design research and theory to provide a basis for a more systematic and useable formulation of these theories. We identify eight separate components of design theories: (1) purpose and scope, (2) constructs, (3) principles of form and function, (4) artifact mutability, (5) testable propositions, (6) justificatory knowledge (kernel theories), (7) principles of implementation, and (8) an expository instantiation. This specification includes components missing in the Walls et al. adaptation of Dubin (1978) and Simon (1969) and also addresses explicitly problems associated with the role of instantiations and the specification of design theories for methodologies and interventions as well as for products and applications. The essay is significant as the unambiguous establishment of design knowledge as theory gives a sounder base for arguments for the rigor and legitimacy of IS as an applied discipline and for its continuing progress. A craft can proceed with the copying of one example of a design artifact by one artisan after another. A discipline cannot.

1,272 citations


Journal ArticleDOI
TL;DR: This paper compares the progress in the area of technology adoption with two widely-researched streams in psychology and organizational behavior: theory of planned behavior and job satisfaction to conclude that there has been excellent progress in technology adoption research.
Abstract: Research on individual-level technology adoption is one of the most mature streams of information systems (IS) research. In this paper, we compare the progress in the area of technology adoption with two widely-researched streams in psychology and organizational behavior: theory of planned behavior and job satisfaction. In addition to gauging the progress in technology adoption research, this allows us to identify some fruitful areas for future research. Based on our comparison, we conclude that there has been excellent progress in technology adoption research. However, as a next step, we call for research focused on interventions, contingencies, and alternative theoretical perspectives (to the largely social psychologybased technology adoption research. Also, we believe it would be important to use the comparisons discussed here as a basis to develop a framework-driven set of future research directions to guide further work in this area.

654 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present the results of a study of user behavioral intention toward protective technologies based on the framework of the theory of planned behavior and find that awareness of the threats posed by negative technologies is a strong predictor of user behavioural intention toward the use of protective technologies.
Abstract: While there is a rich body of literature on user acceptance of technologies with positive outcomes, little is known about user behavior toward what we call protective technologies: information technologies that protect data and systems from disturbances such as viruses, unauthorized access, disruptions, spyware, and others In this paper, we present the results of a study of user behavioral intention toward protective technologies based on the framework of the theory of planned behavior We find that awareness of the threats posed by negative technologies is a strong predictor of user behavioral intention toward the use of protective technologies More interestingly, in the presence of awareness, the influence of subjective norm on individual behavioral intention is weaker among basic technology users but stronger among advanced technology users Furthermore, while our results are consistent with many of the previously established relationships in the context of positive technologies, we find that the determinants “perceived ease of use” and “computer self-efficacy” are no longer significant in the context of protective technologies We believe that this result highlights the most significant difference between positive technologies and protective technologies: while the former are used for their designed utilities, for which usefulness and ease of use have a significant impact, the latter are used out of fear of negative consequences, for which awareness becomes a key determinant We discussed the theoretical and practical implications of these findings The findings of this study extend the theory of planned behavior to the context of protective technologies and shed insights on designing effective information security policies, practices, and protective technologies for organizations and society

360 citations



Journal ArticleDOI
TL;DR: The results suggest that the common practices of instrument validation and reuse of long-standing instruments to measure CSE may not be the most effective approach to the study of the construct.
Abstract: This paper reports an empirical study intended to provide detailed comparisons amongst and between the varieties of available measures of computer self-efficacy (CSE). Our purpose is to ascertain their relative abilities to isolate the CSE construct from other related constructs and to capture variance in performance attributed to changes in CSE level. In addition, we investigate the importance of ensuring the measure being used is sufficiently aligned with the task domain of interest. Finally, we explore the stability of CSE measures as they relate to the current state of evolution within the computing domain. Marakas, Yi, and Johnson (1998) proposed a framework for the construction of instruments intended to measure the CSE construct that we have adopted as a basis for this series of investigations. To that end, we advance and test a set of hypotheses derived from the Marakas et al. (1998) framework. Results of the analyses support the need for adherence to the tenets of the proposed framework as well as provide evidence that CSE measures suffer from degradation of their explanatory power over time. Further, this study brings forth the importance of appropriately validating measures of CSE using approaches intended for a formative rather than a reflective construct. These results suggest that the common practices of instrument validation and reuse of long-standing instruments to measure CSE may not be the most effective approach to the study of the construct. Implications for future research are discussed.

304 citations


Journal ArticleDOI
TL;DR: Analysis of crosssectional data collected from 293 IT managers generally corroborates the hypothesized relationships, showing that the technical and behavioral capabilities of IT personnel have a positive effect on infrastructure capabilities and the effect of infrastructure capabilities on IT-dependent strategic agility is direct, as well as mediated by IT- dependent system and information agility.
Abstract: This study develops a research model of how the technical, behavioral, and business capabilities of IT personnel are associated with IT infrastructure capabilities, and how the latter are associated with IT-dependent organizational agility, which is conceptualized as comprising IT-dependent system, information, and strategic agility. Analysis of crosssectional data collected from 293 IT managers generally corroborates the hypothesized relationships, showing that the technical and behavioral capabilities of IT personnel have a positive effect on infrastructure capabilities. The analysis also provides evidence that the effect of infrastructure capabilities on IT-dependent strategic agility is direct, as well as mediated by IT-dependent system and information agility. The validity of the findings is strengthened by demonstrating that the hypothesized research model fits the data better than two alternative theoretically-anchored models describing different relationships between the same constructs. This study advances understanding of the interrelationships between two major subsets of IT capabilities, and their relationships with the agility afforded by IT.

274 citations


Journal ArticleDOI
TL;DR: The view of Benbasat and Barki's characterization of TAM as unassailable is that common methods bias has never been well tested and that TAM linkages may in fact be methodological artifacts.
Abstract: Benbasat and Barki (2007) argue that TAM has been both a blessing and curse for the IS field and they detail reasons why this is the case. Our response to their critique is to highlight areas of agreement, disagree with one of their assertions, and extend their thinking along another, related line. Specifically, we agree that some TAM constructs, namely perceived usefulness and system usage, need to be more closely examined in order to break up the "black box" portrayal of these concepts. Our view of Benbasat and Barki's characterization of TAM as unassailable is that common methods bias has never been well tested and that TAM linkages may in fact be methodological artifacts. Finally, it is argued that the field desperately needs more parsimony in TAM models and that meta-analysis is one good way of achieving this goal.

265 citations


Journal ArticleDOI
TL;DR: The study highlights the importance of design boundary objects in multi-stakeholder designs and stresses the need to formulate sociology-based design theories on how knowledge is produced and consumed in complex SAD tasks.
Abstract: Traditionally, Systems Analysis and Design (SAD) research has focused on ways of working and ways of modeling. Design ecology – the task, organizational and political context surrounding design – is less well understood. In particular, relationships between design routines and products within ecologies have not received sufficient attention. In this paper, we theorize about design product and ecology relationships and deliberate on how design products – viewed as boundary objects – bridge functional knowledge and stakeholder power gaps across different social worlds. We identify four essential features of design boundary objects: capability to promote shared representation, capability to transform design knowledge, capability to mobilize for action, and capability to legitimize design knowledge. We show how these features help align, integrate, and transform heterogeneous technical and domain knowledge across social worlds as well as mobilize, coordinate, and align stakeholder power. We illustrate through an ethnography of a large aerospace laboratory how two design artifacts – early proto-architectures and project plans – shared these four features to coalesce design processes and propel successful movement of designs across social worlds. These artifacts resolved uncertainty associated with functional requirements and garnered political momentum to choose among design solutions. Altogether, the study highlights the importance of design boundary objects in multi-stakeholder designs and stresses the need to formulate sociology-based design theories on how knowledge is produced and consumed in complex SAD tasks.

171 citations


Journal ArticleDOI
TL;DR: It is suggested that, these studies implicitly include the notion that “IT acceptance” be construed as simply the relationship between antecedent factors such as perceived usefulness and ease of use that target or predict that particular type of intention connected to amount of IT usage.
Abstract: In the past two decades the Technology Acceptance Model (TAM) has successfully catalyzed a large number of studies related to IT usage or intentions toward that usage. However, we argue that the focus of these studies has been on a narrow aspect of usage (typically, extent or frequency of use). Moreover, we suggest that, these studies implicitly include the notion that “IT acceptance” be construed as simply the relationship between antecedent factors such as perceived usefulness and ease of use that target or predict that particular type of intention connected to amount of IT usage. Rather than continuing studies for additional antecedents or contexts that moderate this particular mode of use, we suggest a reflexive pause regarding the notion of IT acceptance itself. Specifically, we encourage broadening our understanding of IT acceptance toward a wider constellation of behavioral usage and its psychological counterparts. Other aspects of usage behavior or post hoc usage evaluation such as infusion, routinization, substantive use, exploitive usage, or faithfulness of appropriation have recently emerged and will likely require/involve other psychological notions of acceptance (Sundaram et al, forthcoming; Jones et al. 2002; Jasperson et al. 2005; Burton-Jones and Straub, 2006; Chin, et al. 1997). The call for this expansion is only made more salient by recent studies that indicate that the traditional TAM antecedents do not necessarily relate to these other forms of usage (Jones et al. 2002) and, furthermore, that these alternative notions of usage such as routinization or infusion may have stronger connection to performance outcomes (Sundaram et al., forthcoming).


Journal ArticleDOI
TL;DR: It is argued that pre-implementation user participation can be problematic so that post-im implementation involvement will be more effective in garnering user interest and assistance.
Abstract: User participation during software projects has long been considered a prerequisite for system success, and yet these initiatives continue to be rife with troubles. This is particularly true of enterprise software such as ERP and CRM, which, in spite of its popularity, is difficult to implement and is prone to user resistance. This, then, begs the question of why these enterprise systems run into problems even with when garnering user participation. One response may be to question the importance of participation per se; a more considered response is likely to be one that emphasizes the need to more closely explore the relationship between participation and the system in use. To this end, we adopt a cross-case comparison to analyze the role of user participation during two ES projects. Through the theoretical lens of 'situated learning', we argue that pre-implementation user participation can be problematic so that post-implementation involvement will be more effective in garnering user interest and assistance.

Journal ArticleDOI
TL;DR: This research contributes to the clarification of the role of theory in design science, expands the concept of "possibilities for action" to IS design, and proposes a design theory of a class of information systems for testing and refinement.
Abstract: Tailorable technologies are a class of information systems designed with the intention that users modify and redesign the technology in the context of use. Tailorable technologies support user goals, intentions, metaphor, and use patterns in the selection and integration of technology functions in the creation of new and unique information systems. We propose a theory of tailorable technology design and identify principles necessary for the initial design. Following a Kantian style of inquiry, we identified four definitional characteristics of tailorable technology: a dual design perspective, user engagement, recognizable environments, and component architectures. From these characteristics, we propose nine design principles that will support the phenomenon of tailoring. Through a year-long case study, we refined and evidenced the principles, finding found that designers of tailorable technologies build environments in which users can both interact and engage with the technology, supporting the proposed design principles. The findings highlight a distinction between a reflective environment, where users recognize and imagine uses for the technology, and an active environment in which users tailor the technology in accordance with the imagined uses. This research contributes to the clarification of the role of theory in design science, expands the concept of "possibilities for action" to IS design, and proposes a design theory of a class of information systems for testing and refinement.

Journal ArticleDOI
TL;DR: An extension of Mumford’s ideas about the benefits and process of participation are presented, based on an analysis of recent citizen engagement initiatives, and the extent to which e-government reflects the principles she espoused is examined.
Abstract: Enid Mumford championed an ethical, socio-technical, and participatory approach to the design of ICT systems. In this paper, we focus on the development of e-government as an example of such a system. First, we present an extension of Mumford’s ideas about the benefits and process of participation, based on an analysis of recent citizen engagement initiatives. We then examine the extent to which e-government reflects the principles she espoused. The evidence collated indicates that e-government development is currently characterised by a technocentric approach with minimal engagement of citizens. We discuss the implications arising from this analysis, and explore the benefits that governments could achieve from adoption of a sociotechnical, participatory approach to e-government development. The crucial enabling role of capacity building is highlighted. Providing citizens with the necessary skills and capabilities to engage effectively offers the key to the successful development of systems such as e-government which impact our lives in the 21st century Information Society.

Journal ArticleDOI
TL;DR: The model integrates the literature on requirements and software development; sets the scene for future research; and, finally proposes how practitioners can manage risks in requirements development projects.
Abstract: Drawing upon the requirements and software development literature, the present study proposes an integrative contingency model for requirements development. Based on 116 quality journal articles, we analyze requirements development risks, requirements development techniques, and heuristics for how they are effectively related. Subsequently, we synthesize the insights from the identified literature into a model for requirements development that relates patterns of risk resolution to archetypical risk profiles. The model integrates the literature on requirements and software development; sets the scene for future research; and, finally proposes how practitioners can manage risks in requirements development projects.

Journal ArticleDOI
TL;DR: A reconceptualization and refinement of the PCI constructs and an extended theoretical model of their influence on users’ behavior are provided, which provides a more complete picture of the influence of the PCIs.
Abstract: Individual adoption and use of technology remains a critical concern for both managers and professionals. Despite the widespread integration of technology into work and organizations, there remain many opportunities for individuals to either extend or limit their use of IT at work. This paper extends work on the Perceived Characteristics of Innovating (PCI), as defined by Moore and Benbasat in 1991. Building on studies over the past ten years as well as on additional empirical research, we provide two contributions – a reconceptualization and refinement of the PCI constructs, and an extended theoretical model of their influence on users’ behavior. The construct refinements aim to provide greater theoretical clarity and to address challenges in the measurement of the constructs. The extended theoretical model provides a more complete picture of the influence of the PCIs, by considering the complex web of relationships among them in addition to their potential direct effects on usage.


Journal ArticleDOI
TL;DR: It is shown that firms are somewhat misaligned in the early post-merger period, and come into alignment only two to three years after the merger, and other factors such as acquirer-target power struggles, prior merger experience, and overarching synergy goals drove much of the initial integration decision making.
Abstract: This paper focuses on IS integration decisions made during mergers and acquisitions from a strategic-alignment lens The objectives of this study are to: (1) examine business-IS alignment as reflected in IS integration decisions in a merger context and (2) identify factors that shape IS integration decisions in a merger context We study these issues in three oil and gas mergers from pre-merger announcement to three to four years after merger announcement Our contributions are threefold We show that firms are somewhat misaligned in the early post-merger period, and come into alignment only two to three years after the merger We find that business-IS alignment was a minor concern for the new organizations in premerger and early post-merger phases Other factors such as acquirer-target power struggles, prior merger experience, and overarching synergy goals drove much of the initial integration decision making Only late in the post-merger do the merged organizations revisit their systems to bring them into alignment with the business needs

Journal ArticleDOI
TL;DR: Criteria for determining whether TAM is scientific or not is reviewed in light of post-positivistic debates about the nature of science, and Popper’s principle of demarcation is applied to determine whether a theory-like TAM-is falsifiable.
Abstract: This paper reflects upon the technology acceptance model (TAM) from the perspective of the post positivistic philosophy of science. I explore what it is to know, what a theory is, and what it means to be scientific in the context of TAM. In particular, I review criteria for determining whether TAM is scientific or not in light of post-positivistic debates about the nature of science. For this purpose, I apply Popper’s principle of demarcation, which determines whether a theory-like TAM-is falsifiable and the logical connection argument to show that connections between actions and intentions cannot be subjected to empirical testing similar to connections between chemical entities. I also draw on Kuhn’s notion of scientific revolutions to observe the degree to which TAM has become normal science. Finally, I review TAM from the Lakatosian perspective of scientific research programs to evaluate whether the program is advancing or declining. My main objective is not to provide a conclusive evaluation of TAM as a research program or a paradigm, but to open the philosophical foundations of TAM for scrutiny so that it can be evaluated not only within the validation rules followed by its proponents, but by applying a set of well known criteria established in the post-positivistic views of science.


Journal ArticleDOI
TL;DR: It is suggested that ontologies and conceptual schemas belong to two different epistemic levels and should deal with general assumptions concerning the explanatory invariants of a domain ‐ those that provide a framework enabling understanding and explanation of data across all domains inviting explanation and understanding.
Abstract: In the traditional systems modeling approach, the modeler is required to capture a user’s view of some domain in a formal conceptual schema The designer’s conceptualization may, or may not match with the user’s conceptualization One of the reasons for these conflicts is the lack of an initial agreement among users and modelers concerning the concepts belonging to the domain Such an agreement could be facilitated by means of an ontology If the ontology is previously constructed and formalized so that it can be shared by the modeler and the user in the development process, such conflicts would be less likely to happen Following up on that, a number of investigators have suggested that those working on information systems should make use of commonly held, formally defined ontologies that would constrain and direct the design, development, and use of information systems ‐ thus avoiding the above mentioned difficulties Whether ontologies represent a significant advance from the more traditional conceptual schemas has been challenged by some researchers We review and summarize some major themes of this complex discussion While recognizing the commonalities and historical continuities between conceptual schemas and ontologies, we think that there is an important emerging distinction which should not be obscured, but should guide future developments In particular, we propose that the notions of conceptual schemas and ontologies be distinguished so as to play essentially different roles for the developers and users of information systems We first suggest that ontologies and conceptual schemas belong to two different epistemic levels They have different objects and are created with different objectives Our proposal is that ontologies should deal with general assumptions concerning the explanatory invariants of a domain ‐ those that provide a framework enabling understanding and explanation of data across all domains inviting explanation and understanding Conceptual schemas, on the other hand, should address the relation between such general explanatory categories and the facts that exemplify them in a particular domain (eg, the contents of the database) In contrast to ontologies, conceptual schemas would involve specification of the meaning of the explanatory categories for a particular domain as well as the consequent dimensions of possible variation among the relevant data of a given domain Accordingly, the conceptual schema makes possible both the intelligibility and the measurement of those facts of a particular domain The proposed distinction between ontologies and conceptual schemas makes possible a natural decomposition of information systems in terms of two necessary but complementary epistemic functions: identification of an invariant

Journal ArticleDOI
TL;DR: Extant process modeling techniques address different aspects of processes, such as activity sequencing, resource allocation, and organizational responsibilities, but do not deal with important aspects of process design such as process goals.
Abstract: Extant process modeling techniques address different aspects of processes, such as activity sequencing, resource allocation, and organizational responsibilities. These techniques are usually based on graphic notation and are driven by practice rather than by theoretical foundations. The lack of theoretical principles hinders the ability to ascertain the “correctness” of a process model. A few techniques (notably Petri Nets) are formalized and apply verification mechanisms (mostly for activity sequencing and concurrency). However, these techniques do not deal with important aspects of process design such as process goals.

Journal ArticleDOI
TL;DR: It is concluded that recent studies in the IS journal quality stream are credible and these journal quality measures provide appropriate indicators of relative journal quality.
Abstract: In this study we investigated the measurement validity of the findings in the IS journal quality stream over the past ten years. Our evaluation applied a series of validation tests to the metrics presented in these studies using data from multiple sources. The results of our tests for content, convergent, and discriminant validity, as well as those for parallel-form, test-retest, and item-tototal reliability, were highly supportive. From these findings, we conclude that recent studies in the IS journal quality stream are credible. As such, these IS journal quality measures provide appropriate indicators of relative journal quality. This conclusion is important for both academic administrators and scientometric researchers, the latter of whom depend on journal quality measures in the evaluation of published IS research.

Journal ArticleDOI
TL;DR: A theoretical model is proposed in the context of MDS that connects user satisfaction with contribution to QoL (a new outcome variable for mobile computing) in a range of life domains and was tested through three empirical studies conducted in Korea.
Abstract: The rapid spread of technological innovations like mobile data services (MDS) has made mobile computing a fact of everyday life for many people. Therefore, we need to understand the contribution of mobile computing to overall quality of life (QoL). Employing the satisfaction hierarchy model and bottom-up spillover theory, this study proposes a theoretical model in the context of MDS that connects user satisfaction (a traditional outcome variable of IT) with contribution to QoL (a new outcome variable for mobile computing) in a range of life domains. The validity of the proposed model and outcome variable was tested through three empirical studies conducted in Korea. User satisfaction with MDS was found to affect the contribution of MDS to QoL in eleven life domains, and these contributions in turn influenced the overall contribution of MDS to QoL. The paper ends with a discussion of the study’s implications and limitations.

Journal ArticleDOI
TL;DR: Drawing on the socio-technical approach and in particular on Enid Mumford and her ETHICS methodology, the paper will contribute to the CCSR's aim of embedding ethics in mainstream computing / IS work.
Abstract: Drawing on the socio-technical approach and in particular on Enid Mumford and her ETHICS methodology, the paper will contribute to the CCSR's aim of embedding ethics in mainstream computing / IS work The Journal of the AIS is the official research outlet of the Association for Information Systems It is among the "basket" of the six best IS journals (http://homeaisnetorg/) It is furthermore available to all members of the AIS, the leading professional body for IS scholars, which will lead to a wide readership


Journal ArticleDOI
TL;DR: This paper detail the development and application of the TOVE ISO 9000 Micro-Theory, a model of ISO 9000 developed using ontologies for quality management (measurement, traceability, and quality management system ontologies), and demonstrates that when enterprise models are developed Using ontologies, they can be leveraged to support business analytics problems - in particular, compliance evaluation - and are sharable.
Abstract: Sharing data between organizations is challenging because it is difficult to ensure that those consuming the data accurately interpret it The promise of the next generation WWW, the semantic Web, is that semantics about shared data will be represented in ontologies and available for automatic and accurate machine processing of data Thus, there is inter-organizational business value in developing applications that have ontology-based enterprise models at their core In an ontology-based enterprise model, business rules and definitions are represented as formal axioms, which are applied to enterprise facts to automatically infer facts not explicitly represented If the proposition to be inferred is a requirement from, say, ISO 9000 or Sarbanes-Oxley, inference constitutes a model-based proof of compliance In this paper, we detail the development and application of the TOVE ISO 9000 Micro-Theory, a model of ISO 9000 developed using ontologies for quality management (measurement, traceability, and quality management system ontologies) In so doing, we demonstrate that when enterprise models are developed using ontologies, they can be leveraged to support business analytics problems - in particular, compliance evaluation - and are sharable Key Words: enterprise modeling, ontologies, quality management, ISO 9000, regulatory requirements Volume 8, Issue 2, Article 2, pp 105–128, February 2007

Journal ArticleDOI
TL;DR: Zhang et al. as mentioned in this paper studied not only the personal identities but also the social identities of criminals during the matching process and showed that combining social features with personal features could improve the performance of criminal identity matching.
Abstract: Complex problems like drug crimes often involve a large number of variables interacting with each other. A complex problem may be solved by breaking it into parts (i.e., sub-problems), which can be tackled more easily. The identity matching problem, for example, is a part of the problem of drug and other types of crimes. It is often encountered during crime investigations when a single criminal is represented by multiple identity records in law enforcement databases. Because of the discrepancies among these records, a single criminal may appear to be different people. Following Enid Mumford’s three-stage problem solving framework, we design a new method to address the problem of criminal identity matching for fighting drug-related crimes. Traditionally, the complexity of criminal identity matching was reduced by treating criminals as isolated individuals who maintain certain personal identities. In this research, we recognize the intrinsic complexity of the problem and treat criminals as interrelated rather than isolated individuals. In other words, we take into consideration of the social relationships between criminals during the matching process. We study not only the personal identities but also the social identities of criminals. Evaluation results were quite encouraging and showed that combining social features with personal features could improve the performance of criminal identity matching. In particular, the social features become more useful when data contain many missing values for personal attributes.

Journal ArticleDOI
TL;DR: In this article, an agent-mediated knowledge-in-motion (KiM) model is proposed to capture the specificity of agents as key players binding knowledge creation and knowledge application.
Abstract: With millions invested in knowledge management (KM), researchers and organizations are constantly investigating how firms can best organize their KM processes to reap instrumental benefits. Yet, most KM research, apart from being fragmented, overemphasizes knowledge creation and draws little attention to key intermediaries in the KM process. The paper captures the specificity of agents as key players binding knowledge creation and knowledge application. Specifically, the paper introduces a conceptual process model that views knowledge management as an agent-mediated series of knowledge transformations, envisioned as the agent-mediated knowledge-in-motion model. The proposed agent-mediated knowledge-in-motion (KiM) model embodies the cycle of knowledge creation and reuse. By tying agentbased research to knowledge creation and application, the paper describes how organizations can strategically employ human and software agents to enhance the creation, transfer, application, and dissemination of knowledge. In the process, the paper highlights specific roles and attributes of various agents in the KM process. Using the organization as the primary unit of analysis, the scope of the discussion surrounds the conceptualization of an agent-mediated knowledge management process where data is transformed into information, information to knowledge, knowledge to creativity, creativity to innovation, and finally, the diffusion of innovation into datathus tying together a cycle of knowledge transitions from creation to reuse.