scispace - formally typeset
Search or ask a question

Showing papers in "ACM Sigmis Database in 1998"


Journal Article•DOI•
TL;DR: The results of this study, examining the adoption of an expert system, indeed support the notion that developer responsiveness strongly influenced both PU and PEOU, but only indirectly affected actual behavior --- IS use --- in accordance with the predictions of SET.
Abstract: The Technology Acceptance Model (TAM) suggests that the perceived usefulness (PU) and the perceived ease of use (PEOU) of an information system (IS) are major determinants of its use. Previous research has demonstrated the validity of this model across a wide variety of IS types. However, prior research has not identified antecedents of PU and there has been only limited research on the antecedents of PEOU. Consequently, research has provided little guidance to IS managers on methods to increase use by augmenting PU and PEOU.Viewing IS development as an instance of Social Exchange Theory (SET), this study proposes that IS managers can influence both the PU and the PEOU of an IS through a constructive social exchange with the user. One means of building and maintaining a constructive social exchange is through developer responsiveness. The results of this study, examining the adoption of an expert system, indeed support this notion. Specifically, developer responsiveness strongly influenced both PU and PEOU, but only indirectly affected actual behavior --- IS use --- in accordance with the predictions of SET. An extension of TAM based on SET is presented and the implications of this extended model are discussed from both a managerial and theoretical perspective.

294 citations


Journal Article•DOI•
TL;DR: Three types of information systems personnel were compared to the general population based on responses to a standardized personality test, and the personality profiles of analysts and managers differed widely from that of programmers, but not from one another.
Abstract: Three types of information systems personnel (programmers, systems analysts, and project managers) were compared to the general population based on responses to a standardized personality test. The IS professionals, in aggregate, exceeded population norms for nearly all of the relevant scales, confirming much prior research.However, the personality profiles of analysts and managers differed widely from that of programmers, but not from one another. Managers and analysts were found to be conservative, logical, analytical, diligent, and ambitious, with strong leadership tendencies and high self-confidence and self-esteem. They were also found to be more sociable and creative than in past research.The differences identified between traditional programmers and systems analysts and managers indicate the importance of studying, managing, and recruiting these groups differently. The findings offer insight into how to retain, promote, and manage IS personnel effectively.

87 citations


Journal Article•DOI•
TL;DR: The Software Practitioner 'q-hrough a glass, darkly" is a biblical description of the imperfect way in which the authors see the world around us.
Abstract: The Software Practitioner 'q-hrough a glass, darkly\" is a biblical description of the. imperfect way in which we see the world around us.

61 citations


Journal Article•DOI•
TL;DR: A data mart is a smaller version of a data warehouse that supports the narrower set of requirements of a single business unit that uses the data warehouse approach to provide clean data.
Abstract: Many large organizations have developed data warehouses to support decision making. The data in a warehouse are subject oriented, integrated, time variant, and nonvolatile. A data warehouse contains five types of data: current detail data, older detail data, lightly summarized data, highly summarized data, and metadata. The architecture of a data warehouse includes a backend process (the extraction of data from source systems), the warehouse, and the front-end use (the accessing of data from the warehouse). A data mart is a smaller version of a data warehouse that supports the narrower set of requirements of a single business unit. Data marts should be developed in an integrated manner in order to avoid repeating the "silos of information" problem.An operational data store is a database for transaction processing systems that uses the data warehouse approach to provide clean data. Data warehousing is constantly changing, with the associated opportunities for practice and research, such as the potential for knowledge management using the warehouse.

43 citations


Journal Article•DOI•
TL;DR: Do IS professionals differ significantly from students in terms of their perceptions about ethical issues?
Abstract: As information technology evolves, it continues to raise new ethical challenges. In recognition of this, the business and academic communities have focused increased attention on ethics. Professional codes of ethics have been enacted by the ACM, AITP, and other computing organizations to provide guidance to information systems (IS) professionals in resolving ethical dilemmas. In addition, the IS'95 model curriculum and the American Assembly of Collegiate Schools of Business (AACSB) guidelines for business education both recognize the importance of ethics in business educational programs.This paper explores an important question that has been neglected by previous research: do IS professionals differ significantly from students in terms of their perceptions about ethical issues? Two studies were conducted and they revealed a number of ethical decision-making differences between professionals and students. This result, along with an additional finding that participants showed little consensus about most ethical scenarios, suggests that ethical decision making is often difficult and that both students and professionals can benefit from ethical training and education. The findings also have important implications for IS research.

32 citations


Journal Article•DOI•
Sandeep Purao1•
TL;DR: A web-based system, APSARA, is described, developed in Java, that implements a pattern retrieval and synthesis methodology that uses natural language processing and automated reasoning heuristics to create object-oriented designs based upon simple requirements descriptions.
Abstract: Although a number of libraries of patterns have been developed for reuse, there is no mechanism for supporting automated design of object-oriented systems by the intelligent reuse of patterns from such libraries. We describe a web-based system, APSARA, the purpose of which is to create object-oriented designs based upon simple requirements descriptions. The system, developed in Java, implements a pattern retrieval and synthesis methodology that uses natural language processing and automated reasoning heuristics. The system is tested on multiple cases from different domains. The results are reported using metrics defined in the spirit of familiar measures such as recall, precision, coverage and spuriousness. These initial tests suggest that this is a feasible approach for the reuse of patterns in object-oriented design. The testing also reveals specific areas of concern and suggests a number of avenues for extending this research.

31 citations


Journal Article•DOI•
TL;DR: An iterative process of application design that incorporates the design of the entire application as well as its components and an iterative comparison and refinement of the two versions of the application diagram ensure a better final application.
Abstract: The proliferation of intranets and extranets as well as the vast expansion of the World Wide Web (WWW) and electronic commerce indicate the need for a structured hypermedia design methodology that will guide the design, development, and maintenance of large multimedia and hypermedia information systems and collaborative systems The Relationship Management Methodology (RMM) is a well-known hypermedia design methodology In this paper we provide an extension to it that enhances the design process We present an iterative process of application design that incorporates the design of the entire application as well as its components The process includes the design of an application diagram in a top-down fashion, the design of the components or building blocks using the construct of an m-slice, and the regeneration of the application diagram in a bottom-up fashion An iterative comparison and refinement of the two versions of the application diagram ensure a better final application

25 citations


Journal Article•DOI•
TL;DR: This paper describes a course that addresses the gap between the information systems integration skills that employers desire and those that universities teach, and relies on an extensive organizational simulation as the primary pedagogical method.
Abstract: Integrating technologies and applications to provide better access to, and sharing of, corporate data and to coordinate enterprise-wide tasks and processes is a critical means to adding business value through information technology. Consequently, potential employers seek information systems professionals whose skills focus on the integration of information technologies, information resources, and business strategy. However, these companies also perceive that universities are not providing graduates the necessary integration skills.This paper describes a course, Business Systems Integration, that addresses the gap between the information systems integration skills that employers desire and those that universities teach. The course approaches information systems integration from three perspectives: 1) integrating information technologies, 2) integrating the content of the MIS curriculum, and 3) integrating organizations via cross-functional business processes. Attaining a practical level of knowledge about systems integration requires a sufficiently complex, real-world environment; thus the course relies on an extensive organizational simulation as the primary pedagogical method. That simulation is described in this paper.

22 citations


Journal Article•DOI•
TL;DR: This study indicates that it is possible to improve performance considerably by selecting the right combinations of data delivery techniques and system characteristics, and proposes solutions for improving workflow efficiency.
Abstract: The Internet is providing worldwide connectivity for business organizations and has created a much greater opportunity for people to collaborate remotely in an automated workflow setting. However, the speed of delivery of workflow data on the Internet can be unpredictable due to variability in traffic and limited bandwidth. Further, the issue of data management becomes more complex when the system consists of various clients with different system and networking settings. In this paper, we investigate several data management problems in distributed workflow management on the Internet and propose solutions for improving workflow efficiency. More specifically, we explore research issues with respect to system architectures and various data delivery techniques. Our study indicates that it is possible to improve performance considerably by selecting the right combinations of data delivery techniques and system characteristics.

15 citations


Journal Article•DOI•
TL;DR: An architecture and a prototype for developing, delivering, and maintaining expert systems on the World Wide Web and allowing experts and users to experiment with real-time enhancements of knowledge bases are described.
Abstract: A convergence of Internet and fuzzy logic technologies provides an opportunity for experts and end users to collaborate in developing, refining, and testing knowledge-based systems. Internet technology removes geographical and time-based restraints, and fuzzy rule bases are easier to understand and maintain. This paper describes an architecture and a prototype for developing, delivering, and maintaining expert systems on the World Wide Web.The system's collaboration components allowed experts to monitor user consultations remotely, view summaries of responses, and trace-rule inference chains. Experts and users participated in real-time chat sessions or posted questions on extended discussion lists. The system allowed experts and users to experiment with real-time enhancements of knowledge bases. Fuzzy rules resulted in semantically richer knowledge bases that flexibly handled complex and uncertain knowledge. A fuzzy inference engine supported hedges and partial matching to assist users in applying knowledge and exploring Web-based data.

15 citations


Journal Article•DOI•
TL;DR: This paper introduces a model that facilitates incremental maintenance of single-class-based object-oriented views by employing the deferred update mode that has proved to be more suitable for object- oriented databases in general.
Abstract: A database management system should support views to facilitate filtering of information in order to have only necessary and required information available to users with minimal delay. Although a lot of research efforts concentrated on views within the conventional relational model, much more effort is required when object-oriented models are considered. However, supporting views is only a step forward in achieving the purpose that requires improving the performance of the system by considering incremental maintenance of views instead of recomputing a view from scratch each time it is accessedIn this paper, we introduce a model that facilitates incremental maintenance of single-class-based object-oriented views by employing the deferred update mode that has proved to be more suitable for object-oriented databases in general. For that purpose, we categorize classes into base and brother classes corresponding to classes originally present in the database and those introduced as views, respectively. To each class, we add a modification list that keeps related modifications during different intervals. An interval starts with the creation or update of a view and ends with the creation or update of another view. A modification list is empty as long as no views depend on its class. Further, we introduce some algorithms that locate modifications done on related classes while trying to compute a given view incrementally. Finally, we give a theoretical justification showing that, in general, the introduced algorithms perform much better than doing computation from scratch each time a view is accessed.

Journal Article•DOI•
TL;DR: This study introduced a new IRD strategy construct that focused on the extent of interaction between users and developers that differentiated between perceptual and evidential outcomes, between process and product outcomes, and between users' and developers' points of view.
Abstract: A contingency model for system development was subjected to several conceptual and operational adjustments and empirical tests. According to the model, there should be a degree of fit between development project uncertainty and the strategy for determining information requirements, ranging from accepting initial requirements statements to experimenting with prototypes to discover requirements. This study introduced a new IRD strategy construct that focused on the extent of interaction between users and developers. The study also differentiated between perceptual and evidential outcomes, between process and product outcomes, and between users' and developers' points of view. The hypotheses predicted that the degree of fit between project uncertainty and the IRD strategy would account for perceptual and evidential project outcomes from both the user's and developer's points of view. Results indicated that only by relying on the new extent of interaction construct, was support found. From the developers' standpoint, the degree of fit appeared to be related to perceptual assessments of the process and product. From the user's standpoint, the degree of fit was only related to perceptual assessments of the process. There was no support for the hypothesized impact of fit on evidential outcomes.

Journal Article•DOI•
TL;DR: The results indicate that programs developed using the object-oriented environment needed fewer lines of code, had less volume, required less effort, were less complex and were more maintainable than those developing using the traditional 4GL environments.
Abstract: This paper describes an exploratory case study that investigates the claim that the object-oriented approach to information systems development reduces development effort and improves maintainability. Software metrics for development effort, complexity, and maintainability of three functionally similar 4GL database development environments, one objected-oriented and two traditional (relational) are compared. The results indicate that programs developed using the object-oriented environment needed fewer lines of code, had less volume, required less effort, were less complex and were more maintainable than those developed using the traditional 4GL environments.

Journal Article•DOI•
TL;DR: Assessment and Control of Software Risks presents a medical handbook approach to software projects, covering a standard nine topics for each potential medical problem, including how to identify it, and methods of controlling it.
Abstract: There is a book in the marketplace that addresses precisely those questions, but in a most unusual way. In that book Assessment and Control of Software Risks (Yourdon Press, 1994) Capers Jones presents a medical handbook approach to software projects. In medicine, because of the commonality of many kinds of diseases and problems, there exists a handbook that provides doctors with rudimentary diagnoses and cures for those problems. That handbook covers a standard nine topics for each potential medical problem, including how to identify it, and methods of controlling it. It is easy to see why the Control of Communicable Diseases in Man is an indispensable part of the medical professional's library.

Journal Article•DOI•
TL;DR: The paper describes the design and implementation of a general purpose scalable simulation environment (SimDS) for designing and evaluating the performance of distributed transaction processing systems.
Abstract: Design of a distributed transaction processing system is a complex process. The paper describes the design and implementation of a general purpose scalable simulation environment (SimDS) for designing and evaluating the performance of distributed transaction processing systems. SimDS is a distributed simulation system where each data server is implemented as a separate process that communicates with each other through user datagram protocol. The paper describes the features of SimDS including various design policies that can be modeled by the system. It also discusses different design issues that one needs to consider in the implementation of a distributed simulation environment. The paper concludes with some test examples where SimDS was used to simulate different configurations of a real-time transaction processing system.

Journal Article•DOI•
TL;DR: This paper presents DataIndexes, a family of design strategies for data warehouses to support OnLine Analytical Processing (OLAP), and presents two simple Data Indexes, which can be used for any attribute and the Join DataIndex (JDI), which is used for foreign-key attributes.
Abstract: In this paper we present DataIndexes, a family of design strategies for data warehouses to support OnLine Analytical Processing (OLAP). As the name implies, DataIndexes are both a storage structure for the warehoused relational data and an indexing scheme to provide fast access to that data. We present two simple DataIndexes: the Basic DataIndex (BDI), which can be used for any attribute and the Join DataIndex (JDI), which is used for foreign-key attributes. Either structure can be shown to significantly improve query response times in most cases. And, since they serve as indexes as well as the store of base data, DataIndexes actually define a physical design strategy for a data warehouse where the indexing, for all intents and purposes, comes for "free."

Journal Article•DOI•
TL;DR: An object logic programming language is proposed that captures all of the basic object-oriented concepts in standard logic programming environment and is extended by redefining the representation of the objects in a more formal way and by adding inheritance through an extended unification similar to the one proposed by Ait-Kaci and Nasr (1986).
Abstract: In this paper, an object logic programming language is proposed that captures all of the basic object-oriented concepts in standard logic programming environment. This paper combines and extends two previously proposed models, namely Conery's technique (1988) which uses first-order logic to model objects including all of the basic object-oriented concepts except inheritance, and the LOGIN language of Ait-Kaci and Nasr (1986) which embeds inheritance into unification using typed logic.In this paper, Conery's proposal is extended by redefining the representation of the objects in a more formal way and by adding inheritance through an extended unification similar to the one proposed by Ait-Kaci and Nasr (1986). We have also changed the type concept of Ait-Kaci and Nasr (1986) by adding types directly to the first-order terms rather than using terms as types. The extended logic language, LPL++, has all the basic object-oriented concepts, namely objects, instances, classes, methods, inheritance, with the availability of update operations on objects in methods. A prototype interpreter also has been developed for the language LPL++.