scispace - formally typeset
Search or ask a question

Showing papers in "Ibm Systems Journal in 1993"


Journal ArticleDOI
TL;DR: A model for conceptualizing and directing the emerging area of strategic management of information technology is developed in terms of four fundamental domains of strategic choice: business strategy, information technology strategy, organlzational infrastructure and processes, and information technology Infrastuvture and processes--each with its own underlying dimenslons.
Abstract: It is cleaaar that eventhough information technology (I/T) has evolved form its traditional orientation of administrative support toward a more strategic role within an organization, there is still a glaring lack of fundamental frameworks within which to understand the potential of I/T for tomorrow's organizations. In this paper, we develop a model for conceptualizing and directing the emerging area of strategic management of information technology. This model, termed the Strategic Allgnment Model, is defined in terms of four fundamental domains of strategic choice: business strategy, information technology strategy, organlzational infrastructure and processes, and information technology Infrastuvture and processes--each with its own underlying dimenslons. We illustrate the power of this model in terms of two fundamental characteristics fo strategic management: strategic fit (the interrelationships between external and internal components) and functional Integration (integration between business and functional domains). More specifically, we derive foru perspectives for gulding management practice in this Important area.

3,343 citations


Journal ArticleDOI
TL;DR: This paper looks at why it may not be sufficient to work on any one of these areas in isolation or to only harmonize business strategy and information technology, and why the Strategic Alignment Model is applied.
Abstract: The strategic use of information technology (I/T) is now and has been a fundamental issue for every business. In essence, I/T can alter the basic nature of an industry. The effective and efficient utilization of information technology requires the alignment of the I/T strategies with the business strategies, something that was not done successfully in the past with traditional approaches. New methods and approaches are now available. Thes trategic alignment framework applies the Strategic Alignment Model to reflect the view that business success depends on the linkage of business strategy, information technology strategy, organizational infrastructure and processes, and I/T infrastructure and processes. In this paper, we look at why it may not be sufficient to work on any one of these areas in isolation or to only harmonize business strategy and information technology. One reason is that, often, too much attention is placed on technology, rather than business, management, and organizational issues. The objective is to build an organizational structure and set of business processes that reflect the interdependence of enterprise strategy and information technology capabilities. The attention paid to the linkage of information technology to the enterprise can significantly affect the competitiveness and efficiency of the business. The essential issue is how information technology can enable the achievement of competitive and strategic advantage for the enterprise.

603 citations


Journal ArticleDOI
TL;DR: The firm-wide strategy-formation processes of the banks, rather than their information systems (I/S) methodology, was central to the alignment of business and information strategies.
Abstract: An empirical study that explored business and information strategy alignment in the information-intensive and competitive Australian banking industry is featured in this paper. The aim of the study was to identify organizational practices that contribute to and enhance such alignment. Multiple sources of information were used to collect data about business and information strategies from the major firms dominating Australian banking. Sources included written and interview-based information, strategic planning documentation, and annual reports. Evidence was sought for the alignment of business and information strategies through the use of information and information technology that provided a comparative advantage to an organization over its competitors. The firm-wide strategy-formation processes of the banks, rather than their information systems (I/S) methodology, was central to the alignment of business and information strategies. The interdependence of firm-wide processes and I/S factors are emphasized in a strategic alignment model that summarizes the findings of the study. The paper concludes with a discussion of the management implications and requirements for action in both firm-wide strategy and I/S areas. The results of this study in the banking industry are pertinent to other industries where information technology and systems are playing an increasingly strategic role.

338 citations


Journal ArticleDOI
TL;DR: This paper presents a framework for senior executives to use in order to lead the deployment of information technology (I/T) without having to know how it is managed and to ensure the fusion of business processes, people, and technology.
Abstract: When every leading firm in an industry has access to the same information technology resource, the management difference determines competitive advantage or disadvantage. The management challenge is to make sure that business processes, people, and technology are meshed, instead of being dealt with as separate elements in planning and implementation. This paper presents a framework for senior executives to use in order to lead the deployment of information technology (I/T) without having to know how it is managed and to ensure the fusion of business processes, people, and technology. The "fusion map" approach that focuses on the steps that precede and enable strategy, has been applied in a number of companies. Factors are identified that make I/T a frequent destabilizer of basic logistics in an industry.

210 citations


Journal ArticleDOI
TL;DR: A framework of strategic control is presented that guides the planning and execution of investments in information technology for business transformation, seeking increased understanding and influence.
Abstract: The strategic role of information systems in "extending" the enterprise is examined. A number of issues emergea s essential considerations in the strategic alignment of the investment in information technology and business strategy. Information technologies transform organizational boundaries, interorganizational relations, and marketplace competitive and cooperative practice. The paper presents a framework of strategic control that guides the planning and execution of these investments in information technology for business transformation, seeking increased understanding and influence. Emerging information technologies change the limits of what is possible in the leverage of strategic control through transformation of boundaries, relations, and markets.

208 citations


Journal ArticleDOI
TL;DR: It is suggested that firms that successfully integrate an information technology (I/T) strategy with their business strategies do so by focusing on the information itself, rather than on technology, as the real carrier of value and source of competitive advantage.
Abstract: This paper suggests that firms that successfully integrate an information technology (I/T) strategy with their business strategies do so by focusing on the information itself, rather than on technology, as the real carrier of value and source of competitive advantage. A primary mechanism by which a firm becomes an information-intensive firm is the implementation of a procedure for measuring the value of its information assets V(I). This paper presents a methodology for measuring the value of information in the firm. This paper describes the application of that methodology in an actual case study and discusses some consequences of being able to compare organizations with respect to their relative levels of information intensity.

192 citations


Journal ArticleDOI
TL;DR: This paper provides a "lens" through which managers can assess their firm's current competitive position, build a vision for where they must be in the future, and craft a transformation strategy to turn that future vision into reality.
Abstract: The old competitive strategies of invention and mass production no longer work in an increasingly turbulent business environment. Successful firms are implementing the new competitive strategies of continuous improvement (constant process improvement) and mass customization-a dynamic flow of goods and services via a stable set of processes. This paper provides a "lens" through which managers can assess their firm's current competitive position, build a vision for where they must be in the future, and craft a transformation strategy to turn that future vision into reality.

169 citations


Journal ArticleDOI
TL;DR: This paper introduces software reuse concepts and examines the cost-benefit trade-offs of software reuse investments, and provides a set of metrics used by IBM to accurately reflect the effort saved by reuse.
Abstract: To remain competitive, software development organizations must reduce cycle time and cost, while at the same time adding function and improving quality. One potential solution lies in software reuse. Because software reuse is not free, we must weigh the potential benefits against the expenditures of time and resources required to identify and integrate reusable software into products. We first introduce software reuse concepts and examine the cost-benefit trade-offs of software reuse investments. We then provide a set of metrics used by IBM to accurately reflect the effort saved by reuse. We define reuse metrics that distinguish the savings and benefits from those already gained through accepted software engineering techniques. When used with the return-on-investment (ROI) model described in this paper, these metrics can effectively establish a sound business justification for reuse and can help assess the success of organizational reuse programs.

166 citations


Journal ArticleDOI
Martin L. Griss1
TL;DR: At Hewlett-Packard Co., a multifaceted corporate reuse program is initiated to help introduce the best practices of systematic reuse into the company, complemented by multidisciplinary research to investigate and develop better methods for domain-specific, reuse-based software engineering.
Abstract: Systematic software reuse is a key business strategy that software managers can employ to dramatically improve their software development processes, to decrease time-to-market and costs, and to improve product quality. Effective reuse requires much more than just code and library technology. We have learned that careful consideration must be given to people, process, and technology. One approach to the systematic integration of these three elements is the concept of the software factory. At Hewlett-Packard Co., we have initiated a multifaceted corporate reuse program to help introduce the best practices of systematic reuse into the company, complemented by multidisciplinary research to investigate and develop better methods for domain-specific, reuse-based software engineering. This essay discusses our experiences. Key aspects include domain-specific kits, business modeling, organization design, and technology infrastructure for a flexible software factory.

162 citations


Journal ArticleDOI
TL;DR: This three-phase transformation process starts with structured automation and re-engineering efforts, builds on new infrastructure and capabilltles to enhance and extend the orginal business, and then redefines it to create new businesses.
Abstract: New Information-technology-based capabllities make It possibie to achleve systematic and dramatic gains in business performance. Reengineering offersone method to access these gains, but a broader process of business transformation explored in this paper can give enterprises a greater range of benefits. This three-phase transformation process starts with structured automation and re-engineering efforts, builds on new infrastructure and capabilltles to enhance and extend the orginal business, and then redefines it to create new businesses.

154 citations


Journal ArticleDOI
A. L. Scherr1
TL;DR: The methodology builds upon traditional approaches to business process definition by adding the dimension of people's accountabilities: their roles, relationships, and agreements to provide unique insights into customer satisfaction, employee empowerment, and quality.
Abstract: This paper presents a methodology for analyzing and designing the processes that an enterprise uses to conduct its business. The methodology builds upon traditional approaches to business process definition by adding the dimension of people's accountabilities: their roles, relationships, and agreements. The approach presented allows for unique insights into customer satisfaction, employee empowerment, and quality. It also provides a basis for spanning the concerns of both business people and information technologists responsible for providing business process automation.

Journal ArticleDOI
TL;DR: Information technology (I/T) solutions are explored that drive firms toward making economic decisions based on worldwide distributed knowledge that identify where a firm can benefit most from the management and application of the technology.
Abstract: The alignment of worldwide computer-based information systems and integrated business strategies is critical to the success of multinational firms in a highly competitive global market. In this paper, information technology (I/T) solutions are explored that drive firms toward making economic decisions based on worldwide distributed knowledge. These solutions focus on a number of entities (or global business drivers) that identify where a firm can benefit most from the management and application of the technology. A variety of approaches for overcoming the barriers and risks of applying this technology are also discussed.


Journal ArticleDOI
Charlene Walrad, Eric Moss1
TL;DR: The relationships between productivity, quality, and measurement are described, classes of measures are identified, and "dominant measures" are grouped according to the maturity levels defined by the Software Engineering Institute's Capability Maturity Model for Software.
Abstract: Application development quality and productivity have been identified as being among the top ten concerns of information systems (I/S) executives in both 1991 and 1992. This paper discusses the role of measurement in pursuit of I/S application development quality and productivity. The relationships between productivity, quality, and measurement are described, classes of measures are identified, and "dominant measures" are grouped according to the maturity levels defined by the Software Engineering Institute's Capability Maturity Model for Software. Also discussed are the organizational and cultural issues associated with instituting a measurement process.

Journal ArticleDOI
TL;DR: The system includes feature extraction based on expert-assisted feature selection, spatial feature measurement, feature and shape representation, feature information compression and organization, search procedures, and pattern-matching techniques, which uses novel data structures to represent the extracted information.
Abstract: This paper describes a system for content-based retrieval of facial images from an image database. The system includes feature extraction based on expert-assisted feature selection, spatial feature measurement, feature and shape representation, feature information compression and organization, search procedures, and pattern-matching techniques. The system uses novel data structures to represent the extracted information. These structures include attributed graphs for representing local features and their relationships, n-tuple of mixed mode data, and highly compressed feature codes. For the retrieval phase, a knowledge-directed search technique that uses a hypothesis refinement approach extracts specific features for candidate identification and retrieval. The overall system, the components, and the methodology are described. The system has been implemented on an IBM Personal System/2® running Operating System/2®. Examples demonstrating the performance of the system are included.

Journal ArticleDOI
Peter Stecher1
TL;DR: The motivation, structure, and possible uses of the Retail Application ArchitectureTM (RAATM) are described, which is a set of generic enterprise models for companies in the retail and wholesale distribution industry.
Abstract: An industry application architecture is a framework for integrating applications and databases and can also be used for analyzing and re-engineering the business of an enterprise as a whole, provided it is structured correctly. This paper describes the motivation, structure, and possible uses of the Retail Application ArchitectureTM (RAATM). The core of RAA is a set of generic enterprise models for companies in the retail and wholesale distribution industry. RAA is oriented as much to the business expert as to the information systems (I/S) department. The goal of RAA is to contribute to the task of building sound business systems in a more efficient and effective manner.


Journal ArticleDOI
TL;DR: This work presents a classic example of object-based systems development using box structures and shows how the box structure usage hierarchy allows stepwise refinement of the system design with referential transparency and verification at every step.
Abstract: Box structures provide a rigorous and systematic process for performing systems development with objects. Box structures represent data abstractions as objects in three system views and combine the advantages of structured development with the advantages of object orientation. As data abstractions become more complex, the box structure usage hierarchy allows stepwise refinement of the system design with referential transparency and verification at every step. An integrated development environment based on box structures supports flexible object-based systems development patterns. We present a classic example of object-based systems development using box structures.

Journal ArticleDOI
Michael Wasmund1
TL;DR: A practical application of the Critical Success Factors method on reuse technology insertion into the software development process and results are described, concluding with lessons learned and recommendations for similar efforts.
Abstract: Software reuse is one of several technologies that can improve quality and effectiveness of software development. The introduction of a reuse infrastructure within an existing organization and the associated modification of employee behavior and processes is a complex interdisciplinary task. The structuring and monitoring of several coordinated activities is required in order to be successful This paper describes a practical application of the Critical Success Factors method on reuse technology insertion into the software development process. The Critical Success Factors method has proved to be a useful means for the introduction of software reuse concepts. Application of the method and results are discussed in detail, concluding with lessons learned and recommendations for similar efforts.

Journal ArticleDOI
TL;DR: In this paper, the authors propose to use reengineering techniques to access systematic and dramatic gains in business performance through new information technology-based capabllities, which make it possibie to achleve systematic, and dramatic, gains in performance.
Abstract: New Information-technology-based capabllities make It possibie to achleve systematic and dramatic gains in business performance. Reengineering offersone method to access these gains, but a broader ...

Journal ArticleDOI
Donald Hough1
TL;DR: The result of using Rapid Delivery is an enhanced ability to build applications that better support the enterprise through a continuous stream of delivered requirements, a reduction in the possibility of project failure, and a diminished likelihood of runaway projects.
Abstract: From a historical vantage point, large application development projects are frequently at risk of failure. Applications are typically developed using a monolithic development approach. Monolithic approaches generally feature business-user-defined requirements that are incorporated in the application but not evident until the resulting application has been implemented. To effectively produce new information systems, innovative methods must be utilized. This paper provides information about one of these, Rapid Delivery--a method for developing applications that can evolve over time. To fully understand the principles of Rapid Delivery, a discussion is included that illuminates a three-dimensional application model and its variations. The application model helps in understanding application segmentation, a technique used in Rapid Delivery to break applications into a variety of functional capabilities. After the development of each application segment has been completed, it is implemented to provide immediate benefit to the enterprise; each application segment is added to the evolving application and its ever-expanding capabilities. The result of using Rapid Delivery is an enhanced ability to build applications that better support the enterprise through a continuous stream of delivered requirements, a reduction in the possibility of project failure, and a diminished likelihood of runaway projects.

Journal ArticleDOI
P. V. Norden1
TL;DR: Some characteristics of the modeling process are illustrated, and the applicability of quantitative techniques to strategic alignment opportunities are explored, such as current pressures to reduce the "cycle time" of many enterprise functions.
Abstract: There is increasing evidence in both the business and technical literature that the operations and strategy processes of many organizations have been aided materially by visualization and modeling techniques Application of quantitative methods has progressed from relatively well-structured operations to the more speculative aspects of strategy and policy formation In retrospect, however, the most valuable contribution of modeling has been greater insight: a clearer understanding of the situations and prospects at hand that the mere act of model formulation often provided the planner, This paper illustrates some characteristics of the modeling process, and explores the applicability of quantitative techniques to strategic alignment opportunities, such as current pressures to reduce the "cycle time" of many enterprise functions

Journal ArticleDOI
TL;DR: The work group concluded that the reference designs described herein represent the best working assumption about "where customers are going" with distributed application designs and should give those who have not yet begun to exploit distributed systems a starting point and considerations for their design work.
Abstract: This paper is based on the findings and conclusions of a client/server work group that was commissioned in 1991 to report IBM's technical strategy for client/server computing. Although there are countless variations for designing applications and interconnecting components in a distributed environment, there seems to be a finite number of variations that represent what a large majority of customers want to build. The intent of the work group was to explore the possibility of defining a set of application "reference designs," which would represent the distributed designs that customers are building today or want to build in the near future. This paper documents the customer scenarios, the reference designs that represent them, and the requirements that were generated for the underlying system software. The work group concluded that the reference designs described herein represent our best working assumption about "where customers are going" with distributed application designs. The discussion should give those who have not yet begun to exploit distributed systems a starting point and considerations for their design work.

Journal ArticleDOI
Henry M. Gladney1
TL;DR: What differentiates DocSS among digital library projects is its approach to data distribution over wide area networks, its client-server approach to the heterogeneous environment, and its synergism with other components of evolving open systems.
Abstract: Digital storage and communications are becoming cost effective for massive collections of document images with access not only for nearby users but also for those who are hundreds of miles from their libraries. The Document Storage Subsystem (DocSS) provides generic library services such as searching, storage, and retrieval of document pages and sharing of objects with appropriate data security and integrity safeguards. A library session has three components: a manager of remote catalogs, a set of managers of large-object stores, and a manager of cache services. DocSS supports all kinds of page data--text, pictures, spreadsheets, graphics, programs--and can be extended to audio and video data. Document models can be built as DocSS applications; the paper describes a folder manager as an example. What differentiates DocSS among digital library projects is its approach to data distribution over wide area networks, its client-server approach to the heterogeneous environment, and its synergism with other components of evolving open systems.

Journal ArticleDOI
Kristine D. Saracelli1, Kurt Bandat1
TL;DR: The requirements for managing the AD process are discussed and the need for automated assistance for these management activities is established, and considerations for an automated system to manage the process are presented.
Abstract: Over the years, the field of application development (AD) has evolved from that of an art form to being more of a science, hence the emergence of concepts such as information engineering. In engineering and scientific fields, the value of process definition and management has long been known. This paper discusses the requirements for managing the AD process and establishes the need for automated assistance for these management activities. Considerations for an automated system to manage the process are presented, and the benefits to be realized by such an implementation are then discussed.

Journal ArticleDOI
Klaus Wothke1
TL;DR: A system is described that automatically generates phonetic transcriptions for German orthographic words and has an excellent linguistic performance: more than 99 percent of the segmented words obtain a correct segmentation in the first step, and more than 98% of the words receive a correct phonetic transcription in the second step.
Abstract: A system is described that automatically generates phonetic transcriptions for German orthographic words. The entire generative process consists of two main steps. In the first step, the system segments the words into their morphs, or prefixes, stems, and suffixes. This segmentation is very important for the transcription of German words, because the pronunciation of the letters depends also on their morphological environment. In the second step, the system transcribes the morphologically segmented words. Several transcriptions can be generated per word, thus permitting the system to take pronunciation variants into account. This feature results from the application area of the system, which is the provision of phonetic reference units for an automatic large-vocabulary speech recognition system. Statistical evaluations show that the transcription system has an excellent linguistic performance: more than 99 percent of the segmented words obtain a correct segmentation in the first step, and more than 98 percent of the words receive a correct phonetic transcription in the second step.

Journal ArticleDOI
D. Bauer1
TL;DR: The experience showed that it is not enough to use an object-oriented language to make reuse happen, but high-quality class libraries are essential to get the desired productivity improvements.
Abstract: During the creation and the establishment of the parts center, our group discovered what language features and what infrastructure were needed to enable the building and the distribution of reusable components. Many of the problems we had to solve were caused by the inadequate support of reuse through the language we initially used. Significant progress was achieved when object-oriented languages like C++, which support reuse inherently, became available. Our experience showed that it is not enough to use an object-oriented language to make reuse happen, but high-quality class libraries are essential to get the desired productivity improvements.

Journal ArticleDOI
J. R. Tirso1, H. Gregorius1
TL;DR: In this article, the authors argue that it is better to obtain code from libraries that contain reusable parts than to write new code, as long as measurement of productivity is not made in the standard form of lines of code per unit of effort.
Abstract: For years, programmers have believed that productivity gains could be achieved if programs could be reused. The logic is that it is better to obtain code from libraries that contain reusable parts than to write new code. One can easily see the productivity gains from not writing code, as long as measurement of productivity is not made in the standard form of lines of code per unit of effort. Building systems from existing proven parts can also lead to increases in quality.

Journal ArticleDOI
Bruce McNutt1
TL;DR: A case study, based on data collected at a large Multiple Virtual Storage installation, is used to investigate the potential types and amounts of memory use by individual files, both in storage control cache and in processor buffers, and develops broad guidelines for how best to deploy an overall memory budget.
Abstract: I/O subsystem configurations are dictated by the storage and I/O requirements of the specific applications that use the disk hardware. Treating the latter requirement as a given, however, draws a boundary at the channel interface that is not well-suited to the capabilities of the Enterprise Systems Architecture (ESA). This architecture allows hardware expenditures in the I/O subsystem to be managed, while at the same time improving transaction response time and system throughput capability, by a strategy of processor buffering coupled with storage control cache. The key is to control the aggregate time per transaction spent waiting for physical disk motion. This paper investigates how to think about and accomplish such an objective. A case study, based on data collected at a large Multiple Virtual Storage installation, is used to investigate the potential types and amounts of memory use by individual files, both in storage control cache and in processor buffers. The mechanism of interaction between the two memory types is then examined and modeled so as to develop broad guidelines for how best to deploy an overall memory budget. These guidelines tend to contradict the usual metrics of storage control cache effectiveness, underscoring the need for an adjustment in pre-ESA paradigms.

Journal ArticleDOI
K. P. Yglesias1
TL;DR: The software life cycle produces several types of reusable information such as customer information, product development information, and process information, which benefits from the use of common tools and centrally coordinated standards and terminology.
Abstract: Organizations that place a high value on their information can best leverage and maintain that information if they apply the same infrastructure and techniques used for reusable software. The software life cycle produces several types of reusable information such as customer information, product development information, and process information. Just as software reuse benefits from structured programming practices, information reuse benefits from the use of common tools, centrally coordinated standards and terminology, and development practices consistent with good writing and design.