scispace - formally typeset
Search or ask a question

Showing papers in "IEEE Software in 1998"


Journal ArticleDOI
TL;DR: It was found that while many companies express interest in Jacobson's use case approach, actual scenario usage often falls outside what is described in textbooks and standard methodologies, and users face significant scenario management problems not yet addressed adequately in theory or practice.
Abstract: Scenario based approaches are becoming ubiquitous in systems analysis and design but remain vague in definition and scope. A survey of current practices indicates we must offer better means for structuring, managing, and developing their use in diverse contexts. The European Esprit project Crews (Cooperative Requirements Engineering with Scenarios) are seeking a deeper understanding of scenario diversity, necessary to improve methodological and tool support for scenario based requirements engineering. They follow a two pronged strategy to gain this understanding. First, following the "3 dimensions" requirements engineering framework developed in the precursor Nature project (K. Pohl, 1994), they developed a scenario classification framework based on a comprehensive survey of scenario literature in requirements engineering, human computer interaction, and other fields. They used the framework to classify 11 prominent scenario based approaches. Secondly, to complement this research framework, they investigated scenario applications in industrial projects through site visits with scenario user projects. The article focuses on these site visits. It was found that while many companies express interest in Jacobson's use case approach, actual scenario usage often falls outside what is described in textbooks and standard methodologies. Users therefore face significant scenario management problems not yet addressed adequately in theory or practice, and are demanding solutions to these problems.

414 citations


Journal ArticleDOI
TL;DR: The article describes how to perform domain engineering by identifying the commonalities and variabilities within a family of products through interesting examples dealing with reuse libraries, design patterns, and programming language design.
Abstract: The article describes how to perform domain engineering by identifying the commonalities and variabilities within a family of products. Through interesting examples dealing with reuse libraries, design patterns, and programming language design, the authors suggest a systematic scope, commonalities, and variabilities approach to formal analysis. Their SCV analysis has been an integral part of the FAST (Family-oriented Abstraction, Specification, and Translation) technology applied to over 25 domains at Lucent Technologies.

404 citations


Journal ArticleDOI
TL;DR: This report presents key discussion points from a workshop on CBSE and provides a useful synthesis of participants' diverse perspectives and experiences.
Abstract: As organizations adopt component-based software engineering, it becomes essential to clearly define its characteristics, advantages and organizational implications. This report presents key discussion points from a workshop on CBSE and provides a useful synthesis of participants' diverse perspectives and experiences.

368 citations


Journal ArticleDOI
Elaine J. Weyuker1
TL;DR: The author emphasizes the need to closely examine a problematic aspect of component reuse: the necessity and potential expense of validating components in their new environments.
Abstract: Components designed for reuse are expected to lower costs and shorten the development life cycle, but this may not prove so simple. The author emphasizes the need to closely examine a problematic aspect of component reuse: the necessity and potential expense of validating components in their new environments.

314 citations


Journal ArticleDOI
TL;DR: A model for matching COTS product features with user requirements is proposed and extended to extend state of the art requirements acquisition techniques to the component based software engineering process.
Abstract: Commercial off the shelf software can save development time and money if you can find a package that meets your customer's needs. The authors propose a model for matching COTS product features with user requirements. To support requirements acquisition for selecting commercial off the shelf products, we propose a method we used recently for selecting a complex COTS software system that had to comply with over 130 customer requirements. The lessons we learned from that experience refined our design of PORE (procurement oriented requirements engineering), a template based method for requirements acquisition. We report 11 of these lessons, with particular focus on the typical problems that arose and solutions to avoid them in the future. These solutions, we believe, extend state of the art requirements acquisition techniques to the component based software engineering process.

300 citations


Journal ArticleDOI
TL;DR: The paper discusses the elements of the G QM approach and fitting GQM to a measurement program.
Abstract: Schlumberger RPS (Retail Petroleum Systems) integrated the Goal/Question/Metric approach into their existing measurement programs to improve their program performance. Key to their success was the use of feedback sessions as a forum to analyze and interpret measurement data. The paper discusses the elements of the GQM approach and fitting GQM to a measurement program.

143 citations


Journal ArticleDOI
TL;DR: The paper discusses the three principles that drive the DCA approach and considers the impact of DCA on an organization.
Abstract: Defect causal analysis offers a simple, low-cost method for systematically improving the quality of software produced by a team, project, or organization. DCA takes advantage of one of the most widely available types of quality information, the software problem report. This information drives a team-based technique for defect causal analysis. The analysis leads to process changes that help prevent defects and ensure their early detection. The paper discusses the three principles that drive the DCA approach. It considers the impact of DCA on an organization.

140 citations


Journal ArticleDOI
Ho-Won Jung1
TL;DR: The author created a variant of the "knapsack" approach that reduces the complexity of earlier approaches; he presents two case studies to illustrate its application.
Abstract: When creating a software system, developers are often faced with a long list of requirements and a limited budget. The article gives developers a method to balance the cost and value of the requirements, and then implement the most cost-effective set. The author created a variant of the "knapsack" approach that reduces the complexity of earlier approaches; he presents two case studies to illustrate its application.

128 citations


Journal ArticleDOI
TL;DR: The software development and maintenance processes, which I prefer to unify and call software evolution, are key to managing computerization, and in my view it is key to the authors' survival in this computer age.
Abstract: In his essay, Ed Yourdon expresses, justifies, and leaves unresolved two well-founded questions: What is the future of software? What does the future hold for the software professional? His prognosis is evasive, incomplete, and unsatisfying: the future will be good for some, not so for others Given Yourdon's extensive experience in the real world of computer usage, as proven by the problems he has observed, it is easy to see why he feels that soft-ware's future is uncertain But he does not point to a solution to this uncertainty, nor does he indicate what can be done to achieve the best possible outcome for software professionals More importantly, Yourdon's analysis does not indicate what should be done to ensure the security, well being, and survival of society, which depends increasingly on software For more than a decade now, there have been those in the software engineering community who have accepted that the need to continually change and evolve software is a fact-a fact addressed through activity that is planned, executed, and controlled by humans Thus, the software development and maintenance processes, which I prefer to unify and call software evolution, are key to managing computerization In my view it is key to our survival in this computer age

127 citations


Journal ArticleDOI
TL;DR: Traditional software usability methods can help us design more understandable and more useful application program interfaces (APIs) but they also give us information the authors need to write good API reference documentation.
Abstract: Traditional software usability methods can help us design more understandable and more useful application program interfaces (APIs). They also give us information we need to write good API reference documentation-before we invest in either programmers or writers and before evolving a large body of code or content.

118 citations


Journal ArticleDOI
TL;DR: It is shown that interrupting this process can significantly reduce a developer's efficiency and can even contribute to project delays.
Abstract: Software development is a highly abstract process that requires intense concentration. The authors show that interrupting this process can significantly reduce a developer's efficiency and can even contribute to project delays.

Journal ArticleDOI
TL;DR: The author's Web-based Agile Software Engineering Environment answers the challenges of today's development process with a time-based life cycle that provides just-in-time process management.
Abstract: The pressure to be first to market with a new product has accelerated the development process. Today's process models require agility, defined as the ability to operate in real time and to adapt quickly to changing requirements and conditions. Further, the global, distributed nature of many projects-often developed now over the Internet and intranets-demands flexible, integrated tools. The author's Web-based Agile Software Engineering Environment answers these challenges with a time-based life cycle that provides just-in-time process management.

Journal ArticleDOI
TL;DR: This work explores how automated tools might support the dynamic modeling phase of object oriented software development by using the Object Modeling Technique as a guideline and notational basis.
Abstract: We explore how automated tools might support the dynamic modeling phase of object oriented software development. We use the Object Modeling Technique as a guideline and notational basis, but in principle our approach is not tied to any particular OO methodology. We assume, however, that dynamic modeling exploits scenarios (as in OMT) describing examples of using the system being developed. Our techniques can easily be adopted for various scenario representations, such as sequence diagrams or collaboration diagrams in UML.

Journal ArticleDOI
TL;DR: The author asserts that any paradigm that is capable of decomposing a system into large numbers of small components-as frequently occurs in both OO and conventional systems-is fundamentally wrong and should expect no particular benefits to accrue from an OO system over a non-OO system.
Abstract: Is object orientation an imperfect paradigm for reliable coding? Worse, does it focus on the wrong part of the life cycle? The author thinks so and explains why. Given that corrective-maintenance costs already dominate the software life cycle and look set to increase significantly, the author argues that reliability in the form of reducing such costs is the most important software improvement goal. Yet, the results are not promising when we review recent corrective-maintenance data for big systems in general and for OO systems, in this case written in C++. The author asserts that any paradigm that is capable of decomposing a system into large numbers of small components-as frequently occurs in both OO and conventional systems-is fundamentally wrong. Thus, because both paradigms suffer from this flaw, we should expect no particular benefits to accrue from an OO system over a non-OO system. Further, a detailed comparison of OO programming and the human thought processes involved in short and long term memory suggests that OO aligns with human thinking limitations indifferently at best. In the case studies described, OO is no more than a different paradigm, and emphatically not a better one, although it is not possible to apportion blame between the OO paradigm itself and its C++ implementation.

Journal ArticleDOI
TL;DR: Advice is given on taking the COTS option and the management decisions that have to be made and the approach that carves days or weeks from a development schedule is worth considering.
Abstract: A new trend in software commerce is emerging: generic software components, also called commercial off the shelf components, that contain fixed functionality. COTS components can be incorporated into other systems still under development so that the developing system and the generic components form a single functional entity. The role of COTS components is to help new software systems reach consumers more quickly and cheaply. Because arriving last to market spells sudden death in the software industry, any approach that carves days or weeks from a development schedule is worth considering. The article gives advice on taking the COTS option and the management decisions that have to be made.

Journal ArticleDOI
TL;DR: Nine software leaders have to say whether the old tried and true principles of software engineering make any sense on the completely new playing field of the Internet.
Abstract: Do the old tried and true principles of software engineering make any sense on the completely new playing field of the Internet? See what nine software leaders have to say.

Journal ArticleDOI
TL;DR: The trick is to perform the minimum amount of work necessary to determine that the project should be cancelled.
Abstract: Is a cancelled project a bad project? After surveying about 8,000 IT projects, the Standish Group reported that about 30 percent of all projects were cancelled ("Charting the Seas of Information Technology", 1994). Capers Jones reports that the average cancelled project in the US is about a year behind schedule and has consumed 200 percent of its expected budget by the time it's cancelled (Assessment and Control of Software Risks, Yourdon Press, 1994). Jones estimates that work on cancelled projects comprises about 15 percent of total US software efforts, amounting to as much as $14 billion per year in 1993 dollars. In spite of these grim statistics, cancelling a project is, in itself, neither good nor bad. Cancelling a project later than necessary is bad. The trick is to perform the minimum amount of work necessary to determine that the project should be cancelled.

Journal ArticleDOI
TL;DR: The author argues that to reap the benefits of component-based development: reduced time to market, more user choice, and lower costs, the authors must rethink their software maintenance strategies.
Abstract: As we continue to move toward component-based software engineering, software development will become more like traditional manufacturing: developers will code less and design and integrate more. The author argues that to reap the benefits of component-based development: reduced time to market, more user choice, and lower costs, we must rethink our software maintenance strategies. He gives a wide-ranging overview of the maintenance challenges raised by component-based development.

Journal ArticleDOI
TL;DR: The architect's role in building design as a fruitful analogy for software professionals seeking to reform software practice helps to focus on usefulness, usability, user needs and practices, and other technical and nontechnical aspects of good software design.
Abstract: Some software designers have recently turned for inspiration to the process of building design to improve development practices and increase software's usefulness and effectiveness. Architects' education revolves around the studio course, which promotes: project based work on complex and open ended problems; very rapid iteration of design solutions; frequent formal and informal critique; consideration of heterogeneous issues; the use of precedent and thinking about the whole; the creative use of constraints; and the central importance of design media. M. Kapor (1991) suggested that software practitioners needed to "rethink the fundamentals of how software is made" and proposed the architect's role in building design as a fruitful analogy for software professionals seeking to reform software practice. This analogy helps us focus on usefulness, usability, user needs and practices, and other technical and nontechnical aspects of good software design. It highlights concerns about people's lives and work practices and how people "inhabit" systems. Several authors have explored similarities and differences between software design and building design, including some who have pursued the software implications of architect Christopher Alexander's design patterns.

Journal ArticleDOI
TL;DR: A state-of-the-practice overview and some of the latest work in CBSE, as well as industry forecasts from two leaders in the field, are presented.
Abstract: Components are "hot," and changing at a fast pace. Here are industry forecasts from two leaders in the field, as well as their choices for inclusion in this focus: a state-of-the-practice overview and some of the latest work in CBSE.

Journal ArticleDOI
TL;DR: The PSP focuses on defect reduction and estimation improvement as the two primary goals of personal process improvement, and through individual collection and analysis of personal data shows how individuals can implement empirically guided software process improvement.
Abstract: In 1995, Watts Humphrey introduced the Personal Software Process in his book, A Discipline for Software Engineering (Addison Wesley Longman, Reading, Mass.). Programmers who use the PSP gather measurements related to their own work products and the process by which they were developed, then use these measures to drive changes to their development behavior. The PSP focuses on defect reduction and estimation improvement as the two primary goals of personal process improvement. Through individual collection and analysis of personal data, the PSP shows how individuals can implement empirically guided software process improvement. The full PSP curriculum leads practitioners through a sequence of seven personal processes. The first and most simple PSP process, PSPO, requires that practitioners track time and defect data using a Time Recording Log and Defect Recording Log, then fill out a detailed Project Summary Report. Later processes become more complicated, introducing size and time estimation, scheduling, and quality management practices such as defect density prediction and cost-of-quality analyses. After almost three years of teaching and using the PSP, we have experienced its educational benefits. As researchers, however, we have also uncovered evidence of certain limitations. We believe that awareness of these limitations can help improve appropriate adoption and evaluation of the method by industrial and academic practitioners.

Journal ArticleDOI
TL;DR: It is suggested that properly framed software licensing agreements provide the vehicle for ensuring that developers are fairly compensated for the time and resources invested in making and using software components.
Abstract: For developers to feel it worthwhile to invest in software components, both licensors and licensees must be fairly compensated for the time and resources invested in making and using such components. The authors suggest that properly framed software licensing agreements provide the vehicle for ensuring such compensation.

Journal ArticleDOI
TL;DR: The author discusses the creation of new models and considers: software process improvement; software testing; software design methods; software inspection; risk management; and software metrics.
Abstract: The author outlines several sets of software engineering and management practices that, in his experience, are still not being routinely applied across the industry. He discusses the creation of new models and considers: software process improvement; software testing; software design methods; software inspection; risk management; and software metrics.

Journal ArticleDOI
TL;DR: The author asserts that Boca provides a solution to the challenge of segregating core business information from the technological specifics of its home system while allowing different applications to exchange data across the enterprise.
Abstract: The Common Business Object Framework aims at coming up with a set of system-level abstractions and interoperability standards to simplify and standardize business systems development The author asserts that Boca provides a solution to the challenge of segregating core business information from the technological specifics of its home system while allowing different applications to exchange data across the enterprise

Journal ArticleDOI
TL;DR: The Best Practices Framework of the Software Program Manager's Network outlines some solutions to software development failure, including managing the development process with more attention to what might go wrong.
Abstract: The evidence is voluminous, consistent, and incontrovertible. It applies to corporate, government agency, and military software development. Quite simply, the software we build does not meet our customers' needs: those of us who build large software programs fail miserably-90 percent of the time-to deliver what customers want, when they want it, at the agreed upon price; we fail to adequately manage the software development process, user-developer communication breaks down, the requirements control process breaks down, we have runaway requirements, budgets, schedules, and "death march" projects. The Best Practices Framework of the Software Program Manager's Network outlines some solutions. First, we must identify what can go wrong. Precedents give ample hints regarding risks. We need to manage the development process with more attention, particularly to what might go wrong. Second, we must manage the most fundamental part of our task: defining our goal. We fail to use requirements management to surface (early) errors or problems, to baseline and track changes, and to improve user-developer communication.

Journal ArticleDOI
TL;DR: The notion of programs can be used to describe a system in terms of the way it relates to the environment in which it operates, characterized by the way the software interacts with its environment and by the degree to which the environment and underlying problem can change.
Abstract: M. Lehman's (Proc. IEEE, vol.68, no.9, p.1060-76, 1980) notion of programs can be used to describe a system in terms of the way it relates to the environment in which it operates. Unlike programs handled in the abstract, the real world contains uncertainties and concepts we do not understand completely. The more dependent a system is on the real world for its requirements, the more likely it is to change. Systems are described as S-systems (formally defined by and derivable from a Specification), P-systems (based on a Practical abstraction of the problem) and E-systems (Embedded in the real world and changing as the world does), characterized by the way the software interacts with its environment and by the degree to which the environment and underlying problem can change. Using these ideas, we can design our systems to be flexible, and plan our maintenance releases and new versions so that we understand and control our software, rather than merely react when problems arise.

Journal ArticleDOI
TL;DR: The author reveals that estimation of effort is key to managing OO development projects, then documents a few rules of thumb for doing this and recommends that one collects metrics throughout the project and uses them for continuous revalidation of assumptions and project performance.
Abstract: The author reveals that estimation of effort is key to managing OO development projects, then documents a few rules of thumb for doing this One lesson he's learned from using these rules is that estimates for a whole project are not reliable if estimates for subcomponents are simply added up As the project size increases, so does the overhead for communication and general interaction He goes beyond effort metrics and recommends that one collects metrics throughout the project and uses them for continuous revalidation of assumptions and project performance If one accepts that "what you can't measure, you can't manage", this is good advice

Journal ArticleDOI
TL;DR: The paper discusses the advantages of Linux and considers massively parallel software development and also discusses open source development.
Abstract: Developed by Torvalds in 1991 from Minix, a simplified version of Unix intended for educational use, Linux has grown from a mere 10000-line kernel to a full-featured OS with more than 1.5 million lines of code. Once thought the exclusive province of hobbyists, students, and hard-core programmers, Linux has quietly, unassumingly slipped into the software mainstream. The paper discusses the advantages of Linux and considers massively parallel software development. It also discusses open source development.

Journal ArticleDOI
TL;DR: Taking this bottom-up approach, this work presents some tricks that practitioners down in the trenches can use to win upper management's approval of and support for SPI.
Abstract: Most everyone in our field acknowledges that software process improvement cannot succeed unless management is committed to implementing it. Fortunately, the move to SPI can be a two-way street, initiated as easily from the bottom up as from the top down. Taking this bottom-up approach, I present some tricks that practitioners down in the trenches can use to win upper management's approval of and support for SPI.

Journal ArticleDOI
TL;DR: Slowly, the authors are bridging the gap between requirements engineering research and practice, but the gap is still large, but they have a few more practice-validated methods and tools in their pockets, and the bridge building continues.
Abstract: Slowly, we are bridging the gap between requirements engineering research and practice The gap is still large, but we have a few more practice-validated methods and tools in our pockets, and the bridge building continues