Showing papers in "Journal of Systems and Software in 1990"
••
TL;DR: Dynamic slicing introduced in this paper differs from the original static slicing in that it is defined on the basis of a computation and allows us to treat array elements and fields in dynamic records as individual variables, which leads to a further reduction of the slice size.
245 citations
••
TL;DR: The need for a formal specification language for real-time applications and for a support environment providing tools for reasoning about formal specifications is motivated and TRIO, a logic-based specification language, is introduced.
241 citations
••
TL;DR: Although the algorithm does not support compaction of the whole data space, it efficiently supports partial compaction such as array relocation, and reduces the execution overhead to a great extent.
193 citations
••
TL;DR: A study of two Ada projects shows that software defects can be predicted in the development phase, and measures of package size proved to be the best predictors of defect densities.
109 citations
••
TL;DR: This work believes that measurement theory provides an appropriate basis for defining measures of the supposedly key internal attributes of software and shows how it is used to define a measure of coupling.
109 citations
••
TL;DR: The Grubstake Group is dedicated to producing an environment in which software measures can be confidently used by software managers and programmers if there exists a formal and rigorous foundation for software measurement.
96 citations
••
86 citations
••
TL;DR: The notion of a single metric, called relative complexity, which assigns a single value to each program in a program set to order the programs by their complexity is developed, which may serve as a leading indicator as to the set of programs that will require large amounts of system resources during the development and maintenance phases.
71 citations
••
TL;DR: This framework is proposed as a conceptual basis for the teaching material to be produced by the “METKIT” (Metrics Educational ToolKIT) ESPRIT project and gives rise to a systematic classification of the subject matter of Software Measurement.
63 citations
••
TL;DR: This study compares the accuracy and the complexity of trees resulting from five techniques for partitioning metric data values and indicates that distribution-sensitive partition techniques that use only relatively few partitions, such as the least weight subsequence techniques L WS-3 and LWS-5, can increase accuracy and decrease complexity in classification trees.
59 citations
••
TL;DR: A set of metrics for which data is to be collected and analyzed, based on a process maturity framework developed at the Software Engineering Institute, are suggested, corresponding to the maturity level of the development process.
••
TL;DR: The results of experiments conducted over the past three years are presented to empirically validate extensions made to enable function point theory to handle scientific and real-time systems.
••
TL;DR: An application-specific metric is proposed, one that represents the structure of a program as a variable-length profile, which is reported, and deriving complexity measures from the profile is discussed.
••
TL;DR: The structural model introduces another structural complexity dimension and uses it to pursue a thorough treatment of the complexity issue to treat the problem of composite structural complexity determination of software packages.
••
TL;DR: This tool can be used as the basis for an environment to help organize the sentences and phrases of a natural language problem description to aid the requirements analyst in the extraction of requirements.
••
TL;DR: The System for Testing And Debugging (STAD) is an experimental installation for the investigation of the use of data flow patterns in the program for testing and debugging and supports the strategies of chain, U- and L-context testing.
••
TL;DR: Why it is inadequate to assess the accuracy of (new) estimation tools simply on the basis of how accurately they replicate old projects, and why raw historical project results do not necessarily constitute the most “preferred” and reliable benchmark for future estimation are demonstrated.
••
TL;DR: The algorithms proposed here increase system performance through load balancing through increasing the maximum process queue length and the maximum amount of CPU time of active processes on each host.
••
TL;DR: A statistical procedure for validating conventional metrics based on Halstead's Software Science and McCabe's Cyclomatic Complexity is presented and a methodology to analyze large software projects with a single set of metrics is proposed.
••
TL;DR: Three graphical methods for examining distributions and two well-known families of transformations of code metrics, including Tukey's ladder and the Box-Cox transformation are presented.
••
TL;DR: This work proposes an object-oriented model where relationships play a central role and is used to describe the structure of complex objects in terms of their component parts, and to express more general constraints.
••
TL;DR: The results of the analysis show that programs with incorrect output used more GOTOs than did programs with correct output, and that programs containing at least one GOTO statement had significantly more “bad performs” than did the GOTO-less programs.
••
TL;DR: There is a trend toward losing sight of the product, application, and technology for which the software is being developed, thus creating a gap between software and product development and bridging of the gap requires effective leadership and project management within the software design teams.
••
TL;DR: Two different fault-tolerant distributed mutual exclusion algorithms that can establish mutual exclusion in such a distributed environment even when the member nodes fail, and their relative performance are evaluated.
••
TL;DR: This paper presents an experiment that introduces a nondisruptive method for integrating metrics into a large-scale commercial software development environment.
••
TL;DR: A new time-cost analysis technique is presented that deletes all of the dependent flows and reduces the computation to a single node graph and is found that this technique is more efficient than traditional approaches.
••
TL;DR: This article discusses means to carry these phases out, using executable assertions, which are derived by solving algebraic equations in the Tarski calculus of relations.
••
TL;DR: The concept of the justifiable complexity of a module is introduced, and suggestions for improvement of the tool are made, and many examples of justifiably overcomplex modules are given.
••
TL;DR: An object-oriented framework is presented for the development of an integrated environment in which a collection of independent heterogeneous data repositories are merged to form a loosely coupled data base environment that creates the illusion of a single integrated data base that can be queried in a uniform and consistent manner.
••
TL;DR: How the measures can be applied by all buyers and vendors, challenges facing the application of quality measurements in the future, and how these challenges can be successfully approached and overcome are discussed.