scispace - formally typeset
Search or ask a question

Showing papers in "Information & Software Technology in 1988"


Journal ArticleDOI
PS Loo1, WK Tsai1
TL;DR: It is shown that random testing works well on several kinds of programs, but not all, and it also determines and under what conditions one can use random testing to get reasonable test results.
Abstract: This paper examines random testing advocated by Duran and Ntafos 1 . They suggest that random testing has its value. However, the assumptions on which their simulations are based may be too weak for a fair comparison between random testing and partition testing. This paper shows that random testing works well on several kinds of programs, but not all. It also determines and under what conditions one can use random testing to get reasonable test results. Several simulations with new assumptions and experimental studies have been done to evaluate random testing. The results suggest that random testing works well on error-prone programs and programs that their expected outputs can be easily known. For it to be most effective, one should use random testing in an early stage of program testing.

75 citations



Journal ArticleDOI
TL;DR: The structured systems analysis and design method (SSADM) is the standard structured method used for computer projects in UK government departments and is also being adopted as a standard by various other bodies.
Abstract: The structured systems analysis and design method (SSADM) is the standard structured method used for computer projects in UK government departments. It is also being adopted as a standard by various other bodies. Responsibility for SSADM belongs to the Central Computer and Telecommunications Agency (CCTA), HM Treasury although support and training may be acquired through commercial organizations. SSADM has certain underlying principles and consists of six stages which are broken down into a number of steps and tasks. Within this framework, a number of structured techniques are used and documents produced. SSADM may be compared with methods that use some of the same techniques. Two other methods (Yourdon and Arthur Young's Information Engineering) are briefly described and some comparisons drawn with SSADM.

42 citations


Journal ArticleDOI
TL;DR: This paper describes a practical approach to building KBSs and, in particular, how the development of hybrid systems, containing both conventional and knowledge-based components, can be managed effectively.
Abstract: The evolution of a software development life cycle (SDLC) by the data processing (DP) community was principally an attempt to ensure that software systems could be built which were accurate, reliable and maintainable. If knowledge-based systems (KBSs) are to become an established technology within the DP industry an SDLC for their development will also have to be evolved which can ensure that the standards expected in commercial and industrial software are met by future KBSs. This paper describes a practical approach to building KBSs and, in particular, how the development of hybrid systems, containing both conventional and knowledge-based components, can be managed effectively.

33 citations



Journal ArticleDOI
TL;DR: Integrity analysis principles are applicable to quality assurance, EDP audit, systems design, operations, and management, as well as to the design of data dictionaries, database management systems, and generalized software for data inspection.
Abstract: Data continue to remain a neglected component of electronic data processing (EDP) systems and are largely unsupported by research. As a result, most files contain defective data, and the extent of this condition is not measurable due to a lack of rigorous inspection techniques. Integrity analysis is a formal methodology for inspecting stored data in a systematic manner, and for quantifying data integrity. Problems in both data and systems are detected by automated data constraints. Result reliability of the inspection processes is ensured by algorithms that provide control and auditability whenever data already identified as defective must proceed through subsequent constraints. Integrity analysis principles are applicable to quality assurance, EDP audit, systems design, operations, and management, as well as to the design of data dictionaries, database management systems, and generalized software for data inspection.

30 citations


Journal ArticleDOI
TL;DR: The most well known and some of the most promising metrics are reviewed and the more promising but relatively unexplored fields of design and specification metrics are outlined.
Abstract: The study of quantitative aspects of software engineering can be divided into measurements that relate to the process of engineering software and those that relate to the software engineering product. This paper deals with the latter. Some of the most well known and some of the most promising metrics are reviewed. Well known metrics include the code metrics of Halstead and McCabe; however, code metrics suffer from a number of inherent limitations. The more promising but relatively unexplored fields of design and specification metrics are outlined. The paper concludes with some speculative remarks concerning the future of product metrics.

24 citations


Journal ArticleDOI
TL;DR: Methodologies are being continually improved in an incremental fashion, but the final part of this paper suggests that IS methodologies might take a very different direction in the future.
Abstract: Early computer applications were implemented without the aid of an explicit information systems (IS) methodology. In the 1970s the need for a more formal methodology was recognized, and the methodologies of this decade emphasized the importance of documentation standards and good training for systems analysts. This was certainly an improvement. However, there were still a number of problems and the first part of this paper highlights lessons that can be drawn from this early experience with IS methodologies. The 1980s have witnessed a growth in the number of IS methodologies. Some are becoming widely used. However, this increase in the numbers of methodologies has caused much confusion. The second part of this paper overviews some of these themes in IS methodologies. Methodologies are being continually improved in an incremental fashion, but the final part of this paper suggests that IS methodologies might take a very different direction in the future.

21 citations


Journal ArticleDOI
TL;DR: The aims of software testing are introduced, followed by a description of static and dynamic analysis, and, functional and structural testing strategies, which are used to provide a taxonomy of testing techniques.
Abstract: Despite advances in formal methods of specification and improved software creation tools, there is no guarantee that the software produced meets its functional requirements. There is a need for some form of software testing. The paper introduces the aims of software testing. This is followed by a description of static and dynamic analysis, and, functional and structural testing strategies. These ideas are used to provide a taxonomy of testing techniques. Each technique is briefly described.

18 citations


Journal ArticleDOI
TL;DR: The relationship to conventional data modelling techniques is explored, and recent extensions to JSD modelling to handle awkward cases are described.
Abstract: The JSD method starts with a modelling phase in which the subject matter of the system being built is described in terms of sequential processes. The basic modelling technique is summarized. The relationship to conventional data modelling techniques is explored. Recent extensions to JSD modelling to handle awkward cases are described. Examples are drawn from a variety of application domains.

16 citations


Journal ArticleDOI
TL;DR: An algorithm for maintaining consistency and improving the performance of databases with replicated data in distributed realtime systems is presented and the semantic information of read-only transactions is exploited for improved efficiency and a multiversion technique is used to increase the degree of concurrency.
Abstract: Considerable research effort has been devoted to the problem of developing techniques for achieving high availability of critical data in distributed realtime systems. One approach is to use replication. Replicated data is stored redundantly at multiple sites so that it can be used even if some of the copies are not available due to failures. This paper presents an algorithm for maintaining consistency and improving the performance of databases with replicated data in distributed realtime systems. The semantic information of read-only transactions is exploited for improved efficiency, and a multiversion technique is used to increase the degree of concurrency. Related issues including version management and consistency of the states seen by transactions are discussed.

Journal ArticleDOI
Mark Woodman1
TL;DR: This tutorial introduces the notation and describes how it has been used and how it should be used.
Abstract: During the past two decades many informal methods for requirements analysis and specification have been proposed. The majority of these claim to be ‘structured’ and have a graphical notation as a central component. The simplicity of some of these notations has made them popular with analysts and acceptable to their customers. However, vagueness in early descriptions of the syntax and semantics of the notations has allowed an unhelpful degree of flexibility in their application and interpretation, and has diminished their usefulness. Dataflow diagrams, as promoted by the Yourdon organization, have suffered in this way. This tutorial introduces the notation and describes how it has been used and how it should be used.

Journal ArticleDOI
TL;DR: A software engineering course which is part of the computer science honours degree at Southampton University is described, which has objectives and problems which are typical of such courses in an academic environment.
Abstract: A software engineering course which is part of the computer science honours degree at Southampton University is described. The course has objectives and problems which are typical of such courses in an academic environment. Ideas are being adopted from software engineering courses in industry to extend and improve the education of students. Industrial training is also borrowing from the academic view of software engineering. There is now a strong link developing from increased technology transfer in this area and benefits are being noted.

Journal ArticleDOI
TL;DR: A measurement-oriented approach to software product assurance that helps managers to achieve the level of confidence and control necessary to ensure successful project completion is described.
Abstract: This paper describes a measurement-oriented approach to software product assurance that helps managers to achieve the level of confidence and control necessary to ensure successful project completion. The paper covers three major topics. First, it explains some important views of software quality: satisfaction of requirements, conformance to standards, and technical excellence. Second, the paper outlines the project planning, configuration management, quality assurance, and software measurement activities that compose software product assurance. Third, the paper explains some numerical techniques for setting quality objectives and assessing performance. It distinguishes between implementation process control and product quality control. While quantitative quality techniques are new to most software enterprises, they have been successfully applied to other manufacturing processes for many years.


Journal ArticleDOI
TL;DR: A system call Difead (Dictionary interface for expert system and databases) is described that uses metadata stored in an active DDS to couple expert systems and databases.
Abstract: Data as a resource in information processing environments is well known, but only recently has the computing community recognized the role of metadata, i.e. data about data. In particular, Data Dictionary Systems, (DDSs), whose main purpose is metadata management, are seen as essential for successful data processing. The artificial intelligence community also recognises this in meta knowledge. A system call Difead (Dictionary interface for expert systems and databases) is described that uses metadata stored in an active DDS to couple expert systems and databases. The prototype is being evaluated on an application that couples two medical expert systems with a relational clinical database.

Journal ArticleDOI
TL;DR: In this article, the authors present a general overview of decision support systems (DSSs) and their use in many types of business environments, including expert systems, and present a survey of the most popular DSS packages.
Abstract: Decision support systems (DSSs) are used in many types of business. However, which type is suitable for a specific purpose? How do expert systems help DSS? This paper analyses the popular DSS packages on the market and offers a general overview about their use now and in the future.

Journal ArticleDOI
TL;DR: A taxonomy of the area of overlap between the fields of artificial intelligence (AI) and software engineering (SE) is developed and related to other major attempts to address the interaction between these two fields.
Abstract: This paper is a broad-based review of the area of overlap between the fields of artificial intelligence (AI) and software engineering (SE). A taxonomy of this overlap area is developed and related to other major attempts to address the interaction between these two fields. Each of the three major subareas — AI-based support environments; AI mechanisms and techniques in practical software; and software engineering tools and techniques in AI software — is described and illustrated with representative examples. Finally, it is noted that the area of overlap is changing and thus any current attempt to map out the possibilities is to be viewed as speculation.

Journal ArticleDOI
TL;DR: A prototype expert system is described that advises end-users in the selection of a suitable development environment for small- to medium-scale expert system applications and suggests improvements, including a feature list for classifying expert system shells.
Abstract: A prototype expert system is described that advises end-users in the selection of a suitable development environment for small- to medium-scale expert system applications. The system, running on a microcomputer, assesses the suitability of an application as a whole for expert systems techniques and recommends the five ‘best’ products from its knowledge base of 42 products. The paper takes a case-study approach, using the system itself as a recursive discussion example, applying some of the concepts involved in building an expert system. It describes the operation of the system and suggests improvements; it includes a feature list for classifying expert system shells.


Journal ArticleDOI
TL;DR: The Emeraude portable common tool environment is presented, addressing in particular issues on process control, object management and use of the object management system, distribution over a network of workstations, and the user interface.
Abstract: This paper present the Emeraude portable common tool environment. It begins by explaining the approach followed in the design and construction of software development environments, and the Emeraude product itself. It continues with a general overview of the numerous related developments, with respect to standardization efforts, products and associated research and development. There follows a technical presentation of the architecture, addressing in particular issues on process control, object management and use of the object management system, distribution over a network of workstations, and the user interface. The paper assumes a general knowledge of PCTE given in an overview paper in Information and Software Technology, Vol 29 No 8.

Journal ArticleDOI
TL;DR: The paper outlines advantages and disadvantages of two popular approaches to system development (conceptual modelling and prototyping) and suggests that they are not mutually exclusive methods, but, in fact, complementary.
Abstract: This paper is the outcome of a series of discussions and workshops held by the British Computer Society's (BCS) Database Specialist Group in Scotland. The paper outlines advantages and disadvantages of two popular approaches to system development (conceptual modelling and prototyping) and suggests that they are not mutually exclusive methods, but, in fact, complementary.

Journal ArticleDOI
TL;DR: A comparison of the various solutions to some problems of dynamic dataflow analysis is covered and a new solution is proposed for some of these problems.
Abstract: This paper covers a comparison of the various solutions to some problems of dynamic dataflow analysis and proposes new solutions to some of these problems. It summarizes experiences of implementing and using several dynamic dataflow analysis systems. Some proposals about future directions and the essential features of automated dynamic dataflow analysis systems are also discussed in this paper.

Journal ArticleDOI
TL;DR: The Z notation was used to write the specifications, it has proved apt for the purpose, enabling one to write succinct specifications in terms of a basic definition of a flowgraph.
Abstract: During the course of research into software metrication a formal specification was required for a system to process program flowgraphs. This note gives specifications for some of the objects in the system. The Z notation was used to write the specifications, it has proved apt for the purpose, enabling one to write succinct specifications in terms of a basic definition of a flowgraph.

Journal ArticleDOI
P. F. Gibbins1
TL;DR: The benefits of using formal methods are examined, as well as the problem of making formal methods more widely used, which includes rapid prototyping and other approaches to software development.
Abstract: Examples of the use of formal methods are given. The benefits of using formal methods are examined, as well as the problem of making formal methods more widely used. The relation between using formal methods and using other approaches to software development, like rapid prototyping, is also considered.

Journal ArticleDOI
TL;DR: A software tool called attribute grammar based theorem prover (AGBTP) is proposed, which can be used both as a processor of attribute grammars and as a theoremProver, and can combine procedural and declarative characteristics using a very high level language i.e. the attribute grammar' language and user defined semantic functions in the host language.
Abstract: In this paper a software tool called attribute grammar based theorem prover (AGBTP) is proposed, which can be used both as a processor of attribute grammars and as a theorem prover. Hence, attribute grammars' applications from the area of software engineering as well as theorem proving applications from the area of knowledge engineering can be faced using the same tool. The main advantages of the proposed tool are that it can combine procedural and declarative characteristics using a very high level language i.e. the attribute grammars' language and user defined semantic functions in the host language. Second, full theorem proving capabilities are obtained through an extended parser, which implements the model elimination procedure.

Journal ArticleDOI
TL;DR: The application of system analysis tools for designing knowledge-based expert systems is presented and illustrates the application of the tools with a simplified example drawn from the oil and gas exploration business.
Abstract: Knowledge-based expert systems incorporate human expert knowledge with the use of computer systems. Today, the wide availability of expert systems shells allows the knowledge engineer to implement specific rules for a desired application. The final design of the artificial intelligence system is the outcome of a detailed study in an organized fashion. In this paper the application of system analysis tools for designing knowledge-based expert systems is presented. The paper illustrates the application of the tools with a simplified example drawn from the oil and gas exploration business. The use of a systematic approach in designing expert systems should help the knowledge engineer clearly identify the facts and rules representative of the acquired human knowledge.

Journal ArticleDOI
TL;DR: The computational algorithms are expressed as attribute grammars and they are translated into a new form that specifies the dependencies among the various required operations, suitable for parallel execution by dataflow computers.
Abstract: The use of attribute grammars as a tool for extracting the inherent parallelism of computational algorithms is described. The computational algorithms are expressed as attribute grammars and they are translated into a new form that specifies the dependencies among the various required operations. This form is suitable for parallel execution by dataflow computers.

Journal ArticleDOI
Lieuwe Sytse de Jong1
TL;DR: An introduction to the engineering of experts systems is given that is related to a traditional software engineering approach and examples originate from a project aimed at the construction of an expert system for fault diagnosis and repair of electronic devices.
Abstract: This paper gives an introduction to the engineering of experts systems. A methodology is proposed that is related to a traditional software engineering approach. Examples originate from a project aimed at the construction of an expert system for fault diagnosis and repair of electronic devices.

Journal ArticleDOI
TL;DR: This paper reviews and classifies various buffering algorithms for relational DBMSs and proposes a new classification criterion based on a judgment of how the knowledge of query reference behaviour is reflected in buffering policies.
Abstract: Buffer management is an essential component of database management systems (DBMSs). This paper reviews and classifies various buffering algorithms for relational DBMSs. A new classification criterion is proposed. It is based on a judgment of how the knowledge of query reference behaviour is reflected in buffering policies. The main characteristics of each strategy are identified and compared to evaluate its potential for database applications. Pseudocode is used to outline each algorithm in a pascal -like notation. Operating system support and related issues are also discussed.