scispace - formally typeset
Search or ask a question

Showing papers on "Test data published in 1976"


Journal ArticleDOI
TL;DR: A system that attempts to generate test data for programs written in ANSI Fortran by symbolically executing the path and creating a set of constraints on the program's input variables, which facilitates error detection and being a possible aid in assertion generation and automatic program documentation.
Abstract: This paper describes a system that attempts to generate test data for programs written in ANSI Fortran. Given a path, the system symbolically executes the path and creates a set of constraints on the program's input variables. If the set of constraints is linear, linear programming techniques are employed to obtain a solution. A solution to the set of constraints is test data that will drive execution down the given path. If it can be determined that the set of constraints is inconsistent, then the given path is shown to be nonexecutable. To increase the chance of detecting some of the more common programming errors, artificial constraints are temporarily created that simulate error conditions and then an attempt is made to solve each augmented set of constraints. A symbolic representation of the program's output variables in terms of the program's input variables is also created. The symbolic representation is in a human readable form that facilitates error detection as well as being a possible aid in assertion generation and automatic program documentation.

801 citations


Journal ArticleDOI
TL;DR: It is proved that an effective testing strategy which is reliable for all programs cannot be constructed and a method for analyzing the reliability of path testing is introduced.
Abstract: A set of test data T for a program P is reliable if it reveals that P contains an error whenever P is incorrect. If a set of tests T is reliable and P produces the correct output for each element of T then P is a correct program. Test data generation strategies are procedures for generating sets of test data. A testing strategy is reliable for a program P if it produces a reliable set of test data for P. It is proved that an effective testing strategy which is reliable for all programs cannot be constructed. A description of the path analysis testing strategy is presented. In the path analysis strategy data are generated which cause different paths in a program to be executed. A method for analyzing the reliability of path testing is introduced. The method is used to characterize certain classes of programs and program errors for which the path analysis strategy is reliable. Examples of published incorrect programs are included.

412 citations


Journal ArticleDOI
TL;DR: In this paper, a test data generator for Fortran programs is described, and a new approach for resolving array reference ambiguities and a procedure for generating test inputs satisfying input constraints are proposed.
Abstract: Software validation through testing will continue to be a very important tool for ensuring correctness of large scale software systems. Automation of testing tools can greatly enhance their power and reduce testing cost. In this paper, techniques for automated test data generation are discussed. Given a program graph, a set of paths are identified to satisfy some given testing criteria. When a path or a program segment is specified, symbolic execution is used for generating input constraints which define a set of inputs for executing this path or segment. Problems encountered in symbolic execution are discussed. A new approach for resolving array reference ambiguities and a procedure for generating test inputs satisfying input constraints are proposed. References to arrays are recorded in a table. during symbolic execution and ambiguities are resolved when test data are generated to evaluate the subscript expressions. The implementation of a test data generator for Fortran programs incorporating these techniques is also described.

332 citations


Journal ArticleDOI
TL;DR: Two examples, a matrix factorization subroutine and a sorting method, illustrate the types of data generation problems that can be successfully treated with numerical maximization techniques.
Abstract: For numerical programs, or more generally for programs with floating-point data, it may be that large savings of time and storage are made possible by using numerical maximization methods instead of symbolic execution to generate test data. Two examples, a matrix factorization subroutine and a sorting method, illustrate the types of data generation problems that can be successfully treated with such maximization techniques.

252 citations


Journal ArticleDOI
TL;DR: In this paper, a mathematical model for predicting the dynamic response of the H. B. Robinson pressurized water reactor plant was formulated and compared with results from measurements made during full-power operation of the plant.
Abstract: A mathematical model for predicting the dynamic response of the H. B. Robinson pressurized water reactor plant was formulated and compared with results from measurements made during full-power operation of the plant. The model was based on the basic conservation laws for neutrons, mass, and energy; design data from the safety analysis report were used to evaluate the necessary coefficients. The model included representations for point kinetics, core heat transfer, piping, pressurizer, and the steam generator. The experiment involved perturbations in control rod position and main steam valve opening. Periodic binary input signals and step inputs were used. Theoretical and experimental frequency responses were obtained from the model and the test data. The comparison showed that the model was capable of good predictions for reactivity perturbations and fair predictions for steam valve perturbations. A method was also demonstrated for using the test data for at-power determination of the differential...

82 citations


Journal ArticleDOI
TL;DR: This work presents a Bayesian procedure that does allow using both sets of test data gathered both at the component level of a multicomponent system and at the system level.
Abstract: On occasion, reliability analysts have test data gathered both at the component level of a multicomponent system and at the system level. When the system test data provide no information on component performance, classical statistical techniques in all but trivial cases do not allow using both sets of data. We present a Bayesian procedure that does allow using both sets. The procedure for attribute data makes use of a lemma that relates the moments of the prior and posterior distributions of reliability to the test data. The procedure for variables data assumes the time to failure distribution of each component is exponential.

48 citations


Patent
13 Feb 1976
TL;DR: In this article, the authors present a method and apparatus for testing a business machine of the type having a keyboard for entering data into the machine and includes means for storing the entered data.
Abstract: A method and apparatus for testing a business machine of the type having a keyboard for entering data into the machine and includes means for storing the entered data. The method and test apparatus are directed to carrying out diagnostic routines on the business machine for determining the defective area of the machine. After the test apparatus is attached to the business machine, the data stored in the machine is transferred and stored in the test apparatus. A programmed control sequence of diagnostic routines is then carried out by the test apparatus with each routine terminating in the operation of an indicator member located on the test apparatus indicating whether a fault exists in the portion of the machine being tested. A control key on the test apparatus allows the operator to initiate a new diagnostic routine. At the completion of the diagnostic routines, the data stored in the test apparatus is transferred back to the business machine. To insure that the data originally stored in the machine was not altered or lost in the test operation, the business machine is operated prior to and after the operation of the test apparatus to print out on a pair of record members the data stored in the machine. The printed data on the record members is compared to determine the validity of the data now stored in the business machine.

39 citations


ReportDOI
01 Jan 1976
TL;DR: In this paper, a study was conducted to evaluate the 300,000-byte version of the C- 81 AGAJ74 helicopter simulation program's capability for prediction of performance, rotor dynamic loads, and stability for soft-in-plane hingeless rotor helicopters.
Abstract: : A study was conducted to evaluate the 300,000-byte version of the C- 81 AGAJ74 helicopter simulation program's capability for prediction of performance, rotor dynamic loads, and stability for soft-in-plane hingeless rotor helicopters. Available test data were compiled for the BO-105 single-rotor helicopter to provide a basis for evaluation of computer program analytical results. Results indicated good correlation for trim and performance, and reasonable correlation for main rotor alternating flap bending moments. Poorer correlation was obtained for main rotor chord and shaft bending moments. Poor agreement was obtained for response to control inputs in hover and at 100 knots; this may have been due to selection of too large a numerical integration interval. Approximately the same damping was indicated by test and analysis for aeroelastic stability. Attempts to compare C-81 results for control power and stability derivatives with analytical results from Boeing Vertol's Y-92 computer program were not sucessful. Significant differences were attributed to restraint of blade flapping in C-81 during these computations.

35 citations


Proceedings ArticleDOI
01 Jan 1976
TL;DR: This paper introduces a technique whereby test data can be used in proving program correctness, and in addition to simplifying certification of correctness, this method simplifies the process of providing specifications for a program.
Abstract: Proofs of program correctness tend to be long and tedious whereas testing, though useful in detecting errors, usually does not guarantee correctness. This paper introduces a technique whereby test data can be used in proving program correctness. In addition to simplifying certification of correctness, this method simplifies the process of providing specifications for a program. The applicability of this technique to procedures, recursive programs, and modular programs is demonstrated.

26 citations


Journal ArticleDOI
TL;DR: In this article, a curve fitting technique was developed for estimating the fracture toughness from invalid test data, which is versatile enough to be applicable to any geometry and loading system provided that a function for the collapse stress is known.

19 citations


Proceedings ArticleDOI
13 Oct 1976
TL;DR: Techniques for automated test data generation are discussed and a new approach for resolving array reference ambiguities and a procedure for generating test inputs satisfying input constraints are proposed.
Abstract: Software validation through testing will continue to be a very important tool for ensuring correctness of large scale software systems. Automation of testing tools can greatly enhance their power and reduce testing cost. In this paper, techniques for automated test data generation are discussed. Given a program graph, a set of paths are identified to satisfy some given testing criteria. When a path or a program segment is specified, symbolic execution is used for generating input constraints which define a set of inputs for executing this path or segment. Problems encountered in symbolic execution are discussed. A new approach for resolving array reference ambiguities and a procedure for generating test inputs satisfying input constraints are proposed. Array references are recorded in a table during symbolic execution and ambiguities are resolved when test data are generated to evaluate the subscript expressions. The implementation of a test data generator for FORTRAN programs incorporating these techniques is also described.

Journal ArticleDOI
TL;DR: In this article, an easily implemented method is developed for calculating, from information obtained from a possibly censored life test, an approximate warranty period before which the k th failure occurs in a production lot of given size with some small specified probability.

Journal ArticleDOI
TL;DR: In this article, the authors describe the construction, operation and performance of an instrumented plate-projectile (IFF) impact tester, and the characteristics of several impact test methods in current use.
Abstract: Characteristics of several impact test methods in current use are discussed. The construction, operation and performance of an instrumented plate-projectile (IFF) impact tester is described. Force-strain curves, energies to any point in the test and failure modes may be inferred from the test data. Also, by inspection of the tested plate, one can see if there are cracks, blanching (development of microvoids) or ductile deformation. This inspection distinguishes between local and catastrophic failure—an important distinction in selection of materials for specific uses. There is reasonable correlation between values from notched Izod and IPP tests, with the latter somewhat more sensitive. Data illustrate the distinction between good and poor fiber-to-matrix coupling in reinforced polypropylene. Although there is some sacrifice of impact strength with improved fiber-matrix coupling, this is more than counteracted by the improvement in static strength and creep resistance. The data show that coupled glass-reinforced impact grade polypropylene exhibits an interesting balance of static to impact properties.

08 Apr 1976
TL;DR: A set of mathematical models and computer programs have been developed to characterize multipath propagation in an airport environment to provide a firm technical basis for assessing the performance of candidate Microwave Landing Systems (MLS) in realistic airport environments.
Abstract: : A set of mathematical models and computer programs have been developed to characterize multipath propagation in an airport environment. When combined with system mathematical models, these models are intended to provide a firm technical basis for assessing the performance of candidate Microwave Landing Systems (MLS) in realistic airport environments. The two paramount issues in developing these models have been (1) validation based on actual field test data and (2) computer running time. The obstacles modeled include buildings and aircraft, as well as the ground which can cause both specular reflections and diffuse scattering. In addition, the shadowing effects due to runway humps, and aircraft, buildings approaching the line of sight between transmitter and receiver are included. Computational procedures are presented for obtaining the salient multipath parameters, i.e., relative magnitude, phase, directional angles, Doppler frequency, and time delay. Computer programs have been written for these algorithms using the Fortan programming language, with structured programming methods, such as Iftran, employed whenever possible. A presentation is given of computer validation data for the computational procedures. A comparison of these computer validation results with experimental field data demonstrates good agreement in all cases of interest. The computer running time for these computer programs is quite reasonable, e.g., it takes about five times longer than actual flight time to run a model of a typical airport environment on an IBM 370 model 168. (Author)

Proceedings ArticleDOI
07 Jun 1976
TL;DR: A symbolic evaluation system called DISSECT is described which can be used to analyze FORTRAN programs and the potential use of systems like DISSect as the basic software certification tool in the software development process is discussed.
Abstract: Symbolic evaluation techniques can be used to determine the cumulative effects of a program's calculations on the branching predicates and output variables in the program. If the evaluation techniques are carefully and selectively applied, they can be used to generate revealing symbolic representations of the computations carried out by the paths in a program, and of the systems of predicates that describe the input data that causes program paths to be executed. A symbolic evaluation system called DISSECT is described which can be used to analyze FORTRAN programs. The system includes a sophisticated command language that allows the user to selectively apply symbolic evaluation techniques to different program paths and subpaths. The command language allows the user to carry out different levels of symbolic testing of a program and to construct systems of predicates that can be used to automate the generation of numeric test data. Experiments with the system which illustrate its advantages and limitations are included. DISSECT can be used to carry out a systematic, documented reliability analysis of a program. The paper concludes with a discussion of the potential use of systems like DISSECT as the basic software certification tool in the software development process.

Journal ArticleDOI
TL;DR: A plotting procedure is described in this paper, which predicts local buckling and general instability loads non-destructively during testing, and its relationship to the Southwell procedure is discussed, as well as its application to combined load testing of beaded and tubular panels, and to proof testing of externally pressurized domes.
Abstract: A plotting procedure is described which predicts local buckling and general instability loads nondestructively during testing. The method is shown to have a simple theoretical basis, and its relationship to the Southwell procedure is discussed. Typical forms of test data plots are cataloged and described, and procedures for application of the method are outlined. The procedure has been used extensively over the past two years with good results. Examples showing applications to combined load testing of beaded and tubular panels, and to proof testing of externally pressurized domes, are described and test data are given. Experience regarding instrumentation, procedural details, and accuracy of the method is discussed.

Journal ArticleDOI
TL;DR: A post-sample diagnostic test for judging the temporal stability of the Box-Jenkins time series models has been developed and its application has been demonstrated in the analysis of a time series consisting of the monthly demand for in-place telephone services in Australia.
Abstract: A post-sample diagnostic test for judging the temporal stability of the Box-Jenkins time series models has been developed. The proposed test is based on the stochastic properties of the errors of the forecasts, at different leads, made from the same origin. Its application has been demonstrated in the analysis of a time series consisting of the monthly demand for in-place telephone services in Australia. An important alternative use of the test has also been indicated.

01 Oct 1976
TL;DR: In this paper, a systematic study of predictive techniques for earth penetration has been reviewed and industrial proprietary computer code, which appears to be the most flexible and extensible existing model currently available, has been used to examine controlled experimental data generated at AFATL.
Abstract: : A systematic study of predictive techniques for earth penetration has been reviewed. Based upon this study and industrial proprietary computer code, which appears to be the most flexible and extensible existing model currently available, has been used to examine controlled experimental data generated at AFATL. Results of this data analysis indicate that use of the code requires some judgment in proper selection of input parameters. In addition, test data generated at AFATL indicate that the drag coefficient in predictive soil penetration equations is velocity dependent over some impact regime and that soil viscosity is an important parameter in this region.

Journal ArticleDOI
TL;DR: Optimal filtering of simulated inertial navigation system test data is used to evaluate alternate laboratory and flight test echniques, which are intended to determine the value of each significant source of navigation error.
Abstract: Optimal filtering of simulated inertial navigation system (INS) test data is used to evaluate alternate laboratory and flight test echniques, which are intended to determine the value of each significant source of navigation error. Tests of both gimbaled and strapdown systems are evaluated. The major problem preventing more accurate determination of the dozens of sources of error in an INS is the high correlation between the contributions of many of the sources of error. Laboratory test sequences and flight test trajectories are presented that reduce these correlations and improve the observability of the individual sources of error. Parametric studies include the effects of flight duration and distance, multi directional flights versus straight out-and-out flights, frequency and direction of maneuvers, and supersonic flights vs subsonic flights. The effects of the range instrumentation (reference system) accuracy and measurement frequency are demonstrated.


Patent
26 Oct 1976
TL;DR: In this paper, the test data obtained at the testing stations and marked by this information are stored in the data processing unit and further processed, and the measured results are stored on the storage field of the data processor (10) which is reserved by virtue of the serial number.
Abstract: The test data collector is applicable to moving tested objects. Each tested object is provided with a serial number and tested for functions. Scannable information stored on a support is allocated to the serial number before testing for its marking. This information is stored in a data processing unit, and the test data obtained at the testing stations and marked by this information are stored in the data processing unit and further processed. When an object arrives at say, station (4) the examiner inserts the key (20) in the scanning system (E) which recognizes the serial number. The measured results are stored in the storage field of the data processor (10) which is reserved by virtue of the serial number.

Proceedings ArticleDOI
13 Oct 1976
TL;DR: The methods used for the reliability proof of a computerized reactor protection system are discussed in this paper and include the use of an automatic test system for statical and dynamical analysis of the software and automatic test data generation.
Abstract: In safety-oriented applications, the software has to fulfil certain stringent reliability requirements. In order to determine the reliability of the software, a variety of different methods can be used. The methods used for the reliability proof of a computerized reactor protection system are discussed in this paper. In addition to the constructive approach with structured programming, defensive programming and other guidelines also concerning the operating system, the analytical approach is taken. This includes the use of an automatic test system for statical and dynamical analysis of the software and automatic test data generation. Finally, a systemtest is conducted, where test data are produced according to the process and the results are compared with the results of a simulation model.

Journal ArticleDOI
TL;DR: In this paper, a set of hyperbolae with coefficients chosen to give best least squares representations of data, and adsorption isotherms obtained by analytical differentiation, were used as test data and the Frumkin and Flory-Huggins equations as test equations.
Abstract: Evaluation of a surface equation of state generally involves an evaluation of its ability to represent raw experimental data, or information derived from such data, after the parameters of the equation have been adjusted to optimize this representation, followed by an evaluation of the physical reasonableness of the optimizing parameters in terms of the physical model on which the equation of state is based. This process is analyzed critically for the adsorption of polar organic compounds at the mercury-electrolytic solution interface using high precision electrocapillary data as test data and the Frumkin and Flory-Huggins equations as test equations. In one approach, ..pi.. vs ln..cap alpha.. data were represented by a set of hyperbolae with coefficients chosen to give best least squares representations of data, and adsorption isotherms obtained by analytical differentiation; parameters for the isotherms were selected to give best least squares fits to these data. Parametrizations depending on high surface pressure limiting tangents to ..pi.. vs ln..cap alpha.. plots, intercept and slope of low surface pressure ln ..pi../..cap alpha.. vs ..pi.. plots, were also investigated. Parameter sets obtained by different methods of parametrization agreed only moderately well, reflecting sensitivities to different portions of the experimental data. When all Flory-Hugginsmore » parameters were freely adjusted, best fits resulted when the water co-area was taken as larger than the organic compound co-area, a physically unrealistic result.« less

Journal ArticleDOI
TL;DR: A computer program, based on the Hantush inflection method and designed for “desk top” computers is presented, which assumes a leaky, isotropic, homogeneous aquifer of infinite areal extent.
Abstract: A computer program, based on the Hantush inflection method and designed for “desk top” computers is presented. The method assumes a leaky, isotropic, homogeneous aquifer of infinite areal extent. The language employed is BASIC, an interactive language used on the Wang Model 2200 programmable calculator. The program can be easily adapted to FORTRAN IV for use on larger machines.

01 Nov 1976
TL;DR: In this article, a full-scale crash testing program was presented to generate experimental test data on recent intermediate size automobiles in the areas of damage susceptibility, crashworthiness and repairability and to demonstrate the capability of existing simulation models for predicting the dynamic responses of the vehicles and occupants.
Abstract: The objectives of the program were to generate experimental test data on recent intermediate size automobiles in the areas of damage susceptibility, crashworthiness and repairability and to demonstrate the capability of existing simulation models for predicting the dynamic responses of the vehicles and occupants. The full-scale crash testing program included frontal barrier and car-to-car front-to-side and front-to-rear impacts in 22 tests of 1973 and 1974 models of Plymouth Satellite and Ford Torino vehicles. The vehicle structure and occupant computer models are briefly described and comparisons of simulated and actual crash test results are presented. The methodology of static crush tests that were performed to obtain data on the force-deflection properties of the major vehicle structural components for input to the vehicle response models is also briefly described.

Book ChapterDOI
TL;DR: In this article, computer assistance in real-time data reduction from a biaxial mechanical test is provided for real time deformation and fracture prediction using closed loop servocontrol.
Abstract: The inadequacy of uniaxial mechanical testing data as a basis for predicting multiaxial deformation and fracture is well known. For optimal biaxial tests, computer control of simultaneous variation of loadings using hydraulic power under closed loop servocontrol is essential. Computer assistance in real time data reduction from a biaxial mechanical test is economical. Physical features of computer hardware, software, and a hydraulic biaxial system are described along with sample results illustrating cross effects in fatigue and yield surface calculations. Significance of the graphical output based on thin tube specimens of 1-in. (2.54-cm) nominal diameter, schedule 80, commercial grade polyvinyl chloride pipes, and the limitations of the present system are discussed.

Journal ArticleDOI
TL;DR: The computer program PALMAGFISHERANAL was written in WATFIV to analyze paleomagnetic directions, and has been extended to assess whether the population of directions is distributed randomly, which is useful in both research and teaching.

Journal ArticleDOI
TL;DR: In this article, the results of locked-cycle grinding tests including the above mentioned cases are analyzed, and compared with the computer simulation data of steady-state characteristics of the corresponding closed-circuit continuous grinding system.
Abstract: Though the locked-cycle test is often used for estimating the steady-state characteristics of a closed-circuit grinding system, few of these test results, especially when classification is not perfect, has been analyzed from the viewpoint of comminution kinetics. Moreover, with regard to the case that the cut-size of the classifier is so adjusted that a prerequisite condition on the size distribution of fine products may be satisfied, no one has investigated experimentally. In this paper, the results of locked-cycle grinding tests including the above mentioned cases are analyzed, and compared with the computer simulation data of steady-state characteristics of the corresponding closed-circuit continuous grinding system. The factors examined are grinding time and ratio of partition towards the coarser products of classifier. The analysis shows that the steady-state characteristics of a closed-circuit continuous grinding system can be estimated with reasonably high accuracy from the test data of corresponding locked-cycle grinding tests.

Journal ArticleDOI
W. Ham1
TL;DR: The general problem of reducing data available in the integrated circuit environment is addressed and examples of data reduction and presentation techniques are given where relation between performance and identification can be seen.
Abstract: The general problem of reducing data available in the integrated circuit environment is addressed. One can consider the circuit to be essentially composed of identification and performance data. Opportunities exist at every level of manufacturing and testing to aid in the final interpretation of the results. Without disturbing the technology or actual circuit design, some general methods for improving testing techniques are discussed. Examples of data reduction and presentation techniques are given where relation between performance and identification can be seen.

Journal ArticleDOI
TL;DR: In this article, a modified two-surface approach was also tested, which treats each component as a separate source and uses grids located close to the machine, was found to produce more accurate and confident results.
Abstract: The ''two-surface'' correction technique, as it is outlined in ASME Performance Test Code No. 36, is utilized in the determination of large steam turbine-generator sound power level (PWL). Test data obtained from using this method and other PWL calculation methods are presented. The two-surface method did not yield consistent results in the indoor environments encountered and had only limited application in turbine-generator PWL calculations. A modified two-surface approach was also tested. This method, which treats each component as a separate source and uses grids located close to the machine, was found to produce more accurate and confident results.