scispace - formally typeset
Search or ask a question
Topic

Test harness

About: Test harness is a research topic. Over the lifetime, 2834 publications have been published within this topic receiving 50365 citations.


Papers
More filters
Book
28 Oct 1999
TL;DR: This book discusses how to develop a Decision Table for Object-oriented Testing, and a Tester's Guide to the UML, and some Assertion Tools for Post-development Testing.
Abstract: List of Figures. List of Tables. List of Procedures. Foreword. Preface. Acknowledgments. I. PRELIMINARIES. 1. A Small Challenge. 2. How to Use This Book. Reader Guidance. Conventions. FAQs for Object-oriented Testing. Test Process. 3. Testing: A Brief Introduction. What Is Software Testing? Definitions. The Limits of Testing. What Can Testing Accomplish? Bibliographic Notes. 4. With the Necessary Changes: Testing and Object-oriented Software. The Dismal Science of Software Testing. Side Effects of the Paradigm. Language-specific Hazards. Coverage Models for Object-oriented Testing. An OO Testing Manifesto. Bibliographic Notes. II. MODELS. 5. Test Models. Test Design and Test Models. Bibliographic Notes. 6. Combinational Models. How Combinational Models Support Testing. How to Develop a Decision Table. Deriving the Logic Function. Decision Table Validation. Test Generation. Choosing a Combinational Test Strategy. Bibliographic Notes. 7. State Machines. Motivation. The Basic Model. The FREE State Model. State-based Test Design. Bibliographic Notes. 8. A Tester's Guide to the UML. Introduction. General-purpose Elements. Use Case Diagram. Class Diagram. Sequence Diagram. Activity Diagram. Statechart Diagram. Collaboration Diagram. Component Diagram. Deployment Diagram. Graphs, Relations, and Testing. Bibliographic Notes. III. PATTERNS. 9. Results-oriented Test Strategy. Results-oriented Testing. Test Design Patterns. Test Design Template. Documenting Test Cases, Suites, and Plans. Bibliographic Notes. 10. Classes. Class Test and Integration. Preliminaries. Method Scope Test Design Patterns. Category-Partition. Combinational Function Test. Recursive Function Test. Polymorphic Message Test. Class Scope Test Design Patterns. Invariant Boundaries. Nonmodal Class Test. Quasi-modal Class Test. Modal Class Test. Flattened Class Scope Test Design Patterns. Polymorphic Server Test. Modal Hierarchy Test. Bibliographic Notes. 11. Reusable Components. Testing and Reuse. Test Design Patterns. Abstract Class Test. Generic Class Test. New Framework Test. Popular Framework Test. Bibliographic Notes. 12. Subsystems. Subsystems. Subsystem Test Design Patterns. Class Association Test. Round-trip Scenario Test. Controlled Exception Test. Mode Machine Test. Bibliographic Notes. 13. Integration. Integration in Object-oriented Development. Integration Patterns. Subsystem/System Scope. Big Bang Integration. Bottom-up Integration. Top-down Integration. Collaboration Integration. Backbone Integration. Layer Integration. Client/Server Integration. Distributed Services Integration. High-frequency Integration. Bibliographic Notes. 14. Application Systems. Testing Application Systems. Test Design Patterns. Extended Use Case Test. Covered in CRUD. Allocate Tests by Profile. Implementation-specific Capabilities. Post-development Testing. Note on Testing Performance Objectives. Bibliographic Notes. 15. Regression Testing. Preliminaries. Test Patterns. Retest All. Retest Risky Use Cases. Retest by Profile. Retest Changed Code. Retest Within Firewall. Bibliographic Notes. IV. TOOLS. 16. Test Automation. Why Testing Must Be Automated. Limitations and Caveats. 17. Assertions. Introduction. Implementation-based Assertions. Responsibility-based Assertions. Implementation. The Percolation Pattern. Deployment. Limitations and Caveats. Some Assertion Tools. Bibliographic Notes. 18. Oracles. Introduction. Oracle Patterns. Comparators. Bibliographic Notes. 19. Test Harness Design. How to Develop a Test Harness. Test Case Patterns. Test Case/Test Suite Method. Test Case/Test Suite Class. Catch All Exceptions. Test Control Patterns. Server Stub. Server Proxy. Driver Patterns. TestDriver Superclass. Percolate the Object Under Test. Symmetric Driver. Subclass Driver. Private Access Driver. Test Control Interface. Drone. Built-in Test Driver. Test Execution Patterns. Command Line Test Bundle. Incremental Testing Framework. Fresh Objects. A Test Implementation Syntax. Bibliographic Notes. Appendix. BigFoot's Tootsie: A Case Study. Requirements. OOA/D for Capability-driven Testing. Implementation. Glossary. References. Index. 0201809389T04062001

1,164 citations

Journal ArticleDOI
TL;DR: A method for creating functional test suites has been developed in which a test engineer analyzes the system specification, writes a series of formal test specifications, and then uses a generator tool to produce test descriptions from which test scripts are written.
Abstract: A method for creating functional test suites has been developed in which a test engineer analyzes the system specification, writes a series of formal test specifications, and then uses a generator tool to produce test descriptions from which test scripts are written. The advantages of this method are that the tester can easily modify the test specification when necessary, and can control the complexity and number of the tests by annotating the tests specification with constraints.

827 citations

Journal ArticleDOI
TL;DR: A technique to select a representative set of test cases from a test suite that provides the same coverage as the entire test suite by identifying, and then eliminating, the redundant and obsolete test cases in the test suite is presented.
Abstract: This paper presents a technique to select a representative set of test cases from a test suite that provides the same coverage as the entire test suite. This selection is performed by identifying, and then eliminating, the redundant and obsolete test cases in the test suite. The representative set replaces the original test suite and thus, potentially produces a smaller test suite. The representative set can also be used to identify those test cases that should be rerun to test the program after it has been changed. Our technique is independent of the testing methodology and only requires an association between a testing requirement and the test cases that satisfy the requirement. We illustrate the technique using the data flow testing methodology. The reduction that is possible with our technique is illustrated by experimental results.

630 citations

Proceedings ArticleDOI
20 Oct 2007
TL;DR: RANDOOP, which generates unit tests for Java code using feedback-directed random test generation, and RANDOOP, which is an annotation-based interface for specifying configuration parameters that affect R )'s behavior and output.
Abstract: RANDOOP for Java generates unit tests for Java code using feedback-directed random test generation. Below we describe RANDOOP's input, output, and test generation algorithm. We also give an overview of RANDOOP's annotation-based interface for specifying configuration parameters that affect RANDOOP's behavior and output.

438 citations

Journal ArticleDOI
TL;DR: The experiments have shown that the chaining approach may significantly improve the chances of finding test data as compared to the existing methods of automated test data generation.
Abstract: Software testing is very labor intensive and expensive and accounts for a significant portion of software system development cost. If the testing process could be automated, the cost of developing software could be significantly reduced. Test data generation in program testing is the process of identifying a set of test data that satisfies a selected testing criterion, such as statement coverage and branch coverage. In this article we present a chaining approach for automated software test data generation which builds on the current theory of execution-oriented test data generation. In the chaining approach, test data are derived based on the actual execution of the program under test. For many programs, the execution of the selected statement may require prior execution of some other statements. The existing methods of test data generation may not efficiently generate test data for these types of programs because they only use control flow information of a program during the search process. The chaining approach uses data dependence analysis to guide the search process, i.e., data dependence analysis automatically identifies statements that affect the execution of the selected statement. The chaining approach uses these statements to form a sequence of statements that is to be executed prior to the execution of the selected statement. The experiments have shown that the chaining approach may significantly improve the chances of finding test data as compared to the existing methods of automated test data generation.

389 citations


Network Information
Related Topics (5)
Software construction
36.2K papers, 743.8K citations
83% related
Software development
73.8K papers, 1.4M citations
82% related
Software system
50.7K papers, 935K citations
81% related
Web service
57.6K papers, 989K citations
77% related
Software
130.5K papers, 2M citations
76% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202311
202242
20214
202011
201916
201815