scispace - formally typeset
Search or ask a question

Showing papers on "Verification and validation of computer simulation models published in 2000"


BookDOI
01 May 2000
TL;DR: This book discusses Behavioral versus RTL Thinking, high-level modeling, and more about the role of Verification in VHDL vs. Verilog.
Abstract: About the Cover. Foreword. Preface. Why This Book Is Important. What This Book Is About. What Prior Knowledge You Should Have. Reading Paths. Choosing a Language: VHDL vs. Verilog. Hardware Verification Languages. And the Winner is... For More Information. Acknowledgements. 1: What is Verification? What is a Testbench? The Importance of Verification. Reconvergence Model. The Human Factor. What Is Being Verified? Functional Verification Approaches. Testing Versus Verification. Design and Verification Reuse. The Cost of Verification. Summary. 2: Verification Tools. Linting Tools. Simulators. Verification Intellectual Property. Waveform Viewers. Code Coverage. Functional Coverage. Verification Languages. Assertions. Revision Control. Issue Tracking. Metrics. Summary. 3: The Verification Plan. The Role of the Verification Plan. Levels of Verification. Verification Strategies. From Specification to Features. Directed Testbenches Approach. Coverage-Driven Random-Based Approach. Summary. 4: High-Level Modeling. Behavioral versus RTL Thinking. You Gotta Have Style! Structure of Behavioral Code. Data Abstraction. Object-Oriented Programming. Aspect-Oriented Programming. The Parallel Simulation Engine. Race Conditions. Verilog Portability Issues. Summary. 5: Stimulus and Response. Reference Signals. Simple Stimulus. Simple Output. Complex Stimulus. Bus-Functional Models. Response Monitors. Transaction-Level Interface. Summary. 6: Architecting Testbenches. Test Harness. VHDL Test Harness. Design Configuration. Self-Checking Testbenches. Directed Stimulus. Random Stimulus. Summary. 7: Simulation Management. Behavioral Models. Pass or Fail? Managing Simulations. Regression. Summary. Appendix A: Coding Guidelines. Directory Structure. General CodingGuidelines. Naming Guidelines. HDL Coding Guidelines. Appendix B: Glossary. Afterwords. Index.

374 citations


Proceedings ArticleDOI
10 Dec 2000
TL;DR: Several issues regarding the complexity of simulation models are discussed, summarizing the findings in this area so far, and calling attention to this area that, despite its importance, appears to remain at the bottom of simulation research agendas.
Abstract: Nowadays the size and complexity of models is growing more and more, forcing modelers to face some problems that they were not accustomed to. Before trying to study ways to deal with complex models, a more important and primary question to explore is, is there any means to avoid the generation of complex models? The primary purpose of this paper is to discuss several issues regarding the complexity of simulation models, summarizing the findings in this area so far, and calling attention to this area that, despite its importance, appears to remain at the bottom of simulation research agendas.

157 citations


Proceedings ArticleDOI
10 Dec 2000
TL;DR: The paper discusses verification, validation, and accreditation of simulation models and the different approaches to deciding model validity.
Abstract: The paper discusses verification, validation, and accreditation of simulation models. The different approaches to deciding model validity are presented; how model verification and validation relate to the model development process are discussed; various validation techniques are defined; conceptual model validity, model verification, operational validity, and data validity are described; ways to document results are given; a recommended procedure is presented; and accreditation is briefly discussed.

105 citations


Journal ArticleDOI
TL;DR: In this article, an inverse simulation of a simplie ed version of the slalom maneuver in the ADS-33D helicopter handling qualities specie cation is presented, both when the required trajectory is prescribed explicitly and when it is dee ned indirectly through geometric and dynamic constraints.
Abstract: Aninversesimulation methodology based on numericaloptimizationis presented. Themethodology isapplied to a simplie ed version of the slalom maneuver in the ADS-33D helicopterhandling qualities specie cation. The inverse simulationisformulatedasanoptimizationproblem withtrajectoryand dynamicconstraints,pilotinputsasdesign variables, and an objective function that depends on the specie c problem being solved. A maximum speed solution is described. The results show that numerical optimization is a reliable and e exible tool for inverse simulation, both when the required trajectory is prescribed explicitly and when it is dee ned indirectly through geometric and dynamicconstraints.When thetrajectory isdee ned indirectly,thereisnotasingleacceptabletrajectory,butrather an entire family with noticeable differences in the helicopter dynamics and in the required pilot inputs. Even when thetrajectory is prescribed explicitly, multiple solutions exist. For handling qualities studies, themultiplesolutions may provide an indication of the amount of scatter in pilot ratings to be expected for a given aircraft and a given maneuver. However, if the inverse simulation is used for simulation validation, then additional constraints may have to be placed on the solution to make it unique.

56 citations


Proceedings ArticleDOI
10 Dec 2000
TL;DR: This paper discusses verification, validation, and accreditation of simulation models and the different approaches to deciding model validity.
Abstract: This paper discusses verification, validation, and accreditation of simulation models. The different approaches to deciding model validity are presented; how model verification and validation relate to the model development process are discussed; various validation techniques are defined; conceptual model validity, model verification, operational validity, and data validity are described; ways to document results are given; a recommended procedure is presented; and accreditation is briefly discussed.

51 citations


Journal ArticleDOI
TL;DR: There is great confusion over the meaning of the terms validation and verification as they apply to intelligent systems, and how several researchers are implementing these, so the second part of the paper details some techniques that can be used to perform the verification and validation of systems.
Abstract: Researchers and practitioners in the field of expert systems all generally agree that to be useful, any fielded intelligent system must be adequately verified and validated. But what does this mean in concrete terms? What exactly is verification? What exactly is validation? How are they different? Many authors have attempted to define these terms and, as a result, several interpretations have surfaced. It is our opinion that there is great confusion as to what these terms mean, how they are different, and how they are implemented. This paper, therefore, has two aims—to clarify the meaning of the terms validation and verification as they apply to intelligent systems, and to describe how several researchers are implementing these. The second part of the paper, therefore, details some techniques that can be used to perform the verification and validation of systems. Also discussed is the role of testing as part of the above-mentioned processes.

50 citations


Journal ArticleDOI
TL;DR: A methodological framework for the validation of predictive simulations that synthesizes concepts from the fields of Conventional Model Verification and Validation, Testing of Scientific Theories, Design of Experiments, Error/Sensitivity, and Statistical Exploratory Data Analysis is presented.

47 citations


Proceedings ArticleDOI
10 Dec 2000
TL;DR: This work provides guidance in developing and executing a comprehensive and detailed verification, validation and accreditation (VV&A) plan and its proper execution throughout the entire M&S application development life cycle.
Abstract: A comprehensive and detailed verification, validation and accreditation (VV&A) plan and its proper execution are crucially important for the successful accreditation of a modeling and simulation (M&S) application. We provide guidance in developing and executing such a plan throughout the entire M&S application development life cycle. The U.S. Department of Defense (DoD) is the largest sponsor and user of modeling and simulation applications in the world. DoD uses many different types of M&S applications, consisting of a combination of software, hardware, and humanware, under diverse objectives including acquisition, analysis and training.

46 citations


Proceedings ArticleDOI
10 Dec 2000
TL;DR: It appears that application domains of simulation models affect what topics need verification, validation, and accreditation research.
Abstract: Six simulation professionals present their views on the directions that they believe that verification, validation, and accreditation research should take. Two of the six are active verification, validation, and accreditation researchers from academia, two develop industry simulation models, and two work in verification, validation, and accreditation of military simulation models. A number of areas and topics for research in verification, validation, and accreditation are identified. It appears that application domains of simulation models affect what topics need verification, validation, and accreditation research.

38 citations


Journal ArticleDOI
TL;DR: The significance of simulation experiments is investigated, the standard methodology used in the development of a simulation model is explained, and a detailed case study using SIMSCRIPT II.5 simulation language is presented.

32 citations


Proceedings ArticleDOI
10 Dec 2000
TL;DR: The Unified Modeling Language (UML) is used to specify simulation models and it is shown that, similar to the "Unified Process" in software engineering, such a methodology forms a sound base for developing complex simulation models.
Abstract: Designing complex simulation models is a task essentially associated with software engineering. In this paper, the Unified Modeling Language (UML) is used to specify simulation models. It is shown that, similar to the "Unified Process" in software engineering, such a methodology forms a sound base for developing complex simulation models. An example is provided to illustrate how this approach supports the design process.

Proceedings ArticleDOI
10 Dec 2000
TL;DR: A refined V&V process is introduced, a conceptual approach for subphase-wise organization of V and V results is presented, and a hierarchical presentation of V &V results is shown which addresses different people involved in use or in accreditation of simulation models.
Abstract: Model verification, validation and accreditation (VV&A) is as complex as developing a modeling and simulation (M&S) application itself. For the purpose of structuring both verification and validation (V&V) activities and V&V results, we introduce a refined V&V process. After identification of the major influence factors on applicable V&V, a conceptual approach for subphase-wise organization of V&V activities is presented. Finally a hierarchical presentation of V&V results is shown which addresses different people involved in use or in accreditation of simulation models.

Journal ArticleDOI
TL;DR: In this paper, two case studies of model validation and verification of large and complex space structures are presented, one focusing on the mated Russian Mir Space Station and United States Space Shuttle and the other involving the International Space Station P6 truss segment tested in the launch configuration.
Abstract: In this paper two case studies of model validation and verification of large and complex space structures are presented. The first case study focuses on experience gained in performing model validation and verification of the mated Russian Mir Space Station and United States Space Shuttle. This study makes use of dynamic test data taken during the STS-81 flight associated with the Mir Structural Dynamics Experiment. The second case study involves the International Space Station P6 truss segment tested in the launch configuration. In both cases, a newly developed capability within UAI/NASTRAN is used to perform the model validation and verification. Key technical issues raised in these studies provide useful insight for future pretest planning and model validation and verification efforts.

Proceedings ArticleDOI
10 Dec 2000
TL;DR: An integrated approach to VV&A from a system perspective is presented and the relationships between the M&S resources in an integrated V&V program are identified.
Abstract: In an M&S-based systems acquisition, computer simulation is used throughout the development process not just as an analysis tool but also as a development tool. In general, development of a system capability using M&S-based systems development will result in multiple models or simulations to meet specific needs. The verification, validation and accreditation (VV&A) of each these tools is integral to M&S development. Integrating verification and validation (V&V) activities with M&S development and then integrating the VV&A activities for all of the M&S resources that support a program provides a cost effective approach to ensure the necessary confidence in M&S results within the time and resources available. This paper presents such an integrated approach to VV&A from a system perspective and identifies the relationships between the M&S resources in an integrated V&V program.

Journal Article
TL;DR: APL’s leading role in DoD‘s VV&A activities is discussed and the Laboratory’'s contribution to advances in VV &A methodology is described.
Abstract: mazing advances in software, computer, and network technology occurred during the 1990s. These advances contributed significantly to the increasing impact of models and simulations on numerous scientific and defense-related domains including medical diagnosis, system acquisition and development, planning and analysis, and training. A growing emphasis on processes that ensure that computer models and simulations perform correctly and that appropriate confidence is placed in their results has accompanied this increased use of models and simulations. Verification, validation, and accreditation (VV&A) serve as the cornerstones of simulation correctness and credibility. This article discusses APL’s leading role in DoD’s VV&A activities and describes the Laboratory’s contribution to advances in VV&A methodology. (Keywords: Credibility, Model, Simulation.)

Proceedings ArticleDOI
10 Dec 2000
TL;DR: In this article, the authors present approaches for both identifying and replacing these candidate variables, which can reduce the subcomponent model complexity by eliminating, grouping, or estimating model parameters or variables at a less detailed level without grossly affecting the simulation results.
Abstract: Today's industrial and defense communities are increasingly reliant on the use simulation to reduce cost. At times, due to their stove-piped nature, these simulations themselves have resulted in a waste of both time and money with regard to future simulation development. Current trends address this problem by promoting the development of simulation infrastructures that are scalable, portable, and interoperable over a variety of paradigms. These infrastructures, such as HLA and SPEEDES, address cost issues by providing simulation infrastructures that promote model re-use by managing model interactions across diverse paradigms, improving scenario development, and allowing for a scalable distributed simulation capability.While these modern simulation infrastructures address many cost-related issues, they do not fully address issues related to model re-use. Simulations that utilize model reuse may result in large complex system models comprised of a diverse set of subsystem component models covering varying amounts of detail and fidelity. Often, a complex simulation that re-uses high fidelity subcomponent models may result in a more detailed system model than the simulation objective requires. Simulating such a system model results in a waste of simulation time with respect to addressing the simulation goals. These simulation costs, however, can be reduced through the use of abstract modeling techniques. These techniques can reduce the subcomponent model complexity by eliminating, grouping, or estimating model parameters or variables at a less detailed level without grossly affecting the simulation results. Key issues in the abstraction process involve identifying the variables or parameters than can be abstracted away for a given simulation objective and applying the proper abstraction technique to replace those parameters. This paper presents approaches for both identifying and replacing these candidate variables.

01 Oct 2000
TL;DR: An overview of the modeling and validation of a complex engineering simulation performed at the Los Alamos National Laboratory is presented, which represents the highly transient response of an assembly with complex joints subjected to an impulsive load.
Abstract: An overview of the modeling and validation of a complex engineering simulation performed at the Los Alamos National Laboratory is presented. The application discussed represents the highly transient response of an assembly with complex joints subjected to an impulsive load. The primary sources of nonlinearity were the contact mechanics. Several tests were conducted to assess the degree of experimental uncertainty, the variability of the geometry of the test article and its assembly procedures, and to provide reference data for model validation. After presenting the experiment and the corresponding numerical simulation, several issues of model validation are addressed. They include data reduction, feature extraction, design of computer experiments, statistical effects analysis, and model updating. It is shown how these tools can help the analyst gain confidence regarding the predictive quality of the simulation.

Proceedings ArticleDOI
Pär Klingstam1, B.-G. Olsson
01 Dec 2000
TL;DR: In this paper, the authors describe how discrete event simulation should be used as a tool for continuous process verification in industrial system development and describe a specification of the working procedures to be used in each life cycle phase of a development project, as well as a definition of areas where efforts are needed in the future.
Abstract: The purpose of this paper is to describe how discrete event simulation should be used as a tool for continuous process verification in industrial system development. Results include a specification of the working procedures to be used in each life cycle phase of a development project, as well as a definition of the areas where efforts are needed in the future. The approach assures continuous verification of the processes, which will lead to better decisions early on. Better decisions imply reduction in time and costs as well as systems with high quality. In conclusion, using simulation techniques for continuous process verification makes us more likely to develop an optimal industrial solution.

Proceedings ArticleDOI
Tomas Berling1, Per Runeson1
03 Apr 2000
TL;DR: It is concluded that the factorial design is a considerable support for the validation planning, enabling efficient system performance validation, and the cost reduction was in this case 40%, compared with the previously used method.
Abstract: System verification and validation are performed to secure that a system fulfils the different quality aspects, such as reliability, functionality, maintainability, performance and user friendliness. Since the verification and validation activities take a large share of the total project budget, efficiency and effectiveness are key issues. In this paper a method is presented for application of the well-known technique for experimental planning, factorial design, to validation of system performance. The proposed method is applied in this case study to the validation of the effectiveness of a radar system. It is concluded that the factorial design is a considerable support for the validation planning, enabling efficient system performance validation. The cost reduction was in this case 40%, compared with the previously used method. At the same time more information of the effects of factors was gained.

Journal ArticleDOI
TL;DR: The validation results illustrate that the proposed multistage validation procedure can account for the complexity of the validation task and its conclusions.
Abstract: A multistage validation framework that accounts for the realistic nature of traffic simulation output data is proposed. The framework consists of conceptual validation and operational validation. The operational validation comprises a qualitative approach, which involves static and animated Turing tests, and a quantitative approach, which involves three levels of statistical tests. Particularly in the third-level statistical test, the autocorrelation and nonstationary nature of traffic simulation output data is emphasized, its implications on validation methods are explored, and a univariate nonseasonal autoregressive-integrated-moving-average (ARIMA) modeling approach is proposed. Finally, numerical results for an actual freeway network are presented. The validation results illustrate that the proposed multistage validation procedure can account for the complexity of the validation task and its conclusions.

Proceedings ArticleDOI
17 Sep 2000
TL;DR: It is asserted that often a novice user will model a system in a different manner-semantically equivalent, but less efficient for the verification tool-than an expert user would, and that some of these inefficient modeling choices can be easily detected at the source-code level.
Abstract: A major obstacle to widespread acceptance of formal verification is the difficulty in using the tools effectively. Although learning the basic syntax and operation of a formal verification tool may be easy, expert users are often able to accomplish a verification task while a novice user encounters time-out or space-out attempting the same task. In this paper, we assert that often a novice user will model a system in a different manner-semantically equivalent, but less efficient for the verification tool-than an expert user would, that some of these inefficient modeling choices can be easily detected at the source-code level, and that a robust verification tool should identify these inefficiencies and optimize them, thereby helping to close the gap between novice and expert users. To test our hypothesis, we propose some possible optimizations for the Mur/spl phi/ verification system, implement the simplest of these, and compare the results on a variety of examples written by both experts and novices (the Mur/spl phi/ distribution examples, a set of cache coherence protocol models, and a portion of the IEEE 1394 Firewire protocol). The results support our assertion-a nontrivial fraction of the Mur/spl phi/ models written by novice users were significantly accelerated by the very simple optimization. Our findings strongly support further research in this area.

Proceedings ArticleDOI
08 Nov 2000
TL;DR: A study on the coverage behavior of VHDL models and the resulting statistical behavior is compared to the statistical behavior used by some commonly used models for software reliability and shows the inappropriateness of applying the existing models for the behavior model verification.
Abstract: During behavioral model verification, it is important to determine the stopping point for the current test strategy and for moving to a different test strategy It has been shown that the location of the stopping point is highly dependent on the statistical model one should choose to describe the coverage behavior during the verification process This paper presents a study on the coverage behavior of VHDL models The resulting statistical behavior is compared to the statistical behavior used by some commonly used models for software reliability and shows the inappropriateness of applying the existing models for the behavior model verification

Proceedings ArticleDOI
04 Jan 2000
TL;DR: This paper describes the knowledge based integration infrastructure, called "wrappings", some anomaly detection algorithms originally developed for verification and validation, of knowledge based systems, and shows how systems organized using wrappings lend themselves to evaluation studies, both offline and online.
Abstract: It is well known that complex systems are difficult to design, implement, and analyze. Component-level verification has improved to the point that we can expect to produce formal or nearly formal verification analyses of all components of a complex system. What remains are the system-level verifications which we believe can be improved by our approach to system development. In earlier papers, we defined the wrapping integration infrastructure (C. Landauer and K.L. Bellman, 1996; 1998; 1999), which shows how a little bit of knowledge about the uses of the different resources goes a very long way towards using them properly, identifying anomalies and even monitoring their behavior. We first describe our knowledge based integration infrastructure, called "wrappings", then we describe some anomaly detection algorithms originally developed for verification and validation, of knowledge based systems, and finally, we show how systems organized using wrappings lend themselves to evaluation studies, both offline and online.

01 Jan 2000
TL;DR: An examination in Japan to standardize traffic simulation models to verify models' functions concerning to vehicle generation, bottleneck capacity at simple road sections, and drivers' route choice behavior is introduced.
Abstract: This paper at first introduces an examination in Japan to standardize traffic simulation models. The basic idea of the standardization here is to estimate abilities of existing models how to reproduce traffic conditions through verification and validation. Verification implies qualifying tests using virtual data sets in order to make a connection between the simulation model and the traffic-engineering theory clear, while validation means an evaluation process with real world data. Subsequently to the general introduction, the verification process is detailed with its philosophy and basic test configurations to verify models' functions concerning to: 1) vehicle generation, 2) bottleneck capacity at simple road sections, 3) capacity of merging/diverging areas, 4) traffic jam growing/shrinking with propagation of shock waves, 5) capacity of left/right turn at an intersection, and 6) drivers' route choice behavior. For the covering abstract see ITRD E114174.

01 Jan 2000
TL;DR: In this article, a verification and validation plan for performing structural dynamics analyses on a class of complex mechanical systems is presented, which is designed for systems with upwards of 10 million equations running on massively parallel computers.
Abstract: In this paper we summarize a verification and validation plan for performing structural dynamics analyses on a class of complex mechanical systems. The plan addresses verification and validation for a new structural dynamics code, SALINAS, which is designed for systems with upwards of 10 million equations running on massively parallel computers. This plan defines a hierarchy of validation cases, building from separable physics to fully-coupled effects. Separable physics validation is directed towards generic elements of the physics of structural dynamic systems as implemented within the code, although those generic elements which are targeted for validation are motivated by the class of applications of the code. The purpose of the fully-coupled system validation cases is to ensure that the integration of the different components and submodels leads to predictive system models. The plan also defines code verification activities such as regression testing, code review, and mesh refinement studies to demonstrate correct software implementation and numerical properties. Finally, it establishes use protocols such as peer review, error estimation, and the use of uncertainty quantification, to ensure that the code is used properly and that the modeling simplifications are consistent with the validation database.

Journal ArticleDOI
01 Oct 2000
TL;DR: An approach that exploits the information on the organisation of the knowledge involved in the task, and of the utilisation of this knowledge for solving the task is proposed, referred to as model-based verification and a case study on the verification of classical planning systems is presented.
Abstract: As knowledge-based systems become a standard in software development The interest in verification and validation techniques to ensure their reliability has grown For this purpose the verification of knowledge bases is fundamental Numerous references in the literature involve verification by detection of implementation-dependent anomalies in the knowledge base such as inconsistency, incompleteness and redundancy This approach has several limitations The main one is the lack of significance of these anomalies in relation to the task of the system To overcome this problem an approach that exploits the information on the organisation of the knowledge involved in the task, and of the utilisation of this knowledge for solving the task is proposed This approach, referred to as model-based verification is described, and a case study on the verification of classical planning systems is presented

Proceedings ArticleDOI
12 Jul 2000
TL;DR: In this paper, the authors describe infrared scene generation and validation activities at the U.S. Army Aviation and Missile Command's (AMCOM) Dual-Mode Hardware-in-the-Loop (HWIL) Simulation.
Abstract: This paper describes infrared (IR) scene generation and validation activities at the U.S. Army Aviation and Missile Command's (AMCOM) Dual-Mode Hardware-in-the-Loop (HWIL) Simulation. The HWIL simulation validation results are based on comparison of infrared seeker data collected in the HWIL simulation to infrared seeker data collected during captive flight tests (CFTs). Use of CFT data allows a simulation developer to quantify not only the radiometric fidelity of the simulation inputs, but also the effects that any limitations of the inputs may have on simulation validity with respect to a particular seeker and its algorithms. Validation of this type of simulation is a complex process and all aspects of the validation are covered. Topics include real-time IR signature modeling and validation, simulation output verification, projected energy verification, and total end-to-end simulation validation. Also included are descriptions of the different types of CFT scenarios necessary for simulation validation and the comparison methodologies used for each case.

Proceedings ArticleDOI
10 Dec 2000
TL;DR: Using simulation techniques for continuous process verification makes us more likely to develop an optimal industrial solution.
Abstract: The purpose of this paper is to describe how discrete event simulation should be used as a tool for continuous process verification in industrial system development. Results include a specification of the working procedures to be used in each life cycle phase of a development project, as well as a definition of the areas where efforts are needed in the future. The approach assures continuous verification of the processes, which will lead to better decisions early on. Better decisions imply reduction in time and costs as well as systems with high quality. In conclusion, using simulation techniques for continuous process verification makes us more likely to develop an optimal industrial solution.

Journal ArticleDOI
TL;DR: The various known verification techniques that address the so called „state explosion problem“ of finite state transition systems are discussed.

Proceedings Article
22 May 2000
TL;DR: The aim of this paper is to exemplify a validation approach based on a clear and thoroughly formal approach to verification, by means of the validation toolkit TIC, and the results are presented for the sake of illustrating the underlying formal concepts in use.
Abstract: In the problem area of evaluating complex software systems, there are two distinguished areas of research, development, and application identified by the two busswords valida*ion and verification, respectively. From the perspective adopted by the authors (cf. (O'Keefe & O'Leary 1993), e.g.), verification usually more formally based and, thus, can be sup- ported by formal reasoning tools like theorem provers, for instance. The scope of verification approaches is limited by the difllculty of finding a sufficiently complete formalisa- tion to built upon. In p~amount realistic problem domains, validation seems to be more appropriate, al- though it is less stringent in character and, therefore, validation results are often less definite. The aim of this paper is to exemplify a validation ap- proach based on a clear and thoroughly formal the- ory. In this way, validation and verification should be brought closer to each other, for the benefit of a con- cexted action towards dependable software systems. To allow for precise and sufficiently ciear results, the authors have selected the application domain of al- goritluns and systems for learning formal languages. By means of the validation toolkit TIC, some series of validation experiments have been performed. The results are presented for the sake of illustrating the underlying formal concepts in use. Comparing the validity of one learning approach to the invalidity of another one can be seen as an interesting result in its own right.