scispace - formally typeset
Search or ask a question
Author

Anca I. Vermesan

Bio: Anca I. Vermesan is an academic researcher from DNV GL. The author has contributed to research in topics: Software verification & Knowledge-based systems. The author has an hindex of 5, co-authored 9 publications receiving 196 citations.

Papers
More filters
09 Jun 1999
TL;DR: Validation, Verification and Integrity in Knowledge and Data Base Systems: Future Directions Intelligent Data and Knowledge Analysis and Verification Towards a Taxonomy of Specific Problems.
Abstract: KBS First Prototype V&V Process Plan as a Way to Produce Reliable Requirements.- On Principles of Knowledge Validation.- Progressive Instantiation for the Logical Validation of Nonmonotonic KBs.- Computer Algebra based Verification and Knowledge Extraction in RBS. Application to Medical Fitness Criteria.- A Knowledge Based Tool to Validate and Verify an Aion Knowledge Base.- Constraints for Validation of Conceptual Graphs.- PRONTO - Ontology-based Evaluation of Knowledge Based Systems.- Formal Methods for the engineering and certification of safetycritical Knowledge Based Systems.- Design pattern for safety-critical knowledge-based systems.- Organising Knowledge Refinement Operators.- Validation and refinement versus revision.- Illustrating Knowledge Base Restructuring and Verification in a Real World Application.- Incorporating backtracking search with knowledge refinement.- Verification and validation of a multistrategy knowledge-based system.- Validation and Verification of Knowledge-Based Systems for Power System Control Centres.- A priori Verification of Product Models in Mechanical Design.- Verification of Business Processes for a Correspondence Handling Center Using CCS.- User Participation-based Software Certification.- Verification and Validation in Support for Software Certification Methods.- Validation, Verification and Integrity in Knowledge and Data Base Systems: Future Directions.- Intelligent Data and Knowledge Analysis and Verification Towards a Taxonomy of Specific Problems.- Ontology-based Verification and Validation of Federated Database Systems.- Applicability of Conventional Software Verification and validation to Knowledge Based Components: A Qualitative Assessment.

69 citations

BookDOI
01 Jan 1999
TL;DR: This article aims to describe the set of V &V process activities that can be applied to a KBS core, that is, the essential components of a V & V Plan.
Abstract: Knowledge Based Systems; Validation & Verification; Software Reliability; Methods of Processes; Requirements. The Quality Assessment Process is in charge of assuring several performances of the software products and processes. One of them is the Software Reliability, perhaps the closest concept to the more general of Quality. These Quality processes should be integrated with the product development and depend mainly on two issues. One is the development methodology and the other is the type of product. The Knowledge Based Systems (KBS) are products whose software production presents special difficulties for the Validation and Verification (V &) process application. Indeed, the knowledge characteristics they model, often incomplete, imprecise, inconsistent and uncertain, are solved by the application of heuristics whose validation is difficult to carry out without the systematic application of test cases. This makes that often the development of KBS goes through a life cycle based on prototypes to build a consistent core that can be modelled conceptually and formally. This article aims to describe the set of V & V process activities that can be applied to a KBS core, that is, the essential components of a V & V Plan. Indeed although this Plan is often not defined, it is essential to assure the reliability of the system requirements, result of the prototyping phase. This paper was written with the support and financing of the Spanish Council for Research and Technology (CICYT) [Project TIC96-0883-CE]

41 citations

Book
01 Jan 1999
TL;DR: KBS First Prototype V&V Process Plan as a Way to Produce Reliable Requirements as discussed by the authors, on Principles of Knowledge Validation, Progressive Instantiation for the Logical Validation of Nonmonotonic KBs., Computer Algebra based Verification and Knowledge Extraction in RBS.
Abstract: KBS First Prototype V&V Process Plan as a Way to Produce Reliable Requirements.- On Principles of Knowledge Validation.- Progressive Instantiation for the Logical Validation of Nonmonotonic KBs.- Computer Algebra based Verification and Knowledge Extraction in RBS. Application to Medical Fitness Criteria.- A Knowledge Based Tool to Validate and Verify an Aion Knowledge Base.- Constraints for Validation of Conceptual Graphs.- PRONTO - Ontology-based Evaluation of Knowledge Based Systems.- Formal Methods for the engineering and certification of safetycritical Knowledge Based Systems.- Design pattern for safety-critical knowledge-based systems.- Organising Knowledge Refinement Operators.- Validation and refinement versus revision.- Illustrating Knowledge Base Restructuring and Verification in a Real World Application.- Incorporating backtracking search with knowledge refinement.- Verification and validation of a multistrategy knowledge-based system.- Validation and Verification of Knowledge-Based Systems for Power System Control Centres.- A priori Verification of Product Models in Mechanical Design.- Verification of Business Processes for a Correspondence Handling Center Using CCS.- User Participation-based Software Certification.- Verification and Validation in Support for Software Certification Methods.- Validation, Verification and Integrity in Knowledge and Data Base Systems: Future Directions.- Intelligent Data and Knowledge Analysis and Verification Towards a Taxonomy of Specific Problems.- Ontology-based Verification and Validation of Federated Database Systems.- Applicability of Conventional Software Verification and validation to Knowledge Based Components: A Qualitative Assessment.

38 citations

Journal ArticleDOI
TL;DR: Research in V&V of KBSs has emerged as a distinct field only in the last decade, and is intended to address issues associated with quality and safety aspects of K BSs, and to provide such applications with the same degree of dependability as conventional applications.
Abstract: Knowledge-Based (KB) technology is being applied to complex problem solving and safety and business critical tasks in many application domains. Concerns have naturally arisen as to the dependability of Knowledge-Based Systems (KBS). As with any software, attention to quality and safety must be paid throughout development of a KBS, and rigorous Verification and Validation (V&V) techniques must be employed. Research in V&V of KBSs has emerged as a distinct field only in the last decade, and is intended to address issues associated with quality and safety aspects of KBSs, and to provide such applications with the same degree of dependability as conventional applications. In recent years, V&V of KBSs has been the topic of annual workshops associated with the main AI conferences, such as AAAI, IJCAI and ECAI.

19 citations

Journal ArticleDOI
TL;DR: Comparisons between verification and validation as found in traditional software engineering and knowledge‐based systems are made, pointing out what is special about the latter as compared with the former.
Abstract: Verification and validation are terms that have been used for several years in software engineering, and there are now many verification and validation techniques, plus considerable experience and expertise in using them. However, applying these techniques to knowledge-based system is not straightforward. The essential differences between conventional systems and knowledge-based systems suggest that these techniques must be expanded and adapted, but new techniques are also needed. This article has two major goals: first, it makes some comparisons between verification and validation as found in traditional software engineering and knowledge-based systems, pointing out what is special about the latter as compared with the former; second, it provides a framework for a discussion of the various European work on verification and validation of knowledge-based systems. The perspective put forward in this article allows for a vast amound of work to be surveyed and analysed beyond the implementation level, by differentiating the symbol level and the knowledge level within a knowledge-based system.

18 citations


Cited by
More filters
Book
01 Oct 2004
TL;DR: This collection surveys the most recent advances in the field and charts directions for future research, discussing topics that include distributed data mining algorithms for new application areas, several aspects of next-generation data mining systems and applications, and detection of recurrent patterns in digital media.
Abstract: Data mining, or knowledge discovery, has become an indispensable technology for businesses and researchers in many fields. Drawing on work in such areas as statistics, machine learning, pattern recognition, databases, and high performance computing, data mining extracts useful information from the large data sets now available to industry and science. This collection surveys the most recent advances in the field and charts directions for future research.The first part looks at pervasive, distributed, and stream data mining, discussing topics that include distributed data mining algorithms for new application areas, several aspects of next-generation data mining systems and applications, and detection of recurrent patterns in digital media. The second part considers data mining, counter-terrorism, and privacy concerns, examining such topics as biosurveillance, marshalling evidence through data mining, and link discovery. The third part looks at scientific data mining; topics include mining temporally-varying phenomena, data sets using graphs, and spatial data mining. The last part considers web, semantics, and data mining, examining advances in text mining algorithms and software, semantic webs, and other subjects.

152 citations

Journal ArticleDOI
TL;DR: In this article, the authors provide perspectives on issues and problems that impact the verification and validation (V&V) of KBSs and provide an overview of different techniques and tools that have been developed for performing V&V activities.
Abstract: Knowledge-based systems (KBSs) are being used in many applications areas where their failures can be costly because of losses in services, property or even life. To ensure their reliability and dependability, it is therefore important that these systems are verified and validated before they are deployed. This paper provides perspectives on issues and problems that impact the verification and validation (V&V) of KBSs. Some of the reasons why V&V of KBSs is difficult are presented. The paper also provides an overview of different techniques and tools that have been developed for performing V&V activities. Finally, some of the research issues that are relevant for future work in this field are discussed.

72 citations

Journal ArticleDOI
TL;DR: A new approach to the design and development of complex rule-based systems for control and decision support, based on ALSV(FD) logic, is described, which ensures high density and transparency of visual knowledge representation.
Abstract: This paper describes a new approach, the HeKatE methodology, to the design and development of complex rule-based systems for control and decision support. The main paradigm for rule representation, namely, eXtended Tabular Trees (XTT), ensures high density and transparency of visual knowledge representation. Contrary to traditional, flat rule-based systems, the XTT approach is focused on groups of similar rules rather than on single rules. Such groups form decision tables which are connected into a network for inference. Efficient inference is assured as only the rules necessary for achieving the goal, identified by the context of inference and partial order among tables, are fired. In the paper a new version of the language-XTT22-is presented. It is based on ALSV(FD) logic, also described in the paper. Another distinctive feature of the presented approach is a top-down design methodology based on successive refinement of the project. It starts with Attribute Relationship Diagram (ARD) development. Such a diagram represents relationships between system variables. Based on the ARD scheme, XTT tables and links between them are generated. The tables are filled with expert-provided constraints on values of the attributes. The code for rule representation is generated in a humanreadable representation called HMR and interpreted with a provided inference engine called HeaRT. A set of software tools supporting the visual design and development stages is described in brief.

70 citations

Journal ArticleDOI
01 Jun 2002
TL;DR: A complete methodology for the validation of rule-based expert systems is presented as a five-step process that has two central themes: a minimal set of test inputs that adequately cover the domain represented in the knowledge base; and a Turing Test-like methodology that evaluates the system's responses to the test inputs and compares them to the responses of human experts.
Abstract: We describe a complete methodology for the validation of rule-based expert systems. This methodology is presented as a five-step process that has two central themes: 1) to create a minimal set of test inputs that adequately cover the domain represented in the knowledge base; and 2) a Turing Test-like methodology that evaluates the system's responses to the test inputs and compares them to the responses of human experts. The development of minimal set of test inputs takes into consideration various criteria, both user-defined, and domain-specific. These criteria are used to reduce the potentially very large set of test inputs to one that is practical, keeping in mind the nature and purpose of the developed system. The Turing Test-like evaluation methodology makes use of only one panel of experts to both evaluate each set of test cases and compare the results with those of the expert system, as well as with those of the other experts. The hypothesis being presented is that much can be learned about the experts themselves by having them anonymously evaluate each other's responses to the same test inputs. Thus, we are better able to determine the validity of an expert system. Depending on its purpose, we introduce various ways to express validity as well as a technique to use the validity assessment for the refinement of the rule base. Lastly, we describe a partial implementation of the test input minimalization process on a small but nontrivial expert system. The effectiveness of the technique was evaluated by seeding errors into the expert system, generating the appropriate set of test inputs and determining whether the errors could be detected by the suggested methodology.

64 citations

Journal ArticleDOI
TL;DR: A novel, consistent, three‐phase methodology incorporating conceptual, logical, and physical design is outlined, and tools supporting the complete design and development process are presented.
Abstract: Rule-based systems (RBSs) constitute a powerful technology for declarative encoding and automated processing of large bodies of knowledge. A typical RBS consists of a knowledge base containing facts and production rules, and an inference engine managing the reasoning process. Despite their simple conceptual scheme, design and development of a RBS often turn out to be unexpectedly complex task. This paper presents an overview of issues concerning design and development of such systems. Differences between RBSs and classical software are exemplified, and design and implementation issues are analyzed. A novel, consistent, three-phase methodology incorporating conceptual, logical, and physical design is outlined. Moreover, tools supporting the complete design and development process are presented. © 2011 John Wiley & Sons, Inc. WIREs Data Mining Knowl Discov 2011 1 117-137 DOI: 10.1002/widm.11

53 citations