scispace - formally typeset
Book ChapterDOI

Constructive Induction Using a Non-Greedy Strategy for Feature Selection

TLDR
A method for feature construction and selection that finds a minimal set of conjunctive features that are appropriate to perform the classification task and is able to achieve higher classification accuracy.
Abstract
We present a method for feature construction and selection that finds a minimal set of conjunctive features that are appropriate to perform the classification task For problems where this bias is appropriate, the method outperforms other constructive induction algorithms and is able to achieve higher classification accuracy The application of the method in the search for minimal multi-level boolean expressions is presented and analyzed with the help of some examples

read more

Citations
More filters
Journal ArticleDOI

Feature Selection for Classification

TL;DR: This survey identifies the future research areas in feature selection, introduces newcomers to this field, and paves the way for practitioners who search for suitable methods for solving domain-specific real-world applications.
Journal ArticleDOI

Toward integrating feature selection algorithms for classification and clustering

TL;DR: With the categorizing framework, the efforts toward-building an integrated system for intelligent feature selection are continued, and an illustrative example is presented to show how existing feature selection algorithms can be integrated into a meta algorithm that can take advantage of individual algorithms.
Journal ArticleDOI

Consistency-based search in feature selection

TL;DR: An empirical study is conducted to examine the pros and cons of these search methods, give some guidelines on choosing a search method, and compare the classifier error rates before and after feature selection.
Proceedings ArticleDOI

Feature selection algorithms: a survey and experimental evaluation

TL;DR: This work assesses the performance of several fundamental algorithms found in the literature in a controlled scenario by taking into account the amount of relevance, irrelevance and redundance on sample data sets and a scoring measure ranks the algorithms.
Journal ArticleDOI

A Fast Clustering-Based Feature Subset Selection Algorithm for High-Dimensional Data

TL;DR: The results demonstrate that the FAST not only produces smaller subsets of features but also improves the performances of the four types of classifiers.
References
More filters
Book

Computers and Intractability: A Guide to the Theory of NP-Completeness

TL;DR: The second edition of a quarterly column as discussed by the authors provides a continuing update to the list of problems (NP-complete and harder) presented by M. R. Garey and myself in our book "Computers and Intractability: A Guide to the Theory of NP-Completeness,” W. H. Freeman & Co., San Francisco, 1979.
Journal ArticleDOI

Induction of Decision Trees

J. R. Quinlan
- 25 Mar 1986 - 
TL;DR: In this paper, an approach to synthesizing decision trees that has been used in a variety of systems, and it describes one such system, ID3, in detail, is described, and a reported shortcoming of the basic algorithm is discussed.
Book

Logic Minimization Algorithms for VLSI Synthesis

TL;DR: The ESPRESSO-IIAPL as discussed by the authors is an extension of the ESPRSO-IIC with the purpose of improving the efficiency of Tautology and reducing the number of blocks and covers.
Proceedings Article

The multi-purpose incremental learning system AQ15 and its testing application to three medical domains

TL;DR: The demonstration that by applying the proposed method of cover truncation and analogical matching, called TRUNC, one may drastically decrease the complexity of the knowledge base without affecting its performance accuracy is demonstrated.