scispace - formally typeset
Open AccessJournal Article

A new distance-weighted k-nearest neighbor classifier

TLDR
The experiment results demonstrate that the proposed DWKNN is robust to different choices of k to some degree, and yields good performance with a larger optimal k, compared to the other state-of-art KNN-based methods.
Abstract
In this paper, we develop a novel Distance-weighted k -nearest Neighbor rule (DWKNN), using the dual distance-weighted function. The proposed DWKNN is motivated by the sensitivity problem of the selection of the neighborhood size k that exists in k -nearest Neighbor rule (KNN), with the aim of improving classification performance. The experiment results on twelve real data sets demonstrate that our proposed classifier is robust to different choices of k to some degree, and yields good performance with a larger optimal k, compared to the other state-of-art KNN-based methods.

read more

Citations
More filters

Application of K-Nearest Neighbor (KNN) Approach for Predicting Economic Events: Theoretical Background

TL;DR: In this article, the k-Nearest Neighbor (kNN) classification method has been used for economic forecasting in Iran and the results showed that kNN is more capable than other methods.
Journal ArticleDOI

A generalized mean distance-based k-nearest neighbor classifier

TL;DR: The experimental results demonstrate that the proposed GMDKNN performs better and has the less sensitiveness to k, which could be a promising method for pattern recognition in some expert and intelligence systems.
Journal ArticleDOI

Explainable artificial intelligence for breast cancer: A visual case-based reasoning approach.

TL;DR: This paper proposes a CBR method that can be both executed automatically as an algorithm and presented visually in a user interface for providing visual explanations or for visual reasoning, and shows that the qualitative method has a classification accuracy comparable to k-Nearest Neighbors algorithms, but is better explainable.
Journal ArticleDOI

Breast Cancer Prediction using varying Parameters of Machine Learning Models

TL;DR: Six supervised machine learning algorithms such as k-Nearest Neighborhood, Logistic Regression, Decision Tree, Random Forest, Support Vector Machine with radial basis function kernel, and Adam Gradient Descent Learning are presented.
Journal ArticleDOI

Improved pseudo nearest neighbor classification

TL;DR: The comprehensively experimental results suggest that the proposed LMPNN classifier is a promising algorithm in pattern recognition.
References
More filters
Journal ArticleDOI

Nearest neighbor pattern classification

TL;DR: The nearest neighbor decision rule assigns to an unclassified sample point the classification of the nearest of a set of previously classified points, so it may be said that half the classification information in an infinite sample set is contained in the nearest neighbor.
Journal ArticleDOI

Discriminatory Analysis - Nonparametric Discrimination: Consistency Properties

TL;DR: In this paper, the discrimination problem is defined as follows: e random variable Z, of observed value z, is distributed over some space (say, p-dimensional) either according to distribution F, or according to Distribution G. The problem is to decide, on the basis of z, which of the two distributions Z has.
Journal ArticleDOI

The Distance-Weighted k-Nearest-Neighbor Rule

TL;DR: One such classification rule is described which makes use of a neighbor weighting function for the purpose of assigning a class to an unclassified sample.
Journal ArticleDOI

A local mean-based nonparametric classifier

TL;DR: A simple nonparametric classifier based on the local mean vectors is proposed that is compared with the 1-NN, k-nn, Euclidean distance, Parzen, and artificial neural network (ANN) classifiers in terms of the error rate on the unknown patterns.
Related Papers (5)