scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Development of efficient image quarrying technique for Mammographic image classification to detect breast cancer with supervised learning algorithm

01 Dec 2013-pp 1-7
TL;DR: It can be concluded that supervised learning algorithm gives fast and accurate classification and it works as efficient tool for classification of breast cancer cells.
Abstract: This Breast cancer is one of the most prevalent lumps in women increased day by day around in worldwide. The scheme for the detection of breast cancer is Mammographic technique that is used at the very earlier stage. In this paper two kinds of classification Support Vector Machine (SVM) and Linear Discriminant Analysis (LDA) are used to analyze the mammographic images. The two classification methods are using the image pre-processing in wavelet decomposition and image enhancement. The results are verified with 322 mammogram images which is size for 1024×1024 with PGM format. The results show that the proposed algorithm can able to classify the images with a good performance rate of 98% It can be concluded that supervised learning algorithm gives fast and accurate classification and it works as efficient tool for classification of breast cancer cells.
Citations
More filters
Proceedings ArticleDOI
13 Mar 2018
TL;DR: The aim of this work is to compare and explain how ANN and logistic algorithm provide better solution when its work with ensemble machine learning algorithms for diagnosing breast cancer even the variables are reduced.
Abstract: According to Breast Cancer Institute (BCI), Breast Cancer is one of the most dangerous type of diseases that is very effective for women in the world. As per clinical expert detecting this cancer in its first stage helps in saving lives. As per cancer.net offers individualized guides for more than 120 types of cancer and related hereditary syndromes. For detecting breast cancer mostly machine learning techniques are used. In this paper we proposed adaptive ensemble voting method for diagnosed breast cancer using Wisconsin Breast Cancer database. The aim of this work is to compare and explain how ANN and logistic algorithm provide better solution when its work with ensemble machine learning algorithms for diagnosing breast cancer even the variables are reduced. In this paper we used the Wisconsin Diagnosis Breast Cancer dataset. When compared to related work from the literature. It is shown that the ANN approach with logistic algorithm is achieved 98.50% accuracy from another machine learning algorithm.

58 citations

Journal ArticleDOI
TL;DR: This paper has proposed an automatic process to locate the optic disc and the optic cup from the retinal image and calculates the CDR value, an important factor used to identify glaucoma.
Abstract: The two major eye diseases that affect human eye is Glaucoma and Diabetic retinopathy. Both these diseases cause permanent eye damage and about 50% of vision loss before identification. Till date early detection of these vision loss causing diseases is challenging. These diseases can be identified only after it has damaged more than 25% of the eye. Glaucoma is caused due to increase in the intraocular pressure that damages the retinal. The inner side of the retina is viewed through a camera called the Fundus Imager. This imager is capable of recording the view of retina with respect to the patient. This procedure makes us capable to identify the damages caused by these diseases in early stages. Detecting the optic disc and cup from a fundus image and calculating the CDR value is a method used to identify glaucoma in its early stages. In this paper we have proposed an automatic process to locate the optic disc and the optic cup from the retinal image. The optic disc is located at the centre of the eyes nervous system. This is the brightest region of the fundus image. The cup which is the inner part of the optic disc is obtained by further segmenting the fundus image. For better results different color planes of the same fundus image is been used. The optic cup to disc ratio is an important factor used to identify glaucoma. Glaucoma patients have a high CDR value. This is mainly due to the increase in the cup diameter. This can also be related to the thinning of the nerve rim.

7 citations


Cites background from "Development of efficient image quar..."

  • ...Different studies have used hybrid level set and morphological operation for optic disc segmentation and blood vessel detection followed with SVM classifiers for OC segmentation [23-30]....

    [...]

01 Jan 2018
TL;DR: The method proposed in this research is Probabilistic Neural Network and it can be concluded that the proposed method has an accuracy of 90%.
Abstract: Breast cancer is cancer of breast tissue. Breast cancer can be lethal when diagnosis and treatment are delayed. Early examination needs to be done to find out whether the breasts are still normal or there are abnormalities. One of the early tests that can be done is mammography. The image produced from mammography will be manually checked by a doctor. A method is needed to identify breast image from mammography automatically. The method proposed in this research is Probabilistic Neural Network. Mammography image of the breast is used as the input for the method for image processing. Then, continue to the pre-processing step which consists of contrast limited adaptive histogram equalization, morphological black hat, morphological elipse, and background exclusion. The next step is image segmentation to generate a binary image using thresholding. Then, the next step is a feature extraction which will generate invariant moment values. The final step is classification, the process to determine normal, benign, or malignant. After testing from this research, it can be concluded that the proposed method has an accuracy of 90%.

1 citations

Journal ArticleDOI
TL;DR: The result shows that the proposed method has increased the classification accuracy and the quality of the image classification which is an ultimate aim of this proposed is improved.

1 citations

References
More filters
01 Jan 1998
TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Abstract: A comprehensive look at learning and generalization theory. The statistical theory of learning and generalization concerns the problem of choosing desired functions on the basis of empirical data. Highly applicable to a variety of computer science and robotics fields, this book offers lucid coverage of the theory as a whole. Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.

26,531 citations


"Development of efficient image quar..." refers background in this paper

  • ...In a Philippine study [1-2] a mammogram screening were done to 151,198 women....

    [...]

Journal ArticleDOI
TL;DR: This paper provides an introduction to support vector machines, kernel Fisher discriminant analysis, and kernel principal component analysis, as examples for successful kernel-based learning methods.
Abstract: This paper provides an introduction to support vector machines, kernel Fisher discriminant analysis, and kernel principal component analysis, as examples for successful kernel-based learning methods. We first give a short background about Vapnik-Chervonenkis theory and kernel feature spaces and then proceed to kernel based learning in supervised and unsupervised scenarios including practical and algorithmic considerations. We illustrate the usefulness of kernel algorithms by discussing applications such as optical character recognition and DNA analysis.

3,566 citations

Journal ArticleDOI
T. Netsch1, H.-O. Peitgen
TL;DR: The basic method for automated detection of microcalcifications in digitized mammograms is significantly improved by consideration of the statistical variation of the estimated contrast, which is the result of the complex noise characteristic of the mammograms.
Abstract: A method is described for the automated detection of microcalcifications in digitized mammograms. The method is based on the Laplacian scale-space representation of the mammogram only. First, possible locations of microcalcifications are identified as local maxima in the filtered image on a range of scales. For each finding, the size and local contrast is estimated, based on the Laplacian response denoted as the scale-space signature. A finding is marked as a microcalcification if the estimated contrast is larger than a predefined threshold which depends on the size of the finding. It is shown that the signature has a characteristic peak, revealing the corresponding image features. This peak can be robustly determined. The basic method is significantly improved by consideration of the statistical variation of the estimated contrast, which is the result of the complex noise characteristic of the mammograms. The method is evaluated with the Nijmegen database and compared to other methods using these mammograms. Results are presented as the free-response receiver operating characteristic (FROG) performance. At a rate of one false positive cluster per image the method reaches a sensitivity of 0.84, which is comparable to the best results achieved so far.

153 citations

Journal ArticleDOI
TL;DR: It is shown that a biorthogonal spline wavelet closely approximates the prewhitening matched filter for detecting Gaussian objects in Markov noise and two contrasting applications of the wavelets-based object recovery algorithm are presented.
Abstract: We show that a biorthogonal spline wavelet closely approximates the prewhitening matched filter for detecting Gaussian objects in Markov noise. The filterbank implementation of the wavelet transform acts as a hierarchy of such detectors operating at discrete object scales. If the object to be detected is Gaussian and its scale happens to coincide with one of those computed by the wavelet transform, and if the background noise is truly Markov, then optimum detection is realized by thresholding the appropriate subband image. In reality, the Gaussian may be a rather coarse approximation of the object, and the background noise may deviate from the Markov assumption. In this case, we may view the wavelet decomposition as a means for computing an orthogonal feature set for input to a classifier. We use a supervised linear classifier applied to feature vectors comprised of samples taken from the subbands of an N-octave, undecimated wavelet transform. The resulting map of test statistic values indicates the presence and location of objects. The object itself is reconstructed by using the test statistic to emphasize wavelet subbands, followed by computing the inverse wavelet transform. We show two contrasting applications of the wavelets-based object recovery algorithm. For detecting microcalcifications in digitized mammograms, the object and noise models closely match the real image data, and the multiscale matched filter paradigm is highly appropriate. The second application, extracting ship outlines in noisy forward-looking infrared images, is presented as a case where good results are achieved despite the data models being less well matched to the assumptions of the algorithm.

144 citations


"Development of efficient image quar..." refers background in this paper

  • ...If detected early, it can be treated early, and the mortality rate of breast cancer can be reduced [7]....

    [...]

Journal ArticleDOI
TL;DR: Exhaustive experimental investigation demonstrates that FDT-SVM is favorably compared with six existing methods, including traditional multiclass SVMs and SVM-based binary hierarchical trees.
Abstract: A novel fuzzy decision tree is proposed in this paper (the FDT-support vector machine (SVM) classifier), where the node discriminations are implemented via binary SVMs. The tree structure is determined via a class grouping algorithm, which forms the groups of classes to be separated at each internal node, based on the degree of fuzzy confusion between the classes. In addition, effective feature selection is incorporated within the tree building process, selecting suitable feature subsets required for the node discriminations individually. FDT-SVM exhibits a number of attractive merits such as enhanced classification accuracy, interpretable hierarchy, and low model complexity. Furthermore, it provides hierarchical image segmentation and has reasonably low computational and data storage demands. Our approach is tested on two different tasks: natural forest classification using a QuickBird multispectral image and urban classification using hyperspectral data. Exhaustive experimental investigation demonstrates that FDT-SVM is favorably compared with six existing methods, including traditional multiclass SVMs and SVM-based binary hierarchical trees. Comparative analysis is carried out in terms of testing rates, architecture complexity, and computational times required for the operative phase.

98 citations


"Development of efficient image quar..." refers background in this paper

  • ...If pre-processing aims to correct some degradation in the image, the nature of a priori information is important [8-9]....

    [...]