scispace - formally typeset
Search or ask a question
Author

Wei Yuke

Bio: Wei Yuke is an academic researcher from Guangdong University of Technology. The author has contributed to research in topics: Algorithm design & Rough set. The author has an hindex of 2, co-authored 11 publications receiving 14 citations.

Papers
More filters
Journal Article
TL;DR: An adaptive segementation algorithm to segment tongue image efficiently is proposed that can well segment the tongue image whose background and boundaries of the objectives is not obvious.
Abstract: Traditional Chinese medicine tongue diagnosis is one of the important medicine topics.The correct segmentation of the body of tongue is premise to the information on diagnosis by the feature of tongue in Traditional Chinese Medicine.The main method at present is the threshold algorithm.But the method can not deal with some tongue images with low contrast.Proposed an adaptive segementation algorithm to segment tongue image efficiently.The algorithm is based on VC + + as a development platform.First,divided the image into several parts,and then use an iterative approach to calculate each sub-block threshold.According to each local threshold to segment.The experimental results show that the algorithm can well segment the tongue image whose background and boundaries of the objectives is not obvious.

6 citations

Proceedings ArticleDOI
04 Jul 2009
TL;DR: The algorithm avoided segmenting image excessively and speeded up segmentation velocity by fuzzy grid dividing and has been put into use in tongue image segmentation of Traditional Chinese Medicine (TCM).
Abstract: The paper studied a new theory of fuzzy rough sets, and presented a method to segment tongue image. It presented an algorithm of Fuzzy Rough Clustering Based on Grid by the theory. The algorithm extracts condensation points by the theory of fuzzy rough sets, and quarters the data space layer by layer, and softens the edge of the dense block by drawing condensation points in the borders. The algorithm has been put into use in tongue image segmentation of Traditional Chinese Medicine (TCM). The application result indicated: The algorithm avoided segmenting image excessively and speeded up segmentation velocity by fuzzy grid dividing.

4 citations

Proceedings ArticleDOI
19 May 2009
TL;DR: The algorithm improved speed, reliability and accuracy of TCM tongue diagnosis, also met the requirements of intellectualization and digitization.
Abstract: The paper studied a new theory of fuzzy rough sets, and presented a method to approximately estimate objects in a range. It presented an algorithm of Fuzzy Rough Clustering Based on Grid by the theory. The algorithm extracts condensation points by the theory of fuzzy rough sets, and quarters the data space layer by layer, and softens the edge of the dense block by drawing condensation points in the borders. The tongue diagnosis system is a big, complex one, its data is of great amount and the data cluster has uncertainty. The algorithm has been put into use in rules mining of Traditional Chinese Medicine (TCM) tongue diagnosis system. The application result indicated: The algorithm speeded up cluster by fuzzy grid dividing, saved a lot of time than traditional fuzzy cluster algorithm. The algorithm improved speed, reliability and accuracy of TCM tongue diagnosis, also met the requirements of intellectualization and digitization.

2 citations

Journal Article
TL;DR: Comparison result shows that the regularized P-M equation is the best method to denoise the tougue images because it has fast speed and good denoising result, and the best preserving for edges.
Abstract: When denoising the tongue image,the details such as the edges and the veins are easy to be lost.Aiming at this problem,this paper proposes a new method to denoise the tongue image based on Partial Differential Equation(PDE).It applies median filtering,Gaussian filtering,P-M equation,regularization P-M equation and Coupling shock filtering combined complex diffusion filtering model to denoise the noising tongue images.Comparison result shows that the regularized P-M equation is the best method to denoise the tougue images.It has fast speed and good denoising result,and the best preserving for edges.

1 citations

Proceedings ArticleDOI
16 Jul 2008
TL;DR: The results of complex TCM diagnosis and inference process show that the intelligent inference model studied in this paper can preferably be applied to TCM differentiation of symptoms and signs.
Abstract: This paper studies an intelligent inference model embedded parallel competitive neural network, and implements the simulation of traditional Chinese medicine (TCM) experts in diagnosis and inference process, by integrating fuzzy neural networks (FNN) technology, image processing and recognition technology, data mining technology and extension thought. The results of complex TCM diagnosis and inference process show that the intelligent inference model studied in this paper can preferably be applied to TCM differentiation of symptoms and signs.

1 citations


Cited by
More filters
Journal ArticleDOI
01 Jul 2014
TL;DR: In this paper, a summary of all previous Rough-set based image segmentation methods are described in detail and also categorized accordingly to provide a stable and better framework forimage segmentation.
Abstract: In the domain of image processing, image segmentation has become one of the key application that is involved in most of the image based operations. Image segmentation refers to the process of breaking or partitioning any image. Although, like several image processing operations, image segmentation also faces some problems and issues when segmenting process becomes much more complicated. Previously lot of work has proved that Rough-set theory can be a useful method to overcome such complications during image segmentation. The Rough-set theory helps in very fast convergence and in avoiding local minima problem, thereby enhancing the performance of the EM, better result can be achieved. During rough-set-theoretic rule generation, each band is individualized by using the fuzzy-correlation-based gray-level thresholding. Therefore, use of Rough-set in image segmentation can be very useful. In this paper, a summary of all previous Rough-set based image segmentation methods are described in detail and also categorized accordingly. Rough-set based image segmentation provides a stable and better framework for image segmentation.

77 citations

Proceedings ArticleDOI
15 Apr 2018
TL;DR: An end-to-end trainable tongue image segmentation method using deep convolutional neural network based on ResNet, named DeepTongue, segments tongue by using a forward network without preprocessing.
Abstract: Accurate tongue image segmentation is helpful to acquire correct automatic tongue diagnosis result However, traditional methods cannot bring satisfying results in most cases This paper proposes an end-to-end trainable tongue image segmentation method using deep convolutional neural network based on ResNet The proposed method, named DeepTongue, segments tongue by using a forward network without preprocessing The proposed method has no restrictions of the illumination and size of tongue images Experimental results show that the proposed DeepTongue improves the segmentation accuracy by a noticeable margin In addition, DeepTongue is much faster than the existing tongue image segmentation methods

31 citations

Proceedings ArticleDOI
06 May 2016
TL;DR: It has proved by experiments that the DC_A Priori algorithm is obviously superior to the Apriori algorithm and the MC_Apriori algorithms based on the matrix, whether in small support degree or in the intensive database with large numbers of transactions and items.
Abstract: Apriori algorithm is a classical association rule mining algorithm, but it has problems about frequently scanning database and generating a large number of candidate sets. To solve these problems, an improved DC_Apriori algorithm was proposed, which restructured the storage structure of the database, improved connection of frequent item sets, the generation of k-frequent item sets is only need to join the 1-frequent item sets with k-1-frequent item sets, greatly reduced the number of connections and it can directly get frequent item sets by only one pruning operation, effectively avoid the unnecessary invalid candidate sets, and greatly reduce the number of scanning the database and improve the efficiency of frequent item sets generation. It has proved by experiments that the DC_Apriori algorithm is obviously superior to the Apriori algorithm and the MC_Apriori algorithm based on the matrix, whether in small support degree or in the intensive database with large numbers of transactions and items, the running time of DC_Apriori to get the same result is significantly less than the Apriori algorithm and MC_Apriori algorithm based on the matrix.

18 citations

Book ChapterDOI
01 Jan 2015
TL;DR: This chapter describes the state-of-the-art in the combinations of fuzzy and rough sets dividing into three parts.
Abstract: Fuzzy sets and rough sets are known as uncertainty models. They are proposed to treat different aspects of uncertainty. Therefore, it is natural to combine them to build more powerful mathematical tools for treating problems under uncertainty. In this chapter, we describe the state-of-the-art in the combinations of fuzzy and rough sets dividing into three parts.

16 citations

Journal ArticleDOI
TL;DR: A novel method is suggested, which applies multiobjective greedy rules and makes fusion of color and space information in order to extract tongue image accurately.
Abstract: Tongue image with coating is of important clinical diagnostic meaning, but traditional tongue image extraction method is not competent for extraction of tongue image with thick coating. In this paper, a novel method is suggested, which applies multiobjective greedy rules and makes fusion of color and space information in order to extract tongue image accurately. A comparative study of several contemporary tongue image extraction methods is also made from the aspects of accuracy and efficiency. As the experimental results show, geodesic active contour is quite slow and not accurate, the other 3 methods achieve fairly good segmentation results except in the case of the tongue with thick coating, our method achieves ideal segmentation results whatever types of tongue images are, and efficiency of our method is acceptable for the application of quantitative check of tongue image.

9 citations