scispace - formally typeset
Search or ask a question
Author

Taweethong Koanantakool

Bio: Taweethong Koanantakool is an academic researcher from Thailand Ministry of Public Health. The author has contributed to research in topics: Image segmentation & Burn injury. The author has an hindex of 1, co-authored 1 publications receiving 38 citations.

Papers
More filters
Proceedings ArticleDOI
22 Mar 2012
TL;DR: The aim of this work is to develop an automatic system with the ability of providing the first assessment to burn injury from burn color images by identifying degree of the burn through segmentation and degree of burn identification.
Abstract: When burn injury occurs, the most important step is to provide treatment to the injury immediately by identifying degree of the burn which can only be diagnosed by specialists. However, specialists for burn trauma are still inadequate for some local hospitals. Hence, the invention of an automatic system that is able to help evaluating the burn would be extremely beneficial to those hospitals. The aim of this work is to develop an automatic system with the ability of providing the first assessment to burn injury from burn color images. The method used in this work can be divided into 2 parts, i.e., burn image segmentation and degree of burn identification. Burn image segmentation employs the Cr-transformation, Luv-transformation and fuzzy c-means clustering technique to separate the burn wound area from healthy skin and then mathematical morphology is applied to reduce segmentation errors. The segmentation algorithm performance is evaluated by the positive predictive value (PPV) and the sensitivity (S). Burn degree identification uses h-transformation and texture analysis to extract feature vectors and the support vector machine (SVM) is applied to identify the degree of burn. The classification results are compared with that of Bayes and K-nearest neighbor classifiers. The experimental results show that our proposed segmentation algorithm yields good results for the burn color images. The PPV and S are about 0.92 and 0.84, respectively. Degree of burn identification experiments show that SVM yields the best results of 89.29 % correct classification on the validation sets of the 4-fold cross validation. SVM also yields 75.33 % correct classification on the blind test experiment.

46 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: A novel approach, using support vector machines (SVM) to determine the wound boundaries on foot ulcer images captured with an image capture box, which provides controlled lighting and range and is sufficiently efficient for a smartphone-based image analysis.
Abstract: The standard chronic wound assessment method based on visual examination is potentially inaccurate and also represents a significant clinical workload. Hence, computer-based systems providing quantitative wound assessment may be valuable for accurately monitoring wound healing status, with the wound area the best suited for automated analysis. Here, we present a novel approach, using support vector machines (SVM) to determine the wound boundaries on foot ulcer images captured with an image capture box, which provides controlled lighting and range. After superpixel segmentation, a cascaded two-stage classifier operates as follows: in the first stage, a set of k binary SVM classifiers are trained and applied to different subsets of the entire training images dataset, and incorrectly classified instances are collected. In the second stage, another binary SVM classifier is trained on the incorrectly classified set. We extracted various color and texture descriptors from superpixels that are used as input for each stage in the classifier training. Specifically, color and bag-of-word representations of local dense scale invariant feature transformation features are descriptors for ruling out irrelevant regions, and color and wavelet-based features are descriptors for distinguishing healthy tissue from wound regions. Finally, the detected wound boundary is refined by applying the conditional random field method. We have implemented the wound classification on a Nexus 5 smartphone platform, except for training which was done offline. Results are compared with other classifiers and show that our approach provides high global performance rates (average sensitivity = 73.3%, specificity = 94.6%) and is sufficiently efficient for a smartphone-based image analysis.

109 citations

Journal ArticleDOI
TL;DR: Wang et al. as mentioned in this paper proposed a novel convolutional framework based on MobileNetV2 and connected component labeling to segment wound regions from natural images, which is a lightweight and less compute-intensive architecture.
Abstract: Acute and chronic wounds have varying etiologies and are an economic burden to healthcare systems around the world. The advanced wound care market is expected to exceed $22 billion by 2024. Wound care professionals rely heavily on images and image documentation for proper diagnosis and treatment. Unfortunately lack of expertise can lead to improper diagnosis of wound etiology and inaccurate wound management and documentation. Fully automatic segmentation of wound areas in natural images is an important part of the diagnosis and care protocol since it is crucial to measure the area of the wound and provide quantitative parameters in the treatment. Various deep learning models have gained success in image analysis including semantic segmentation. This manuscript proposes a novel convolutional framework based on MobileNetV2 and connected component labelling to segment wound regions from natural images. The advantage of this model is its lightweight and less compute-intensive architecture. The performance is not compromised and is comparable to deeper neural networks. We build an annotated wound image dataset consisting of 1109 foot ulcer images from 889 patients to train and test the deep learning models. We demonstrate the effectiveness and mobility of our method by conducting comprehensive experiments and analyses on various segmentation neural networks. The full implementation is available at https://github.com/uwm-bigdata/wound-segmentation .

80 citations

Journal ArticleDOI
TL;DR: A four-dimensional probability map specific to wound characteristics is defined, a computationally efficient method to segment wound images utilizing the probability map, and auto-calibration of wound measurements using the content of the image are applied.

71 citations

Journal ArticleDOI
26 Feb 2016-PLOS ONE
TL;DR: An interactive mobile phone application is developed that enables transfer of both patient data and pictures of a wound from the point-of-care to a remote burns expert who, in turn, provides advice back.
Abstract: BACKGROUND: Each year more than 10 million people worldwide are burned severely enough to require medical attention, with clinical outcomes noticeably worse in resource poor settings. Expert clinical advice on acute injuries can play a determinant role and there is a need for novel approaches that allow for timely access to advice. We developed an interactive mobile phone application that enables transfer of both patient data and pictures of a wound from the point-of-care to a remote burns expert who, in turn, provides advice back. METHODS AND RESULTS: The application is an integrated clinical decision support system that includes a mobile phone application and server software running in a cloud environment. The client application is installed on a smartphone and structured patient data and photographs can be captured in a protocol driven manner. The user can indicate the specific injured body surface(s) through a touchscreen interface and an integrated calculator estimates the total body surface area that the burn injury affects. Predefined standardised care advice including total fluid requirement is provided immediately by the software and the case data are relayed to a cloud server. A text message is automatically sent to a burn expert on call who then can access the cloud server with the smartphone app or a web browser, review the case and pictures, and respond with both structured and personalized advice to the health care professional at the point-of-care. CONCLUSIONS: In this article, we present the design of the smartphone and the server application alongside the type of structured patient data collected together with the pictures taken at point-of-care. We report on how the application will be introduced at point-of-care and how its clinical impact will be evaluated prior to roll out. Challenges, strengths and limitations of the system are identified that may help materialising or hinder the expected outcome to provide a solution for remote consultation on burns that can be integrated into routine acute clinical care and thereby promote equity in injury emergency care, a growing public health burden. Language: en

49 citations

Journal ArticleDOI
TL;DR: Application of AI is very promising for prediction of burn depth and therefore can be a useful tool to help in guiding clinical decision and initial treatment of burn wounds.
Abstract: We present in this paper the application of deep convolutional neural networks (CNNs), which is a state-of-the-art artificial intelligence (AI) approach in machine learning, for automated time-independent prediction of burn depth. Color images of four types of burn depth injured in first few days, including normal skin and background, acquired by a TiVi camera were trained and tested with four pretrained deep CNNs: VGG-16, GoogleNet, ResNet-50, and ResNet-101. In the end, the best 10-fold cross-validation results obtained from ResNet-101 with an average, minimum, and maximum accuracy are 81.66, 72.06, and 88.06%, respectively; and the average accuracy, sensitivity, and specificity for the four different types of burn depth are 90.54, 74.35, and 94.25%, respectively. The accuracy was compared with the clinical diagnosis obtained after the wound had healed. Hence, application of AI is very promising for prediction of burn depth and, therefore, can be a useful tool to help in guiding clinical decision and initial treatment of burn wounds.

38 citations