scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Intelligent Diagnostic Prediction and Classification System for Chronic Kidney Disease.

03 Jul 2019-Scientific Reports (Nature Publishing Group)-Vol. 9, Iss: 1, pp 9583-9583
TL;DR: Comparing the D-ACO algorithm with existing methods, the presented intelligent system outperformed the other methodologies with a significant improvisation in classification accuracy using fewer features.
Abstract: At present times, healthcare systems are updated with advanced capabilities like machine learning (ML), data mining and artificial intelligence to offer human with more intelligent and expert healthcare services. This paper introduces an intelligent prediction and classification system for healthcare, namely Density based Feature Selection (DFS) with Ant Colony based Optimization (D-ACO) algorithm for chronic kidney disease (CKD). The proposed intelligent system eliminates irrelevant or redundant features by DFS in prior to the ACO based classifier construction. The proposed D-ACO framework three phases namely preprocessing, Feature Selection (FS) and classification. Furthermore, the D-ACO algorithm is tested using benchmark CKD dataset and the performance are investigated based on different evaluation factors. Comparing the D-ACO algorithm with existing methods, the presented intelligent system outperformed the other methodologies with a significant improvisation in classification accuracy using fewer features.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: A deep learning-based automated detection and classification model for fundus DR images that offers better classification over the existing models is proposed.

164 citations


Cites background from "Intelligent Diagnostic Prediction a..."

  • ...Some other deep learning models have also been presented in [19-23]....

    [...]

Journal ArticleDOI
TL;DR: In this paper, the authors used recursive feature elimination (RFE) to choose the most important features and applied four classification algorithms to detect chronic kidney disease (CKD) in patients.
Abstract: Chronic kidney disease (CKD) is among the top 20 causes of death worldwide and affects approximately 10% of the world adult population. CKD is a disorder that disrupts normal kidney function. Due to the increasing number of people with CKD, effective prediction measures for the early diagnosis of CKD are required. The novelty of this study lies in developing the diagnosis system to detect chronic kidney diseases. This study assists experts in exploring preventive measures for CKD through early diagnosis using machine learning techniques. This study focused on evaluating a dataset collected from 400 patients containing 24 features. The mean and mode statistical analysis methods were used to replace the missing numerical and the nominal values. To choose the most important features, Recursive Feature Elimination (RFE) was applied. Four classification algorithms applied in this study were support vector machine (SVM), k-nearest neighbors (KNN), decision tree, and random forest. All the classification algorithms achieved promising performance. The random forest algorithm outperformed all other applied algorithms, reaching an accuracy, precision, recall, and F1-score of 100% for all measures. CKD is a serious life-threatening disease, with high rates of morbidity and mortality. Therefore, artificial intelligence techniques are of great importance in the early detection of CKD. These techniques are supportive of experts and doctors in early diagnosis to avoid developing kidney failure.

81 citations

Journal ArticleDOI
TL;DR: The proposed Deep neural model outperformed the other four classifiers (Support Vector Machine (SVM), K-Nearest Neighbor (KNN), Logistic regression, Random Forest, and Naive Bayes classifier) by achieving 100% accuracy.
Abstract: Diabetes and high blood pressure are the primary causes of Chronic Kidney Disease (CKD). Glomerular Filtration Rate (GFR) and kidney damage markers are used by researchers around the world to identify CKD as a condition that leads to reduced renal function over time. A person with CKD has a higher chance of dying young. Doctors face a difficult task in diagnosing the different diseases linked to CKD at an early stage in order to prevent the disease. This research presents a novel deep learning model for the early detection and prediction of CKD. This research objectives to create a deep neural network and compare its performance to that of other contemporary machine learning techniques. In tests, the average of the associated features was used to replace all missing values in the database. After that, the neural network’s optimum parameters were fixed by establishing the parameters and running multiple trials. The foremost important features were selected by Recursive Feature Elimination (RFE). Hemoglobin, Specific Gravity, Serum Creatinine, Red Blood Cell Count, Albumin, Packed Cell Volume, and Hypertension were found as key features in the RFE. Selected features were passed to machine learning models for classification purposes. The proposed Deep neural model outperformed the other four classifiers (Support Vector Machine (SVM), K-Nearest Neighbor (KNN), Logistic regression, Random Forest, and Naive Bayes classifier) by achieving 100% accuracy. The proposed approach could be a useful tool for nephrologists in detecting CKD.

55 citations

Journal ArticleDOI
TL;DR: In this article , an improved algorithm named Internet of Things-based enhanced machine learning is proposed in which it involves separate functions to diagnose each type of tumour and analyzes and calculates things like the size, shape, and location of the tumour.
Abstract: In the current age of technology, various diseases in the body are also on the rise. Tumours that cause more discomfort in the body are set to increase the discomfort of most patients. Patients experience different effects depending on the tumour size and type. Future developments in the medical field are moving towards the development of tools based on IoT devices. These advances will in the future follow special features designed based on multiple machine learning developed by artificial intelligence. In that order, an improved algorithm named Internet of Things-based enhanced machine learning is proposed in this paper. What makes it special is that it involves separate functions to diagnose each type of tumour. It analyzes and calculates things like the size, shape, and location of the tumour. Cure from cancer is determined by the stage at which we find cancer. Early detection of cancer has the potential to cure quickly. At a saturation point, the proposed Internet of Things-based enhanced machine learning model achieved 94.56% of accuracy, 94.12% of precision, 94.98% of recall, 95.12% of F1-score, and 1856 ms of execution time. The simulation is conducted to test the efficacy of the model, and the results of the simulation show that the proposed Internet of Things-based enhanced machine learning obtains a higher rate of intelligence than other methods.

52 citations

Journal ArticleDOI
TL;DR: This study projects a new segmentation based classification model for skin lesion diagnosis by combining a GrabCut algorithm and Adaptive Neuro-Fuzzy classifier (ANFC) model, which exhibits better identification and classification of skin cancer.
Abstract: Internet of Medical Things (IoMT) includes interconnected sensors, wearable devices, medical devices, and clinical systems. At the same time, skin cancer is a commonly available type of cancer that exists all over the globe. This study projects a new segmentation based classification model for skin lesion diagnosis by combining a GrabCut algorithm and Adaptive Neuro-Fuzzy classifier (ANFC) model. The proposed method involves four main steps: preprocessing, segmentation, feature extraction, and classification. Initially, the preprocessing step is carried out using a Top hat filter and inpainting technique. Then, the Grabcut algorithm is used to segment the preprocessed images. Next, the feature extraction process takes place by the use of a deep learning based Inception model. Finally, an adaptive neuro-fuzzy classifier (ANFC) system gets executed to classify the dermoscopic images into different classes. The proposed model is simulated using a benchmark International Skin Imaging Collaboration (ISIC) dataset and the results are examined interms of accuracy, sensitivity and specificity. The proposed model exhibits better identification and classification of skin cancer. For examining the effective outcome of the projected technique, an extensive comparison of the presented method with earlier models takes place. The experimental values indicated that the proposed method has offered a maximum sensitivity of 93.40%, specificity of 98.70% and accuracy of 97.91%.

47 citations

References
More filters
Book
01 Jan 1991
TL;DR: The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Abstract: Preface to the Second Edition. Preface to the First Edition. Acknowledgments for the Second Edition. Acknowledgments for the First Edition. 1. Introduction and Preview. 1.1 Preview of the Book. 2. Entropy, Relative Entropy, and Mutual Information. 2.1 Entropy. 2.2 Joint Entropy and Conditional Entropy. 2.3 Relative Entropy and Mutual Information. 2.4 Relationship Between Entropy and Mutual Information. 2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information. 2.6 Jensen's Inequality and Its Consequences. 2.7 Log Sum Inequality and Its Applications. 2.8 Data-Processing Inequality. 2.9 Sufficient Statistics. 2.10 Fano's Inequality. Summary. Problems. Historical Notes. 3. Asymptotic Equipartition Property. 3.1 Asymptotic Equipartition Property Theorem. 3.2 Consequences of the AEP: Data Compression. 3.3 High-Probability Sets and the Typical Set. Summary. Problems. Historical Notes. 4. Entropy Rates of a Stochastic Process. 4.1 Markov Chains. 4.2 Entropy Rate. 4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph. 4.4 Second Law of Thermodynamics. 4.5 Functions of Markov Chains. Summary. Problems. Historical Notes. 5. Data Compression. 5.1 Examples of Codes. 5.2 Kraft Inequality. 5.3 Optimal Codes. 5.4 Bounds on the Optimal Code Length. 5.5 Kraft Inequality for Uniquely Decodable Codes. 5.6 Huffman Codes. 5.7 Some Comments on Huffman Codes. 5.8 Optimality of Huffman Codes. 5.9 Shannon-Fano-Elias Coding. 5.10 Competitive Optimality of the Shannon Code. 5.11 Generation of Discrete Distributions from Fair Coins. Summary. Problems. Historical Notes. 6. Gambling and Data Compression. 6.1 The Horse Race. 6.2 Gambling and Side Information. 6.3 Dependent Horse Races and Entropy Rate. 6.4 The Entropy of English. 6.5 Data Compression and Gambling. 6.6 Gambling Estimate of the Entropy of English. Summary. Problems. Historical Notes. 7. Channel Capacity. 7.1 Examples of Channel Capacity. 7.2 Symmetric Channels. 7.3 Properties of Channel Capacity. 7.4 Preview of the Channel Coding Theorem. 7.5 Definitions. 7.6 Jointly Typical Sequences. 7.7 Channel Coding Theorem. 7.8 Zero-Error Codes. 7.9 Fano's Inequality and the Converse to the Coding Theorem. 7.10 Equality in the Converse to the Channel Coding Theorem. 7.11 Hamming Codes. 7.12 Feedback Capacity. 7.13 Source-Channel Separation Theorem. Summary. Problems. Historical Notes. 8. Differential Entropy. 8.1 Definitions. 8.2 AEP for Continuous Random Variables. 8.3 Relation of Differential Entropy to Discrete Entropy. 8.4 Joint and Conditional Differential Entropy. 8.5 Relative Entropy and Mutual Information. 8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information. Summary. Problems. Historical Notes. 9. Gaussian Channel. 9.1 Gaussian Channel: Definitions. 9.2 Converse to the Coding Theorem for Gaussian Channels. 9.3 Bandlimited Channels. 9.4 Parallel Gaussian Channels. 9.5 Channels with Colored Gaussian Noise. 9.6 Gaussian Channels with Feedback. Summary. Problems. Historical Notes. 10. Rate Distortion Theory. 10.1 Quantization. 10.2 Definitions. 10.3 Calculation of the Rate Distortion Function. 10.4 Converse to the Rate Distortion Theorem. 10.5 Achievability of the Rate Distortion Function. 10.6 Strongly Typical Sequences and Rate Distortion. 10.7 Characterization of the Rate Distortion Function. 10.8 Computation of Channel Capacity and the Rate Distortion Function. Summary. Problems. Historical Notes. 11. Information Theory and Statistics. 11.1 Method of Types. 11.2 Law of Large Numbers. 11.3 Universal Source Coding. 11.4 Large Deviation Theory. 11.5 Examples of Sanov's Theorem. 11.6 Conditional Limit Theorem. 11.7 Hypothesis Testing. 11.8 Chernoff-Stein Lemma. 11.9 Chernoff Information. 11.10 Fisher Information and the Cram-er-Rao Inequality. Summary. Problems. Historical Notes. 12. Maximum Entropy. 12.1 Maximum Entropy Distributions. 12.2 Examples. 12.3 Anomalous Maximum Entropy Problem. 12.4 Spectrum Estimation. 12.5 Entropy Rates of a Gaussian Process. 12.6 Burg's Maximum Entropy Theorem. Summary. Problems. Historical Notes. 13. Universal Source Coding. 13.1 Universal Codes and Channel Capacity. 13.2 Universal Coding for Binary Sequences. 13.3 Arithmetic Coding. 13.4 Lempel-Ziv Coding. 13.5 Optimality of Lempel-Ziv Algorithms. Compression. Summary. Problems. Historical Notes. 14. Kolmogorov Complexity. 14.1 Models of Computation. 14.2 Kolmogorov Complexity: Definitions and Examples. 14.3 Kolmogorov Complexity and Entropy. 14.4 Kolmogorov Complexity of Integers. 14.5 Algorithmically Random and Incompressible Sequences. 14.6 Universal Probability. 14.7 Kolmogorov complexity. 14.9 Universal Gambling. 14.10 Occam's Razor. 14.11 Kolmogorov Complexity and Universal Probability. 14.12 Kolmogorov Sufficient Statistic. 14.13 Minimum Description Length Principle. Summary. Problems. Historical Notes. 15. Network Information Theory. 15.1 Gaussian Multiple-User Channels. 15.2 Jointly Typical Sequences. 15.3 Multiple-Access Channel. 15.4 Encoding of Correlated Sources. 15.5 Duality Between Slepian-Wolf Encoding and Multiple-Access Channels. 15.6 Broadcast Channel. 15.7 Relay Channel. 15.8 Source Coding with Side Information. 15.9 Rate Distortion with Side Information. 15.10 General Multiterminal Networks. Summary. Problems. Historical Notes. 16. Information Theory and Portfolio Theory. 16.1 The Stock Market: Some Definitions. 16.2 Kuhn-Tucker Characterization of the Log-Optimal Portfolio. 16.3 Asymptotic Optimality of the Log-Optimal Portfolio. 16.4 Side Information and the Growth Rate. 16.5 Investment in Stationary Markets. 16.6 Competitive Optimality of the Log-Optimal Portfolio. 16.7 Universal Portfolios. 16.8 Shannon-McMillan-Breiman Theorem (General AEP). Summary. Problems. Historical Notes. 17. Inequalities in Information Theory. 17.1 Basic Inequalities of Information Theory. 17.2 Differential Entropy. 17.3 Bounds on Entropy and Relative Entropy. 17.4 Inequalities for Types. 17.5 Combinatorial Bounds on Entropy. 17.6 Entropy Rates of Subsets. 17.7 Entropy and Fisher Information. 17.8 Entropy Power Inequality and Brunn-Minkowski Inequality. 17.9 Inequalities for Determinants. 17.10 Inequalities for Ratios of Determinants. Summary. Problems. Historical Notes. Bibliography. List of Symbols. Index.

45,034 citations

Book
01 Jan 1973

20,541 citations

Journal Article
Carol L. Baird1
TL;DR: A randomized controlled experiment is designed to test whether access to affordable day care (in the form of subsidies, for example) would incentivize Saudi mothers to search actively for employment and to remain employed once they are hired.
Abstract: This pilot aims to better understand the market for childcare in Saudi Arabia – both the supply and demand sides – and to design a randomized controlled experiment to test whether access to affordable day care (in the form of subsidies, for example) would incentivize Saudi mothers to search actively for employment and to remain employed once they are hired. In addition, the study seeks to understand the degree to which employment early on in one’s life impacts employment in later stages. The pilot will provide information on the groups of women the experiment should target, appropriate levels for the childcare subsidy, and the quality and current geographic locations of daycare sites. Expected Impact Determine the effects of facilitating childcare access on Saudi women’s employment. PRINCIPAL INVESTIGATORS  Boston University Patricia Cortes  Harvard University Claudia Goldin  Swarthmore College Jennifer Peck

9,609 citations


Additional excerpts

  • ...| = = H W A V P w A V log P w A V ( ) ( ( ) ( )) (4) m mn w k...

    [...]

Journal ArticleDOI
TL;DR: There was a high prevalence of CVD in CKD and that mortality due to CVD was 10 to 30 times higher in dialysis patients than in the general population, and the task force recommended that patients with CKD be considered in the “highest risk group” for subsequent CVD events.
Abstract: Chronic kidney disease1 (CKD) is a worldwide public health problem. In the United States, there is a rising incidence and prevalence of kidney failure, with poor outcomes and high cost. The number of individuals with kidney failure treated by dialysis and transplantation exceeded 320 000 in 1998 and is expected to surpass 650 000 by 2010.1,2 There is an even higher prevalence of earlier stages of CKD (Table 1).1,3 Kidney failure requiring treatment with dialysis or transplantation is the most visible outcome of CKD. However, cardiovascular disease (CVD) is also frequently associated with CKD, which is important because individuals with CKD are more likely to die of CVD than to develop kidney failure,4 CVD in CKD is treatable and potentially preventable, and CKD appears to be a risk factor for CVD. In 1998, the National Kidney Foundation (NKF) Task Force on Cardiovascular Disease in Chronic Renal Disease issued a report emphasizing the high risk of CVD in CKD.5 This report showed that there was a high prevalence of CVD in CKD and that mortality due to CVD was 10 to 30 times higher in dialysis patients than in the general population (Figure 1 and Table 2).6–18 The task force recommended that patients with CKD be considered in the “highest risk group” for subsequent CVD events and that treatment recommendations based on CVD risk stratification should take into account the highest-risk status of patients with CKD. View this table: TABLE 1. Stages of CKD Figure 1. Cardiovascular mortality defined by death due to arrhythmias, cardiomyopathy, cardiac arrest, myocardial infarction, atherosclerotic heart disease, and pulmonary edema in general population (GP; National Center for Health Statistics [NCHS] multiple cause of mortality data files International Classification of Diseases, 9th Revision [ICD 9] codes 402, 404, 410 to 414, and …

4,037 citations

Journal ArticleDOI
TL;DR: The objective is to provide a generic introduction to variable elimination which can be applied to a wide array of machine learning problems and focus on Filter, Wrapper and Embedded methods.

3,517 citations