scispace - formally typeset
Search or ask a question
Author

Nz Jhanjhi

Bio: Nz Jhanjhi is an academic researcher from Taylors University. The author has contributed to research in topics: Computer science & Artificial intelligence. The author has an hindex of 4, co-authored 47 publications receiving 68 citations.

Papers published on a yearly basis

Papers
More filters
Journal ArticleDOI
TL;DR: A comprehensive framework is provided that will help energy researchers and practitioners in better understanding of 5G aided industry 4.0 infrastructure and energy resource optimization by improving privacy by using case studies and mathematical modelling.
Abstract: The 5G is expected to revolutionize every sector of life by providing interconnectivity of everything everywhere at high speed. However, massively interconnected devices and fast data transmission will bring the challenge of privacy as well as energy deficiency. In today’s fast-paced economy, almost every sector of the economy is dependent on energy resources. On the other hand, the energy sector is mainly dependent on fossil fuels and is constituting about 80% of energy globally. This massive extraction and combustion of fossil fuels lead to a lot of adverse impacts on health, environment, and economy. The newly emerging 5G technology has changed the existing phenomenon of life by connecting everything everywhere using IoT devices. 5G enabled IIoT devices has transformed everything from traditional to smart, e.g. smart city, smart healthcare, smart industry, smart manufacturing etc. However, massive I/O technologies for providing D2D connection has also created the issue of privacy that need to be addressed. Privacy is the fundamental right of every individual. 5G industries and organizations need to preserve it for their stability and competency. Therefore, privacy at all three levels (data, identity and location) need to be maintained. Further, energy optimization is a big challenge that needs to be addressed for leveraging the potential benefits of 5G and 5G aided IIoT. Billions of IIoT devices that are expected to communicate using the 5G network will consume a considerable amount of energy while energy resources are limited. Therefore, energy optimization is a future challenge faced by 5G industries that need to be addressed. To fill these gaps, we have provided a comprehensive framework that will help energy researchers and practitioners in better understanding of 5G aided industry 4.0 infrastructure and energy resource optimization by improving privacy. The proposed framework is evaluated using case studies and mathematical modelling.

51 citations

Journal ArticleDOI
TL;DR: In this paper , the authors proposed a COVID-19 detection system with the potential to detect COVID19 in the initial stage by employing deep learning models over patients' symptoms and chest X-ray images, which obtained average accuracy 78.88%, specificity 94%, and sensitivity 77% on a testing dataset containing 800 patients' X-Ray images and 800 patients's symptoms.
Abstract: ABSTRACT The accurate diagnosis of the initial stage COVID-19 is necessary for minimizing its spreading rate. The physicians most often recommend RT-PCR tests; this is invasive, time-consuming, and ineffective in reducing the spread rate of COVID-19. However, this can be minimized by using noninvasive and fast machine learning methods trained either on labeled patients’ symptoms or medical images. The machine learning methods trained on labeled patients’ symptoms cannot differentiate between different types of pneumonias like COVID-19, viral pneumonia, and bacterial pneumonia because of similar symptoms, i.e., cough, fever, headache, sore throat, and shortness of breath. The machine learning methods trained on labeled patients’ medical images have the potential to overcome the limitation of the symptom-based method; however, these methods are incapable of detecting COVID-19 in the initial stage because the infection of COVID-19 takes 3 to 12 days to appear. This research proposes a COVID-19 detection system with the potential to detect COVID-19 in the initial stage by employing deep learning models over patients’ symptoms and chest X-Ray images. The proposed system obtained average accuracy 78.88%, specificity 94%, and sensitivity 77% on a testing dataset containing 800 patients’ X-Ray images and 800 patients’ symptoms, better than existing COVID-19 detection methods.

45 citations

Proceedings ArticleDOI
01 Dec 2019
TL;DR: The current research is an effort to present briefly the core concepts of security and privacy issues concerning to the smart cities and reveal cyber-attacks that were recent targeting smart cities based on current literature.
Abstract: The increasing need and implementation of Information Communication Technologies (ICT) in urban infrastructure have led to greater attention in smart cities. Smart cities make use of ICTs to enhance: the quality of life of citizens by paving the way to improve local economy, enhance transport system and transport management thereby providing conducive environment to build strong relationships together with public authorities. The assiduous goal of a smart cities is to improve the quality of life and services of citizens. Nevertheless, the instigation of ICTs has raised various issues pertaining to privacy and security issues concerned with smart cities and the inhabitants residing in them. The current research is an effort to present briefly the core concepts of security and privacy issues concerning to the smart cities and reveal cyber-attacks that were recent targeting smart cities based on current literature. Further, this research has elaborated and identified numerous security weaknesses and privacy challenges pertaining to various cyber security, the issues, challenges and recommendations in order to provide future directions.

45 citations

Book ChapterDOI
01 Jan 2021
TL;DR: A computerized tomography study in patients with suspected COVID-19 pneumonia consists of using a high-resolution approach (HRCT) and artificial intelligence applications need to be useful in categorizing the illness to an awesome severity and integrating the structured file.
Abstract: A new coronavirus-CoV-2 virus has caused disease outbreaks in many countries, and the number of cases is increasing rapidly through transmission from person to person. Clinical acoustics for SARS-CoV-2 patients are crucial to distinguish them from other respiratory infections. Symptomatic sufferers can also have pulmonary lesions on the photographs. A computerized tomography study in patients with suspected COVID-19 pneumonia consists of using a high-resolution approach (HRCT). Artificial intelligence applications need to be useful in categorizing the illness to an awesome severity and integrating the structured file, organized consistent with subjective issues, with objective and quantitative checks of the amount of the lesions. Data indicate the statistical document of the world in trendy. This method, with the aid of a coloring map, identifies floor glass in submission processing and separates it from consolidation and units it as a percentage in respect to the balanced weight loss program.

38 citations

Journal ArticleDOI
TL;DR: The experimental results indicate that the predictive accuracy of the DRL model trained on the temporal dataset is significantly better than other ML models that are trained only with the dataset at specific snapshot in time.
Abstract: The prediction of hidden or missing links in a criminal network, which represent possible interactions between individuals, is a significant problem. The criminal network prediction models commonly rely on Social Network Analysis (SNA) metrics. These models leverage on machine learning (ML) techniques to enhance the predictive accuracy of the models and processing speed. The problem with the use of classical ML techniques such as support vector machine (SVM), is the dependency on the availability of large dataset for training purpose. However, recent ground breaking advances in the research of deep reinforcement learning (DRL) techniques have developed methods of training ML models through self-generated dataset. In view of this, DRL could be applied to other domains with relatively smaller dataset such as criminal networks. Prior to this research, few, if any, previous works have explored the prediction of links within criminal networks that could appear and/or disappear over time by leveraging on DRL technique. Therefore, in this paper, the primary objective is to construct a time-based link prediction model (TDRL) by leveraging on DRL technique to train using a relatively small real-world criminal dataset that evolves over time. The experimental results indicate that the predictive accuracy of the DRL model trained on the temporal dataset is significantly better than other ML models that are trained only with the dataset at specific snapshot in time.

38 citations


Cited by
More filters
01 Jan 2020
TL;DR: Prolonged viral shedding provides the rationale for a strategy of isolation of infected patients and optimal antiviral interventions in the future.
Abstract: Summary Background Since December, 2019, Wuhan, China, has experienced an outbreak of coronavirus disease 2019 (COVID-19), caused by the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). Epidemiological and clinical characteristics of patients with COVID-19 have been reported but risk factors for mortality and a detailed clinical course of illness, including viral shedding, have not been well described. Methods In this retrospective, multicentre cohort study, we included all adult inpatients (≥18 years old) with laboratory-confirmed COVID-19 from Jinyintan Hospital and Wuhan Pulmonary Hospital (Wuhan, China) who had been discharged or had died by Jan 31, 2020. Demographic, clinical, treatment, and laboratory data, including serial samples for viral RNA detection, were extracted from electronic medical records and compared between survivors and non-survivors. We used univariable and multivariable logistic regression methods to explore the risk factors associated with in-hospital death. Findings 191 patients (135 from Jinyintan Hospital and 56 from Wuhan Pulmonary Hospital) were included in this study, of whom 137 were discharged and 54 died in hospital. 91 (48%) patients had a comorbidity, with hypertension being the most common (58 [30%] patients), followed by diabetes (36 [19%] patients) and coronary heart disease (15 [8%] patients). Multivariable regression showed increasing odds of in-hospital death associated with older age (odds ratio 1·10, 95% CI 1·03–1·17, per year increase; p=0·0043), higher Sequential Organ Failure Assessment (SOFA) score (5·65, 2·61–12·23; p Interpretation The potential risk factors of older age, high SOFA score, and d-dimer greater than 1 μg/mL could help clinicians to identify patients with poor prognosis at an early stage. Prolonged viral shedding provides the rationale for a strategy of isolation of infected patients and optimal antiviral interventions in the future. Funding Chinese Academy of Medical Sciences Innovation Fund for Medical Sciences; National Science Grant for Distinguished Young Scholars; National Key Research and Development Program of China; The Beijing Science and Technology Project; and Major Projects of National Science and Technology on New Drug Creation and Development.

4,408 citations

Journal ArticleDOI
TL;DR: A comprehensive survey on evolution, prevention and mitigation of Ransomware in IoT context is provided and is expected to be useful for researchers and practitioners who are involved in developing solutions for IoT security.
Abstract: Internet of things architecture is the integration of real-world objects and places with the internet. This booming in technology is bringing ease in our lifestyle and making formerly impossible things possible. Internet of things playing a vital role in bridging this gap easily and rapidly. IoT is changing our lifestyle and the way of working the technologies, by bringing them together at the one page in several application areas of daily life. However, IoT has to face several challenges in the form of cyber scams, one of the major challenges IoT has to face is the likelihood of Ransomware attack. Ransomware is a malicious kind of software that restricts access to vital information in some way and demand payment for getting access to this information. The ransomware attack is becoming widespread daily, and it is bringing disastrous consequences, including loss of sensitive data, loss of productivity, data destruction, and loss of reputation and business downtime. Which further leads to millions of dollar daily losses due to the downtime. This is inevitable for organizations to revise their annual cybersecurity goals and need to implement proper resilience and recovery plan to keep business running. However, before proceeding towards providing a practical solution, there is a need to synthesize the existing data and statistics about this crucial attack to make aware to the researchers and practitioners. To fill this gap, this paper provides a comprehensive survey on evolution, prevention and mitigation of Ransomware in IoT context. This paper differs from existing in various dimensions: firstly, it provides deeper insights about Ransomware evolution in IoT. Secondly; it discusses diverse aspects of Ransomware attacks on IoT which include, various types of Ransomware, Current research in Ransomware, Existing techniques to prevent and mitigate Ransomware attacks in IoT along with the ways to deal with an affected machine, the decision about paying the ransom or not, and future emerging trends of Ransomware propagation in IoT. Thirdly, a summary of current research is also provided to show various directions of research. In sum, this detailed survey is expected to be useful for researchers and practitioners who are involved in developing solutions for IoT security.

88 citations

Journal ArticleDOI
TL;DR: The proposed task allocation algorithm (Energy-aware Task Allocation in Multi-Cloud Networks (ETAMCN) minimizes the overall energy consumption and also reduces the makespan), which improves the average energy consumption through ETAMCN.
Abstract: In recent years, the growth rate of Cloud computing technology is increasing exponentially, mainly for its extraordinary services with expanding computation power, the possibility of massive storage, and all other services with the maintained quality of services (QoSs). The task allocation is one of the best solutions to improve different performance parameters in the cloud, but when multiple heterogeneous clouds come into the picture, the allocation problem becomes more challenging. This research work proposed a resource-based task allocation algorithm. The same is implemented and analyzed to understand the improved performance of the heterogeneous multi-cloud network. The proposed task allocation algorithm (Energy-aware Task Allocation in Multi-Cloud Networks ( ETAMCN )) minimizes the overall energy consumption and also reduces the makespan. The results show that the makespan is approximately overlapped for different tasks and does not show a significant difference. However, the average energy consumption improved through ETAMCN is approximately 14%, 6.3%, and 2.8% in opposed to the random allocation algorithm, Cloud Z-Score Normalization ( CZSN ) algorithm, and multi-objective scheduling algorithm with Fuzzy resource utilization (FR-MOS), respectively. An observation of the average SLA-violation of ETAMCN for different scenarios is performed.

59 citations

Journal ArticleDOI
TL;DR: This paper presents a comprehensive review of various Load Balancing techniques in a static, dynamic, and nature-inspired cloud environment to address the Data Center Response Time and overall performance.
Abstract: Cloud Computing is a robust model that allows users and organizations to purchase required services per their needs. The model offers many services such as storage, platforms for deployment, convenient access to web services, and so on. Load Balancing is a common issue in the cloud that makes it hard to maintain the performance of the applications adjacent to the Quality of Service (QoS) measurement and following the Service Level Agreement (SLA) document as required from the cloud providers to enterprises. Cloud providers struggle to distribute equal workload among the servers. An efficient LB technique should optimize and ensure high user satisfaction by utilizing the resources of VMs efficiently. This paper presents a comprehensive review of various Load Balancing techniques in a static, dynamic, and nature-inspired cloud environment to address the Data Center Response Time and overall performance. An analytical review of the algorithms is provided, and a research gap is concluded for the future research perspective in this domain. This research also provides a graphical representation of reviewed algorithms to highlight the operational flow. Additionally, this review presents a fault-tolerant framework and explores the other existing frameworks in the recent literature.

54 citations

Journal ArticleDOI
TL;DR: A test case selection and prioritization framework using a design pattern to increase the faults detection rate and it is shown that the proposed framework successfully verified changes.
Abstract: To survive in the competitive environment, most organizations have adopted component-based software development strategies in the rapid technology advancement era and the proper utilization of cloud-based services. To facilitate the continuous configuration, reduce complexity, and faster system delivery for higher user satisfaction in dynamic scenarios. In cloud services, customers select services from web applications dynamically. Healthcare body sensors are commonly used for diagnosis and monitoring patients continuously for their emergency treatment. The healthcare devices are connected with mobile or laptop etc. on cloud environment with network and frequently change applications. Thus, organizations rely on regression testing during changes and implementation to validate the quality and reliability of the system after the alteration. However, for a large application with limited resources and frequently change component management activities in the cloud computing environment, component-based system verification is difficult and challenging due to irrelevant and redundant test cases and faults. In this study, proposed a test case selection and prioritization framework using a design pattern to increase the faults detection rate. First, we select test cases on frequently accessed components using observer patterns and, secondly, prioritize test cases on adopting some strategies. The proposed framework was validated by an experiment and compared with other techniques (previous faults based and random priority). Hence, experimental results show that the proposed framework successfully verified changes. Subsequently, the proposed framework increases the fault detection rate (i.e., more than 90%) than previous faults based and random priority (i.e., more than 80% respectively).

52 citations