scispace - formally typeset
Search or ask a question
Author

Muhammad R. Ahmed

Bio: Muhammad R. Ahmed is an academic researcher from University of Canberra. The author has contributed to research in topics: Wireless sensor network & Key distribution in wireless sensor networks. The author has an hindex of 9, co-authored 39 publications receiving 391 citations. Previous affiliations of Muhammad R. Ahmed include King Abdulaziz University & University of Portsmouth.

Papers
More filters
Journal ArticleDOI
TL;DR: Results obtained by simulating the framework indicate that the designed network via its various components can achieve high QoS, with reduced end-to-end latency and packet drop rate, which is essential for developing next generation ${e}$ -healthcare systems.
Abstract: Rapid developments in the fields of information and communication technology and microelectronics allowed seamless interconnection among various devices letting them to communicate with each other. This technological integration opened up new possibilities in many disciplines including healthcare and well-being. With the aim of reducing healthcare costs and providing improved and reliable services, several healthcare frameworks based on Internet of Healthcare Things (IoHT) have been developed. However, due to the critical and heterogeneous nature of healthcare data, maintaining high quality of service (QoS)—in terms of faster responsiveness and data-specific complex analytics—has always been the main challenge in designing such systems. Addressing these issues, this paper proposes a five-layered heterogeneous mist, fog, and cloud-based IoHT framework capable of efficiently handling and routing (near-)real-time as well as offline/batch mode data. Also, by employing software defined networking and link adaptation-based load balancing, the framework ensures optimal resource allocation and efficient resource utilization. The results, obtained by simulating the framework, indicate that the designed network via its various components can achieve high QoS, with reduced end-to-end latency and packet drop rate, which is essential for developing next generation ${e}$ -healthcare systems.

147 citations

Journal ArticleDOI
TL;DR: It is found that the protocols for UWSN are highly selective, and the fitness of any protocol depends solely on the application and design requirements, which is addressed by this review.

73 citations

Journal ArticleDOI
TL;DR: This work proposes an improved performance scheme of nanocommunication over terahertz bands for wireless BSNs making it suitable for smart e-health applications and demonstrates the efficiency of the proposed scheme through maximized energy utilization in both single and multihop communications.
Abstract: Current developments in nanotechnology make electromagnetic communication possible at the nanoscale for applications involving body sensor networks (BSNs). This specialized branch of wireless sensor networks, drawing attention from diverse fields, such as engineering, medicine, biology, physics, and computer science, has emerged as an important research area contributing to medical treatment, social welfare, and sports. The concept is based on the interaction of integrated nanoscale machines by means of wireless communications. One key hurdle for advancing nanocommunications is the lack of an apposite networking protocol to address the upcoming needs of the nanonetworks. Recently, some key challenges have been identified, such as nanonodes with extreme energy constraints, limited computational capabilities, terahertz frequency bands with limited transmission range, and so on, in designing protocols for wireless nanosensor networks. This work proposes an improved performance scheme of nanocommunication over terahertz bands for wireless BSNs making it suitable for smart e-health applications. The scheme contains – a new energy-efficient forwarding routine for electromagnetic communication in wireless nanonetworks consisting of hybrid clusters with centralized scheduling; a model designed for channel behavior taking into account the aggregated impact of molecular absorption, spreading loss, and shadowing; and an energy model for energy harvesting and consumption. The outage probability is derived for both single and multilinks and extended to determine the outage capacity. The outage probability for a multilink is derived using a cooperative fusion technique at a predefined fusion node. Simulated using a nano-sim simulator, performance of the proposed model has been evaluated for energy efficiency, outage capacity, and outage probability. The results demonstrate the efficiency of the proposed scheme through maximized energy utilization in both single and multihop communications; multisensor fusion at the fusion node enhances the link quality of the transmission.

69 citations

Journal ArticleDOI
TL;DR: Wang et al. as mentioned in this paper proposed a novel pipeline for fall detection based on wearable accelerometer data and three publicly available datasets have been used to validate their proposed method, and more than 7700 cross-disciplinary time-series features were investigated for each of the datasets.
Abstract: Fall causes trauma or critical injury among the geriatric population which is a second leading accidental cause of post-injury mortality around the world. It is crucial to keep elderly people under supervision by ensuring proper privacy and comfort. Thus the elderly fall detection and prediction using wearable/ non-wearable sensors become an active field of research. In this work, a novel pipeline for fall detection based on wearable accelerometer data has been proposed. Three publicly available datasets have been used to validate our proposed method, and more than 7700 cross-disciplinary time-series features were investigated for each of the datasets. After following a series of feature reduction techniques such as mutual information, removing highly correlated features using the Pearson correlation coefficient, Boruta algorithm, we have obtained the dominant features for each dataset. Different classical machine learning (ML) algorithms were utilized to detect falls based on the obtained features. For individual datasets, the simple ML classifiers achieved very good accuracy. We trained our pipeline with two of the three datasets and tested with the remaining one dataset until all three datasets were used as the test set to show the generalization capability of our proposed pipeline. A set of 39 high-performing features is selected, and the classifiers were trained with them. For all the cases, the proposed pipeline showed excellent efficiency in detecting falls. This architecture performed better than most of the existing works in all the used publicly available datasets, proving the supremacy of the proposed data analysis pipeline.

56 citations

Journal ArticleDOI
TL;DR: In this paper, a comprehensive layer-wise survey on IoT security threats, and the AI-based security models to impede security threats is presented, and open challenges and future research directions are addressed for the safeguard of the IoT network.
Abstract: The Internet of Things (IoT) has emerged as a technology capable of connecting heterogeneous nodes/objects, such as people, devices, infrastructure, and makes our daily lives simpler, safer, and fruitful. Being part of a large network of heterogeneous devices, these nodes are typically resource-constrained and became the weakest link to the cyber attacker. Classical encryption techniques have been employed to ensure the data security of the IoT network. However, high-level encryption techniques cannot be employed in IoT devices due to the limitation of resources. In addition, node security is still a challenge for network engineers. Thus, we need to explore a complete solution for IoT networks that can ensure nodes and data security. The rule-based approaches and shallow and deep machine learning algorithms– branches of Artificial Intelligence (AI)– can be employed as countermeasures along with the existing network security protocols. This paper presented a comprehensive layer-wise survey on IoT security threats, and the AI-based security models to impede security threats. Finally, open challenges and future research directions are addressed for the safeguard of the IoT network.

41 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: In this article, the authors divide Edge Intelligence into two categories: Intelligence-enabled Edge Computing (IEC) and Artificial Intelligence on Edge (AI on Edge) to provide more optimal solutions to key problems in edge computing with the help of popular and effective AI technologies.
Abstract: Along with the rapid developments in communication technologies and the surge in the use of mobile devices, a brand-new computation paradigm, Edge Computing, is surging in popularity. Meanwhile, Artificial Intelligence (AI) applications are thriving with the breakthroughs in deep learning and the many improvements in hardware architectures. Billions of data bytes, generated at the network edge, put massive demands on data processing and structural optimization. Thus, there exists a strong demand to integrate Edge Computing and AI, which gives birth to Edge Intelligence. In this paper, we divide Edge Intelligence into AI for edge (Intelligence-enabled Edge Computing) and AI on edge (Artificial Intelligence on Edge). The former focuses on providing more optimal solutions to key problems in Edge Computing with the help of popular and effective AI technologies while the latter studies how to carry out the entire process of building AI models, i.e., model training and inference, on the edge. This paper provides insights into this new inter-disciplinary field from a broader perspective. It discusses the core concepts and the research road-map, which should provide the necessary background for potential future research initiatives in Edge Intelligence.

362 citations

Journal ArticleDOI
TL;DR: In this paper, the authors divide edge intelligence into AI for edge (intelligence-enabled edge computing) and AI on edge (artificial intelligence on edge), and provide insights into this new interdisciplinary field from a broader perspective.
Abstract: Along with the rapid developments in communication technologies and the surge in the use of mobile devices, a brand-new computation paradigm, edge computing, is surging in popularity. Meanwhile, the artificial intelligence (AI) applications are thriving with the breakthroughs in deep learning and the many improvements in hardware architectures. Billions of data bytes, generated at the network edge, put massive demands on data processing and structural optimization. Thus, there exists a strong demand to integrate edge computing and AI, which gives birth to edge intelligence. In this article, we divide edge intelligence into AI for edge (intelligence-enabled edge computing) and AI on edge (artificial intelligence on edge). The former focuses on providing more optimal solutions to key problems in edge computing with the help of popular and effective AI technologies while the latter studies how to carry out the entire process of building AI models, i.e., model training and inference, on the edge. This article provides insights into this new interdisciplinary field from a broader perspective. It discusses the core concepts and the research roadmap, which should provide the necessary background for potential future research initiatives in edge intelligence.

343 citations

Journal ArticleDOI
TL;DR: A survey of UWSN regarding underwater communication channel, environmental factors, localization, media access control, routing protocols, and effect of packet size on communication is conducted.
Abstract: Underwater Wireless Sensor Networks (UWSNs) contain several components such as vehicles and sensors that are deployed in a specific acoustic area to perform collaborative monitoring and data collection tasks. These networks are used interactively between different nodes and ground-based stations. Presently, UWSNs face issues and challenges regarding limited bandwidth, high propagation delay, 3D topology, media access control, routing, resource utilization, and power constraints. In the last few decades, research community provided different methodologies to overcome these issues and challenges; however, some of them are still open for research due to variable characteristics of underwater environment. In this paper, a survey of UWSN regarding underwater communication channel, environmental factors, localization, media access control, routing protocols, and effect of packet size on communication is conducted. We compared presently available methodologies and discussed their pros and cons to highlight new directions of research for further improvement in underwater sensor networks.

201 citations

Journal ArticleDOI
TL;DR: A taxonomy based on key enabling technologies, use cases, emerging machine learning schemes, communication technologies, networking technologies, and computing technologies is devised, and practical guidelines to cope with open research challenges are proposed.
Abstract: Internet of everything (IoE)-based smart services are expected to gain immense popularity in the future, which raises the need for next-generation wireless networks. Although fifth-generation (5G) networks can support various IoE services, they might not be able to completely fulfill the requirements of novel applications. Sixth-generation (6G) wireless systems are envisioned to overcome 5G network limitations. In this article, we explore recent advances made toward enabling 6G systems. We devise a taxonomy based on key enabling technologies, use cases, emerging machine learning schemes, communication technologies, networking technologies, and computing technologies. Furthermore, we identify and discuss open research challenges, such as artificial-intelligence-based adaptive transceivers, intelligent wireless energy harvesting, decentralized and secure business models, intelligent cell-less architecture, and distributed security models. We propose practical guidelines including deep Q-learning and federated learning-based transceivers, blockchain-based secure business models, homomorphic encryption, and distributed-ledger-based authentication schemes to cope with these challenges. Finally, we outline and recommend several future directions.

185 citations

Journal ArticleDOI
Yong Chen1
TL;DR: In this paper, an intensive literature review on industrial information integration engineering (IIIE) is presented, which presents an overview of IIIE's content, scope and findings, and potential research opportunities.

180 citations