scispace - formally typeset
Search or ask a question
Author

Mohamed Hadi Habaebi

Bio: Mohamed Hadi Habaebi is an academic researcher from International Islamic University Malaysia. The author has contributed to research in topics: Wireless network & Cellular network. The author has an hindex of 15, co-authored 268 publications receiving 1119 citations. Previous affiliations of Mohamed Hadi Habaebi include International Islamic University, Islamabad & Islamic University.


Papers
More filters
Journal ArticleDOI
TL;DR: This paper looks at threat mitigation approaches in IoT using an autonomic taxonomy and finally sets down future directions.

132 citations

Journal ArticleDOI
TL;DR: The state-of-art application of Blockchain in 5G network is reviewed and how it can facilitate enabling technologies of 5G and beyond to enable various services at the front-haul, edge and the core is explored.
Abstract: Until now, every evolution of communication standard was driven by the need for providing high-speed connectivity to the end-user. However, 5G marks a radical shift from this focus as 5G and beyond networks are being designed to be future-proof by catering to diverse requirements of several use cases. These requirements include Ultra-Reliable Low Latency Communications, Massive Machine-Type Communications and Enhanced Mobile Broadband. To realize such features in 5G and beyond, there is a need to rethink how current cellular networks are deployed because designing new radio access technologies and utilizing the new spectrum are not enough. Several technologies, such as software-defined networking, network function virtualization, machine learning and cloud computing, are being integrated into the 5G networks to fulfil the need for diverse requirements. These technologies, however, give rise to several challenges associated with decentralization, transparency, interoperability, privacy and security. To address these issues, Blockchain has emerged as a potential solution due to its capabilities such as transparency, data encryption, auditability, immutability and distributed architecture. In this paper, we review the state-of-art application of Blockchain in 5G network and explore how it can facilitate enabling technologies of 5G and beyond to enable various services at the front-haul, edge and the core. Based on the review, we present a taxonomy of Blockchain application in 5G networks and discuss several issues that can be solved using Blockchain integration. We then present various field-trials and Proof of concept that are using Blockchain to address the challenges faced in the current 5G deployment. Finally, we discuss various challenges that need to be addressed to realize the full potential of Blockchain in beyond 5G networks. The survey presents a broad range of ideas related to Blockchain integration in 5G and beyond networks that address issues such as interoperability, security, mobility, resource allocation, resource sharing and management, energy efficiency and other desirable features.

51 citations

Journal ArticleDOI
TL;DR: Results indicate that the best CQI algorithm outperforms other algorithms in terms of throughput levels but on the expense of fairness to other users suffering from bad channel conditions.
Abstract: Long-Term Evolution (LTE) is a recently evolving technology characterized by very high speed data rate that allows users to access internet through their mobile as well as through other electronic devices. Such technology is intended to support variety of IP-based heterogeneous traffic types. Traffic scheduling plays an important role in LTE technology by assigning the shared resources among users in the most efficient manner. This paper discusses the performance of three types of scheduling algorithms namely: Round Robin, best Channel Quality Indicator (CQI) and Proportional Fair (PF) schedulers representing the extreme cases in scheduling. The scheduling algorithms performances on the downlink were measured in terms of throughput and block error rate using a MATLAB-based system level simulation. Results indicate that the best CQI algorithm outperforms other algorithms in terms of throughput levels but on the expense of fairness to other users suffering from bad channel conditions. ABSTRAK : Teknologi baru Evolusi Jangka Panjang (LTE) sentiasa berubah dan ia bercirikan kelajuan kadar data sangat tinggi yang membolehkan pengguna mengakses internet melalui telefon bimbit dan peranti elektronik lain. Teknologi seperti ini bertujuan menyokong pelbagai jenis trafik heterogen berasaskan IP. Penjadualan trafik memainkan peranan penting dalam teknologi LTE bagi mengagihkan sumber perkongsian secara paling berkesan di kalangan pengguna. Kertas ini membincangkan prestasi tiga jenis algoritma penjadualan iaitu: pusingan Robin, penunjuk kualiti saluran (CQI) terbaik dan penjadualan berkadar adil (PF) yang merupakan kes ekstrem dalam penjadualan. Prestasi penjadualan Algoritma di pautan turun diukur dari segi daya pemprosesan dan kadar ralat blok melalui simulasi sistem menggunakan MATLAB. Hasil kajian menunjukkan algoritma CQI adalah yang terbaik berbanding hasil algoritma lain dari segi tahap daya pemprosesan tetapi algoritma ini menyebabkan pengguna lain mengalami keadaan saluran buruk. KEYWORDS : LTE; round robin; best CQI; proportional fair; scheduling; resource blocks

41 citations

Proceedings ArticleDOI
16 Jul 2012
TL;DR: An outline of carrier aggregation including aggregation structure, deployment scenarios, implementation, main design features and backward compatibility with legacy LTE systems is provided.
Abstract: Long Term Evolution-Advanced (LTE-Advanced) provides considerably higher data rates than even early releases of LTE. One key enhancement feature is bandwidth extension by the use of multicarrier technology to support deployment bandwidth up to 100 MHz. In order to achieve up to 1 Gb/s peak data rate in IMT-Advanced mobile systems, carrier aggregation technology is introduced by the 3GPP to support very-high-data-rate transmissions over wide frequency bandwidths (e.g., up to 100 MHz) in its new LTE-Advanced standards. The carrier aggregation (CA) technology allows scalable expansion of effective bandwidth provided to a user terminal through simultaneous utilization of radio resources across multiple carriers. The CA in LTE-Advanced is designed to support aggregation of a variety of different arrangements of component carriers (CCs), including CCs of the same or different bandwidths, contiguous or non- contiguous CCs in the same frequency band, and CCs in different frequency bands. The CA is supported by both formats of LTE, specifically the frequency Division Duplex (FDD) and Time Division Duplex (TDD) variants. This guarantees that both FDD LTE and TDD LTE are able to meet the high data throughput requirements placed upon them. This paper provides an outline of carrier aggregation including aggregation structure, deployment scenarios, implementation, main design features and backward compatibility with legacy LTE systems.

38 citations

Journal ArticleDOI
TL;DR: This paper reviews systematically the related works employing thermography with AI highlighting their contributions and drawbacks and proposing open issues for research.
Abstract: Breast cancer plays a significant role in affecting female mortality. Researchers are actively seeking to develop early detection methods of breast cancer. Several technologies contributed to the reduction in mortality rate from this disease, but early detection contributes most to preventing disease spread, breast amputation and death. Thermography is a promising technology for early diagnosis where thermal cameras employed are of high resolution and sensitivity. The combination of Artificial Intelligence (AI) with thermal images is an effective tool to detect early stage breast cancer and is foreseen to provide impressive predictability levels. This paper reviews systematically the related works employing thermography with AI highlighting their contributions and drawbacks and proposing open issues for research. Several different types of Artificial Neural Networks (ANNs) and deep learning models were used in the literature to process thermographic images of breast cancer, such as Radial Basis Function Network (RBFN), K-Nearest Neighbors (KNN), Probability Neural Network (PNN), Support Vector Machine (SVM), ResNet50, SeResNet50, V Net, Bayes Net, Convolutional Neural Networks (CNN), Convolutional and DeConvolutional Neural Networks (C-DCNN), VGG-16, Hybrid (ResNet-50 and V-Net), ResNet101, DenseNet and InceptionV3. Previous studies were found limited to varying the numbers of thermal images used mostly from DMR-IR database. In addition, analysis of the literature indicate that several factors do affect the performance of the Neural Network used, such as Database, optimization method, Network model and extracted features. However, due to small sample size used, most of the studies achieved a classification accuracy of 80% to 100%.

37 citations


Cited by
More filters
01 Jan 2016

733 citations

Journal ArticleDOI
TL;DR: A survey of IDS research efforts for IoT is presented to identify leading trends, open issues, and future research possibilities, and classified the IDS proposed in the literature according to the following attributes: detection method, IDS placement strategy, security threat and validation strategy.

675 citations

Book
01 Dec 1981

609 citations

Journal ArticleDOI
TL;DR: A novel model for intrusion detection based on two-layer dimension reduction and two-tier classification module, designed to detect malicious activities such as User to Root (U2R) and Remote to Local (R2L) attacks is presented.
Abstract: With increasing reliance on Internet of Things (IoT) devices and services, the capability to detect intrusions and malicious activities within IoT networks is critical for resilience of the network infrastructure. In this paper, we present a novel model for intrusion detection based on two-layer dimension reduction and two-tier classification module, designed to detect malicious activities such as User to Root (U2R) and Remote to Local (R2L) attacks. The proposed model is using component analysis and linear discriminate analysis of dimension reduction module to spate the high dimensional dataset to a lower one with lesser features. We then apply a two-tier classification module utilizing Naive Bayes and Certainty Factor version of K-Nearest Neighbor to identify suspicious behaviors. The experiment results using NSL-KDD dataset shows that our model outperforms previous models designed to detect U2R and R2L attacks.

356 citations

Journal ArticleDOI
24 Aug 2018-Sensors
TL;DR: This paper is presenting an overview about different layered architectures of IoT and attacks regarding security from the perspective of layers, and suggested a new secure layered architecture of IoT to overcome these issues.
Abstract: The use of the Internet is growing in this day and age, so another area has developed to use the Internet, called Internet of Things (IoT). It facilitates the machines and objects to communicate, compute and coordinate with each other. It is an enabler for the intelligence affixed to several essential features of the modern world, such as homes, hospitals, buildings, transports and cities. The security and privacy are some of the critical issues related to the wide application of IoT. Therefore, these issues prevent the wide adoption of the IoT. In this paper, we are presenting an overview about different layered architectures of IoT and attacks regarding security from the perspective of layers. In addition, a review of mechanisms that provide solutions to these issues is presented with their limitations. Furthermore, we have suggested a new secure layered architecture of IoT to overcome these issues.

294 citations