scispace - formally typeset
Search or ask a question
Author

Burak Ozpoyraz

Bio: Burak Ozpoyraz is an academic researcher from Koç University. The author has contributed to research in topics: Spectral efficiency & Communication channel. The author has co-authored 2 publications.

Papers
More filters
Journal ArticleDOI
TL;DR: This article focuses its attention on four promising physical layer concepts foreseen to dominate next-generation communications, namely massive multiple-input multiple-output systems, sophisticated multi-carrier waveform designs, reconfigurable intelligent surface-empowered communications, and physical layer security.
Abstract: Deep learning (DL) has proven its unprecedented success in diverse fields such as computer vision, natural language processing, and speech recognition by its strong representation ability and ease of computation. As we move forward to a thoroughly intelligent society with 6G wireless networks, new applications and use cases have been emerging with stringent requirements for next-generation wireless communications. Therefore, recent studies have focused on the potential of DL approaches in satisfying these rigorous needs and overcoming the deficiencies of existing model-based techniques. The main objective of this article is to unveil the state-of-the-art advancements in the field of DL-based physical layer methods to pave the way for fascinating applications of 6G. In particular, we have focused our attention on four promising physical layer concepts foreseen to dominate next-generation communications, namely massive multiple-input multiple-output systems, sophisticated multi-carrier waveform designs, reconfigurable intelligent surface-empowered communications, and physical layer security. We examine up-to-date developments in DL-based techniques, provide comparisons with state-of-the-art methods, and introduce a comprehensive guide for future directions. We also present an overview of the underlying concepts of DL, along with the theoretical background of well-known DL techniques. Furthermore, this article provides programming examples for a number of DL techniques and the implementation of a DL-based multiple-input multiple-output by sharing user-friendly code snippets, which might be useful for interested readers.

11 citations

Journal ArticleDOI
TL;DR: It is shown via computer simulations that the proposed scheme outperforms the existing IM-based schemes and might be a candidate for future secure communication systems.
Abstract: In this paper, we propose a physical layer security scheme that exploits a novel index modulation (IM) technique for coordinate interleaved orthogonal designs (CIOD). Utilizing the diversity gain of CIOD transmission, the proposed scheme, named CIOD-IM, provides an improved spectral efficiency by means of IM. In order to provide a satisfactory secrecy rate, we design a particular artificial noise matrix, which does not affect the performance of the legitimate receiver, while deteriorating the performance of the eavesdropper. We derive expressions of the ergodic secrecy rate and the theoretical bit error rate upper bound. In addition, we analyze the case of imperfect channel estimation by taking practical concerns into consideration. It is shown via computer simulations that the proposed scheme outperforms the existing IM-based schemes and might be a candidate for future secure communication systems.
Posted Content
TL;DR: In this paper, a physical layer security scheme that exploits a novel index modulation (IM) technique for coordinate interleaved orthogonal designs (CIOD) is proposed, which provides improved spectral efficiency by means of IM.
Abstract: In this paper, we propose a physical layer security scheme that exploits a novel index modulation (IM) technique for coordinate interleaved orthogonal designs (CIOD). Utilizing the diversity gain of CIOD transmission, the proposed scheme, named CIOD-IM, provides an improved spectral efficiency by means of IM. In order to provide a satisfactory secrecy rate, we design a particular artificial noise matrix, which does not affect the performance of the legitimate receiver, while deteriorating the performance of the eavesdropper. We derive expressions of the ergodic secrecy rate and the theoretical bit error rate upper bound. In addition, we analyze the case of imperfect channel estimation by taking practical concerns into consideration. It is shown via computer simulations that the proposed scheme outperforms the existing IM-based schemes and might be a candidate for future secure communication systems.

Cited by
More filters
Journal ArticleDOI
TL;DR: The vision for 6G is furnishes the world beyond 5G with the transition to 6G assuming the lead as future wireless communication technology and main impediments and challenges that the 5G–6G transition may face in achieving these greater ideals.
Abstract: The fifth-generation mobile network (5G), as the fundamental enabler of Industry 4.0, has facilitated digital transformation and smart manufacturing through AI and cloud computing (CC). However, B5G is viewed as a turning point that will fundamentally transform existing global trends in wireless communication practices as well as in the lives of masses. B5G foresees a world where physical–digital confluence takes place. This study intends to see the world beyond 5G with the transition to 6G assuming the lead as future wireless communication technology. However, despite several developments, the dream of an era without latency, unprecedented speed internet, and extraterrestrial communication has yet to become a reality. This article explores main impediments and challenges that the 5G–6G transition may face in achieving these greater ideals. This article furnishes the vision for 6G, facilitating technology infrastructures, challenges, and research leads towards the ultimate achievement of “technology for humanity” objective and better service to underprivileged people.

15 citations

Journal ArticleDOI
TL;DR: This paper provides an overview of current theoretical and application prospects of IoT, AI, cloud computing, edge computing, deep learning techniques, blockchain technologies, social networks, robots, machines, privacy, and security techniques in intersection with COVID-19 pandemic.
Abstract: The origin of the COVID-19 pandemic has given overture to redirection, as well as innovation to many digital technologies. Even after the progression of vaccination efforts across the globe, total eradication of this pandemic is still a distant future due to the evolution of new variants. To proactively deal with the pandemic, the health care service providers and the caretaker organizations require new technologies, alongside with improvements in existing related technologies, Internet of Things (IoT), Artificial Intelligence (AI), and Machine Learning in terms of infrastructure, efficiency, privacy, and security. This paper provides an overview of current theoretical and application prospects of IoT, AI, cloud computing, edge computing, deep learning techniques, blockchain technologies, social networks, robots, machines, privacy, and security techniques. In consideration of these prospects in intersection with COVID-19 pandemic, we reviewed the technologies within the broad umbrella of AI-IoT technologies in the most concise classification scheme. In this review, we illustrated that AI-IoT technological applications and innovations have most impacted the field of healthcare. The essential AI-IoT technologies found for healthcare were fog computing in IoT, deep learning, and blockchain. Furthermore, we highlighted several aspects of these technologies and their future impact with a novel methodology of using techniques from image processing, machine learning and differential system modeling.

6 citations

Journal ArticleDOI
TL;DR: In this article , the authors proposed two mitigation methods, such as adversarial training and defensive distillation, for adversarial attacks against artificial intelligence-based models used in the millimeter-wave (mmWave) beamforming prediction.
Abstract: The design of a security scheme for beamforming prediction is critical for next-generation wireless networks (5G, 6G, and beyond). However, there is no consensus about protecting beamforming prediction using deep learning algorithms in these networks. This paper presents the security vulnerabilities in deep learning for beamforming prediction using deep neural networks in 6G wireless networks, which treats the beamforming prediction as a multi-output regression problem. It is indicated that the initial DNN model is vulnerable to adversarial attacks, such as Fast Gradient Sign Method , Basic Iterative Method , Projected Gradient Descent , and Momentum Iterative Method , because the initial DNN model is sensitive to the perturbations of the adversarial samples of the training data. This study offers two mitigation methods, such as adversarial training and defensive distillation, for adversarial attacks against artificial intelligence-based models used in the millimeter-wave (mmWave) beamforming prediction. Furthermore, the proposed scheme can be used in situations where the data are corrupted due to the adversarial examples in the training data. Experimental results show that the proposed methods defend the DNN models against adversarial attacks in next-generation wireless networks.

5 citations

Journal ArticleDOI
TL;DR: A comprehensive vulnerability analysis of deep learning-based channel estimation models trained with the dataset obtained from MATLAB’s 5G toolbox for adversarial attacks and defensive distillation-based mitigation methods are proposed.
Abstract: Future wireless networks (5G and beyond), also known as Next Generation or NextG, are the vision of forthcoming cellular systems, connecting billions of devices and people together. In the last decades, cellular networks have dramatically grown with advanced telecommunication technologies for high-speed data transmission, high cell capacity, and low latency. The main goal of those technologies is to support a wide range of new applications, such as virtual reality, metaverse, telehealth, online education, autonomous and flying vehicles, smart cities, smart grids, advanced manufacturing, and many more. The key motivation of NextG networks is to meet the high demand for those applications by improving and optimizing network functions. Artificial Intelligence (AI) has a high potential to achieve these requirements by being integrated into applications throughout all network layers. However, the security concerns on network functions of NextG using AI-based models, i.e., model poisoning, have not been investigated deeply. It is crucial to protect the next-generation cellular networks against cybersecurity threats, especially adversarial attacks. Therefore, it needs to design efficient mitigation techniques and secure solutions for NextG networks using AI-based methods. This paper proposes a comprehensive vulnerability analysis of deep learning (DL)-based channel estimation models trained with the dataset obtained from MATLAB’s 5G toolbox for adversarial attacks and defensive distillation-based mitigation methods. The adversarial attacks produce faulty results by manipulating trained DL-based models for channel estimation in NextG networks while mitigation methods can make models more robust against adversarial attacks. This paper also presents the performance of the proposed defensive distillation mitigation method for each adversarial attack. The results indicate that the proposed mitigation method can defend the DL-based channel estimation models against adversarial attacks in NextG networks.

3 citations

Journal ArticleDOI
TL;DR: In this paper , the authors provide an overview of the existing ML-based mmWave/THz beam management and beam tracking techniques and highlight key characteristics of an optimal BM and tracking framework.
Abstract: Next-generation wireless communication networks will benefit from beamforming gain to utilize higher bandwidths at millimeter wave (mmWave) and terahertz (THz) bands. For high directional gain, a beam management (BM) framework acquires and tracks optimal downlink and uplink beam pairs through exhaustive beam scan. However, for narrower beams at higher carrier frequencies this leads to a huge beam measurement overhead that negatively impacts the beam acquisition and tracking. Moreover, volatility of mmWave and THz channels, user random mobility patterns, and environmental changes further complicate the BM process. Consequently, machine learning (ML) algorithms that can identify and learn complex mobility patterns and track environmental dynamics have been identified as a remedy. In this article, we provide an overview of the existing ML-based mmWave/THz BM and beam tracking techniques. Especially, we highlight key characteristics of an optimal BM and tracking framework. By surveying the recent studies, we identify some open research challenges and provide our recommendations that can serve as a future direction for researchers in this area.

2 citations