scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Deep Learning Based Channel Estimation Algorithm for Fast Time-Varying MIMO-OFDM Systems

01 Mar 2020-IEEE Communications Letters (IEEE)-Vol. 24, Iss: 3, pp 572-576
TL;DR: A deep learning (DL)-based MIMO-OFDM channel estimation algorithm that can be effectively utilized to adapt the characteristics of fast time-varying channels in the high mobility scenarios by performing offline training to the learning network.
Abstract: Channel estimation is very challenging for multiple-input and multiple-output orthogonal frequency division multiplexing (MIMO-OFDM) systems in high mobility environments with non-stationarity channel characteristics. In order to handle this problem, we propose a deep learning (DL)-based MIMO-OFDM channel estimation algorithm. By performing offline training to the learning network, the channel state information (CSI) generated by the training samples can be effectively utilized to adapt the characteristics of fast time-varying channels in the high mobility scenarios. The simulation results show that the proposed DL-based algorithm is more robust for the scenarios of high mobility in MIMO-OFDM systems, compared to the conventional algorithms.
Citations
More filters
Journal ArticleDOI
TL;DR: In this article, the authors investigated the mean square error (MSE) performance of machine learning-based channel estimation in orthogonal frequency division multiplexing (OFDM) systems and derived a clear analytical relation between the size of the training data and performance.
Abstract: Recently, machine learning-based channel estimation has attracted much attention. The performance of machine learning-based estimation has been validated by simulation experiments. However, little attention has been paid to the theoretical performance analysis. In this paper, we investigate the mean square error (MSE) performance of machine learning-based estimation. Hypothesis testing is employed to analyze its MSE upper bound. Furthermore, we build a statistical model for hypothesis testing, which holds when the linear learning module with a low input dimension is used in machine learning-based channel estimation, and derive a clear analytical relation between the size of the training data and performance. Then, we simulate the machine learning-based channel estimation in orthogonal frequency division multiplexing (OFDM) systems to verify our analysis results. Finally, the design considerations for the situation where only limited training data is available are discussed. In this situation, our analysis results can be applied to assess the performance and support the design of machine learning-based channel estimation.

17 citations

Journal ArticleDOI
TL;DR: This article develops important insights derived from the physical radio frequency (RF) channel properties and presents a comprehensive overview on the application of DL for accurately estimating channel state information (CSI) with low overhead.
Abstract: A new wave of wireless services, including virtual reality, autonomous driving and Internet of Things, is driving the design of new generations of wireless systems to deliver ultra-high data rates, massive numbers of connections and ultra-low latency. Massive multiple-input multiple-output (MIMO) is one of the critical underlying technologies that allow future wireless networks to meet these service needs. This article discusses the application of deep learning (DL) for massive MIMO channel estimation in wireless networks by integrating the underlying characteristics of channels in future high-speed cellular deployment. We develop important insights derived from the physical radio frequency (RF) channel properties and present a comprehensive overview on the application of DL for accurately estimating channel state information (CSI) with low overhead. We provide examples of successful DL application in CSI estimation for massive MIMO wireless systems and highlight several promising directions for future research.

16 citations


Cites methods from "Deep Learning Based Channel Estimat..."

  • ...In [8], a DL network has been proposed to handle the massive MIMO channel estimation problem in high-speed mobile scenarios by exploiting the temporal correlation as well as the spatial and spectral correlation....

    [...]

  • ...mobility environment [8] Channel calibration [11]...

    [...]

Journal ArticleDOI
TL;DR: In this paper, a machine learning-based channel estimation for orthogonal frequency division multiplexing (OFDM) systems is proposed, in which the training of the estimator is performed online.
Abstract: In this paper, we devise a highly efficient machine learning-based channel estimation for orthogonal frequency division multiplexing (OFDM) systems, in which the training of the estimator is performed online. A simple learning module is employed for the proposed learning-based estimator. The training process is thus much faster and the required training data is reduced significantly. Besides, a training data construction approach utilizing least square (LS) estimation results is proposed so that the training data can be collected during the data transmission. The feasibility of this novel construction approach is verified by theoretical analysis and simulations. Based on this construction approach, two alternative training data generation schemes are proposed. One scheme transmits additional block pilot symbols to create training data, while the other scheme adopts a decision-directed method and does not require extra pilot overhead. Simulation results show the robustness of the proposed channel estimation method. Furthermore, the proposed method shows better adaptation to practical imperfections compared with the conventional minimum mean-square error (MMSE) channel estimation. It outperforms the existing machine learning-based channel estimation techniques under varying channel conditions.

13 citations

Journal ArticleDOI
TL;DR: This article focuses its attention on four promising physical layer concepts foreseen to dominate next-generation communications, namely massive multiple-input multiple-output systems, sophisticated multi-carrier waveform designs, reconfigurable intelligent surface-empowered communications, and physical layer security.
Abstract: Deep learning (DL) has proven its unprecedented success in diverse fields such as computer vision, natural language processing, and speech recognition by its strong representation ability and ease of computation. As we move forward to a thoroughly intelligent society with 6G wireless networks, new applications and use cases have been emerging with stringent requirements for next-generation wireless communications. Therefore, recent studies have focused on the potential of DL approaches in satisfying these rigorous needs and overcoming the deficiencies of existing model-based techniques. The main objective of this article is to unveil the state-of-the-art advancements in the field of DL-based physical layer methods to pave the way for fascinating applications of 6G. In particular, we have focused our attention on four promising physical layer concepts foreseen to dominate next-generation communications, namely massive multiple-input multiple-output systems, sophisticated multi-carrier waveform designs, reconfigurable intelligent surface-empowered communications, and physical layer security. We examine up-to-date developments in DL-based techniques, provide comparisons with state-of-the-art methods, and introduce a comprehensive guide for future directions. We also present an overview of the underlying concepts of DL, along with the theoretical background of well-known DL techniques. Furthermore, this article provides programming examples for a number of DL techniques and the implementation of a DL-based multiple-input multiple-output by sharing user-friendly code snippets, which might be useful for interested readers.

11 citations

Journal ArticleDOI
TL;DR: Experimental results demonstrate that in a realistic non-cooperative cognitive communication scenario where prior information is exempted, the proposed SC-MFNet outperforms the traditional feature-based methods and the state-of-the-art neural networks which are based on either constellation features or series features.
Abstract: Due to the shortage of radio spectrum in the current 5G and upcoming 6G systems, the cognitive radio (CR) technique is indispensable for spectrum management and can put the unutilized spectrum to good use. As the core technology of CR, blind modulation recognition (BMR) plays a pivotal role in improving spectral efficiency. However, the BMR research on MIMO-OFDM systems still lacks enough attention. Given the prosperity of deep learning, we propose a series-constellation multi-modal feature network (SC-MFNet) to recognize the modulation types of MIMO-OFDM subcarriers. Without any prior information, a blind signal separation algorithm is employed to reconstruct the impaired transmitted signal. Considering the insufficient features of signal series, we propose a segment accumulated constellation diagram (SACD) strategy to produce the striking constellation features. Moreover, the proposed multi-modal feature fusion network is employed to collect the advantages of series and SACD features, which are extracted by one-dimensional convolution (Conv1DNet) branch and improved EfficientNet branch, respectively. Experimental results demonstrate that in a realistic non-cooperative cognitive communication scenario where prior information is exempted, the proposed SC-MFNet outperforms the traditional feature-based methods and the state-of-the-art neural networks which are based on either constellation features or series features.

10 citations

References
More filters
Book
16 Nov 2010
TL;DR: In this article, the authors provide a comprehensive introduction to the theory and practice of wireless channel modeling, OFDM, and MIMO, using MATLAB programs to simulate the various techniques on a wireless network.
Abstract: MIMO-OFDM is a key technology for next-generation cellular communications (3GPP-LTE, Mobile WiMAX, IMT-Advanced) as well as wireless LAN (IEEE 802.11a, IEEE 802.11n), wireless PAN (MB-OFDM), and broadcasting (DAB, DVB, DMB). In MIMO-OFDM Wireless Communications with MATLAB, the authors provide a comprehensive introduction to the theory and practice of wireless channel modeling, OFDM, and MIMO, using MATLAB programs to simulate the various techniques on MIMO-OFDM systems. One of the only books in the area dedicated to explaining simulation aspects Covers implementation to help cement the key concepts Uses materials that have been classroom-tested in numerous universities Provides the analytic solutions and practical examples with downloadable MATLAB codes Simulation examples based on actual industry and research projects Presentation slides with key equations and figures for instructor use MIMO-OFDM Wireless Communications with MATLAB is a key text for graduate students in wireless communications. Professionals and technicians in wireless communication fields, graduate students in signal processing, as well as senior undergraduates majoring in wireless communications will find this book a practical introduction to the MIMO-OFDM techniques. Instructor materials and MATLAB code examples available for download at www.wiley.com/go/chomimo

1,413 citations


"Deep Learning Based Channel Estimat..." refers methods in this paper

  • ..., the LS with linear interpolation [5] and the LMMSE with linear interpolation [6], in high mobility MIMO-OFDM systems....

    [...]

  • ...The least squares (LS) and linear minimum mean squared error (LMMSE) methods are commonly used channel estimation algorithms [5], [6], but they cannot work well in the high mobility scenario due to several drawbacks....

    [...]

Journal ArticleDOI
TL;DR: The proposed deep learning-based approach to handle wireless OFDM channels in an end-to-end manner is more robust than conventional methods when fewer training pilots are used, the cyclic prefix is omitted, and nonlinear clipping noise exists.
Abstract: This letter presents our initial results in deep learning for channel estimation and signal detection in orthogonal frequency-division multiplexing (OFDM) systems. In this letter, we exploit deep learning to handle wireless OFDM channels in an end-to-end manner. Different from existing OFDM receivers that first estimate channel state information (CSI) explicitly and then detect/recover the transmitted symbols using the estimated CSI, the proposed deep learning-based approach estimates CSI implicitly and recovers the transmitted symbols directly. To address channel distortion, a deep learning model is first trained offline using the data generated from simulation based on channel statistics and then used for recovering the online transmitted data directly. From our simulation results, the deep learning based approach can address channel distortion and detect the transmitted symbols with performance comparable to the minimum mean-square error estimator. Furthermore, the deep learning-based approach is more robust than conventional methods when fewer training pilots are used, the cyclic prefix is omitted, and nonlinear clipping noise exists. In summary, deep learning is a promising tool for channel estimation and signal detection in wireless communications with complicated channel distortion and interference.

1,357 citations


"Deep Learning Based Channel Estimat..." refers methods in this paper

  • ...In [8], a DL-based channel estimation and signal detection method was developed...

    [...]

Journal ArticleDOI
TL;DR: The learned denoising-based approximate message passing (LDAMP) network is exploited and significantly outperforms state-of-the-art compressed sensing-based algorithms even when the receiver is equipped with a small number of RF chains.
Abstract: Channel estimation is very challenging when the receiver is equipped with a limited number of radio-frequency (RF) chains in beamspace millimeter-wave massive multiple-input and multiple-output systems. To solve this problem, we exploit a learned denoising-based approximate message passing (LDAMP) network. This neural network can learn channel structure and estimate channel from a large number of training data. Furthermore, we provide an analytical framework on the asymptotic performance of the channel estimator. Based on our analysis and simulation results, the LDAMP neural network significantly outperforms state-of-the-art compressed sensing-based algorithms even when the receiver is equipped with a small number of RF chains.

587 citations


"Deep Learning Based Channel Estimat..." refers background in this paper

  • ...In [7], a learned denoising-based approximate...

    [...]

  • ...Recently, the deep learning (DL) method has been applied in wireless communications [7]–[17], such as channel estimation [7]–[14], channel state information (CSI) feedback [15], [16], and data detection [17], in order to achieve excellent performance....

    [...]

Journal ArticleDOI
TL;DR: Simulation results corroborate that the proposed deep learning based scheme can achieve better performance in terms of the DOA estimation and the channel estimation compared with conventional methods, and the proposed scheme is well investigated by extensive simulation in various cases for testing its robustness.
Abstract: The recent concept of massive multiple-input multiple-output (MIMO) can significantly improve the capacity of the communication network, and it has been regarded as a promising technology for the next-generation wireless communications. However, the fundamental challenge of existing massive MIMO systems is that high computational complexity and complicated spatial structures bring great difficulties to exploit the characteristics of the channel and sparsity of these multi-antennas systems. To address this problem, in this paper, we focus on channel estimation and direction-of-arrival (DOA) estimation, and a novel framework that integrates the massive MIMO into deep learning is proposed. To realize end-to-end performance, a deep neural network (DNN) is employed to conduct offline learning and online learning procedures, which is effective to learn the statistics of the wireless channel and the spatial structures in the angle domain. Concretely, the DNN is first trained by simulated data in different channel conditions with the aids of the offline learning, and then corresponding output data can be obtained based on current input data during online learning process. In order to realize super-resolution channel estimation and DOA estimation, two algorithms based on the deep learning are developed, in which the DOA can be estimated in the angle domain without additional complexity directly. Furthermore, simulation results corroborate that the proposed deep learning based scheme can achieve better performance in terms of the DOA estimation and the channel estimation compared with conventional methods, and the proposed scheme is well investigated by extensive simulation in various cases for testing its robustness.

577 citations

Posted Content
TL;DR: In this article, a deep learning-based approach for channel estimation and signal detection in orthogonal frequency-division multiplexing (OFDM) channels is presented, which is more robust than conventional methods when fewer training pilots are used, the cyclic prefix is omitted, and nonlinear clipping noise is presented.
Abstract: This article presents our initial results in deep learning for channel estimation and signal detection in orthogonal frequency-division multiplexing (OFDM). OFDM has been widely adopted in wireless broadband communications to combat frequency-selective fading in wireless channels. In this article, we take advantage of deep learning in handling wireless OFDM channels in an end-to-end approach. Different from existing OFDM receivers that first estimate CSI explicitly and then detect/recover the transmitted symbols with the estimated CSI, our deep learning based approach estimates CSI implicitly and recovers the transmitted symbols directly. To address channel distortion, a deep learning model is first trained offline using the data generated from the simulation based on the channel statistics and then used for recovering the online transmitted data directly. From our simulation results, the deep learning based approach has the ability to address channel distortions and detect the transmitted symbols with performance comparable to minimum mean-square error (MMSE) estimator. Furthermore, the deep learning based approach is more robust than conventional methods when fewer training pilots are used, the cyclic prefix (CP) is omitted, and nonlinear clipping noise is presented. In summary, deep learning is a promising tool for channel estimation and signal detection in wireless communications with complicated channel distortions and interferences.

522 citations