scispace - formally typeset
Y

Ye Li

Researcher at Nantong University

Publications -  39
Citations -  346

Ye Li is an academic researcher from Nantong University. The author has contributed to research in topics: Linear network coding & Network packet. The author has an hindex of 7, co-authored 33 publications receiving 200 citations. Previous affiliations of Ye Li include Queen's University.

Papers
More filters
Journal ArticleDOI

Wireless Channel Models for Maritime Communications

TL;DR: The sparse and the location-dependent properties constitute the most important and distinctive characteristics of the maritime wireless channels, and are highlighted in a thorough review of existing modeling approaches and measurement campaigns.
Journal ArticleDOI

Efficient Coastal Communications with Sparse Network Coding

TL;DR: It is demonstrated through simulations that an appropriate choice of sparse codes is critical to meet the unique requirements in coastal communication systems, and batched sparse code is suitable for relay-aided multicast, and subset-based sparse codes are preferable for D2D-enabled multicast.
Journal ArticleDOI

Wireless Information and Power Transfer in Secure Massive MIMO Downlink With Phase Noise

TL;DR: The effect of phase noise on the downlink SWIPT in secure massive MIMO systems is studied, which degrades accuracy of the channel state information and in turn causes potential information leakage.
Journal ArticleDOI

Location-Based MIMO-NOMA: Multiple Access Regions and Low-Complexity User Pairing

TL;DR: Numerical results confirm the accuracy of the derived region boundaries, and the simulations show that the proposed user pairing algorithm can effectively improve the resource utilization rate, compared to the conventional MA and user pairing schemes.
Journal ArticleDOI

A Low-Complexity Coded Transmission Scheme Over Finite-Buffer Relay Links

TL;DR: This paper proposes a low-complexity coding scheme where packets are encoded from sequentially formed random subsets of source packets called batches, and achieves higher effective end-to-end rates than RLNC when the coding coefficients delivery cost is considered and is also with much lowered decoding complexity thanks to its sparseness.