Topic
Codebook
About: Codebook is a research topic. Over the lifetime, 8492 publications have been published within this topic receiving 115995 citations.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: This paper develops and analyzes three limited-feedback resource allocation algorithms suitable for uplink transmission in heterogeneous wireless networks (HetNets) and reveals that the Lloyd algorithm can offer a performance close to the perfect-CSI case (without a limited number of feedback bits).
Abstract: In this paper, we develop and analyze three limited-feedback resource allocation algorithms suitable for uplink transmission in heterogeneous wireless networks (HetNets). In this setup, one macro-cell shares the spectrum with a set of underlay cognitive small-cells via the orthogonal frequency-division multiple access (OFDMA), where the interference from small-cells to the macro-cell should be kept below a predefined threshold. The resource allocation algorithms aim to maximize the weighted sum of instantaneous rates of all users over all cells by jointly optimizing power and subcarrier allocation under power constraints. Since in practice, the HetNet backhaul capacity is limited, reducing the amount of channel state information (CSI) feedback signaling passed over the backhaul links is highly desirable. To reach this goal, we apply the Lloyd algorithm to develop the limited-feedback two-phase resource allocation scheme. In the first offline phase , an optimal codebook for power and subcarrier allocation is designed and sent to all nodes. In the second online phase , based on channel realizations, the appropriate codeword of the designed codebook is chosen for transmission parameters, and the macro-cell only sends the codeword index represented by a limited number of bits for subcarrier and power allocation to its own users and small-cells. Then, each small-cell informs its own users by their related codewords. The offline phase involves a mixed-integer nonconvex resource allocation problem encountering high computational complexity. To solve it efficiently, we apply the general iterative successive convex approximation (SCA) approach, where the nonconvex optimization problem is transformed into the approximated convex optimization problem in each iteration. The simulation results reveal that the Lloyd algorithm can offer a performance close to the perfect-CSI case (without a limited number of feedback bits).
86 citations
••
TL;DR: The simulation and analytical results show that the presented SCMA codebook outperforms the existing codebooks and low-density signature, and the proposed design is more efficient for the SCMA Codebook with large size and/or high dimension.
Abstract: In this paper, a novel codebook design method for sparse code multiple access (SCMA) is proposed and an analytical framework to evaluate the bit error rate (BER) performance is developed. In particular, to meet the codebook design criteria based on the pairwise error probability, a novel codebook with large minimum Euclidean distance employing the star quadrature amplitude modulation signal constellations is designed. In addition, with the aid of the phase distribution of the presented SCMA constellations, BER of SCMA system over downlink Rayleigh fading channel is obtained in closed-form expression. The simulation and analytical results show that the presented SCMA codebook outperforms the existing codebooks and low-density signature, and the proposed design is more efficient for the SCMA codebook with large size and/or high dimension. Moreover, the derived theoretical BER results match well the simulation results, especially in the high signal-to-noise ratio regimes.
85 citations
••
TL;DR: A fast algorithm for vector quantising image data is proposed and the algorithm is proved powerful.
Abstract: The encoding of a VQ-based image coding requires a full codebook search for each input vector to find out the best matched codeword. It is a time consuming process. A fast algorithm for vector quantising image data is proposed. The algorithm is proved powerful.
85 citations
••
TL;DR: Simulation results show that the proposed algorithm achieves significantly better performance than the conventional DS-CDMA (C- CDMA) systems and the existing chip-interleaving, linear precoding and adaptive spreading techniques.
Abstract: In this paper we propose a novel switched-interleaving algorithm based on limited feedback for both uplink and downlink DS-CDMA systems. The proposed switched chip-interleaving DS-CDMA scheme requires the cooperation between the transmitter and the receiver, and a feedback channel sending the index of the interleaver to be used. The transmit chip-interleaver is chosen by the receiver from a codebook of interleaving matrices known to both the receiver and the transmitter and the codebook index is sent back using a limited number of bits. In order to design the codebook, we consider a number of different chip patterns by using random interleavers, block interleavers and a proposed frequently selected patterns method (FSP). The best interleaving patterns are chosen by the selection functions of the received signal to interference plus noise ratio (SINR) for both downlink and uplink systems. Since the selection function needs to determine the best interleaver based on the channel state information, it is necessary to predict reliably the channel state information for typical delay values. We present symbol-based and block-based linear minimum mean squared error (MMSE) receivers for interference suppression. Simulation results show that our proposed algorithm achieves significantly better performance than the conventional DS-CDMA (C-CDMA) systems and the existing chip-interleaving, linear precoding and adaptive spreading techniques.
85 citations
••
25 Mar 2003TL;DR: A locally adaptive partitioning algorithm is introduced that performs comparably in this application to a more expensive globally optimal one that employs dynamic programming.
Abstract: High dimensional source vectors, such as those that occur in hyperspectral imagery, are partitioned into a number of subvectors of different length and then each subvector is vector quantized (VQ) individually with an appropriate codebook. A locally adaptive partitioning algorithm is introduced that performs comparably in this application to a more expensive globally optimal one that employs dynamic programming. The VQ indices are entropy coded and used to condition the lossless or near-lossless coding of the residual error. Motivated by the need for maintaining uniform quality across all vector components, a percentage maximum absolute error distortion measure is employed. Experiments on the lossless and near-lossless compression of NASA AVIRIS images are presented. A key advantage of the approach is the use of independent small VQ codebooks that allow fast encoding and decoding.
85 citations