# Rate-reliability-complexity tradeoff for ML and lattice decoding of full-rate codes

TL;DR: The current work proves the fact that ML and (MMSE-preprocessed) lattice decoding share the same complexity exponent for a very broad setting, which now includes almost any DMT optimal code and all decoding order policies.

Abstract: Recent work in [1]-[3] quantified, in the form of a complexity exponent, the computational resources required for ML and lattice sphere decoding to achieve a certain diversity-multiplexing performance. For a specific family of layered lattice designs, and a specific set of decoding orderings, this complexity was shown to be an exponential function in the number of codeword bits, and was shown to meet a universal upper bound on complexity exponents. The same results raised the question of whether complexity reductions away from the universal upper bound are feasible, for example, with a proper choice of decoder (ML vs lattice), or with a proper choice of lattice codes and decoding ordering policies. The current work addresses this question by first showing that for almost any full-rate DMT optimal lattice code, there exists no decoding ordering policy that can reduce the complexity exponent of ML or lattice based sphere decoding away from the universal upper bound, i.e., that a randomly picked lattice code (randomly and uniformly drawn from an ensemble of DMT optimal lattice designs) will almost surely be such that no decoding ordering policy can provide exponential complexity reductions away from the universal upper bound. As a byproduct of this, the current work proves the fact that ML and (MMSE-preprocessed) lattice decoding share the same complexity exponent for a very broad setting, which now includes almost any DMT optimal code (again randomly drawn) and all decoding order policies. Under a basic richness of codes assumption, this is in fact further extended to hold, with probability one, over all full-rate codes. Under the same assumption, the result allows for a meaningful rate-reliability-complexity tradeoff that holds, almost surely in the random choice of the full-rate lattice design, and which holds irrespective of the decoding ordering policy. This tradeoff can be used to, for example, describe the optimal achievable diversity gain of ML or lattice sphere decoding in the presence of limited computational resources.

...read more

##### Citations

2 citations

2 citations

##### References

4,264 citations

### "Rate-reliability-complexity tradeof..." refers background in this paper

...I. INTRODUCTION In MIMO systems, the prohibitively large computational costs of maximum likelihood (ML) decoding serve as motivation to consider different branch-and-bound algorithms [4]–[6] which can provide computational savings at the expense of a relatively small error-performance degradation....

[...]

...Combining (3) and (4) yields the equivalent system model y =Ms + w, (5a) where M,ρ 1 2− rT κ HG ∈ R2nRT×κ. (5b)...

[...]

2,196 citations

1,732 citations

### "Rate-reliability-complexity tradeof..." refers background in this paper

...…this question by first showing that for almost any full-rate DMT optimal lattice code, there exists no decoding ordering policy that can reduce the complexity exponent of ML or lattice based sphere decoding away from the universal upper bound, i.e., that a randomly picked lattice code…...

[...]

1,376 citations

##### Related Papers (5)

[...]