scispace - formally typeset
M

Mazen O. Hasna

Researcher at Qatar University

Publications -  280
Citations -  8307

Mazen O. Hasna is an academic researcher from Qatar University. The author has contributed to research in topics: Relay & Spectral efficiency. The author has an hindex of 32, co-authored 257 publications receiving 7295 citations. Previous affiliations of Mazen O. Hasna include Qatar Airways & Polytechnic University of Turin.

Papers
More filters
Proceedings ArticleDOI

A minorization-maximization algorithm for an-based mimome secrecy rate maximization

TL;DR: A minorization-maximization algorithm is developed that is iterative in nature and guaranteed to converge to a locally optimal solution to the problem of secrecy rate maximization in a multi-input multi-output multi-eavesdropper wiretap channel.
Proceedings ArticleDOI

Novel cooperative policy for cognitive radio networks: Stability region and delay analysis

TL;DR: This work shows how significantly its proposed scheme improves the performance of the secondary user and increases the maximum stable throughput of the primary user over the traditional cooperative policies that restrict the secondaryuser to exploit only the periods of silence of thePrimary user.
Proceedings ArticleDOI

A signal combining technique based on channel shortening for cooperative sensor networks

TL;DR: This paper considers a scenario where the source node transmits its signal to the destination through multiple relays in an uncoordinated fashion, and develops a novel signal combining technique based on channel shortening which outperforms the selection combining scheme.
Journal ArticleDOI

Joint Optimization of Area Spectral Efficiency and Delay Over PPP Interfered Ad-Hoc Networks

TL;DR: A utility function, U = ASE/delay is introduced and the optimal ALOHA transmission probability p and the SIR threshold τ that jointly maximize the ASE and minimize the local delay are derived.
Proceedings ArticleDOI

Centralized-decentralized RB allocation based on genetic algorithm and coordination over X2 interface in LTE uplink

TL;DR: A method for resource block allocation for the LTE uplink system aims to minimize the inter-cell interference and maximize the channel capacity and the genetic algorithm is used to come up with a solution in reasonable time.