Please report any queries concerning the funding data grouped in the sections named "Externally Awarded" or "Internally Disbursed" (shown on the profile page) to
your Research Finance Administrator. Your can find your Research Finance Administrator at https://www.ucl.ac.uk/finance/research/rs-contacts.php by entering your department
Please report any queries concerning the student data shown on the profile page to:
Email: portico-services@ucl.ac.uk
Help Desk: http://www.ucl.ac.uk/ras/portico/helpdesk
Email: portico-services@ucl.ac.uk
Help Desk: http://www.ucl.ac.uk/ras/portico/helpdesk
Publication Detail
Learning-Based Symbol Level Precoding: A Memory-Efficient Unsupervised
Learning Approach
-
Publication Type:Journal article
-
Authors:Mohammad A, Masouros C, Andreopoulos Y
-
Keywords:eess.SP, eess.SP
-
Author URL:
-
Notes:6 pages, 5 figures, Conference
Abstract
Symbol level precoding (SLP) has been proven to be an effective means of
managing the interference in a multiuser downlink transmission and also
enhancing the received signal power. This paper proposes an unsupervised
learning based SLP that applies to quantized deep neural networks (DNNs).
Rather than simply training a DNN in a supervised mode, our proposal unfolds a
power minimization SLP formulation in an imperfect channel scenario using the
interior point method (IPM) proximal `log' barrier function. We use binary and
ternary quantizations to compress the DNN's weight values. The results show
significant memory savings for our proposals compared to the existing
full-precision SLP-DNet with significant model compression of ~21x and ~13x for
both binary DNN-based SLP (RSLP-BDNet) and ternary DNN-based SLP (RSLP-TDNets),
respectively.
› More search options
There are no UCL People associated with this publication