Please report any queries concerning the funding data grouped in the sections named "Externally Awarded" or "Internally Disbursed" (shown on the profile page) to
your Research Finance Administrator. Your can find your Research Finance Administrator at https://www.ucl.ac.uk/finance/research/rs-contacts.php by entering your department
Please report any queries concerning the student data shown on the profile page to:
Email: portico-services@ucl.ac.uk
Help Desk: http://www.ucl.ac.uk/ras/portico/helpdesk
Email: portico-services@ucl.ac.uk
Help Desk: http://www.ucl.ac.uk/ras/portico/helpdesk
Publication Detail
An Unsupervised Learning-Based Approach for Symbol-Level-Precoding
-
Publication Type:Conference
-
Authors:Mohammad A, Masouros C, Andreopoulos Y
-
Publisher:IEEE
-
Publication date:11/12/2021
-
Published proceedings:Proceedings of the 2021 IEEE Global Communications Conference (GLOBECOM)
-
Name of conference:2021 IEEE Global Communications Conference (GLOBECOM)
-
Conference place:Madrid, Spain
-
Conference start date:07/12/2021
-
Conference finish date:12/12/2021
-
Keywords:eess.SP, eess.SP
-
Author URL:
-
Notes:6 pages, 2 figures, GLOBECOM 2021 Conference
Abstract
This paper proposes an unsupervised learning-based precoding framework that
trains deep neural networks (DNNs) with no target labels by unfolding an
interior point method (IPM) proximal `log' barrier function. The proximal `log'
barrier function is derived from the strict power minimization formulation
subject to signal-to-interference-plus-noise ratio (SINR) constraint. The
proposed scheme exploits the known interference via symbol-level precoding
(SLP) to minimize the transmit power and is named strict Symbol-Level-Precoding
deep network (SLP-SDNet). The results show that SLP-SDNet outperforms the
conventional block-level-precoding (Conventional BLP) scheme while achieving
near-optimal performance faster than the SLP optimization-based approach
› More search options
UCL Researchers