1. Multi-Head Transformer Architecture with Higher Dimensional Feature Representation for Massive MIMO CSI Feedback
- Author
-
Qing Chen, Aihuang Guo, and Yaodong Cui
- Subjects
CSI feedback ,massive MIMO ,deep learning ,Transformer ,attention mechanism ,Technology ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Biology (General) ,QH301-705.5 ,Physics ,QC1-999 ,Chemistry ,QD1-999 - Abstract
To achieve the anticipated performance of massive multiple input multiple output (MIMO) systems in wireless communication, it is imperative that the user equipment (UE) accurately feeds the channel state information (CSI) back to the base station (BS) along the uplink. To reduce the feedback overhead, an increasing number of deep learning (DL)-based networks have emerged, aimed at compressing and subsequently recovering CSI. Various novel structures are introduced, among which Transformer architecture has enabled a new level of precision in CSI feedback. In this paper, we propose a new method named TransNet+ built upon the Transformer-based TransNet by updating the multi-head attention layer and implementing an improved training scheme. The simulation results demonstrate that TransNet+ outperforms existing methods in terms of recovery accuracy and achieves state-of-the-art.
- Published
- 2024
- Full Text
- View/download PDF