Back to Search Start Over

CrossMPT: Cross-attention Message-Passing Transformer for Error Correcting Codes

Authors :
Park, Seong-Joon
Kwak, Hee-Youl
Kim, Sang-Hyo
Kim, Yongjune
No, Jong-Seon
Publication Year :
2024

Abstract

Error correcting codes~(ECCs) are indispensable for reliable transmission in communication systems. The recent advancements in deep learning have catalyzed the exploration of ECC decoders based on neural networks. Among these, transformer-based neural decoders have achieved state-of-the-art decoding performance. In this paper, we propose a novel Cross-attention Message-Passing Transformer~(CrossMPT). CrossMPT iteratively updates two types of input vectors (i.e., magnitude and syndrome vectors) using two masked cross-attention blocks. The mask matrices in these cross-attention blocks are determined by the code's parity-check matrix that delineates the relationship between magnitude and syndrome vectors. Our experimental results show that CrossMPT significantly outperforms existing neural network-based decoders, particularly in decoding low-density parity-check codes. Notably, CrossMPT also achieves a significant reduction in computational complexity, achieving over a 50\% decrease in its attention layers compared to the original transformer-based decoder, while retaining the computational complexity of the remaining layers.<br />Comment: 13 pages

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2405.01033
Document Type :
Working Paper