Back to Search Start Over

TranSFormer: Slow-Fast Transformer for Machine Translation

Authors :
Li, Bei
Jing, Yi
Tan, Xu
Xing, Zhen
Xiao, Tong
Zhu, Jingbo
Publication Year :
2023

Abstract

Learning multiscale Transformer models has been evidenced as a viable approach to augmenting machine translation systems. Prior research has primarily focused on treating subwords as basic units in developing such systems. However, the incorporation of fine-grained character-level features into multiscale Transformer has not yet been explored. In this work, we present a \textbf{S}low-\textbf{F}ast two-stream learning model, referred to as Tran\textbf{SF}ormer, which utilizes a ``slow'' branch to deal with subword sequences and a ``fast'' branch to deal with longer character sequences. This model is efficient since the fast branch is very lightweight by reducing the model width, and yet provides useful fine-grained features for the slow branch. Our TranSFormer shows consistent BLEU improvements (larger than 1 BLEU point) on several machine translation benchmarks.<br />Comment: Accepted by Findings of ACL2023

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2305.16982
Document Type :
Working Paper