Back to Search Start Over

PhraseAttn: Dynamic Slot Capsule Networks for phrase representation in Neural Machine Translation

Authors :
Binh Nguyen
Long H.B. Nguyen
Dien Dinh
Binh Van Le
Source :
Journal of Intelligent & Fuzzy Systems. 42:3871-3878
Publication Year :
2022
Publisher :
IOS Press, 2022.

Abstract

Word representation plays a vital role in most Natural Language Processing systems, especially for Neural Machine Translation. It tends to capture semantic and similarity between individual words well, but struggle to represent the meaning of phrases or multi-word expressions. In this paper, we investigate a method to generate and use phrase information in a translation model. To generate phrase representations, a Primary Phrase Capsule network is first employed, then iteratively enhancing with a Slot Attention mechanism. Experiments on the IWSLT English to Vietnamese, French, and German datasets show that our proposed method consistently outperforms the baseline Transformer, and attains competitive results over the scaled Transformer with two times lower parameters.

Details

ISSN :
18758967 and 10641246
Volume :
42
Database :
OpenAIRE
Journal :
Journal of Intelligent & Fuzzy Systems
Accession number :
edsair.doi...........bb89f9c8248a986f12b854fb873fb611
Full Text :
https://doi.org/10.3233/jifs-212101