Back to Search Start Over

Translation of Multifaceted Data without Re-Training of Machine Translation Systems

Authors :
Moon, Hyeonseok
Lee, Seungyoon
Hong, Seongtae
Lee, Seungjun
Park, Chanjun
Lim, Heuiseok
Publication Year :
2024

Abstract

Translating major language resources to build minor language resources becomes a widely-used approach. Particularly in translating complex data points composed of multiple components, it is common to translate each component separately. However, we argue that this practice often overlooks the interrelation between components within the same data point. To address this limitation, we propose a novel MT pipeline that considers the intra-data relation in implementing MT for training data. In our MT pipeline, all the components in a data point are concatenated to form a single translation sequence and subsequently reconstructed to the data components after translation. We introduce a Catalyst Statement (CS) to enhance the intra-data relation, and Indicator Token (IT) to assist the decomposition of a translated sequence into its respective data components. Through our approach, we have achieved a considerable improvement in translation quality itself, along with its effectiveness as training data. Compared with the conventional approach that translates each data component separately, our method yields better training data that enhances the performance of the trained model by 2.690 points for the web page ranking (WPR) task, and 0.845 for the question generation (QG) task in the XGLUE benchmark.<br />Comment: Accepted to EMNLP2024 findings

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2404.16257
Document Type :
Working Paper
Full Text :
https://doi.org/10.18653/v1/2024.findings-emnlp.114