Back to Search Start Over

A Theory of Unsupervised Translation Motivated by Understanding Animal Communication

Authors :
Goldwasser, Shafi
Gruber, David F.
Kalai, Adam Tauman
Paradise, Orr
Publication Year :
2022

Abstract

Neural networks are capable of translating between languages -- in some cases even between two languages where there is little or no access to parallel translations, in what is known as Unsupervised Machine Translation (UMT). Given this progress, it is intriguing to ask whether machine learning tools can ultimately enable understanding animal communication, particularly that of highly intelligent animals. We propose a theoretical framework for analyzing UMT when no parallel translations are available and when it cannot be assumed that the source and target corpora address related subject domains or posses similar linguistic structure. We exemplify this theory with two stylized models of language, for which our framework provides bounds on necessary sample complexity; the bounds are formally proven and experimentally verified on synthetic data. These bounds show that the error rates are inversely related to the language complexity and amount of common ground. This suggests that unsupervised translation of animal communication may be feasible if the communication system is sufficiently complex.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2211.11081
Document Type :
Working Paper