1. Leveraging Grammar Induction for Language Understanding and Generation
- Author
-
Kai, Jushi, Hou, Shengyuan, Huang, Yusheng, and Lin, Zhouhan
- Subjects
Computer Science - Computation and Language ,Computer Science - Artificial Intelligence - Abstract
Grammar induction has made significant progress in recent years. However, it is not clear how the application of induced grammar could enhance practical performance in downstream tasks. In this work, we introduce an unsupervised grammar induction method for language understanding and generation. We construct a grammar parser to induce constituency structures and dependency relations, which is simultaneously trained on downstream tasks without additional syntax annotations. The induced grammar features are subsequently incorporated into Transformer as a syntactic mask to guide self-attention. We evaluate and apply our method to multiple machine translation tasks and natural language understanding tasks. Our method demonstrates superior performance compared to the original Transformer and other models enhanced with external parsers. Experimental results indicate that our method is effective in both from-scratch and pre-trained scenarios. Additionally, our research highlights the contribution of explicitly modeling the grammatical structure of texts to neural network models., Comment: EMNLP 2024 Findings
- Published
- 2024