Back to Search Start Over

Reverse That Number! Decoding Order Matters in Arithmetic Learning

Authors :
Zhang-Li, Daniel
Lin, Nianyi
Yu, Jifan
Zhang, Zheyuan
Yao, Zijun
Zhang, Xiaokang
Hou, Lei
Zhang, Jing
Li, Juanzi
Publication Year :
2024

Abstract

Recent advancements in pretraining have demonstrated that modern Large Language Models (LLMs) possess the capability to effectively learn arithmetic operations. However, despite acknowledging the significance of digit order in arithmetic computation, current methodologies predominantly rely on sequential, step-by-step approaches for teaching LLMs arithmetic, resulting in a conclusion where obtaining better performance involves fine-grained step-by-step. Diverging from this conventional path, our work introduces a novel strategy that not only reevaluates the digit order by prioritizing output from the least significant digit but also incorporates a step-by-step methodology to substantially reduce complexity. We have developed and applied this method in a comprehensive set of experiments. Compared to the previous state-of-the-art (SOTA) method, our findings reveal an overall improvement of in accuracy while requiring only a third of the tokens typically used during training. For the purpose of facilitating replication and further research, we have made our code and dataset publicly available at \url{https://anonymous.4open.science/r/RAIT-9FB7/}.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2403.05845
Document Type :
Working Paper