1. AcademicGPT: Empowering Academic Research
- Author
-
Wei, Shufa, Xu, Xiaolong, Qi, Xianbiao, Yin, Xi, Xia, Jun, Ren, Jingyi, Tang, Peijun, Zhong, Yuxiang, Chen, Yihao, Ren, Xiaoqin, Liang, Yuxin, Huang, Liankai, Xie, Kai, Gui, Weikang, Tan, Wei, Sun, Shuanglong, Hu, Yongquan, Liu, Qinxian, Li, Nanjin, Dai, Chihao, Wang, Lihua, Liu, Xiaohui, Zhang, Lei, and Xie, Yutao
- Subjects
Computer Science - Computation and Language - Abstract
Large Language Models (LLMs) have demonstrated exceptional capabilities across various natural language processing tasks. Yet, many of these advanced LLMs are tailored for broad, general-purpose applications. In this technical report, we introduce AcademicGPT, designed specifically to empower academic research. AcademicGPT is a continual training model derived from LLaMA2-70B. Our training corpus mainly consists of academic papers, thesis, content from some academic domain, high-quality Chinese data and others. While it may not be extensive in data scale, AcademicGPT marks our initial venture into a domain-specific GPT tailored for research area. We evaluate AcademicGPT on several established public benchmarks such as MMLU and CEval, as well as on some specialized academic benchmarks like PubMedQA, SCIEval, and our newly-created ComputerScienceQA, to demonstrate its ability from general knowledge ability, to Chinese ability, and to academic ability. Building upon AcademicGPT's foundation model, we also developed several applications catered to the academic area, including General Academic Question Answering, AI-assisted Paper Reading, Paper Review, and AI-assisted Title and Abstract Generation., Comment: Technical Report. arXiv admin note: text overlap with arXiv:2310.12081, arXiv:2310.10053 by other authors
- Published
- 2023