Cite
Mixture of insighTful Experts (MoTE): The Synergy of Thought Chains and Expert Mixtures in Self-Alignment
MLA
Liu, Zhili, et al. Mixture of InsighTful Experts (MoTE): The Synergy of Thought Chains and Expert Mixtures in Self-Alignment. 2024. EBSCOhost, widgets.ebscohost.com/prod/customlink/proxify/proxify.php?count=1&encode=0&proxy=&find_1=&replace_1=&target=https://search.ebscohost.com/login.aspx?direct=true&site=eds-live&scope=site&db=edsarx&AN=edsarx.2405.00557&authtype=sso&custid=ns315887.
APA
Liu, Z., Gou, Y., Chen, K., Hong, L., Gao, J., Mi, F., Zhang, Y., Li, Z., Jiang, X., Liu, Q., & Kwok, J. T. (2024). Mixture of insighTful Experts (MoTE): The Synergy of Thought Chains and Expert Mixtures in Self-Alignment.
Chicago
Liu, Zhili, Yunhao Gou, Kai Chen, Lanqing Hong, Jiahui Gao, Fei Mi, Yu Zhang, et al. 2024. “Mixture of InsighTful Experts (MoTE): The Synergy of Thought Chains and Expert Mixtures in Self-Alignment.” http://widgets.ebscohost.com/prod/customlink/proxify/proxify.php?count=1&encode=0&proxy=&find_1=&replace_1=&target=https://search.ebscohost.com/login.aspx?direct=true&site=eds-live&scope=site&db=edsarx&AN=edsarx.2405.00557&authtype=sso&custid=ns315887.