Back to Search Start Over

Mindstorms in Natural Language-Based Societies of Mind

Authors :
Zhuge, Mingchen
Liu, Haozhe
Faccio, Francesco
Ashley, Dylan R.
Csordás, Róbert
Gopalakrishnan, Anand
Hamdi, Abdullah
Hammoud, Hasan Abed Al Kader
Herrmann, Vincent
Irie, Kazuki
Kirsch, Louis
Li, Bing
Li, Guohao
Liu, Shuming
Mai, Jinjie
Piękos, Piotr
Ramesh, Aditya
Schlag, Imanol
Shi, Weimin
Stanić, Aleksandar
Wang, Wenyi
Wang, Yuhui
Xu, Mengmeng
Fan, Deng-Ping
Ghanem, Bernard
Schmidhuber, Jürgen
Publication Year :
2023

Abstract

Both Minsky's "society of mind" and Schmidhuber's "learning to think" inspire diverse societies of large multimodal neural networks (NNs) that solve problems by interviewing each other in a "mindstorm." Recent implementations of NN-based societies of minds consist of large language models (LLMs) and other NN-based experts communicating through a natural language interface. In doing so, they overcome the limitations of single LLMs, improving multimodal zero-shot reasoning. In these natural language-based societies of mind (NLSOMs), new agents -- all communicating through the same universal symbolic language -- are easily added in a modular fashion. To demonstrate the power of NLSOMs, we assemble and experiment with several of them (having up to 129 members), leveraging mindstorms in them to solve some practical AI tasks: visual question answering, image captioning, text-to-image synthesis, 3D generation, egocentric retrieval, embodied AI, and general language-based task solving. We view this as a starting point towards much larger NLSOMs with billions of agents-some of which may be humans. And with this emergence of great societies of heterogeneous minds, many new research questions have suddenly become paramount to the future of artificial intelligence. What should be the social structure of an NLSOM? What would be the (dis)advantages of having a monarchical rather than a democratic structure? How can principles of NN economies be used to maximize the total reward of a reinforcement learning NLSOM? In this work, we identify, discuss, and try to answer some of these questions.<br />Comment: 9 pages in main text + 7 pages of references + 38 pages of appendices, 14 figures in main text + 13 in appendices, 7 tables in appendices

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2305.17066
Document Type :
Working Paper