Back to Search Start Over

On Meta-Prompting

Authors :
de Wynter, Adrian
Wang, Xun
Gu, Qilong
Chen, Si-Qing
Publication Year :
2023

Abstract

Certain statistical models are capable of interpreting input strings as instructions, or prompts, and carry out tasks based on them. Many approaches to prompting and pre-training these models involve the automated generation of these prompts. We call these approaches meta-prompting, or prompting to obtain prompts. We propose a theoretical framework based on category theory to generalize and describe them. This framework is flexible enough to account for LLM stochasticity; and allows us to obtain formal results around task agnosticity and equivalence of various meta-prompting approaches. We experiment with meta-prompting in two active areas of model research: creativity and ideation. We find that user preference favors (p < 0.01) the prompts generated under meta-prompting, as well as their corresponding outputs, over a series of hardcoded baseline prompts that include the original task prompt. Using our framework, we argue that meta-prompting is more effective than basic prompting at generating desirable outputs.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2312.06562
Document Type :
Working Paper