Back to Search
Start Over
A Model Is Not Built By A Single Prompt: LLM-Based Domain Modeling With Question Decomposition
- Publication Year :
- 2024
-
Abstract
- Domain modeling, a crucial part of model-driven engineering, demands extensive domain knowledge and experience from engineers. When the system description is highly complicated, the modeling task can become particularly challenging and time-consuming. Large language Models(LLMs) can assist by automatically generating an initial object model from the system description. Although LLMs have demonstrated remarkable code-generation ability, they still struggle with model-generation using a single prompt. In real-world domain modeling, engineers usually decompose complex tasks into easily solvable sub-tasks, significantly controlling complexity and enhancing model quality. Inspired by this, we propose an LLM-based domain modeling approach via question decomposition, similar to developer's modeling process. Following conventional modeling guidelines, we divide the model generation task into several sub-tasks, i.e., class generation, association and aggregation generation, and inheritance generation. For each sub-task, we carefully design the prompt by choosing more efficient query words and providing essential modeling knowledge to unlock the modeling potential of LLMs. To sum up all the sub-tasks solutions, we implemente a proof-of-object tool integrated into the standard Ecore editor that asks LLMs to generate an object model from the system description. We evaluate our approach with 20 systems from different application domains. The preliminary results show that our approach outperforms the single-prompt-based prompt by improving recall values and F1 scores in most systems for modeling the classes, attributes, and relationships.
- Subjects :
- Computer Science - Software Engineering
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2410.09854
- Document Type :
- Working Paper