1. Towards Achieving Concept Completeness for Unsupervised Textual Concept Bottleneck Models
- Author
-
Bhan, Milan, Choho, Yann, Moreau, Pierre, Vittaut, Jean-Noel, Chesneau, Nicolas, and Lesot, Marie-Jeanne
- Subjects
Computer Science - Computation and Language - Abstract
Textual Concept Bottleneck Models (TBMs) are interpretable-by-design models for text classification that predict a set of salient concepts before making the final prediction. This paper proposes Complete Textual Concept Bottleneck Model (CT-CBM),a novel TCBM generator building concept labels in a fully unsupervised manner using a small language model, eliminating both the need for predefined human labeled concepts and LLM annotations. CT-CBM iteratively targets and adds important concepts in the bottleneck layer to create a complete concept basis and addresses downstream classification leakage through a parallel residual connection. CT-CBM achieves good results against competitors, offering a promising solution to enhance interpretability of NLP classifiers without sacrificing performance.
- Published
- 2025