Back to Search Start Over

PARMESAN: Parameter-Free Memory Search and Transduction for Dense Prediction Tasks

Authors :
Winter, Philip Matthias
Wimmer, Maria
Major, David
Lenis, Dimitrios
Berg, Astrid
Neubauer, Theresa
De Paolis, Gaia Romana
Novotny, Johannes
Ulonska, Sophia
Bühler, Katja
Publication Year :
2024

Abstract

This work addresses flexibility in deep learning by means of transductive reasoning. For adaptation to new data and tasks, e.g., in continual learning, existing methods typically involve tuning learnable parameters or complete re-training from scratch, rendering such approaches unflexible in practice. We argue that the notion of separating computation from memory by the means of transduction can act as a stepping stone for solving these issues. We therefore propose PARMESAN (parameter-free memory search and transduction), a scalable method which leverages a memory module for solving dense prediction tasks. At inference, hidden representations in memory are being searched to find corresponding patterns. In contrast to other methods that rely on continuous training of learnable parameters, PARMESAN learns via memory consolidation simply by modifying stored contents. Our method is compatible with commonly used architectures and canonically transfers to 1D, 2D, and 3D grid-based data. The capabilities of our approach are demonstrated at the complex task of continual learning. PARMESAN learns by 3-4 orders of magnitude faster than established baselines while being on par in terms of predictive performance, hardware-efficiency, and knowledge retention.<br />Comment: preprint, 25 pages, 7 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2403.11743
Document Type :
Working Paper