Back to Search Start Over

Robust Analysis of Multi-Task Learning Efficiency: New Benchmarks on Light-Weighed Backbones and Effective Measurement of Multi-Task Learning Challenges by Feature Disentanglement

Authors :
Mao, Dayou
Chen, Yuhao
Wu, Yifan
Gilles, Maximilian
Wong, Alexander
Publication Year :
2024

Abstract

One of the main motivations of MTL is to develop neural networks capable of inferring multiple tasks simultaneously. While countless methods have been proposed in the past decade investigating robust model architectures and efficient training algorithms, there is still lack of understanding of these methods when applied on smaller feature extraction backbones, the generalizability of the commonly used fast approximation technique of replacing parameter-level gradients with feature level gradients, and lack of comprehensive understanding of MTL challenges and how one can efficiently and effectively identify the challenges. In this paper, we focus on the aforementioned efficiency aspects of existing MTL methods. We first carry out large-scale experiments of the methods with smaller backbones and on a the MetaGraspNet dataset as a new test ground. We also compare the existing methods with and without using the fast gradient surrogate and empirically study the generalizability of this technique. Lastly, we propose Feature Disentanglement measure as a novel and efficient identifier of the challenges in MTL, and propose Ranking Similarity score as an evaluation metric for different identifiers to prove the faithfulness of our method.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2402.03557
Document Type :
Working Paper