Back to Search Start Over

TIES-Merging: Resolving Interference When Merging Models

Authors :
Yadav, Prateek
Tam, Derek
Choshen, Leshem
Raffel, Colin
Bansal, Mohit
Yadav, Prateek
Tam, Derek
Choshen, Leshem
Raffel, Colin
Bansal, Mohit
Publication Year :
2023

Abstract

Transfer learning - i.e., further fine-tuning a pre-trained model on a downstream task - can confer significant advantages, including improved downstream performance, faster convergence, and better sample efficiency. These advantages have led to a proliferation of task-specific fine-tuned models, which typically can only perform a single task and do not benefit from one another. Recently, model merging techniques have emerged as a solution to combine multiple task-specific models into a single multitask model without performing additional training. However, existing merging methods often ignore the interference between parameters of different models, resulting in large performance drops when merging multiple models. In this paper, we demonstrate that prior merging techniques inadvertently lose valuable information due to two major sources of interference: (a) interference due to redundant parameter values and (b) disagreement on the sign of a given parameter's values across models. To address this, we propose our method, TRIM, ELECT SIGN & MERGE (TIES-Merging), which introduces three novel steps when merging models: (1) resetting parameters that only changed a small amount during fine-tuning, (2) resolving sign conflicts, and (3) merging only the parameters that are in alignment with the final agreed-upon sign. We find that TIES-Merging outperforms several existing methods in diverse settings covering a range of modalities, domains, number of tasks, model sizes, architectures, and fine-tuning settings. We further analyze the impact of different types of interference on model parameters, and highlight the importance of resolving sign interference. Our code is available at https://github.com/prateeky2806/ties-merging<br />Comment: Published at NeurIPS 2023, 23 Pages, 13 Figures, 14 Tables

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1381633215
Document Type :
Electronic Resource