Back to Search Start Over

Learning Generalizable Models for Vehicle Routing Problems via Knowledge Distillation

Authors :
Bi, Jieyi
Ma, Yining
Wang, Jiahai
Cao, Zhiguang
Chen, Jinbiao
Sun, Yuan
Chee, Yeow Meng
Publication Year :
2022
Publisher :
arXiv, 2022.

Abstract

Recent neural methods for vehicle routing problems always train and test the deep models on the same instance distribution (i.e., uniform). To tackle the consequent cross-distribution generalization concerns, we bring the knowledge distillation to this field and propose an Adaptive Multi-Distribution Knowledge Distillation (AMDKD) scheme for learning more generalizable deep models. Particularly, our AMDKD leverages various knowledge from multiple teachers trained on exemplar distributions to yield a light-weight yet generalist student model. Meanwhile, we equip AMDKD with an adaptive strategy that allows the student to concentrate on difficult distributions, so as to absorb hard-to-master knowledge more effectively. Extensive experimental results show that, compared with the baseline neural methods, our AMDKD is able to achieve competitive results on both unseen in-distribution and out-of-distribution instances, which are either randomly synthesized or adopted from benchmark datasets (i.e., TSPLIB and CVRPLIB). Notably, our AMDKD is generic, and consumes less computational resources for inference.<br />Comment: Accepted at NeurIPS 2022

Details

Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....b2b9895e46a94914360b39158c16ec13
Full Text :
https://doi.org/10.48550/arxiv.2210.07686