Back to Search Start Over

How to Leverage Diverse Demonstrations in Offline Imitation Learning

Authors :
Yue, Sheng
Liu, Jiani
Hua, Xingyuan
Ren, Ju
Lin, Sen
Zhang, Junshan
Zhang, Yaoxue
Publication Year :
2024

Abstract

Offline Imitation Learning (IL) with imperfect demonstrations has garnered increasing attention owing to the scarcity of expert data in many real-world domains. A fundamental problem in this scenario is how to extract positive behaviors from noisy data. In general, current approaches to the problem select data building on state-action similarity to given expert demonstrations, neglecting precious information in (potentially abundant) $\textit{diverse}$ state-actions that deviate from expert ones. In this paper, we introduce a simple yet effective data selection method that identifies positive behaviors based on their resultant states -- a more informative criterion enabling explicit utilization of dynamics information and effective extraction of both expert and beneficial diverse behaviors. Further, we devise a lightweight behavior cloning algorithm capable of leveraging the expert and selected data correctly. In the experiments, we evaluate our method on a suite of complex and high-dimensional offline IL benchmarks, including continuous-control and vision-based tasks. The results demonstrate that our method achieves state-of-the-art performance, outperforming existing methods on $\textbf{20/21}$ benchmarks, typically by $\textbf{2-5x}$, while maintaining a comparable runtime to Behavior Cloning ($\texttt{BC}$).<br />Comment: International Conference on Machine Learning (ICML)

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2405.17476
Document Type :
Working Paper