Back to Search Start Over

UniMASK: Unified Inference in Sequential Decision Problems

Publication Year :
2022

Abstract

Randomly masking and predicting word tokens has been a successful approach in pre-training language models for a variety of downstream tasks. In this work, we observe that the same idea also applies naturally to sequential decision-making, where many well-studied tasks like behavior cloning, offline reinforcement learning, inverse dynamics, and waypoint conditioning correspond to different sequence maskings over a sequence of states, actions, and returns. We introduce the UniMASK framework, which provides a unified way to specify models which can be trained on many different sequential decision-making tasks. We show that a single UniMASK model is often capable of carrying out many tasks with performance similar to or better than single-task models. Additionally, after fine-tuning, our UniMASK models consistently outperform comparable single-task models. Our code is publicly available at https://github.com/micahcarroll/uniMASK.<br />Comment: NeurIPS 2022 (Oral). A prior version was published at an ICML Workshop, available at arXiv:2204.13326

Details

Database :
OAIster
Notes :
Carroll, Micah, Paradise, Orr, Lin, Jessy, Georgescu, Raluca, Sun, Mingfei, Bignell, David, Milani, Stephanie, Hofmann, Katja, Hausknecht, Matthew, Dragan, Anca, Devlin, Sam
Publication Type :
Electronic Resource
Accession number :
edsoai.on1381583313
Document Type :
Electronic Resource