Back to Search Start Over

Accelerating Natural Language Understanding in Task-Oriented Dialog

Authors :
Ahuja, Ojas
Desai, Shrey
Publication Year :
2020

Abstract

Task-oriented dialog models typically leverage complex neural architectures and large-scale, pre-trained Transformers to achieve state-of-the-art performance on popular natural language understanding benchmarks. However, these models frequently have in excess of tens of millions of parameters, making them impossible to deploy on-device where resource-efficiency is a major concern. In this work, we show that a simple convolutional model compressed with structured pruning achieves largely comparable results to BERT on ATIS and Snips, with under 100K parameters. Moreover, we perform acceleration experiments on CPUs, where we observe our multi-task model predicts intents and slots nearly 63x faster than even DistilBERT.<br />Comment: Accepted to ACL 2020 Workshop on NLP for Conversational AI

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2006.03701
Document Type :
Working Paper