Back to Search Start Over

Enhancing DETRs Variants through Improved Content Query and Similar Query Aggregation

Authors :
Zhang, Yingying
Shi, Chuangji
Guo, Xin
Lao, Jiangwei
Wang, Jian
Wang, Jiaotuan
Chen, Jingdong
Publication Year :
2024

Abstract

The design of the query is crucial for the performance of DETR and its variants. Each query consists of two components: a content part and a positional one. Traditionally, the content query is initialized with a zero or learnable embedding, lacking essential content information and resulting in sub-optimal performance. In this paper, we introduce a novel plug-and-play module, Self-Adaptive Content Query (SACQ), to address this limitation. The SACQ module utilizes features from the transformer encoder to generate content queries via self-attention pooling. This allows candidate queries to adapt to the input image, resulting in a more comprehensive content prior and better focus on target objects. However, this improved concentration poses a challenge for the training process that utilizes the Hungarian matching, which selects only a single candidate and suppresses other similar ones. To overcome this, we propose a query aggregation strategy to cooperate with SACQ. It merges similar predicted candidates from different queries, easing the optimization. Our extensive experiments on the COCO dataset demonstrate the effectiveness of our proposed approaches across six different DETR's variants with multiple configurations, achieving an average improvement of over 1.0 AP.<br />Comment: 11 pages, 7 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2405.03318
Document Type :
Working Paper