Title Conditional DETR for Fast Training Convergence
Authors Meng, Depu
Chen, Xiaokang
Fan, Zejia
Zeng, Gang
Li, Houqiang
Yuan, Yuhui
Sun, Lei
Wang, Jingdong
Affiliation Univ Sci & Technol China, Hefei, Peoples R China
Peking Univ, Beijing, Peoples R China
Microsoft Res Asia, Beijing, Peoples R China
Microsoft Res, Beijing, Peoples R China
Issue Date 2021
Publisher 2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021)
Abstract The recently-developed DETR approach applies the transformer encoder and decoder architecture to object detection and achieves promising performance. In this paper, we handle the critical issue, slow training convergence, and present a conditional cross-attention mechanism for fast DETR training. Our approach is motivated by that the cross-attention in DETR relies highly on the content embeddings for localizing the four extremities and predicting the box, which increases the need for high-quality content embeddings and thus the training difficulty. Our approach, named conditional DETR, learns a conditional spatial query from the decoder embedding for decoder multi-head cross-attention. The benefit is that through the conditional spatial query, each cross-attention head is able to attend to a band containing a distinct region, e.g., one object extremity or a region inside the object box. This narrows down the spatial range for localizing the distinct regions for object classification and box regression, thus relaxing the dependence on the content embeddings and easing the training. Empirical results show that conditional DETR converges 6.7x faster for the backbones R50 and R101 and 10x faster for stronger backbones DC5-R50 and DC5-R101. Code is available at https://github.com/Atten4Vis/ConditionalDETR.
URI http://hdl.handle.net/20.500.11897/646640
ISBN 978-1-6654-2812-5
DOI 10.1109/ICCV48922.2021.00363
Indexed EI
CPCI-S(ISTP)
Appears in Collections: 待认领

Files in This Work
There are no files associated with this item.

Web of Science®


0

Checked on Last Week

Scopus®



Checked on Current Time

百度学术™


0

Checked on Current Time

Google Scholar™





License: See PKU IR operational policies.