Title PQ-Transformer: Jointly Parsing 3D Objects and Layouts From Point Clouds
Authors Chen, Xiaoxue
Zhao, Hao
Zhou, Guyue
Zhang, Ya-Qin
Affiliation Tsinghua Univ, Inst AI Ind Res, Beijing 100190, Peoples R China
Peking Univ, Intel Labs China, Beijing 100871, Peoples R China
Issue Date Apr-2022
Publisher IEEE ROBOTICS AND AUTOMATION LETTERS
Abstract 3D scene understanding from point clouds plays a vital role for various robotic applications. Unfortunately, current state-of-the-art methods use separate neural networks for different tasks like object detection or room layout estimation. Such a scheme has two limitations: 1) Storing and running several networks for different tasks are expensive for typical robotic platforms. 2) The intrinsic structure of separate outputs are ignored and potentially violated. To this end, we propose the first transformer architecture that predicts 3D objects and layouts simultaneously, using point cloud inputs. Unlike existing methods that either estimate layout keypoints or edges, we directly parameterize room layout as a set of quads. As such, the proposed architecture is termed as P(oint)Q(uad)-Transformer. Along with the novel quad representation, we propose a tailored physical constraint loss function that discourages object-layout interference. The quantitative and qualitative evaluations on the public benchmark Scan Net show that the proposed PQ-Transformer succeeds to jointly parse 3D objects and layouts, running at a quasi-real-time (8.91 FPS) rate without efficiency-oriented optimization. Moreover, the new physical constraint lass can improve strong baselines, and the F1-score of the room layout is significantly promoted from 37.9% to 57.9%.(1)
URI http://hdl.handle.net/20.500.11897/637391
ISSN 2377-3766
DOI 10.1109/LRA.2022.3143224
Indexed EI
SCI(E)
Appears in Collections: 待认领

Files in This Work
There are no files associated with this item.

Web of Science®


0

Checked on Last Week

Scopus®



Checked on Current Time

百度学术™


0

Checked on Current Time

Google Scholar™





License: See PKU IR operational policies.