Back to Search Start Over

3D mesh transformer: A hierarchical neural network with local shape tokens.

Authors :
Chen, Yu
Zhao, Jieyu
Huang, Lingfeng
Chen, Hao
Source :
Neurocomputing. Dec2022, Vol. 514, p328-340. 13p.
Publication Year :
2022

Abstract

Self-attention networks have revolutionized Natural Language Processing (NLP) and are making impressive strides in image analysis tasks such as image classification and object detection. Inspired by this success, we specifically design a novel self-attention mechanism between local shapes and build a shape Transformer. We split the 3D mesh model into shape patches, which we call shape tokens, and provide polynomial fitting representations of these patches as input to the shape Transformer. The shape token encodes local geometric information and resembles the token (word) status in NLP. The simplification of the mesh model provides a hierarchical multiresolution structure, which allows us to realize the feature learning of a multilayer Transformer. We set high-level features formed by the shape Transformer as visual tokens and propose a vector-type self-attention mechanism to construct a 3D visual Transformer. Finally, we realized a hierarchical network structure based on local shape tokens and high-level visual tokens. Experiments show that our fusion network of 3D shape Transformer with explicit local shape context augmentation and 3D visual Transformer with multi-level structural feature learning achieves excellent performance on shape classification and part segmentation tasks. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
514
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
159844127
Full Text :
https://doi.org/10.1016/j.neucom.2022.09.138