Back to Search Start Over

SparseMeshCNN with Self-Attention for Segmentation of Large Meshes

Authors :
Hansen, Bjørn
Lowes, Mathias
Ørkild, Thomas
Dahl, Anders
Dahl, Vedrana
De Backer, Ole
Camara, Oscar
Paulsen, Rasmus
Ingwersen, Christian
Sørensen, Kristine
Hansen, Bjørn
Lowes, Mathias
Ørkild, Thomas
Dahl, Anders
Dahl, Vedrana
De Backer, Ole
Camara, Oscar
Paulsen, Rasmus
Ingwersen, Christian
Sørensen, Kristine
Source :
Hansen , B , Lowes , M , Ørkild , T , Dahl , A , Dahl , V , De Backer , O , Camara , O , Paulsen , R , Ingwersen , C & Sørensen , K 2022 , ' SparseMeshCNN with Self-Attention for Segmentation of Large Meshes ' , Proceedings of the Northern Lights Deep Learning Workshop , vol. 3 .
Publication Year :
2022

Abstract

In many clinical applications, 3D mesh models of human anatomies are important tools for visualization, diagnosis, and treatment planning. Such 3D mesh models often have a high number of vertices to capture the complex shape, and processing these large meshes on readily available graphic cards can be a challenging task. To accommodate this, we present a sparse version of MeshCNN called SparseMeshCNN, which can process meshes with more than 60 000 edges. We further show that adding non-local attention in the network can mitigate the small receptive field and improve the results. The developed methodology was applied to separate the Left Atrial Appendage (LAA) from the Left Atrium (LA) on 3D mesh models constructed from medical images, but the method is general and can be put to use in any application within mesh classification or segmentation where memory can be a concern.

Details

Database :
OAIster
Journal :
Hansen , B , Lowes , M , Ørkild , T , Dahl , A , Dahl , V , De Backer , O , Camara , O , Paulsen , R , Ingwersen , C & Sørensen , K 2022 , ' SparseMeshCNN with Self-Attention for Segmentation of Large Meshes ' , Proceedings of the Northern Lights Deep Learning Workshop , vol. 3 .
Notes :
application/pdf, English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1397136425
Document Type :
Electronic Resource