Back to Search Start Over

Personalised pose estimation from singleplane moving fluoroscope images using deep convolutional neural networks

Authors :
Vogl, Florian
Schütz, Pascal
Postolka, Barbara
List, Renate
Taylor, William R.
Source :
PLoS ONE, 17 (6 6)
Publication Year :
2022
Publisher :
ETH Zurich, 2022.

Abstract

Measuring joint kinematics is a key requirement for a plethora of biomechanical research and applications. While x-ray based systems avoid the soft-tissue artefacts arising in skinbased measurement systems, extracting the object's pose (translation and rotation) from the x-ray images is a time-consuming and expensive task. Based on about 106'000 annotated images of knee implants, collected over the last decade with our moving fluoroscope during activities of daily living, we trained a deep-learning model to automatically estimate the 6D poses for the femoral and tibial implant components. By pretraining a single stage of our architecture using renderings of the implant geometries, our approach offers personalised predictions of the implant poses, even for unseen subjects. Our approach predicted the pose of both implant components better than about 0.75 mm (in-plane translation), 25 mm (out-of-plane translation), and 2° (all Euler-angle rotations) over 50% of the test samples. When evaluating over 90% of test samples, which included heavy occlusions and low contrast images, translation performance was better than 1.5 mm (in-plane) and 30 mm (out-ofplane), while rotations were predicted better than 3-4°. Importantly, this approach now allows for pose estimation in a fully automated manner.<br />PLoS ONE, 17 (6 6)<br />ISSN:1932-6203

Details

Language :
English
ISSN :
19326203
Database :
OpenAIRE
Journal :
PLoS ONE, 17 (6 6)
Accession number :
edsair.doi.dedup.....2b235e03b8569d6319e904743f01da08
Full Text :
https://doi.org/10.3929/ethz-b-000557208