1. Reconstructing unseen modalities and pathology with an efficient Recurrent Inference Machine
- Author
-
Karkalousos, Dimitrios, Lønning, Kai, Hulst, Hanneke E., Dumoulin, Serge O., Sonke, Jan-Jakob, Vos, Frans M., and Caan, Matthan W. A.
- Subjects
Electrical Engineering and Systems Science - Image and Video Processing - Abstract
Objective: To allow efficient learning using the Recurrent Inference Machine (RIM) for image reconstruction whereas not being strictly dependent on the training data distribution so that unseen modalities and pathologies are still accurately recovered. Methods: Theoretically, the RIM learns to solve the inverse problem of accelerated-MRI reconstruction whereas being robust to variable imaging conditions. The efficiency and generalization capabilities with different training datasets were studied, as well as recurrent network units with decreasing complexity: the Gated Recurrent Unit (GRU), the Minimal Gated Unit (MGU), and the Independently Recurrent Neural Network (IndRNN), to reduce inference times. Validation was performed against Compressed Sensing (CS) and further assessed based on data unseen during training. A pathology study was conducted by reconstructing simulated white matter lesions and prospectively undersampled data of a Multiple Sclerosis patient. Results: Training on a single modality of 3T $T_1$-weighted brain data appeared sufficient to also reconstruct 7T $T_{2}^*$-weighted brain and 3T $T_2$-weighted knee data. The IndRNN is an efficient recurrent unit, reducing inference time by 68\% compared to CS, whereas maintaining performance. The RIM was able to reconstruct lesions unseen during training more accurately than CS when trained on $T_2$-weighted knee data. Training on $T_1$-weighted brain data and on combined data slightly enhanced the signal compared to CS. Conclusion: The RIM is efficient when decreasing its complexity, which reduces the inference time, whereas still being able to reconstruct data and pathology that was unseen during training., Comment: 20 pages, 8 figures
- Published
- 2020