Intra-interventional respiratory motion estimation is an essential part of modern radiation therapy delivery or high intensity focused ultrasound systems. The treatment quality could tremendously benefit from more accurate dose delivery using realtime motion tracking based on magnetic-resonance (MR) or ultrasound (US) imaging techniques. However, the use of images acquired during the treatment for motion estimation requires the development of dedicated tracking approaches capable of realtime processing. In this work, we present a new intra-interventional respiratory motion estimation approach that can be used in conjunction with state-of-the-art realtime imaging modalities like MR-Linac scanners and 3D-US. Our novel approach combines GPU-accelerated image-based realtime tracking of sparsely distributed feature points via block-matching with a dense patient-specific motion-model in a unified optimisation framework. Using only few feature points for image matching allows for rapid computations while the PCA-based serves as a learned motion prior for regularisation of those matching results to achieve the robustness of traditional registration approaches including a correct treatment of sliding motion. Furthermore, the motion model is also employed to interpolate dense motion fields for the whole treatment region from the motion initially estimated for the feature points. In an extensive evaluation on publicly available MRI and US data sets, our approach achieves highly accurate motion predictions in realtime with landmark errors of approx. 1 mm for MRI and approx. 2 mm for US. In addition, our approach shows substantial improvements over classical template tracking strategies often used in practice. In conclusion, our new model-based sparse-to-dense image registration approach allows for accurate and robust realtime respiratory motion tracking in image-guided interventions. Our source code is publicly available at https://github.com/mattiaspaul/realtimeDeeds.
Sign-in or become an IEEE member to discover the full contents of the paper.