A New Augmented Reality Assisted Image Guidance System for Cardiac Interventions
DescriptionThe objective of this research is to develop a quantitative and intuitive augmented reality (AR) system for visualizing and planning percutaneous cardiac intervention. Cardiac intervention is an increasingly favored treatment for a wide variety of cardiovascular diseases and is typically performed under the guidance of real-time imaging, such as X-ray fluoroscopy and echocardiography. Fluoroscopy is ubiquitous during percutaneous interventions, since it is the most accessible modality to operate, provides real-time imaging, and can easily visualize radiopaque markers commonly utilized on transcatheter devices. However, the heart is transparent to fluoroscopy, and therefore contrast agents, which transiently opacify the structure of interest, are used to visualize the relative position of the catheter to surrounding tissue. The limitations of current imaging techniques increase the complexity of existing procedures, which requires the interventionalists to determine the positioning of the catheter/device by analyzing images from multiple imaging angles based on their individual experiences. Preoperative imaging modalities, such as computed tomography (CT) and magnetic resonance imaging (MRI), which can provide detailed anatomical information, are often displayed on separate screens or overlaid on real-time imaging modalities (i.e., fluoro and echo) to improve image-guided interventions. However, this method of displaying hybrid images provides little additional information and often obstructs the view of the real-time image during the procedure. Furthermore, all of these images are displayed on 2D screens, which fundamentally mitigate any depth perception and, thus, the ability to perceive the precise position and angle of a catheter within the heart. To address these limitations, we propose a novel method of image guidance that displays, in real-time, high-resolution 3D holographic renderings of the catheter and patient’s heart using augmented reality (AR) devices. The geometry of the patient’s heart will be generated (prior to the procedure) via the automated segmentation of a cardiac CT scan. The real-time position and orientation of the catheter will be detected by processing fluoroscopic images acquired during the intervention and updated in the AR rendering of the patient’s heart. These two processes will be automated using machine learning algorithms that will be trained by a set of ground truth images that were manually segmented. Co-registration of the heart (from the CT scan) and catheter (from the fluoroscopic images) will be performed by using the spine as a universal fiduciary marker since it’s stationary and present in both imaging modalities. This novel AR-based image-guidance system will be validated by physicians using a benchtop heart model that can also be used as a training setup for interventionalists. We believe this technology could have a significant, long-term impact on the accuracy of transcatheter delivery for cardiac devices, thus facilitating the acquisition of new procedural skillsets and lowering the learning curve for existing procedures. The usefulness of this image-guidance system will be demonstrated on a transseptal puncture, which is a procedure common to many left heart interventions. However, this guidance system is applicable to any transcatheter procedure that a CT scan has been acquired prior to the procedure, and thus has the potential for broad impact on minimally invasive procedures. Overall, the presented scope of work will generate libraries of segmented images, further improve the use of machine learning algorithms for cardiac procedures, and develop a training tool useful for physician education.
|Effective start/end date||1/01/21 → …|