[Defense] A Framework for Interactive Immersion into Imaging Data using Augmented Reality
Thursday, April 14, 2022
10:30 am - 11:10 am
will defend his dissertation
A Framework for Interactive Immersion into Imaging Data using Augmented Reality
Image acquisition scanners produce an ever-growing amount of 3D/4D multimodal data that requires extensive image analytics and visualization of collected and generated information. For the latter, augmented reality (AR) with head-mounted displays (HMDs) has been commended as a potential enhancement. This PhD describes a framework (FI3D) for interactive and immersive experiences using an AR interface powered with image processing and analytics. The FI3D was designed to communicate with peripherals, including imaging scanners and HMDs, and to provide computational power for data acquisition and processing. The core of the FI3D is deployed to a dedicated computational unit that performs the computationally demanding processes in real-time, and the HMD is used as an IO device. The FI3D is customizable, allowing users to integrate it with different workflows while incorporating third party libraries. Using the FI3D as a foundation, two applications were developed in the cardiac and urology medical domains to experiment with, test, and validate the system. First, cine MRI images were segmented using a machine learning model while simultaneously an HMD rendered the reconstructed surfaces. Secondly, a simulated environment for robotic assisted MRI-guided transrectal prostate biopsies was developed, and user studies were conducted to evaluate the feasibility of AR visualization and interaction using the HoloLens HMD. Performance results showed that the system can maintain an image stream of five images with a resolution of 512 x 512 per second and update visual properties of the holograms at 1 update per 16 milliseconds. Interactive studies showed that using a gaming joystick allowed the manipulation of a robotic structure more effectively than using holographic menus or traditional input interfaces, i.e., mouse and keyboard. The presented framework can serve as the foundation for medical applications that benefit from AR visualization, removing various technical challenges from the developmental pipeline. Its versatility and the immersive and interactive experiences offered by the AR interface may assist physicians with diagnosis and image-guided interventions, resulting in safer and faster procedures. This can further increase the accessibility of healthcare to the public, yielding an increase in patient throughput.
10:30AM - 11:10AM CT
Virtual via MS Teams
Dr. Nikolaos Tsekos and Dr. Ernst Leiss, dissertation co-advisors
Faculty, students and the general public are invited.