MULTIMODAL IMAGING, COMPUTER VISION, and AUGMENTED REALITY for MEDICAL GUIDANCE a Dissertation Presented to the Graduate Faculty
Total Page:16
File Type:pdf, Size:1020Kb
MULTIMODAL IMAGING, COMPUTER VISION, AND AUGMENTED REALITY FOR MEDICAL GUIDANCE A Dissertation Presented to The Graduate Faculty of The University of Akron In Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy Christopher Andrew Mela December, 2018 MULTIMODAL IMAGING, COMPUTER VISION, AND AUGMENTED REALITY FOR MEDICAL GUIDANCE Christopher Andrew Mela Dissertation Approved: Accepted: ________________________ ________________________ Advisor Interim Department Chair Dr. Yang Liu Dr. Rebecca K. Willits ________________________ ________________________ Committee Member Interim Dean of the College Dr. Brian Davis Dr. Craig Menzemer ________________________ ________________________ Committee Member Dean of the Graduate School Dr. Rebecca K. Willits Dr. Chand Midha __________________________ __________________________ Committee Member Date Dr. Ajay Mahajan __________________________ Committee Member Dr. Yi Pang __________________________ Committee Member Dr. Jiahua Zhu ii ABSTRACT Surgery is one of the primary treatment options for many types of diseases. Traditional methods of surgical planning and intraoperative lesion identification rely on sight as well as physical palpitation of the suspect region. Since these methods are of low specificity, doctors have begun relying upon medical imaging technologies to make diagnoses and to help plan and guide surgical procedures. Preoperative imaging technologies such as Magnetic Resonance Imaging (MRI) and X-ray Computed Tomography (CT) do well to aid in diagnostics and operative planning. However, compact technologies with high specificity and resolution that are convenient for intraoperative use are needed to aid in surgical guidance. Methods including fluorescence imaging, intraoperative microscopy and ultrasound have gained significant recent attention towards these ends. Discussed in this dissertation is the initial design, construction, programming, testing and expanding of a platform technology integrating multimodal medical imaging, computer vision and augmented reality. The platform combines a real-time, head-mounted stereoscopic fluorescence imaging system in-line with a near-eye display. The compact and light-weight assembly provides the user with a wide field-of-view, line-of-sight imaging system that simulates natural binocular vision. Additionally, an ultrasound iii imaging module is connected and incorporated into the display, along with a portable fiber microscopy system. Lastly, pre-operative MRI/CT imaging models are incorporated into the system for intraoperative registration and display onto the surgical scene. Novel software algorithms were developed to enhance system operations. Fluorescence detection while using a wearable imaging platform was improved through the incorporation of an optical point tracking regime with pulsatile illumination. Optical fiducial marker identification was added to aid in ultrasound and tomographic image registration. Additionally, stereoscopic depth-of-field measurements were used towards the implementation of a fluorescence-to-color video rate co-registration scheme. System testing was conducted on multiple fronts. Fluorescence imaging sensitivity was evaluated to determine minimum detectable concentrations of fluorescent dye. Surgical and medical diagnostic simulations were also conducted using optical tissue phantoms to evaluate device performance in relation to traditional methods, and to identify areas of improvement. System resolution was also analyzed both in planar spatial coordinates as well as depth-of-field measurements. The performance of various augmented reality displays was tested with respect to fidelity of fluorescence identification and resolution. Lastly, the system was tested for registration accuracy. In summary, we have developed a platform integrating intraoperative multimodal imaging, computer vision, and augmented reality, for guiding surgeries and other medical applications. iv ACKNOWLEDGEMENTS Special thanks to my advisor Dr. Yang Liu who took me on when few others would. Thanks to my lab mates, Tri Quang and Maziyar Askari, for working with me these past few years. Thanks as well to my committee members for, well, being on my committee. I’m sure reading this will be fun. Additional acknowledgements to Stephen Paterson, Visar Berki, Drs. Forrest Bao, Vivek Nagarajan and Narrender Reddy for their technical support at various times during my time at the University of Akron. More gratitude towards Charlotte LaBelle and Sandy Vasenda for assisting with my many and varied administrative needs, as well as to Dr. Daniel Sheffer who accepted my application and was the first person to welcome me at Akron. Thanks to our clinical collaborators at the Cleveland Clinic, in particular Dr. Frank Papay who made our collaborations possible. Additional thanks to Dr. Stephen Grobmyer who kindly brought our system into the surgery, and Drs. Edward Maytin, Maria Madajka and Eliana Duraes for collaborating with us on clinical imaging research. A big thanks to NASA, William Thompson at Glenn Research Center and Baraquiel Reyna at Johnson, as well as to the whole NASA Space Technologies Research Fellowship team. v TABLE OF CONTENTS Page LIST OF TABLES………………………………………………………………………xiv LIST OF FIGURES…………………………………………………………………….xix CHAPTER I. INTRODUCTION………………………………………………………………………1 1.1. Imaging for Surgical Oncology………………………………………………1 1.2. Intraoperative Fluorescence Imaging for Surgical Interventions…………….5 1.2.1. Fluorescent Dyes……………………………………………………5 1.2.2. Fluorescence Imaging in Surgery…………………………………..8 1.2.3. Instrumentation in Fluorescence Imaging…………………………..9 1.2.4. Fluorescence Imaging Systems……………………………………11 1.3. Augmented Reality in Medical Imaging…………………………………….14 1.4. Multimodal Imaging………………………………………………………..17 1.4.1. Ultrasound…………………………………………………………17 vi 1.4.2. Radiology………………………………………………………….18 1.4.3. MRI/CT……………………………………………………………18 1.5. Scope and Aims……………………………………………………………..19 II. STEREOSCOPIC IMAGING GOGGLES FOR MULTIMODAL INTRAOPERATIVE IMAGE GUIDANCE ……………………………………….25 2.1. Introduction………………………………………………………………….25 2.2. Materials and Methods………………………………………………………28 2.2.1. Imaging System Instrumentation………………………………….28 2.2.2. Image Acquisition, Processing, Registration and Display………..32 2.2.3. System Characterization…………………………………………..34 2.2.4. Image-Guided Surgery in Chicken Ex Vivo………………………36 2.2.5. Telemedicine………………………………………………………37 2.3. Results……………………………………………………………………….38 2.3.1. System Characterization…………………………………………..38 2.3.2. Image-Guided Surgeries in Chicken………………………………41 2.3.3. Telemedicine………………………………………………………46 2.4. Discussion…………………………………………………………………...47 2.4.1. Stereoscopy………………………………………………………..48 2.4.2. Characterization…………………………………………………...48 vii 2.4.3. Microscopy………………………………………………………..49 2.4.4. Ultrasound…………………………………………………………50 2.4.5. Future Work……………………………………………………….50 2.5. Conclusions………………………………………………………………….51 III. METHODS OF CHARACTERIZATION FOR A STEREOSCOPIC HEAD- MOUNTED FLUORESCENCE IMAGING SYSTEM……………………………….52 3.1. Introduction………………………………………………………………….52 3.2. Materials and Methods………………………………………………………57 3.2.1. Optical Imaging and Display……………………………………..57 3.2.2. Computation……………………………………………………….59 3.2.3. Fluorescence Detection Sensitivity……………………………….59 3.2.4. Fluorescence Guided Surgical Simulation………………………..64 3.2.5. Resolution Testing………………………………………………..67 3.2.6. Display Testing…………………………………………………….69 3.3. Results……………………………………………………………………….70 3.3.1. Fluorescence Detection Sensitivity……………………..............70 3.3.2. Fluorescence Guided Surgical Simulation………………………..76 3.3.3. Resolution Testing…………………………………………………77 3.3.4. Display Testing……………………………………………………78 viii 3.4. Discussion…………………………………………………………………..81 5.4.1. Dark Room Study…………………………………………………81 5.4.2. Tissue Phantom Study…………………………………………….82 5.4.3. Display Testing……………………………………………………84 3.5. Conclusions………………………………………………………………….85 IV. APPLICATION OF A DENSE FLOW OPTICAL TRACKING ALGORITHM WITH PULSED LIGHT IMAGING FOR ENHANCED FLUORESCENCE DETECTION ………………………………………………………………………….86 4.1. Introduction………………………………………………………………….86 4.1.1. Enhancing Fluorescence Imaging for Clinical Application……….86 4.1.2. Pulsed Light Imaging for Skin Cancer Therapy…………………..90 4.2. Materials and Methods………………………………………………………92 4.2.1. Computation……………………………………………………….92 4.2.2. Instrumentation……………………………………………………92 4.2.3. Illumination………………………………………………………..93 4.2.4. Fluorescent Point Tracking………………………………………..96 4.2.5. Fluorescence Sensitivity………………………………………….97 4.2.6. Fluorescent Point Tracking Accuracy ………………………......98 4.3. Results……………………………………………………………………..101 4.3.1. Pulsed Light Imaging…………………………………………….101 ix 4.3.2. Fluorescent Point Tracking Accuracy……………………………103 4.3.3. Fluorescence Sensitivity………………………………………...104 4.4. Discussion………………………………………………………………….106 4.4.1. Pulsed Light Imaging…………………………………………….106 4.4.2. Fluorescent Point Tracking Accuracy……………………………107 4.4.3. Fluorescence Sensitivity…………………………………………108 4.5. Conclusions…………………………………………………………….....109 V. MULTIMODAL IMAGING GOGGLE WITH AUGMENTED REALITY COMBINING FLUORESCENCE, ULTRASOUND AND TOMOGRAPHICAL IMAGING………………………………………………………………………..111 5.1. Introduction………………………………………………………………..111 5.1.1. Single and Multimode Imaging………………………………….111 5.1.2. Multimodal Image Registration………………………………….112 5.1.3. Augmented Reality Fluorescence Imaging System with Multimode Registration…………………………………………………………..116 5.2. Materials and Methods…………………………………………………….117 5.2.1. Optical