1. Department of Surgery and Cancer, Imperial College London

1. Department of Surgery and Cancer, Imperial College London

<p> 1</p><p>1 The current and future use of imaging in urological robotic surgery: A survey of the</p><p>2 European Association of Robotic Urological Surgeons</p><p>3 Archie Hughes-Hallett, MRCS1, Erik K Mayer, PhD1, Philip Pratt, PhD2, Alex Mottrie, PhD3,4, Ara</p><p>4 Darzi, FRS1,2, Justin Vale, MS1, </p><p>5 1. Department of Surgery and Cancer, Imperial College London 6 2. The Hamlyn Centre for Robotic Surgery, Imperial College London 7 3. Department of Urology, OLV Clinic, Aalst, Belgium 8 4. O.L.V. Vattikuti Robotic Surgery Institute, Aalst, Belgium 9 10</p><p>11 Corresponding Author </p><p>12 Erik Mayer, </p><p>13 Department of Surgery and Cancer, Imperial College London, St Marys Hospital</p><p>14 Campus, London, W2 1NY</p><p>15 07984195642</p><p>16 [email protected]</p><p>17</p><p>18 No reprints will be available from the authors</p><p>19</p><p>20 No financial support was received</p><p>21</p><p>22 Article Category: Original Article</p><p>23</p><p>24 Word count abstract: 244</p><p>25 Word count manuscript text: 2,142</p><p>26 5 figures and 2 tables</p><p>27</p><p>28 2</p><p>29</p><p>30 Introduction</p><p>31 Since Röntgen first utilised x-rays to image the carpal bones of the human hand in</p><p>32 1895, medical imaging has evolved and is now able to provide a detailed</p><p>33 representation of a patient’s intracorporeal anatomy, with recent advances now</p><p>34 allowing for 3-dimensional (3D) reconstructions. The visualisation of anatomy in 3D</p><p>35 has been shown to improve the ability to localize structures when compared to 2D</p><p>36 with no change in the amount of cognitive loading [1]. This has allowed imaging to</p><p>37 move from a largely diagnostic tool to one that can be used for both diagnosis and</p><p>38 operative planning. </p><p>39</p><p>40 One potential interface to display 3D images, to maximise its potential as a tool for</p><p>41 surgical guidance, is to overlay them onto the endoscopic operative scene (augmented</p><p>42 reality). This addresses, in part, a criticism often levelled at robotic surgery, the loss</p><p>43 of haptic feedback. Augmented reality has the potential to mitigate for this sensory</p><p>44 loss by enhancing the surgeons visual cues with information regarding subsurface</p><p>45 anatomical relationships [2].</p><p>46</p><p>47 Augmented reality surgery is in its infancy for intra-abdominal procedures due in</p><p>48 large part to the difficulties of applying static preoperative imaging to a constantly</p><p>49 deforming intraoperative scene [3]. There are case reports and ex-vivo studies in the</p><p>50 literature examining the technology in minimal access prostatectomy [3–6] and</p><p>51 partial nephrectomy [7–10], but there remains a lack of evidence determining</p><p>52 whether surgeons feel there is a role for the technology and if so what procedures</p><p>53 they feel it would be efficacious for.</p><p>2 3</p><p>54</p><p>55 This questionnaire-based study was designed to assess: firstly, the pre- and</p><p>56 intraoperative imaging modalities utilised by robotic urologists; secondly, the current</p><p>57 use of imaging intraoperatively for surgical planning; and finally whether there is a</p><p>58 desire for augmented reality amongst the robotic urological community. </p><p>59</p><p>60 Methods</p><p>61 Recruitment</p><p>62 A web based survey instrument was designed and sent out, as part of a larger survey,</p><p>63 to members of the EAU Robotic Urology Section (ERUS). Only independently</p><p>64 practising robotic surgeons performing RALP, RAPN and/or robotic cystectomy</p><p>65 were included in the analysis, those surgeons exclusively performing other</p><p>66 procedures were excluded. Respondents were offered no incentives to reply. All data</p><p>67 collected was anonymous. </p><p>68</p><p>69 Survey design and administration</p><p>70 The questionnaire was created using the LimeSurvey platform</p><p>71 (www.limesurvey.com) and hosted on their website. All responses (both complete</p><p>72 and incomplete) were included in the analysis. The questionnaire was dynamic with</p><p>73 the questions displayed tailored to the respondents’ previous answers.</p><p>74</p><p>75 When computing fractions or percentages the denominator was the number of</p><p>76 respondents to answer the question, this number is variable due to the dynamic nature</p><p>77 of the questionnaire.</p><p>78 4</p><p>79 Survey Content</p><p>80 Demographics</p><p>81 All respondents to the survey were asked in what country they practised and what</p><p>82 robotic urological procedures they performed, in addition to what procedures they</p><p>83 performed surgeons were asked specify the number of cases they had undertaken for</p><p>84 each procedure. </p><p>85</p><p>86 Current Imaging Practice</p><p>87 Procedure-specific questions in this group were displayed according to the operations</p><p>88 the respondent performed. A summary of the questions can be seen in appendix 1.</p><p>89 Procedure non-specific questions were also asked. Participants were asked whether</p><p>90 they routinely used the Tile Pro™ function of the da Vinci console (Intuitive</p><p>91 Surgical, Sunnyvale, USA) and whether they routinely viewed imaging</p><p>92 intraoperatively.</p><p>93</p><p>94 Augmented Reality</p><p>95 Prior to answering questions in this section, participants were invited to watch a</p><p>96 video demonstrating an augmented reality platform during Robot-Assisted Partial</p><p>97 Nephrectomy (RAPN), performed by our group at Imperial College London. A still</p><p>98 from this video can be seen in figure 1. They were then asked whether they felt</p><p>99 augmented reality would be of use as a navigation or training tool in robotic surgery.</p><p>100</p><p>101 Once again, in this section, procedure-specific questions were displayed according to</p><p>102 the operations the respondent performed. Only those respondents who felt augmented</p><p>103 reality would be of use as a navigation tool were asked procedure-specific questions.</p><p>4 5</p><p>104 Questions were asked to establish where in these procedures they felt an augmented</p><p>105 reality environment would be of use.</p><p>106</p><p>107 Results</p><p>108 Demographics</p><p>109 Of the 239 respondents completing the survey 117 were independently practising</p><p>110 robotic surgeons and were therefore eligible for analysis. The majority of the</p><p>111 surgeons had both trained (210/239, 87.9%) and worked in Europe (215/239, 90.0%).</p><p>112 The median number of cases undertaken by those surgeons reporting their case</p><p>113 volume was: 120 (6 - 2000), 9 (1 – 120) and 30 (1 – 270), for RALP, Robot assisted</p><p>114 cystectomy and RAPN respectively.</p><p>115</p><p>116 Contemporary use of imaging in robotic surgery</p><p>117 When enquiring about the use of imaging for surgical planning, the majority of</p><p>118 surgeons (57%, 65/115) routinely viewed preoperative imaging intraoperatively with</p><p>119 only 9% (13/137) routinely capitalising on the TilePro™ function in the console to</p><p>120 display these images, when assessing the use of TilePro™ amongst surgeons who</p><p>121 performed RAPN 13.8% (9/65) reported using the technology routinely.</p><p>122</p><p>123 When assessing the imaging modalities that are available to a surgeon in theatre the</p><p>124 majority of surgeons performing RALP (74%, 78/106)) reported using MRI with an</p><p>125 additional 37% (39/106) reporting the use of CT for preoperative staging and/or</p><p>126 planning. For surgeons performing RAPN and robot-assisted cystectomy there was</p><p>127 more of a consensus with 97% (68/70) and 95% (54/57) of surgeons, respectively,</p><p>128 using CT for routine preoperative imaging (table 1). 6</p><p>129</p><p>130 Those surgeons performing RAPN were found to have the most diversity in the way</p><p>131 they viewed preoperative images in theatre, routinely viewing images in sagittal,</p><p>132 coronal and axial slices (table 2). The majority of these surgeons also viewed the</p><p>133 images as 3D reconstructions (54%, 38/70). </p><p>134</p><p>135 The majority of surgeons used ultrasound intraoperatively in RAPN (51%, 35/69)</p><p>136 with a further 25% (17/69) reporting they would use it if they had access to a ‘drop-</p><p>137 in’ ultrasound probe (figure 3).</p><p>138</p><p>139 Desire for augmented reality</p><p>140 In all 87% of respondents envisaged a role for augmented reality as a navigation tool</p><p>141 in robotic surgery and 82% (88/107) felt that there was an additional role for the</p><p>142 technology as a training tool.</p><p>143</p><p>144 The greatest desire for augmented reality was amongst those surgeons performing</p><p>145 RAPN with 86% (54/63) feeling the technology would be of use. The largest group</p><p>146 of surgeons felt it would be useful in identifying tumour location, with significant</p><p>147 numbers also feeling it would be efficacious in tumour resection (figure 4).</p><p>148</p><p>149 When enquiring about the potential for augmented reality in Robot-Assisted</p><p>150 Laparoscopic Prostatectomy (RALP), 79% (20/96) of respondents felt it would be of</p><p>151 use during the procedure, with the largest group feeling it would be helpful for nerve</p><p>152 sparing 65% (62/96) (Figure 2). The picture in cystectomy was similar with 74%</p><p>153 (37/50) of surgeons believing augmented reality would be of use, with both nerve</p><p>6 7</p><p>154 sparing and apical dissection highlighted as specific examples (40%, 20/50) (Figure</p><p>155 5). The majority also felt that it would be useful for lymph node dissection in both</p><p>156 RALP and robot assisted cystectomy (55% (52/95) and 64% (32/50) respectively). </p><p>157</p><p>158 Discussion</p><p>159 The results from this study suggest that the contemporary robotic surgeon views</p><p>160 imaging as an important adjunct to operative practice. The way these images are</p><p>161 being viewed is changing; although the majority of surgeons continue to view images</p><p>162 as two-dimensional (2D) slices a significant minority have started to capitalise on 3D</p><p>163 reconstructions to give them an improved appreciation of the patient’s anatomy. </p><p>164</p><p>165 This study has highlighted surgeons’ willingness to take the next step in the</p><p>166 utilisation of imaging in operative planning, augmented reality, with 87% feeling it</p><p>167 has a role to play in robotic surgery. Although there appears to be a considerable</p><p>168 desire for augmented reality, the technology itself is still in its infancy with the</p><p>169 limited evidence demonstrating clinical application reporting only qualitative results</p><p>170 [3,11–13].</p><p>171</p><p>172 There are a number of significant issues that need to be overcome before augmented</p><p>173 reality can be adopted in routine clinical practice. The first of these is registration (the</p><p>174 process by which two images are positioned in the same coordinate system such that</p><p>175 the locations of corresponding points align [14]). This process has been performed</p><p>176 both manually and using automated algorithms with varying degrees of accuracy</p><p>177 [2,15]. The second issue pertains to the use of static preoperative imaging in a 8</p><p>178 dynamic operative environment; in order for the preoperative imaging to be</p><p>179 accurately registered it must be deformable. This problem remains as yet unresolved.</p><p>180</p><p>181 Live intraoperative imaging circumvents the problems of tissue deformation and in</p><p>182 RAPN 51% of surgeons reported already using intraoperative ultrasound to aid in</p><p>183 tumour resection. Cheung and colleagues [9] have published an ex-vivo study</p><p>184 highlighting the potential for intraoperative ultrasound in augmented reality partial</p><p>185 nephrectomy. They report the overlaying of ultrasound onto the operative scene to</p><p>186 improve the surgeon’s appreciation of the subsurface tumour anatomy, this</p><p>187 improvement in anatomical appreciation resulted in improved resection quality over</p><p>188 conventional ultrasound guided resection [9]. Building on this work the first in vivo</p><p>189 use of overlaid ultrasound in RAPN has recently been reported [10]. Although good</p><p>190 subjective feedback was received from the operating surgeon, the study was limited</p><p>191 to a single case demonstrating feasibility and as such was not able to show an</p><p>192 outcome benefit to the technology [10].</p><p>193</p><p>194 RAPN also appears to be the area in which augmented reality would be most readily</p><p>195 adopted with 86% of surgeons claiming they see a use for the technology during the</p><p>196 procedure. Within this operation there are two obvious steps to augmentation,</p><p>197 anatomical identification (in particular vessel identification to facilitate both routine</p><p>198 ‘full clamping’ and for the identification of secondary and tertiary vessels for</p><p>199 ‘selective clamping’ [16]) and tumour resection. These two phases have different</p><p>200 requirements from an augmented reality platform; the first phase of identification</p><p>201 requires a gross overview of the anatomy without the need for high levels of</p><p>202 registration accuracy. Tumour resection, however, necessitates almost sub-millimetre</p><p>8 9</p><p>203 accuracy in registration and needs the system to account for the dynamic</p><p>204 intraoperative environment. The step of anatomical identification is amenable to the</p><p>205 use of non-deformable 3D reconstructions of preoperative imaging while that of</p><p>206 image-guided tumour resection is perhaps better suited to augmentation with live</p><p>207 imaging such as ultrasound [2,9,17].</p><p>208</p><p>209 For RALP and robot-assisted cystectomy the steps in which surgeons felt augmented</p><p>210 reality would be of assistance were those of neurovascular bundle preservation and</p><p>211 apical dissection. The relative, perceived, efficacy of augmented reality in these steps</p><p>212 correlate with previous examinations of augmented reality in RALP [18,19].</p><p>213 Although surgeon preference for utilising AR while undertaking robotic</p><p>214 prostatectomy has been demonstrated, Thompson et al. failed to demonstrate an</p><p>215 improvement in oncological outcomes in those patients undergoing AR RALP [19]. </p><p>216</p><p>217 Both nerve sparing and apical dissection require a high level of registration accuracy</p><p>218 and a necessity for either live imaging or the deformation of preoperative imaging to</p><p>219 match the operative scene; achieving this level of registration accuracy is made more</p><p>220 difficult by the mobilisation of the prostate gland during the operation [18]. These</p><p>221 problems are equally applicable to robot-assisted cystectomy. Although guidance</p><p>222 systems have been proposed in the literature for RALP [3,4,13,18,20], none have</p><p>223 achieved the level of accuracy required to provide assistance during nerve sparing.</p><p>224 Additionally, there are still imaging challenges that need to be overcome. Although</p><p>225 multiparametric MRI has been shown to improve decision making in opting for a</p><p>226 nerve sparing approach to RALP [21] the imaging is not yet able to reliably discern</p><p>227 the exact location of the neurovascular bundle. This said significant advances are 10</p><p>228 being made with novel imaging modalities on the horizon that may allow for imaging</p><p>229 of the neurovascular bundle in the near future [22].</p><p>230</p><p>231 Limitations</p><p>232</p><p>233 The number of operations included represents a significant limitation of the study,</p><p>234 had different index procedures been chosen different results may have been seen.</p><p>235 This being said the index procedures selected were chosen as they represent the vast</p><p>236 majority of uro-oncological robotic surgical practice, largely mitigating for this</p><p>237 shortfall.</p><p>238</p><p>239 Although the available ex-vivo evidence suggests that introducing augmented reality</p><p>240 operating environments into surgical practice would help to improve outcomes [9,23]</p><p>241 the in-vivo experience to date is limited to small volume case series reporting</p><p>242 feasibility [2,3,15]. To date no study has demonstrated an in-vivo outcome advantage</p><p>243 to augmented reality guidance. In addition to this limitation augmented reality has</p><p>244 been demonstrated to increased rates of inattention blindness amongst surgeons</p><p>245 suggesting there is a trade of between increasing visual information and the surgeon’s</p><p>246 ability to appreciate unexpected operative events [23].</p><p>247</p><p>248 Conclusions</p><p>249</p><p>250 This survey depicts the contemporary robotic surgeon to be comfortable with the use</p><p>251 of imaging to aid in intraoperative planning; furthermore it highlights a significant</p><p>252 interest amongst the urological community in augmented reality operating platforms.</p><p>10 11</p><p>253</p><p>254 Short to medium term development of augmented reality systems in robotic urology</p><p>255 surgery would be best performed using RAPN as the index procedure. Not only was</p><p>256 this the operation where surgeons saw the greatest potential benefits, but it may also</p><p>257 be the operation where it is most easily achievable by capitalising on the respective</p><p>258 benefits of technologies the surgeons are already using; preoperative CT for</p><p>259 anatomical identification and intraoperative ultrasound for tumour resection.</p><p>260</p><p>261 Conflicts of Interest</p><p>262 None of the authors have any conflicts of interest to declare</p><p>263</p><p>264 References</p><p>265 1. Foo J-L, Martinez-Escobar M, Juhnke B, Cassidy K, Hisley K, Lobe T, Winer</p><p>266 E. Evaluating mental workload of two-dimensional and three-dimensional</p><p>267 visualization for anatomical structure localization. J Laparoendosc Adv Surg</p><p>268 Tech A. 2013;23(1):65–70. </p><p>269 2. Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW,</p><p>270 Vale J. Augmented Reality Partial Nephrectomy: Examining the Current</p><p>271 Status and Future Perspectives. Urology; 2014;83(2):266–73. </p><p>272 3. Sridhar AN, Hughes-Hallett A, Mayer EK, Pratt PJ, Edwards PJ, Yang G-Z,</p><p>273 Darzi AW, Vale J. Image-guided robotic interventions for prostate cancer. Nat</p><p>274 Rev Urol. 2013;10(8):452–62. </p><p>275 4. Cohen D, Mayer E, Chen D, Anstee A, Vale J, Yang G-Z, Darzi A, Edwards P</p><p>276 “Eddie.” Augmented reality image guidance in minimally invasive</p><p>277 prostatectomy. Lect Notes Comput Sci. 2010;6367:101–10. 12</p><p>278 5. Simpfendorfer T, Baumhauer M, Muller M, Gutt CN, Meinzer HP, Rassweiler</p><p>279 JJ, Guven S, Teber D. Augmented reality visualization during laparoscopic</p><p>280 radical prostatectomy. J Endourol. 2011;25(12):1841–5. </p><p>281 6. Teber D, Simpfendorfer T, Guven S, Baumhauer M, Gozen AS, Rassweiler J.</p><p>282 In-vitro evaluation of a soft-tissue navigation system for laparoscopic</p><p>283 prostatectomy. J Endourol. 2010;24(9):1487–91. </p><p>284 7. Teber D, Guven S, Simpfendörfer T, Baumhauer M, Güven EO, Yencilek F,</p><p>285 Gözen AS, Rassweiler JJ, Simpfendorfer T, Guven EO, Gozen AS.</p><p>286 Augmented reality: a new tool to improve surgical accuracy during</p><p>287 laparoscopic partial nephrectomy? Preliminary in vitro and in vivo results. Eur</p><p>288 Urol. 2009;56(2):332–8. </p><p>289 8. Pratt P, Mayer E, Vale J, Cohen D, Edwards E, Darzi A, Yang G-Z. An</p><p>290 effective visualisation and registration system for image-guided robotic partial</p><p>291 nephrectomy. J Robot Surg. 2012;6(1):23–31. </p><p>292 9. Cheung CL, Wedlake C, Moore J, Pautler SE, Peters TM. Fused video and</p><p>293 ultrasound images for minimally invasive partial nephrectomy: A phantom</p><p>294 study. Med Image Comput Comput Assist Interv. 2010;13(Pt 3):408–15. </p><p>295 10. Hughes-Hallett A, Pratt P, Mayer E, Di Marco A, Yang G-Z, Vale J, Darzi A.</p><p>296 Intraoperative Ultrasound Overlay in Robot-assisted Partial Nephrectomy:</p><p>297 First Clinical Experience. Eur Urol. 2013; </p><p>298 11. Nakamura K, Naya Y, Zenbutsu S, Araki K, Cho S, Ohta S, Nihei N, Suzuki</p><p>299 H, Ichikawa T, Igarashi T. Surgical navigation using three-dimensional</p><p>300 computed tomography images fused intraoperatively with live video. J</p><p>301 Endourol. 2010;24(4):521–4. </p><p>12 13</p><p>302 12. Teber D, Guven S, Simpfendorfer T, Baumhauer M, Guven EO, Yencilek F,</p><p>303 Gozen AS, Rassweiler J. Augmented Reality: A New Tool To Improve</p><p>304 Surgical Accuracy during Laparoscopic Partial Nephrectomy? Preliminary In</p><p>305 Vitro and In Vivo Results. Eur Urol. 2009;56(2):332–8. </p><p>306 13. Ukimura O, Gill IS. Imaging-assisted endoscopic surgery: Cleveland clinic</p><p>307 experience. J Endourol. 2008;22(4):803–9. </p><p>308 14. Altamar HO, Ong RE, Glisson CL, Viprakasit DP, Miga MI, Herrell SD,</p><p>309 Galloway RL. Kidney deformation and intraprocedural registration: A study of</p><p>310 elements of image-guided kidney surgery. J Endourol. 2011;25(3):511–7. </p><p>311 15. Nicolau S, Soler L, Mutter D, Marescaux J. Augmented reality in laparoscopic</p><p>312 surgical oncology. Surg Oncol. 2011;20(3):189–201. </p><p>313 16. Ukimura O, Nakamoto M, Gill IS. Three-dimensional reconstruction of</p><p>314 renovascular-tumor anatomy to facilitate zero-ischemia partial nephrectomy.</p><p>315 Eur Urol. 2012;61(1):211–7. </p><p>316 17. Pratt P, Hughes-Hallett A, Di Marco A, Cundy T, Mayer E, Vale J, Darzi A,</p><p>317 Yang G-Z. Multimodal Reconstruction for Image-Guided Interventions.</p><p>318 Hamlyn Symposium. 2013. </p><p>319 18. Mayer EK, Cohen D, Chen D, Anstee A, Vale J a., Yang GZ, Darzi AW,</p><p>320 Edwards E. Augmented Reality Image Guidance in Minimally Invasive</p><p>321 Prostatectomy. Eur Urol Supp. 2011;10(2):300. </p><p>322 19. Thompson S, Penney G, Billia M, Challacombe B, Hawkes D, Dasgupta P.</p><p>323 Design and evaluation of an image-guidance system for robot-assisted radical</p><p>324 prostatectomy. BJU Int. 2013;111(7):1081–90. 14</p><p>325 20. Simpfendorfer T, Baumhauer M, Muller M, Gutt CN, Meinzer H-PP,</p><p>326 Rassweiler JJ, Guven S, Teber D, Simpfendörfer T, Müller M. Augmented</p><p>327 reality visualization during laparoscopic radical prostatectomy. J Endourol.</p><p>328 2011;25(12):1841–5. </p><p>329 21. Panebianco V, Salciccia S, Cattarino S, Minisola F, Gentilucci A, Alfarone A,</p><p>330 Ricciuti GP, Marcantonio A, Lisi D, Gentile V, Passariello R, Sciarra A. Use</p><p>331 of Multiparametric MR with Neurovascular Bundle Evaluation to Optimize the</p><p>332 Oncological and Functional Management of Patients Considered for Nerve-</p><p>333 Sparing Radical Prostatectomy. J Sex Med. 2012;9(8):2157–66. </p><p>334 22. Rai S, Srivastava A, Sooriakumaran P, Tewari A. Advances in imaging the</p><p>335 neurovascular bundle. Curr Opin Urol. 2012;22(2):88–96. </p><p>336 23. Dixon BJ, Daly MJ, Chan H, Vescan AD, Witterick IJ, Irish JC. Surgeons</p><p>337 blinded by enhanced navigation: the effect of augmented reality on attention.</p><p>338 Surg Endosc. 2013;27(2):454–61. </p><p>339</p><p>340</p><p>341</p><p>342</p><p>343</p><p>344</p><p>345</p><p>346</p><p>347</p><p>348</p><p>14 15</p><p>349</p><p>350</p><p>351</p><p>352</p><p>353</p><p>354</p><p>355 Tables</p><p>CT MRI USS None Other RALP (n=106) 39.8% 73.5% 2% 15.1% 8.4% (39) (78) (3) (16) (9) RAPN (n=70) 97.1% 42.9% 17.1% 0% 2.9% (68) (30) (12) (0) (2) Cystectomy (n=57) 94.7% 26.3% 1.8% 1.8% 5.3% (54) (15) (1) (1) (3) Table 1 - Which preoperative imaging modalities do you use for diagnosis and surgical planning? 356</p><p>357</p><p>358</p><p>359</p><p>360</p><p>361</p><p>362</p><p>363</p><p>364</p><p>365</p><p>366 16</p><p>367</p><p>368</p><p>Axial Coronal Sagittal 3D Do not slices slices slices (n) recons. view (n) (n) (n) (n) RALP (n=106) 49.1% 44.3% 31.1% 9.4% 31.1% (52) (47) (33) (10) (33) RAPN (n=70) 68.6% 74.3% 60% (42) 54.3% 0% (48) (52) (38) (0) Cystectomy 70.2% 52.6% 50.9% 21.1% 8.8% (n=57) (40) (30) (29) (12) (5) Table 2 - How do you typically view preoperative imaging in the OR? 3D recons = Three dimensional reconstructions 369</p><p>370</p><p>371</p><p>372</p><p>373</p><p>374</p><p>375</p><p>376</p><p>377</p><p>378</p><p>379</p><p>380</p><p>16 17</p><p>381 Figure Legends</p><p>382</p><p>383 Figure 1 – A still taken from a video of augmented reality robot assisted partial</p><p>384 nephrectomy performed. Here the tumour has been painted into the operative view</p><p>385 allowing the surgeon to appreciate the relationship of the tumour to the surface of the</p><p>386 kidney.</p><p>387 Figure 2 – Chart demonstrating responses to the question - In robotic prostatectomy </p><p>388 which parts of the operation do you feel augmented reality image overlay would be of</p><p>389 assistance?</p><p>390</p><p>391 Figure 3 - Chart demonstrating responses to the question - Do you use intraoperative</p><p>392 ultrasound for robotic partial nephrectomy?</p><p>393</p><p>394 Figure 4 - Chart demonstrating responses to the question – In robotic partial</p><p>395 nephrectomy which parts of the operation do you feel augmented reality image</p><p>396 overlay would be of assistance?</p><p>397</p><p>398 Figure 5 - Chart demonstrating responses to the question – In robotic cystectomy</p><p>399 which parts of the operation do you feel augmented reality overlay technology would</p><p>400 be of assistance?</p>

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    17 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us