1

1 The current and future use of imaging in urological robotic surgery: A survey of the

2 European Association of Robotic Urological Surgeons

3 Archie Hughes-Hallett, MRCS1, Erik K Mayer, PhD1, Philip Pratt, PhD2, Alex Mottrie, PhD3,4, Ara

4 Darzi, FRS1,2, Justin Vale, MS1,

5 1. Department of Surgery and Cancer, Imperial College London 6 2. The Hamlyn Centre for Robotic Surgery, Imperial College London 7 3. Department of Urology, OLV Clinic, Aalst, Belgium 8 4. O.L.V. Vattikuti Robotic Surgery Institute, Aalst, Belgium 9 10

11 Corresponding Author

12 Erik Mayer,

13 Department of Surgery and Cancer, Imperial College London, St Marys Hospital

14 Campus, London, W2 1NY

15 07984195642

16 [email protected]

17

18 No reprints will be available from the authors

19

20 No financial support was received

21

22 Article Category: Original Article

23

24 Word count abstract: 244

25 Word count manuscript text: 2,142

26 5 figures and 2 tables

27

28 2

29

30 Introduction

31 Since Röntgen first utilised x-rays to image the carpal bones of the human hand in

32 1895, medical imaging has evolved and is now able to provide a detailed

33 representation of a patient’s intracorporeal anatomy, with recent advances now

34 allowing for 3-dimensional (3D) reconstructions. The visualisation of anatomy in 3D

35 has been shown to improve the ability to localize structures when compared to 2D

36 with no change in the amount of cognitive loading [1]. This has allowed imaging to

37 move from a largely diagnostic tool to one that can be used for both diagnosis and

38 operative planning.

39

40 One potential interface to display 3D images, to maximise its potential as a tool for

41 surgical guidance, is to overlay them onto the endoscopic operative scene (augmented

42 reality). This addresses, in part, a criticism often levelled at robotic surgery, the loss

43 of haptic feedback. Augmented reality has the potential to mitigate for this sensory

44 loss by enhancing the surgeons visual cues with information regarding subsurface

45 anatomical relationships [2].

46

47 Augmented reality surgery is in its infancy for intra-abdominal procedures due in

48 large part to the difficulties of applying static preoperative imaging to a constantly

49 deforming intraoperative scene [3]. There are case reports and ex-vivo studies in the

50 literature examining the technology in minimal access prostatectomy [3–6] and

51 partial nephrectomy [7–10], but there remains a lack of evidence determining

52 whether surgeons feel there is a role for the technology and if so what procedures

53 they feel it would be efficacious for.

2 3

54

55 This questionnaire-based study was designed to assess: firstly, the pre- and

56 intraoperative imaging modalities utilised by robotic urologists; secondly, the current

57 use of imaging intraoperatively for surgical planning; and finally whether there is a

58 desire for augmented reality amongst the robotic urological community.

59

60 Methods

61 Recruitment

62 A web based survey instrument was designed and sent out, as part of a larger survey,

63 to members of the EAU Robotic Urology Section (ERUS). Only independently

64 practising robotic surgeons performing RALP, RAPN and/or robotic cystectomy

65 were included in the analysis, those surgeons exclusively performing other

66 procedures were excluded. Respondents were offered no incentives to reply. All data

67 collected was anonymous.

68

69 Survey design and administration

70 The questionnaire was created using the LimeSurvey platform

71 (www.limesurvey.com) and hosted on their website. All responses (both complete

72 and incomplete) were included in the analysis. The questionnaire was dynamic with

73 the questions displayed tailored to the respondents’ previous answers.

74

75 When computing fractions or percentages the denominator was the number of

76 respondents to answer the question, this number is variable due to the dynamic nature

77 of the questionnaire.

78 4

79 Survey Content

80 Demographics

81 All respondents to the survey were asked in what country they practised and what

82 robotic urological procedures they performed, in addition to what procedures they

83 performed surgeons were asked specify the number of cases they had undertaken for

84 each procedure.

85

86 Current Imaging Practice

87 Procedure-specific questions in this group were displayed according to the operations

88 the respondent performed. A summary of the questions can be seen in appendix 1.

89 Procedure non-specific questions were also asked. Participants were asked whether

90 they routinely used the Tile Pro™ function of the da Vinci console (Intuitive

91 Surgical, Sunnyvale, USA) and whether they routinely viewed imaging

92 intraoperatively.

93

94 Augmented Reality

95 Prior to answering questions in this section, participants were invited to watch a

96 video demonstrating an augmented reality platform during Robot-Assisted Partial

97 Nephrectomy (RAPN), performed by our group at Imperial College London. A still

98 from this video can be seen in figure 1. They were then asked whether they felt

99 augmented reality would be of use as a navigation or training tool in robotic surgery.

100

101 Once again, in this section, procedure-specific questions were displayed according to

102 the operations the respondent performed. Only those respondents who felt augmented

103 reality would be of use as a navigation tool were asked procedure-specific questions.

4 5

104 Questions were asked to establish where in these procedures they felt an augmented

105 reality environment would be of use.

106

107 Results

108 Demographics

109 Of the 239 respondents completing the survey 117 were independently practising

110 robotic surgeons and were therefore eligible for analysis. The majority of the

111 surgeons had both trained (210/239, 87.9%) and worked in Europe (215/239, 90.0%).

112 The median number of cases undertaken by those surgeons reporting their case

113 volume was: 120 (6 - 2000), 9 (1 – 120) and 30 (1 – 270), for RALP, Robot assisted

114 cystectomy and RAPN respectively.

115

116 Contemporary use of imaging in robotic surgery

117 When enquiring about the use of imaging for surgical planning, the majority of

118 surgeons (57%, 65/115) routinely viewed preoperative imaging intraoperatively with

119 only 9% (13/137) routinely capitalising on the TilePro™ function in the console to

120 display these images, when assessing the use of TilePro™ amongst surgeons who

121 performed RAPN 13.8% (9/65) reported using the technology routinely.

122

123 When assessing the imaging modalities that are available to a surgeon in theatre the

124 majority of surgeons performing RALP (74%, 78/106)) reported using MRI with an

125 additional 37% (39/106) reporting the use of CT for preoperative staging and/or

126 planning. For surgeons performing RAPN and robot-assisted cystectomy there was

127 more of a consensus with 97% (68/70) and 95% (54/57) of surgeons, respectively,

128 using CT for routine preoperative imaging (table 1). 6

129

130 Those surgeons performing RAPN were found to have the most diversity in the way

131 they viewed preoperative images in theatre, routinely viewing images in sagittal,

132 coronal and axial slices (table 2). The majority of these surgeons also viewed the

133 images as 3D reconstructions (54%, 38/70).

134

135 The majority of surgeons used ultrasound intraoperatively in RAPN (51%, 35/69)

136 with a further 25% (17/69) reporting they would use it if they had access to a ‘drop-

137 in’ ultrasound probe (figure 3).

138

139 Desire for augmented reality

140 In all 87% of respondents envisaged a role for augmented reality as a navigation tool

141 in robotic surgery and 82% (88/107) felt that there was an additional role for the

142 technology as a training tool.

143

144 The greatest desire for augmented reality was amongst those surgeons performing

145 RAPN with 86% (54/63) feeling the technology would be of use. The largest group

146 of surgeons felt it would be useful in identifying tumour location, with significant

147 numbers also feeling it would be efficacious in tumour resection (figure 4).

148

149 When enquiring about the potential for augmented reality in Robot-Assisted

150 Laparoscopic Prostatectomy (RALP), 79% (20/96) of respondents felt it would be of

151 use during the procedure, with the largest group feeling it would be helpful for nerve

152 sparing 65% (62/96) (Figure 2). The picture in cystectomy was similar with 74%

153 (37/50) of surgeons believing augmented reality would be of use, with both nerve

6 7

154 sparing and apical dissection highlighted as specific examples (40%, 20/50) (Figure

155 5). The majority also felt that it would be useful for lymph node dissection in both

156 RALP and robot assisted cystectomy (55% (52/95) and 64% (32/50) respectively).

157

158 Discussion

159 The results from this study suggest that the contemporary robotic surgeon views

160 imaging as an important adjunct to operative practice. The way these images are

161 being viewed is changing; although the majority of surgeons continue to view images

162 as two-dimensional (2D) slices a significant minority have started to capitalise on 3D

163 reconstructions to give them an improved appreciation of the patient’s anatomy.

164

165 This study has highlighted surgeons’ willingness to take the next step in the

166 utilisation of imaging in operative planning, augmented reality, with 87% feeling it

167 has a role to play in robotic surgery. Although there appears to be a considerable

168 desire for augmented reality, the technology itself is still in its infancy with the

169 limited evidence demonstrating clinical application reporting only qualitative results

170 [3,11–13].

171

172 There are a number of significant issues that need to be overcome before augmented

173 reality can be adopted in routine clinical practice. The first of these is registration (the

174 process by which two images are positioned in the same coordinate system such that

175 the locations of corresponding points align [14]). This process has been performed

176 both manually and using automated algorithms with varying degrees of accuracy

177 [2,15]. The second issue pertains to the use of static preoperative imaging in a 8

178 dynamic operative environment; in order for the preoperative imaging to be

179 accurately registered it must be deformable. This problem remains as yet unresolved.

180

181 Live intraoperative imaging circumvents the problems of tissue deformation and in

182 RAPN 51% of surgeons reported already using intraoperative ultrasound to aid in

183 tumour resection. Cheung and colleagues [9] have published an ex-vivo study

184 highlighting the potential for intraoperative ultrasound in augmented reality partial

185 nephrectomy. They report the overlaying of ultrasound onto the operative scene to

186 improve the surgeon’s appreciation of the subsurface tumour anatomy, this

187 improvement in anatomical appreciation resulted in improved resection quality over

188 conventional ultrasound guided resection [9]. Building on this work the first in vivo

189 use of overlaid ultrasound in RAPN has recently been reported [10]. Although good

190 subjective feedback was received from the operating surgeon, the study was limited

191 to a single case demonstrating feasibility and as such was not able to show an

192 outcome benefit to the technology [10].

193

194 RAPN also appears to be the area in which augmented reality would be most readily

195 adopted with 86% of surgeons claiming they see a use for the technology during the

196 procedure. Within this operation there are two obvious steps to augmentation,

197 anatomical identification (in particular vessel identification to facilitate both routine

198 ‘full clamping’ and for the identification of secondary and tertiary vessels for

199 ‘selective clamping’ [16]) and tumour resection. These two phases have different

200 requirements from an augmented reality platform; the first phase of identification

201 requires a gross overview of the anatomy without the need for high levels of

202 registration accuracy. Tumour resection, however, necessitates almost sub-millimetre

8 9

203 accuracy in registration and needs the system to account for the dynamic

204 intraoperative environment. The step of anatomical identification is amenable to the

205 use of non-deformable 3D reconstructions of preoperative imaging while that of

206 image-guided tumour resection is perhaps better suited to augmentation with live

207 imaging such as ultrasound [2,9,17].

208

209 For RALP and robot-assisted cystectomy the steps in which surgeons felt augmented

210 reality would be of assistance were those of neurovascular bundle preservation and

211 apical dissection. The relative, perceived, efficacy of augmented reality in these steps

212 correlate with previous examinations of augmented reality in RALP [18,19].

213 Although surgeon preference for utilising AR while undertaking robotic

214 prostatectomy has been demonstrated, Thompson et al. failed to demonstrate an

215 improvement in oncological outcomes in those patients undergoing AR RALP [19].

216

217 Both nerve sparing and apical dissection require a high level of registration accuracy

218 and a necessity for either live imaging or the deformation of preoperative imaging to

219 match the operative scene; achieving this level of registration accuracy is made more

220 difficult by the mobilisation of the prostate gland during the operation [18]. These

221 problems are equally applicable to robot-assisted cystectomy. Although guidance

222 systems have been proposed in the literature for RALP [3,4,13,18,20], none have

223 achieved the level of accuracy required to provide assistance during nerve sparing.

224 Additionally, there are still imaging challenges that need to be overcome. Although

225 multiparametric MRI has been shown to improve decision making in opting for a

226 nerve sparing approach to RALP [21] the imaging is not yet able to reliably discern

227 the exact location of the neurovascular bundle. This said significant advances are 10

228 being made with novel imaging modalities on the horizon that may allow for imaging

229 of the neurovascular bundle in the near future [22].

230

231 Limitations

232

233 The number of operations included represents a significant limitation of the study,

234 had different index procedures been chosen different results may have been seen.

235 This being said the index procedures selected were chosen as they represent the vast

236 majority of uro-oncological robotic surgical practice, largely mitigating for this

237 shortfall.

238

239 Although the available ex-vivo evidence suggests that introducing augmented reality

240 operating environments into surgical practice would help to improve outcomes [9,23]

241 the in-vivo experience to date is limited to small volume case series reporting

242 feasibility [2,3,15]. To date no study has demonstrated an in-vivo outcome advantage

243 to augmented reality guidance. In addition to this limitation augmented reality has

244 been demonstrated to increased rates of inattention blindness amongst surgeons

245 suggesting there is a trade of between increasing visual information and the surgeon’s

246 ability to appreciate unexpected operative events [23].

247

248 Conclusions

249

250 This survey depicts the contemporary robotic surgeon to be comfortable with the use

251 of imaging to aid in intraoperative planning; furthermore it highlights a significant

252 interest amongst the urological community in augmented reality operating platforms.

10 11

253

254 Short to medium term development of augmented reality systems in robotic urology

255 surgery would be best performed using RAPN as the index procedure. Not only was

256 this the operation where surgeons saw the greatest potential benefits, but it may also

257 be the operation where it is most easily achievable by capitalising on the respective

258 benefits of technologies the surgeons are already using; preoperative CT for

259 anatomical identification and intraoperative ultrasound for tumour resection.

260

261 Conflicts of Interest

262 None of the authors have any conflicts of interest to declare

263

264 References

265 1. Foo J-L, Martinez-Escobar M, Juhnke B, Cassidy K, Hisley K, Lobe T, Winer

266 E. Evaluating mental workload of two-dimensional and three-dimensional

267 visualization for anatomical structure localization. J Laparoendosc Adv Surg

268 Tech A. 2013;23(1):65–70.

269 2. Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW,

270 Vale J. Augmented Reality Partial Nephrectomy: Examining the Current

271 Status and Future Perspectives. Urology; 2014;83(2):266–73.

272 3. Sridhar AN, Hughes-Hallett A, Mayer EK, Pratt PJ, Edwards PJ, Yang G-Z,

273 Darzi AW, Vale J. Image-guided robotic interventions for prostate cancer. Nat

274 Rev Urol. 2013;10(8):452–62.

275 4. Cohen D, Mayer E, Chen D, Anstee A, Vale J, Yang G-Z, Darzi A, Edwards P

276 “Eddie.” Augmented reality image guidance in minimally invasive

277 prostatectomy. Lect Notes Comput Sci. 2010;6367:101–10. 12

278 5. Simpfendorfer T, Baumhauer M, Muller M, Gutt CN, Meinzer HP, Rassweiler

279 JJ, Guven S, Teber D. Augmented reality visualization during laparoscopic

280 radical prostatectomy. J Endourol. 2011;25(12):1841–5.

281 6. Teber D, Simpfendorfer T, Guven S, Baumhauer M, Gozen AS, Rassweiler J.

282 In-vitro evaluation of a soft-tissue navigation system for laparoscopic

283 prostatectomy. J Endourol. 2010;24(9):1487–91.

284 7. Teber D, Guven S, Simpfendörfer T, Baumhauer M, Güven EO, Yencilek F,

285 Gözen AS, Rassweiler JJ, Simpfendorfer T, Guven EO, Gozen AS.

286 Augmented reality: a new tool to improve surgical accuracy during

287 laparoscopic partial nephrectomy? Preliminary in vitro and in vivo results. Eur

288 Urol. 2009;56(2):332–8.

289 8. Pratt P, Mayer E, Vale J, Cohen D, Edwards E, Darzi A, Yang G-Z. An

290 effective visualisation and registration system for image-guided robotic partial

291 nephrectomy. J Robot Surg. 2012;6(1):23–31.

292 9. Cheung CL, Wedlake C, Moore J, Pautler SE, Peters TM. Fused video and

293 ultrasound images for minimally invasive partial nephrectomy: A phantom

294 study. Med Image Comput Comput Assist Interv. 2010;13(Pt 3):408–15.

295 10. Hughes-Hallett A, Pratt P, Mayer E, Di Marco A, Yang G-Z, Vale J, Darzi A.

296 Intraoperative Ultrasound Overlay in Robot-assisted Partial Nephrectomy:

297 First Clinical Experience. Eur Urol. 2013;

298 11. Nakamura K, Naya Y, Zenbutsu S, Araki K, Cho S, Ohta S, Nihei N, Suzuki

299 H, Ichikawa T, Igarashi T. Surgical navigation using three-dimensional

300 computed tomography images fused intraoperatively with live video. J

301 Endourol. 2010;24(4):521–4.

12 13

302 12. Teber D, Guven S, Simpfendorfer T, Baumhauer M, Guven EO, Yencilek F,

303 Gozen AS, Rassweiler J. Augmented Reality: A New Tool To Improve

304 Surgical Accuracy during Laparoscopic Partial Nephrectomy? Preliminary In

305 Vitro and In Vivo Results. Eur Urol. 2009;56(2):332–8.

306 13. Ukimura O, Gill IS. Imaging-assisted endoscopic surgery: Cleveland clinic

307 experience. J Endourol. 2008;22(4):803–9.

308 14. Altamar HO, Ong RE, Glisson CL, Viprakasit DP, Miga MI, Herrell SD,

309 Galloway RL. Kidney deformation and intraprocedural registration: A study of

310 elements of image-guided kidney surgery. J Endourol. 2011;25(3):511–7.

311 15. Nicolau S, Soler L, Mutter D, Marescaux J. Augmented reality in laparoscopic

312 surgical oncology. Surg Oncol. 2011;20(3):189–201.

313 16. Ukimura O, Nakamoto M, Gill IS. Three-dimensional reconstruction of

314 renovascular-tumor anatomy to facilitate zero-ischemia partial nephrectomy.

315 Eur Urol. 2012;61(1):211–7.

316 17. Pratt P, Hughes-Hallett A, Di Marco A, Cundy T, Mayer E, Vale J, Darzi A,

317 Yang G-Z. Multimodal Reconstruction for Image-Guided Interventions.

318 Hamlyn Symposium. 2013.

319 18. Mayer EK, Cohen D, Chen D, Anstee A, Vale J a., Yang GZ, Darzi AW,

320 Edwards E. Augmented Reality Image Guidance in Minimally Invasive

321 Prostatectomy. Eur Urol Supp. 2011;10(2):300.

322 19. Thompson S, Penney G, Billia M, Challacombe B, Hawkes D, Dasgupta P.

323 Design and evaluation of an image-guidance system for robot-assisted radical

324 prostatectomy. BJU Int. 2013;111(7):1081–90. 14

325 20. Simpfendorfer T, Baumhauer M, Muller M, Gutt CN, Meinzer H-PP,

326 Rassweiler JJ, Guven S, Teber D, Simpfendörfer T, Müller M. Augmented

327 reality visualization during laparoscopic radical prostatectomy. J Endourol.

328 2011;25(12):1841–5.

329 21. Panebianco V, Salciccia S, Cattarino S, Minisola F, Gentilucci A, Alfarone A,

330 Ricciuti GP, Marcantonio A, Lisi D, Gentile V, Passariello R, Sciarra A. Use

331 of Multiparametric MR with Neurovascular Bundle Evaluation to Optimize the

332 Oncological and Functional Management of Patients Considered for Nerve-

333 Sparing Radical Prostatectomy. J Sex Med. 2012;9(8):2157–66.

334 22. Rai S, Srivastava A, Sooriakumaran P, Tewari A. Advances in imaging the

335 neurovascular bundle. Curr Opin Urol. 2012;22(2):88–96.

336 23. Dixon BJ, Daly MJ, Chan H, Vescan AD, Witterick IJ, Irish JC. Surgeons

337 blinded by enhanced navigation: the effect of augmented reality on attention.

338 Surg Endosc. 2013;27(2):454–61.

339

340

341

342

343

344

345

346

347

348

14 15

349

350

351

352

353

354

355 Tables

CT MRI USS None Other RALP (n=106) 39.8% 73.5% 2% 15.1% 8.4% (39) (78) (3) (16) (9) RAPN (n=70) 97.1% 42.9% 17.1% 0% 2.9% (68) (30) (12) (0) (2) Cystectomy (n=57) 94.7% 26.3% 1.8% 1.8% 5.3% (54) (15) (1) (1) (3) Table 1 - Which preoperative imaging modalities do you use for diagnosis and surgical planning? 356

357

358

359

360

361

362

363

364

365

366 16

367

368

Axial Coronal Sagittal 3D Do not slices slices slices (n) recons. view (n) (n) (n) (n) RALP (n=106) 49.1% 44.3% 31.1% 9.4% 31.1% (52) (47) (33) (10) (33) RAPN (n=70) 68.6% 74.3% 60% (42) 54.3% 0% (48) (52) (38) (0) Cystectomy 70.2% 52.6% 50.9% 21.1% 8.8% (n=57) (40) (30) (29) (12) (5) Table 2 - How do you typically view preoperative imaging in the OR? 3D recons = Three dimensional reconstructions 369

370

371

372

373

374

375

376

377

378

379

380

16 17

381 Figure Legends

382

383 Figure 1 – A still taken from a video of augmented reality robot assisted partial

384 nephrectomy performed. Here the tumour has been painted into the operative view

385 allowing the surgeon to appreciate the relationship of the tumour to the surface of the

386 kidney.

387 Figure 2 – Chart demonstrating responses to the question - In robotic prostatectomy

388 which parts of the operation do you feel augmented reality image overlay would be of

389 assistance?

390

391 Figure 3 - Chart demonstrating responses to the question - Do you use intraoperative

392 ultrasound for robotic partial nephrectomy?

393

394 Figure 4 - Chart demonstrating responses to the question – In robotic partial

395 nephrectomy which parts of the operation do you feel augmented reality image

396 overlay would be of assistance?

397

398 Figure 5 - Chart demonstrating responses to the question – In robotic cystectomy

399 which parts of the operation do you feel augmented reality overlay technology would

400 be of assistance?