3rd SUMMER EUROPEAN UNIVERSITY
Surgical robotics – Montpellier, September 2007
VISUAL SERVOING with applications in medical robotics
Florent Nageotte, Michel de Mathelin Louis Pasteur University – Strasbourg LSIIT – Control, Vision & Robotics Research Group
UMR CNRS 7005 1 OverviewOverview
Part I : Fundamentals of visual servoing Background and definitions Servoing architectures and classification Position-based visual servoing Image-based visual servoing Visual servoing without feature extraction
Part II : Medical robotics applications Laparoscopic surgery Internal organ motion tracking
2 European Summer University - Montpellier 2007 OverviewOverview
Part I : Fundamentals of visual servoing Background and definitions Servoing architectures and classification Position-based visual servoing Image-based visual servoing Visual servoing without feature extraction
Part II : Medical robotics applications Laparoscopic surgery Internal organ motion tracking
3 European Summer University - Montpellier 2007 VisualVisual servoingservoing principleprinciple
Control the end-effector of a robot using a vision sensor (2D sensor). E.g : position the effector of a robot w.r.t. an object
Similar to registration but the velocity of the robot is generally continuously updated
Minimize a task function dependent of the error between the current pose of the robot and the reference pose Creates a virtual link between robot and object
Image processing, computer vision and control issues
4 European Summer University - Montpellier 2007 I.1I.1 BackgroundBackground andand definitionsdefinitions A. Coordinates and pose Coordinates of point P with respect to coordinate frame i : i P Position and orientation of frame i with respect to frame j :
⎡T ⎤ ⎫ origin of x ⎪ ⎢T ⎥ translation vector = jT frame i w.r. ⎢ y ⎥ ⎬ i ⎪ to frame j j ⎢Tz ⎥ ⎭ Pose = pi = α ⎢ ⎥ ⎫ ⎢ ⎥ rotation β ⎪ j ⎢ ⎥ ⎬ rotation angles ⇒ Ri matrix ⎢ ⎥ ⎪ ⎣ γ ⎦ ⎭
5 European Summer University - Montpellier 2007 I.1I.1 BackgroundBackground andand definitionsdefinitions B. Coordinate transformations Coordinates of point iP with respect to coordinate frame j :
j j j i P = Ti + Ri P Coordinates of vector iV with respect to frame j :
j j i V = Ri V Homogeneous transformation from frame i to frame j :
j j j i j i j ⎡ Ri Ti ⎤ ⎡ P⎤ j ⎡ P⎤ ⎡ V ⎤ j ⎡ V ⎤ H i = ⎢ ⎥ ⇒ ⎢ ⎥ = H i ⎢ ⎥ ⎢ ⎥ = H i ⎢ ⎥ ⎣ 0 1 ⎦ ⎣ 1 ⎦ ⎣ 1 ⎦ ⎣ 0 ⎦ ⎣ 0 ⎦
j j k H i = H k H i
6 European Summer University - Montpellier 2007 I.1I.1 BackgroundBackground andand definitionsdefinitions C. Velocity of a rigid object Velocity screw of frame i with respect to frame j in frame j coordinates: ⎡ vx ⎤ ⎫ ⎪ j ⎢ v ⎥ translatio nal velocity = jV ⎢ y ⎥ ⎬ i ⎪ () j ⎢ v ⎥ j z ⎭ ()ri = ω⎢ ⎥ x ⎫ ⎢ ⎥ () ω ⎪ j j ⎢ y ⎥ ⎬ rotational velocity = Ωi ⎢ ⎥ ⎪ ⎣ω z ⎦ ⎭ Velocity of point P rigidly attached to frame i with respect to frame j expressed in frame j coordinates :
j j j j j j j j j i j j j P = ( Ω i )× P + ( Vi ) = ( Ω i )× ( Ri P+ Ti ) + ( Vi )
7 European Summer University - Montpellier 2007 I.1I.1 BackgroundBackground andand definitionsdefinitions D. Camera projection model
Perspective projection : image plane P camera frame
Yc p Xc y x Zc oc optical center λ v u λ = focal length c = coordinates of point P with P λ respect to the camera frame c
⎡ x c ⎤ ⎡ λ x c ⎤ ⎡ x c ⎤ u + k ⎡ x ⎤ ⎢ z ⎥ ⎡u ⎤ ⎢ 0 u z ⎥ C P = ⎢ y ⎥ ⇒ = ⎢ c ⎥ ⇒ = ⎢ c ⎥ ⎢ c ⎥ ⎢ ⎥ ⎢ ⎥ ⎣ y ⎦ ⎢ y c ⎥ ⎣ v ⎦ ⎢ y c ⎥ ⎢ ⎥ v 0 + λ k v ⎣ z c ⎦ ⎢ ⎥ ⎢ ⎥ ⎣ z c ⎦ pixels ⎣ z c ⎦
8 European Summer University - Montpellier 2007 I.1I.1 BackgroundBackground andand definitionsdefinitions D. Camera projection model λ ⎡u⎤ ⎡ ku 0 u0 ⎤⎡xc ⎤ ⎡xc ⎤ z ⎢v⎥ = ⎢ 0 λk v ⎥⎢y ⎥ = K ⎢y ⎥ c ⎢ ⎥ ⎢ v 0 ⎥⎢ c ⎥ ⎢ c ⎥ ⎣⎢1⎦⎥ ⎣⎢ 0 0 1 ⎦⎥⎣⎢zc ⎦⎥ ⎣⎢zc ⎦⎥ • Intrinsic camera parameters obtained by calibration
• Mathematical model of a perfect optical system / physical phenomenon (distorsions, etc.)
• Numerical sensor : aliasing, dynamical behaviour (limited bandwidth)
• Other models apply for other types of « visual » sensors, e.g., C-arm, CT- scan, ultrasound probe, …
9 European Summer University - Montpellier 2007 OverviewOverview
Part I : Fundamentals of visual servoing Background and definitions Servoing architectures and classification Position-based visual servoing Image-based visual servoing Visual servoing without feature extraction
Part II : Medical robotics applications Laparoscopic surgery Internal organ motion tracking
10 European Summer University - Montpellier 2007 I.2 Classification I.21 Camera position A. Eye-in-hand configuration
c Ht
b He e Hc e Hc must be known (calibration) or the reference position must be learned (showing)
11 European Summer University - Montpellier 2007 I.2 Classification I.21 Camera position B. External camera configuration
c He c He must be measured c b Ht or He b Hc must be known b Hc
12 European Summer University - Montpellier 2007 I.2 Classification I.22 Control level A. Indirect visual servoing Cartesian controller * * ∆p or r q* Trajectory Joint Power generator control amplifiers
joint Control angles q Robot+camera law T Image - processing image
•Suitable for slow visual servoing (T-1 < 50 Hz) •Control law is easier to design
13 European Summer University - Montpellier 2007 I.2 Classification I.22 Control level B. Direct visual servoing
* τ * or q Control Low level - joint controllers law q
Robot+camera
T Image processing image
•Suitable for fast visual servoing (T-1 >= 50 Hz) •Control law design is more complex (robot dynamics must be taken into account)
14 European Summer University - Montpellier 2007 I.2 Classification I.23 Feedback variables A. Position-based visual servoing (3D visual servoing)
pd Power Control - amplifiers
p Robot+camera
T f Pose Features estimation extraction image •A model of the object must be known or multiple images should be used •Calibration errors may induce large pose estimation errors •Control law design is easier •Possible loss of target for large errors
15 European Summer University - Montpellier 2007 I.2 Classification I.23 Feedback variables B. Image-based visual servoing (2D visual servoing)
f d Power Control - amplifiers
Robot+camera f T Features image extraction •Smaller computational burden •Eliminates errors due to calibration •More complex control law •Workspace limits can be hit for large errors
16 European Summer University - Montpellier 2007 I.2 Classification I.23 Feedback variables C. Hybrid visual servoing (e.g. 2D1/2 visual servoing)
fpd Power - Hybrid amplifiers ppd control law -
Robot+camera fp f T Partial pose Features p p estimation extraction image •Smaller errors due to calibration •Simplified model of the target •Better properties of motion (in the image and in 3D)
17 European Summer University - Montpellier 2007 I.2 Classification I.24 Bandwidth of the visual servo-loop A. Slow visual servoing Sampling frequency < 50Hz Indirect visual servoing Robot transfer function model without dynamics Proportional control law (P)
B. Fast visual servoing Sampling frequency >= 50 Hz Direct visual servoing Dynamical model of the robot must be taken into account More advanced control laws : PID, predictive, robust, non-linear, …
18 European Summer University - Montpellier 2007 OverviewOverview
Part I : Fundamentals of visual servoing Background and definitions Servoing architectures and classification Position-based visual servoing Image-based visual servoing Visual servoing without feature extraction
Part II : Medical robotics applications Laparoscopic surgery Internal organ motion tracking
19 European Summer University - Montpellier 2007 I.3 Position-based visual servoing I.31 Indirect visual servoing A. Control law
p ep * or * d Control ∆p r Cartesian - law controller
joint angles q pˆ = p + δp T fˆ Robot+camera Pose Features estimation extraction image
•e• p generally expressed as translation vector and θu * • Look-then-move strategy (T very large, asynchronous) : ∆p = ep −1 • Pseudo-continuous strategy : p = J T r * T p Control law : r = k J p ep
20 European Summer University - Montpellier 2007 I.4 Image-based visual servoing I.41 Indirect visual servoing
B. Obtaining reference pd
• Reference position of the end-effector is given w.r.t an object Computation of the corresponding position of the camera w.r.t the object (eye-in-hand) e requires the knowledge of H c • Learning of the image in the final position and pose estimation of the camera
21 European Summer University - Montpellier 2007 I.3 Position-based visual servoing I.31 Indirect visual servoing C. Interaction matrix Jp Case of a eye-in-hand system
c* ⎡ Tc ⎤ ep = ⎢ c* ⎥ ⎣ θu⎦
⎡ c*C ⎤ ⎡ c*R c R (V + Ω × EC)⎤ ⎡ c*R c R −c*R c R EC⎤⎡V ⎤ c e e e c e c e e ep = ⎢ c* c* ⎥ = ⎢ c* c ⎥ = ⎢ c* c ⎥⎢ ⎥ ⎣ ( Ωc )⎦ ⎣ Rc ReΩe ⎦ ⎣ 0 Rc Re ⎦⎣Ωe ⎦
22 European Summer University - Montpellier 2007 I.3 Position-based visual servoing I.31 Indirect visual servoing D. Pose estimation
Camera calibration : Tsaï (IEEE Trans. Rob. Aut., 1987), Zhang ( ICCV 99) Pose estimation : Analytical : 3 or 4 points : Tsaï (co-planar target) Numerical Iterative : - DeMenthon (IEEE Trans. PAMI 1992, Int. J. Comp. Vision 1995), - VVS (Marchand, Eurographics 2002), - Minimization of reprojection error (gradient, Levenberg-Marquardt, etc.)
23 European Summer University - Montpellier 2007 I.3 Position-based visual servoing I.31 Indirect visual servoing E. Stability and robustness
Stability is not an issue : Look-then-move strategy is always stable and low speed * T T −1 Exponential vision loop: r ≈ r p = k J p J p ep convergence
Jp depends on camera position w.r. to end-effector (eye-in-hand configuration) Stability if ˆ −1 J P J P > 0 Measurement error is an issue: δp can be very large !
Main sources of uncertainty : ⇒Improvement: use learning Camera intrinsic parameters of pc by showing when possible Camera position w.r. to end-effector if eye-in-hand configuration Camera position w.r. to the robot base and robot kinematic chain if external camera, except if end-effector pose is estimated by vision Feature detection error : δf = fˆ − f
24 European Summer University - Montpellier 2007 I.3 Position-based visual servoing I.32 Direct visual servoing A. Control law
* * pd e q or p Control τ Low-level joint - law controllers
pˆ = p + δp Robot+camera T Pose Features estimation extraction image
T T * −1 T −1 p = J p r = J p J R (q) q Control law : q = k J R (q) J p ep
25 European Summer University - Montpellier 2007 I.3 Position-based visual servoing I.32 Direct visual servoing B. Stability and robustness (1) Stability may be an issue: Low speed vision loop : q ≈ q*
T −1 T −1 p = kJ p J R (q) J R (q)J p ep Exponential convergence High speed vision loop : Linearized approach : q(s) ≈ F(s,q) q* (s) 1 p ≈ J T J (q) F(s,q)q* s p R Joint-level velocity feedback loops have a linearizing and decoupling effect
−1 T −1 * * Control law: q * = J R ( q ) J p p withp computed using a LPV discrete-time model of the vision loop This approach works in practice with 6DOF vision loop !
26 European Summer University - Montpellier 2007 I.3 Position-based visual servoing I.32 Direct visual servoing B. Stability and robustness (2)
LPV discrete-time model p * q* q control −1 T −1 pd J (q)J F(z, q) algorithm R p robot identified model acquisition and δp image processing pˆ p Tz −d p T + J p J R (q) 1− z −1
GPC of a 6DOF robot vision loop : J. Gangloff & M. de Mathelin (Advanced Robotics, vol 17, no 10, déc. 2003)
27 European Summer University - Montpellier 2007 I.3 Position-based visual servoing I.32 Direct visual servoing B. Stability and robustness (3) Non linear approach : rigid link robot manipulator model
τ = M (q)q+ C(q,q)q + g(q) + fr (q,q)
Inertia Coriolis, gravity friction centripetal -PD control scheme(Arimoto): τ * = g ( q ) − K q − K e with e = q − q τ v p q q d
- Passivity-basedζ scheme (Slotine & Li): ζ = qd − Λeq High complexity * for 6DOF ! = M (q) + C(q,q)ζ + g(q) − Kveq − K peq Problems: - asymptotic stability is proven with no friction
-qd andqd are generally unknown - joint flexibilities and backlash
28 European Summer University - Montpellier 2007 OverviewOverview
Part I : Fundamentals of visual servoing Background and definitions Servoing architectures and classification Position-based visual servoing Image-based visual servoing Visual servoing without feature extraction
Part II : Medical robotics applications Laparoscopic surgery Internal organ motion tracking
29 European Summer University - Montpellier 2007 I.4 Image-based visual servoing I.41 Indirect visual servoing A. Control law
f e *or * d f Control ∆p r Cartesian - law controller q
Robot+camera pixel noise fˆ = f +δf T Features image extraction
* * • Look-then-move strategy : select ∆ p or ∆ q to decrease a cost function of e f • Pseudo-continuous strategy : f = J r * + I Control law : r = k J I e f
30 European Summer University - Montpellier 2007 I.4 Image-based visual servoing I.41 Indirect visual servoing
B. Obtaining reference fd
• Computation of the perspective projection of the object in the reference position requires a model of the object requires the knowledge of e requires the knowledge of H c requires an accurate model of camera
• Learning of the image in the final position
31 European Summer University - Montpellier 2007 I.4 Image-based visual servoing I.41 Indirect visual servoing C. Image jacobian or interaction matrix
Image Jacobian JI : p x n matrix (p feature vector dimension, n # dof of effector). Depends on kind of features used.
Exemple : case of a point P seen by a camera attached to the effector of a robot
u P v
X C ZC ZC
ZO X P C
YC X O
YO
32 European Summer University - Montpellier 2007 I.4 Image-based visual servoing I.41 Indirect visual servoing
c c ⎡ vx ⎤ c o ω c o P = ( Ω c )× P + ( Vc ) image ⎢ v ⎥ ⎡x⎤ y λ ω ⎢ ⎥ ⎡ (v − v0 )z ⎤ coordinates c ⎢ z y − z + vx ⎥ c ⎢ ⎥ o ⎢ vz ⎥ k in pixel: (u,v) P = y ()r = ⎡x⎤ ⎢ λ ω v ⎥ ⎢ ⎥ c ω⎢ ⎥ (u − u )z ω λ x c ⎢ ⎥ ⎢ 0 ⎥ ⎢z⎥ ⎢ ⎥ P = y = λ z − z x + vy ⎣ ⎦ ω ⎢ ⎥ ⎢ k ω ⎥ ⎢ y ⎥ ⎢ ⎥ u ⎣z⎦ ⎢ z (v − v ) (u − u ) ⎥ ⎢ ⎥ ⎢ ( 0 − 0 ω ) + v ⎥ λ ⎣ω z ⎦ x y z ⎣⎢ kv ku ⎦⎥ λ λ λ 2 2 ⎡ ku (u − u0 ) (u − u0 )(v − v0 ) ( ku ) + (u − u0 ) ku (v − v0 )⎤ ⎢ 0 − − λ − ⎥ ⎡u⎤ z z k k k ⎢ λv u v ⎥c o ⎢ ⎥ = 2 2 ( rc ) ⎣v⎦ ⎢ kv (v − v0 ) ( kv ) + (v − v0 ) (u − u0 )(v − v0 ) kv (u − u0 ) ⎥ ⎢ 0 − − ⎥ ⎣ z z kv λku ku ⎦ ⎡ c R −cR EC⎤⎡V ⎤ ⎡u⎤ ⎡V ⎤ c (or ) = e e e ⇒ f = = J e c ⎢ c ⎥⎢ ⎥ ⎢ ⎥ I ⎢ ⎥ ⎣ 0 Re ⎦⎣Ωe ⎦ ⎣v⎦ ⎣Ωe ⎦
33 European Summer University - Montpellier 2007 I.4 Image-based visual servoing I.41 Indirect visual servoing
JI depends on intrinsic parameters and on pose ! Also depends on pose of camera w.r.t effector
ˆ •JI must be estimated J I •JI must be full rank for controlling all degrees of freedom (at least 3 points required) • End-effector with n dofs only n features components can be controlled in the image.
• Expressions of image jacobian for classical geometrical objects : straight lines, spheres, cylinders etc. in F. Chaumette, thesis 90 and Espiau, TRA 92 … and for any planar objects using moments in Tahri O and Chaumette, F, IEEE TR 2005, 21(6)
34 European Summer University - Montpellier 2007 I.4 Image-based visual servoing I.41 Indirect visual servoing D. Stability and robustness (1)
Low speed : r ≈ r* + f = k J I J I e f Exponential convergence • Behaves as a gradient descent optimization : may be stuck in local minima ! 2nd order optimization Malis, E., Improving vision- based control using efficient second-order minimization techniques, 2004
• The norm of ef decreases, but some components may increase during motion (possible loss of features) • If more features than dofs, least square of error minimization
• no control in the cartesian space better behaviour for small errors
35 European Summer University - Montpellier 2007 I.4 Image-based visual servoing I.41 Indirect visual servoing D. Stability and robustness (2) ˆ + Main source of uncertainty : Ji f = k J I J I e f ˆ + Exponential convergence if J I J I > 0 Easily guaranteed : coarse calibration and coarse pose estimation are generally sufficient. No proof of global stability in function of k Error in the final position independent of errors on camera / effector position independent of erros on intrinsic parameters of the camera if reference image has been learned function of pixel noise
Pixel noise attenuation : pick more features smaller δf Problem of reaching the workspace limits Hybrid visual servoing Malis, E., 21/2D visual servoing TRA 99
36 European Summer University - Montpellier 2007 I.4 Image-based visual servoing I.42 Direct visual servoing A. Control law
* * f d e q or f Control τ Low-level joint - law controllers
Robot+camera pixel noise fˆ = f +δf T image Features extraction
* −1 ˆ + f = J I r = J I J R (q) q Control law : q = k J R (q) J I e f
37 European Summer University - Montpellier 2007 I.4 Image-based visual servoing I.42 Direct visual servoing B. Stability and robustness (1) Stability may be an issue: Low speed vision loop : q ≈ q*
−1 ˆ + Exponential convergence if f = k J I J R (q) J R (q)J I e f J Jˆ + > 0 High speed vision loop : I I Linearized approach : q(s) ≈ F(s,q) q* (s) 1 f ≈ J J (q) F(s,q)q* s I R
−1 * * Control law: q * = J R ( q ) r withr computed using a LPV discrete-time model of the vision loop This approach works in practice with 6DOF vision loops !
38 European Summer University - Montpellier 2007 I.4 Image-based visual servoing I.42 Direct visual servoing B. Stability and robustness (2)
LPV discrete-time model image-based identification
* q* q control r J −1(q) F(z, q) algorithm R robot identified model e acquisition and p δf image processing
ef ˆ −d + f f ˆ + - Tz f J I + J I J R (q) 1− z −1 fd GPC of a 6DOF robot vision loop : J. Gangloff & M. de Mathelin (Advanced Robotics, vol 17, no 10, déc. 2003)
39 European Summer University - Montpellier 2007 I.4 Image-based visual servoing I.42 Direct visual servoing B. Stability and robustness (3) High speed vision loop : Non linear approach : rigid link robot manipulator model
τ = M (q)q+ C(q,q)q + g(q) + fr (q,q)
PD control scheme : (R. Kelly, IEEE Trans. Rob. Aut., vol 12, 1996)
* T ˆ + τ = g(q) − Kvq − J R (q)K p J I e f
Stability is proved only for a 2 DOF robot with no friction
=> Previous restrictions apply
40 European Summer University - Montpellier 2007 OverviewOverview
Part I : Fundamentals of visual servoing Background and definitions Servoing architectures and classification Position-based visual servoing Image-based visual servoing Visual servoing without feature extraction
Part II : Medical robotics applications Laparoscopic surgery Internal organ motion tracking
41 European Summer University - Montpellier 2007 I.5 Visual servoing without feature extraction Case of planar objects
Images linked by a homography
* ⎡u⎤ ⎡u ⎤ ⎛ g g g ⎞ ⎢ ⎥ ⎜ 11 12 13 ⎟ ⎢ ⎥ * * G = g g g p = v = Gp = G⎢v ⎥ ⎜ 21 22 23 ⎟ ⎢ ⎥ ⎜ ⎟ ⎢ ⎥ g31 g32 g33 ⎣⎢1⎦⎥ ⎣ 1 ⎦ ⎝ ⎠ with det(G) = 1 Computation of G using SSD minimization Example : ESM (S. Benhimane and E. Malis 2004) Computation of euclidean homography m = K −1GKm* = Hm*
42 European Summer University - Montpellier 2007 I.5 Visual servoing without feature extraction Control law
* SSD + Cartesian Ir H Control r controller ESM law
I Robot+camera I Timage
λν ⎡ 0 ⎤⎡νe ⎤ ⎡νe ⎤ T r* = − e = e = (H − I)m* [e ] = H − H ⎢ ω ⎥⎢ ⎥ with ⎢ ⎥ ν and ω × ⎣ 0 λ ⎦⎣eω ⎦ ⎣eω ⎦
m* = K −1 p* Camera coordinates of a selected point in the reference image
43 European Summer University - Montpellier 2007 I.5 Visual servoing without feature extraction Stability and robustness
S. Benhimane and E. Malis, Homography-based 2D visual servoing, IEEE ICRA 2006 No need of interaction matrix computation Proof of local stability Practically : good robustness w.r.t intrinsic parameters Large domain of stability Recently extended to non-planar objects
44 European Summer University - Montpellier 2007 BibliographyBibliography
Hager, G. and S. Hutchinson. Special issue on vision-based control of robot manipulators. IEEE Trans. Rob. Autom., vol 12, no 5, 1996.
Corke, P. Visual control of robots. Research studies Press Ltd., Somerset, UK, 1996.
B. Espiau and F. Chaumette, A new approcah to visual servoing in robotics, IEEE Transactions on robotics and automation, vol 8, no 3, 1992,
S. Benhimane and E. Malis, Homography-based 2D visual servoing, IEEE ICRA 2006
45 European Summer University - Montpellier 2007 OverviewOverview
Part I : Fundamentals of visual servoing Background and definitions Servoing architectures and classification Position-based visual servoing Image-based visual servoing Visual servoing without feature extraction
Part II : Medical robotics applications Laparoscopic surgery Internal organ motion tracking
46 European Summer University - Montpellier 2007 VisualVisual servoingservoing inin medicalmedical roboticsrobotics
Main applications In laparoscopic surgery In ophtalmology For physiological motion compensation In US imaging diagnosis and intervention M.A.Vitrani and G.Morel, ICRA 2005 : guidance of surgical instruments A. Krupa and F. Chaumette, IROS 2005 : control of US probe
47 European Summer University - Montpellier 2007 LaparoscopicLaparoscopic surgerysurgery
Small incisions, long-shaft instruments Endoscopic vision system Advantages : Shorter recovery time Lower infection risk Smaller costs
Difficulties : Tiring gestures Indirect visual feedback, lack of depth information Inverted motions limited to 4 DOF Poor haptic feedback
48 European Summer University - Montpellier 2007 LaparoscopicLaparoscopic surgerysurgery
Classical robotic control of the endoscope
AESOP (Computer Motion/Intuitive Surgical) EndoAssist (Armstrong-Healthcare)
Voice Controlled controlled with head motions
49 European Summer University - Montpellier 2007 LaparoscopicLaparoscopic surgerysurgery
Vision-based control of the endoscope
R. Taylor (95), A. Casals (95), G. Wei and G. Hirzinger (97), S. Voros and Ph. Cinquin (2006), etc. More vision guided systems than real visual servoing
Centering the image on marked instruments Tracking of instruments Centering the image on specific physiological markers
50 European Summer University - Montpellier 2007 LaparoscopicLaparoscopic surgerysurgery Visual servoing based control of the instrument (Krupa 2003) Goal : surgical instrument guidance
Autonomous recovery of the instruments Bring the instrument in the field of view Centering of the instrument in the image
Automatic positionning Put the instrument at a desired location w.r.t to tissues Provide depth informations
Motivations : - increase safety - improve ergonomy - speed up surgical procedures
51 European Summer University - Montpellier 2007 LaparoscopicLaparoscopic surgerysurgery Experimental setup
camera Surgical robot
optics
External camera instrument and configuration Laser pointing trocart 2 Laser projection trocart 1 to add information to the scene Laser spots organ
52 European Summer University - Montpellier 2007 LaparoscopicLaparoscopic surgerysurgery Experimental setup laser pointing device
4 laser spots To be used with standard instruments (4 mm) and standard trocarts (10 mm)
3 optical markers (detection of the instrument)
53 European Summer University - Montpellier 2007 LaparoscopicLaparoscopic surgerysurgery
Kinematics in laparoscopic surgery
- constraints: 4 DOF Τ W = ωωωv op( x y z z ) ωx ω y - AESOP: 6 DOF (2 passive joints) vz d1 q2 q3 3 θ c ω 2 4 z
q q4 1 1 5 − kinematic model : 7 6 q WJqqop= op(,d1 ) c 0 5 q6 *1*ˆ − qJq ccop= (,d1 ) W op
54 European Summer University - Montpellier 2007 LaparoscopicLaparoscopic surgerysurgery
Visual features detection
Goal: to obtain the image coordinates of the centers of laser spots and optical markers diffused laser pattern reflecting lights
55 European Summer University - Montpellier 2007 LaparoscopicLaparoscopic surgerysurgery Robust detection Blinking laser spots and optical markers synchronized with frame acquisition Even frames : laser on – markers off Odd frames : markers on – laser off laser marker ligne N°: Selection by high-pass 0 000000000000000000000000000000 1 000000000000000000000000000000 2 000000000000000000000000000000 filtering 3 000000000000000000000000000000 4 000000011110000000000000000000 5 000000000000000000011111100000 6 000000111111000000000000000000 7 000000000000000000011111100000 8 000000111111000000000000000000 9 000000000000000000011111100000Not a 10 000000011110000000000000000000 11 000000000000000000000000000000 12 000000000000000000000000000000 13 000000000000001110000000000000feature 14 000000000000011111000000000000 15 000000000000011111000000000000 16 000000000000011111000000000000 17 000000000000001110000000000000 18 000000000000000000000000000000 19 000000000000000000000000000000
56 European Summer University - Montpellier 2007 LaparoscopicLaparoscopic surgerysurgery
Robust detection
57 European Summer University - Montpellier 2007 LaparoscopicLaparoscopic surgerysurgery
Control of the position of the spot
2D direct visual servoing scheme (Image-based)
* vz
* * * Robot sp s p −1 −1 qc ω J (q,d ) controller +- k Jω op 1 AESOP
* ˆ {}B ωz ZB O s Q Features ˆ p image camera ˆ YB detection XB T - controlled variable : S ppp= (uv) ***T - control input : ω = (ωωxy) - control law : ** SSSJppp=−=k ( ) ω ω
58 European Summer University - Montpellier 2007 LaparoscopicLaparoscopic surgerysurgery
Estimation of interaction matrix
Open loop estimation of J ω ⎡JJˆˆ⎤ Hypothesis : standstill camera Jˆ ω = ⎢ ωω11 12 ⎥ JJˆˆ ⎣⎢ 21ωω 22 ⎦⎥ ω * = cst x ∆u ∆v ∆T ˆ p Jˆ = p Jω11 = * ω 21 * * ω x∆T ω x∆T ω y = 0 * ωx = 0 ∆u p ∆vp ∆T * ˆ Jˆ = Jω J = ω 22 * ω y = cst ω12 * T ω y∆T ω y∆ ˆ −1 Stability if JJωω > 0
59 European Summer University - Montpellier 2007 LaparoscopicLaparoscopic surgerysurgery
60 European Summer University - Montpellier 2007 LaparoscopicLaparoscopic surgerysurgery
Autonomous complex gestures (F. Nageotte 2005) Automatic stitching in laparoscopic surgery Trajectory planning in the camera frame Trajectory following using the endoscopic camera Avoid registration procedure between sensors
61 European Summer University - Montpellier 2007 LaparoscopicLaparoscopic surgerysurgery
Autonomous complex gestures (F. Nageotte 2005)
2D visual servoing : Marked instrument Reference images : projection of the needle-holder in the endoscopic image fine camera calibration required Controlled variables : marker points + cylinder (10 to 16) Control input : 4 DOFs of end-effector
62 European Summer University - Montpellier 2007 LaparoscopicLaparoscopic surgerysurgery
Autonomous complex gestures (F. Nageotte 2005)
Complex interaction matrix with varying size. Hypothesis : small tracking error matrix computed using known reference pose no pose estimation required Prediction of feature appearance and disappearance for update
63 European Summer University - Montpellier 2007 LaparoscopicLaparoscopic surgerysurgery
Automatic suturing
64 European Summer University - Montpellier 2007 OverviewOverview
Part I : Fundamentals of visual servoing Background and definitions Servoing architectures and classification Position-based visual servoing Image-based visual servoing Visual servoing without feature extraction
Part II : Medical robotics applications Laparoscopic surgery Internal organ motion tracking
65 European Summer University - Montpellier 2007 PhysiologicalPhysiological motionsmotions compensationcompensation
Objectives Filter
Stabilized image Visual feedback
Controller
66 European Summer University - Montpellier 2007 PhysiologicalPhysiological motionsmotions compensationcompensation
Breathing motion compensation (R. Ginhoux 2003)
Predictive control law : take into account periodic disturbance GPC control law with periodic output perturbation model
Dynamic model of the robot is required : linearized model obtained by pseudo-random excitation Controlled variable : distance to the organ Control input : first axis of the robot (particular configuration)
Identification of the interaction gain
67 European Summer University - Montpellier 2007 PhysiologicalPhysiological motionsmotions compensationcompensation In vivo experiment
68 European Summer University - Montpellier 2007 PhysiologicalPhysiological motionsmotions compensationcompensation Beating heart motion compensation (R. Ginhoux 2003)
Fast robot
Fast camera (500hz)
2 types of motion: - slow (~0.25 Hz) - fast (~1.5 Hz) In vivo experiments
69 European Summer University - Montpellier 2007 PhysiologicalPhysiological motionsmotions compensationcompensation Beating heart motion compensation
Heart motion
Respiratory component
• Adaptive filtering of 12 harmonics of the heart beat rate
70 European Summer University - Montpellier 2007 PhysiologicalPhysiological motionsmotions compensationcompensation Beating heart motion compensation
Prediction of the perturbation reference correction GPC control law 3 controlled variables : distance to the organ in the image z, spot position on the organ surface (x,y) Control input : 3 axes of the robot (q1, q2, q3) Linearized dynamic model of the axes of the robot considered as dynamically decoupled Hypothesis : interaction matrix partly decoupled
⎡x⎤ ⎡ 0 L12 L13 ⎤⎡q1 ⎤ ⎢y⎥ = ⎢ 0 L L ⎥⎢q ⎥ Estimated around working position in open loop ⎢ ⎥ ⎢ 22 23 ⎥⎢ 2 ⎥ ⎣⎢z⎦⎥ ⎣⎢L31 0 0 ⎦⎥⎣⎢q3 ⎦⎥
71 European Summer University - Montpellier 2007 PhysiologicalPhysiological motionsmotions compensationcompensation
In vivo experiment
72 European Summer University - Montpellier 2007 PhysiologicalPhysiological motionsmotions compensationcompensation
Breathing motion compensation with a flexible endoscope
Flexible endoscopes : gastroscopy, coloscopy, transluminal operations Hard to manually control, orientation and translation coordination very complex Backlashes, dead zones
Autonomous following of the organs
73 European Summer University - Montpellier 2007 PhysiologicalPhysiological motionsmotions compensationcompensation
Breathing motion compensation with a flexible endoscope
74 European Summer University - Montpellier 2007 PhysiologicalPhysiological motionsmotions compensationcompensation
Breathing motion compensation with a flexible endoscope
• Controlled variables : position of an interest area in the image (u, v) • Control input : q1 and q2 ⎡u⎤ ⎡L11 L12 ⎤⎡q1 ⎤ • Interaction matrix estimated beforehand ⎢ ⎥ = ⎢ ⎥⎢ ⎥ ⎣v⎦ ⎣L21 L22 ⎦⎣q2 ⎦ • Repetitive control law rejection of breathing disturbance and model errors after some disturbance periods
75 European Summer University - Montpellier 2007 PhysiologicalPhysiological motionsmotions compensationcompensation
Breathing motion compensation with a flexible endoscope
76 European Summer University - Montpellier 2007 BibliographyBibliography
A. Krupa. Visual servoing in laparoscopic surgery. Ph.D. thesis, University Louis Pasteur, Strasbourg I, june 2003
R. Ginhoux. Compensation des mouvements physiologiques en chirurgie robotisée par commande prédictive. Ph.D. Thesis, Strasbourg University, december 2003
F. Nageotte. Contributions à la suture assistée par ordinateur en chirurgie mini-invasive. Ph.D. Thesis, Strasbourg University, november 2005
A. Krupa, J. Gangloff, C. Doignon, M. de Mathelin, G. Morel, J. Leroy, L. Soler, J. Marescaux. Autonomous 3D positioning of surgical instruments in robotized laparoscopic surgery using visual servoing. IEEE Transactions on Robotics, vol 19, no 5, pages 842-853, 2003
R. Ginhoux, J. Gangloff, M. de Mathelin, L. Soler, M. Arenas Sanchez and J. Marescaux. Active Filtering of Physiological Motion in Robotized Surgery Using Predictive Control. IEEE Transactions on Robotics, vol 21, no 1, pages 67-79, 2005
L. Ott, Ph. Zanne, F. Nageotte and M. De Mathelin, Problematic of transluminal operations, Surgetica 2007
F. Nageotte, Ph. Zanne, M. De Mathelin and Ch. Doignon, Visual servoing based tracking for endoscopic path following, IROS 2006
77 European Summer University - Montpellier 2007 PerspectivesPerspectives
New sensors (visual servoing under CT scan ?) Servoing without features extraction on deformable objects High speed visual servoing with high resolution
78 European Summer University - Montpellier 2007