An introduction to Computational Anatomy
St´ephanieAllassonni`ere
Facult´ede m´edecine,Universit´eParis Descartes
September 2018
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 1 / 82 Goals Mathematical tools Databases Deformable Template framework
Introduction
Outline
1. Introduction to Computational Anatomy
2. Registration technics 3. Statistical analysis of the deformations 4. Bayesian Modelling for template estimation
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 2 / 82 Introduction
Outline
1. Introduction to Computational Anatomy Goals Mathematical tools Databases Deformable Template framework 2. Registration technics 3. Statistical analysis of the deformations 4. Bayesian Modelling for template estimation
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 2 / 82 Establish “normal” variability
Classify pathologies from structural deviations
Model organ development across time
Build prior knowledge to simulate new anatomies, segment areas or organs in new patient, predict from the shape of an organ the evolution of a disease (through clinical variables for example)
Estimate representative organs within groups (anatomical invariants) Analysis of populations :
Classification / Discrimination :
Learn the temporal evolution (growth, evolution of a disease) :
Segmentation, prediction, help therapy :
Introduction to CA
Goals
To design mathematical methods and algorithms to model and analyse the anatomy
Characterise anatomical shapes :
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 3 / 82 Establish “normal” variability
Classify pathologies from structural deviations
Model organ development across time
Build prior knowledge to simulate new anatomies, segment areas or organs in new patient, predict from the shape of an organ the evolution of a disease (through clinical variables for example)
Analysis of populations :
Classification / Discrimination :
Learn the temporal evolution (growth, evolution of a disease) :
Segmentation, prediction, help therapy :
Introduction to CA
Goals
To design mathematical methods and algorithms to model and analyse the anatomy
Characterise anatomical shapes : Estimate representative organs within groups (anatomical invariants)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 3 / 82 Classify pathologies from structural deviations
Model organ development across time
Build prior knowledge to simulate new anatomies, segment areas or organs in new patient, predict from the shape of an organ the evolution of a disease (through clinical variables for example)
Establish “normal” variability Classification / Discrimination :
Learn the temporal evolution (growth, evolution of a disease) :
Segmentation, prediction, help therapy :
Introduction to CA
Goals
To design mathematical methods and algorithms to model and analyse the anatomy
Characterise anatomical shapes : Estimate representative organs within groups (anatomical invariants) Analysis of populations :
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 3 / 82 Classify pathologies from structural deviations
Model organ development across time
Build prior knowledge to simulate new anatomies, segment areas or organs in new patient, predict from the shape of an organ the evolution of a disease (through clinical variables for example)
Classification / Discrimination :
Learn the temporal evolution (growth, evolution of a disease) :
Segmentation, prediction, help therapy :
Introduction to CA
Goals
To design mathematical methods and algorithms to model and analyse the anatomy
Characterise anatomical shapes : Estimate representative organs within groups (anatomical invariants) Analysis of populations : Establish “normal” variability
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 3 / 82 Model organ development across time
Build prior knowledge to simulate new anatomies, segment areas or organs in new patient, predict from the shape of an organ the evolution of a disease (through clinical variables for example)
Classify pathologies from structural deviations Learn the temporal evolution (growth, evolution of a disease) :
Segmentation, prediction, help therapy :
Introduction to CA
Goals
To design mathematical methods and algorithms to model and analyse the anatomy
Characterise anatomical shapes : Estimate representative organs within groups (anatomical invariants) Analysis of populations : Establish “normal” variability Classification / Discrimination :
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 3 / 82 Model organ development across time
Build prior knowledge to simulate new anatomies, segment areas or organs in new patient, predict from the shape of an organ the evolution of a disease (through clinical variables for example)
Learn the temporal evolution (growth, evolution of a disease) :
Segmentation, prediction, help therapy :
Introduction to CA
Goals
To design mathematical methods and algorithms to model and analyse the anatomy
Characterise anatomical shapes : Estimate representative organs within groups (anatomical invariants) Analysis of populations : Establish “normal” variability Classification / Discrimination : Classify pathologies from structural deviations
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 3 / 82 Build prior knowledge to simulate new anatomies, segment areas or organs in new patient, predict from the shape of an organ the evolution of a disease (through clinical variables for example)
Model organ development across time Segmentation, prediction, help therapy :
Introduction to CA
Goals
To design mathematical methods and algorithms to model and analyse the anatomy
Characterise anatomical shapes : Estimate representative organs within groups (anatomical invariants) Analysis of populations : Establish “normal” variability Classification / Discrimination : Classify pathologies from structural deviations Learn the temporal evolution (growth, evolution of a disease) :
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 3 / 82 Build prior knowledge to simulate new anatomies, segment areas or organs in new patient, predict from the shape of an organ the evolution of a disease (through clinical variables for example)
Segmentation, prediction, help therapy :
Introduction to CA
Goals
To design mathematical methods and algorithms to model and analyse the anatomy
Characterise anatomical shapes : Estimate representative organs within groups (anatomical invariants) Analysis of populations : Establish “normal” variability Classification / Discrimination : Classify pathologies from structural deviations Learn the temporal evolution (growth, evolution of a disease) : Model organ development across time
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 3 / 82 Build prior knowledge to simulate new anatomies, segment areas or organs in new patient, predict from the shape of an organ the evolution of a disease (through clinical variables for example)
Introduction to CA
Goals
To design mathematical methods and algorithms to model and analyse the anatomy
Characterise anatomical shapes : Estimate representative organs within groups (anatomical invariants) Analysis of populations : Establish “normal” variability Classification / Discrimination : Classify pathologies from structural deviations Learn the temporal evolution (growth, evolution of a disease) : Model organ development across time Segmentation, prediction, help therapy:
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 3 / 82 Introduction to CA
Goals
To design mathematical methods and algorithms to model and analyse the anatomy
Characterise anatomical shapes : Estimate representative organs within groups (anatomical invariants) Analysis of populations : Establish “normal” variability Classification / Discrimination : Classify pathologies from structural deviations Learn the temporal evolution (growth, evolution of a disease) : Model organ development across time Segmentation, prediction, help therapy: Build prior knowledge to simulate new anatomies, segment areas or organs in new patient, predict from the shape of an organ the evolution of a disease (through clinical variables for example)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 3 / 82 Geometry Statistical Modelling and differential geometry Statistical analysis
Statistical learning Functional analysis Etc...
Introduction to CA
Mathematical tools
Characterise anatomical shapes : Analysis of populations :
Classification / Discrimination : Learn the temporal evolution (growth, evolution of a disease) :
Segmentation, prediction, help therapy :
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 4 / 82 Statistical Modelling and differential geometry Statistical analysis
Statistical learning Functional analysis Etc...
Introduction to CA
Mathematical tools
Characterise anatomical shapes : Geometry Analysis of populations :
Classification / Discrimination : Learn the temporal evolution (growth, evolution of a disease) :
Segmentation, prediction, help therapy :
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 4 / 82 Statistical analysis
Statistical learning Functional analysis Etc...
Introduction to CA
Mathematical tools
Characterise anatomical shapes : Geometry Analysis of populations : Statistical Modelling and differential geometry Classification / Discrimination : Learn the temporal evolution (growth, evolution of a disease) :
Segmentation, prediction, help therapy :
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 4 / 82 Statistical learning Functional analysis Etc...
Introduction to CA
Mathematical tools
Characterise anatomical shapes : Geometry Analysis of populations : Statistical Modelling and differential geometry Classification / Discrimination : Statistical analysis Learn the temporal evolution (growth, evolution of a disease) :
Segmentation, prediction, help therapy :
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 4 / 82 Functional analysis Etc...
Introduction to CA
Mathematical tools
Characterise anatomical shapes : Geometry Analysis of populations : Statistical Modelling and differential geometry Classification / Discrimination : Statistical analysis Learn the temporal evolution (growth, evolution of a disease) : Statistical learning Segmentation, prediction, help therapy :
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 4 / 82 Etc...
Introduction to CA
Mathematical tools
Characterise anatomical shapes : Geometry Analysis of populations : Statistical Modelling and differential geometry Classification / Discrimination : Statistical analysis Learn the temporal evolution (growth, evolution of a disease) : Statistical learning Segmentation, prediction, help therapy : Functional analysis
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 4 / 82 Introduction to CA
Mathematical tools
Characterise anatomical shapes : Geometry Analysis of populations : Statistical Modelling and differential geometry Classification / Discrimination : Statistical analysis Learn the temporal evolution (growth, evolution of a disease) : Statistical learning Segmentation, prediction, help therapy : Functional analysis Etc...
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 4 / 82 Introduction to CA
Kinds of data
Images (grey level, tensors, etc)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 5 / 82 Introduction to CA
Kinds of data
Images (grey level, tensors, etc)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 6 / 82 Introduction to CA
Kinds of data
Anatomical landmarks : anatomical points, fibres, gyri, etc
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 7 / 82 Introduction to CA
Kinds of data
Meshed surfaces (with point correspondence or not)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 8 / 82 Introduction to CA
Kinds of data
Clinical variables related to the data (age, diagnosis, physiological parameters, etc)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 9 / 82 Introduction to CA NEED : compare these elements in a mathematical way
−→ Computational anatomy is the correct setting
−→ Requires models of the data
Deformable Template Model (Genander, ’80)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 10 / 82 / Variance
/ Mean
Registration
Template estimation
Introduction to CA
The idea of Deformable Template Model
Compare two observations via the quantification of the deformation from one to the other (D’Arcy Thompson, 1917)
Each element of a population is a smooth deformation of a template
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 11 / 82 / Mean
/ Variance
Template estimation
Introduction to CA
The idea of Deformable Template Model
Compare two observations via the quantification of the deformation from one to the other (D’Arcy Thompson, 1917)
Registration Each element of a population is a smooth deformation of a template
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 11 / 82 / Variance
/ Mean
Introduction to CA
The idea of Deformable Template Model
Compare two observations via the quantification of the deformation from one to the other (D’Arcy Thompson, 1917)
Registration Each element of a population is a smooth deformation of a template Template estimation
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 11 / 82 / Variance
Introduction to CA
The idea of Deformable Template Model
Compare two observations via the quantification of the deformation from one to the other (D’Arcy Thompson, 1917)
Registration Each element of a population is a smooth deformation of a template Template estimation / Mean
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 11 / 82 Introduction to CA
The idea of Deformable Template Model
Compare two observations via the quantification of the deformation from one to the other (D’Arcy Thompson, 1917)
Registration / Variance Each element of a population is a smooth deformation of a template Template estimation / Mean
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 11 / 82 The Registration issue First approach : Rigid body transformations Next step : Linear Models The diffeomorphic setting
Registration
Outline
1. Introduction to Computational Anatomy 2. Registration technics
3. Statistical analysis of the deformations 4. Bayesian Modelling for template estimation
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 12 / 82 Registration
Outline
1. Introduction to Computational Anatomy 2. Registration technics The Registration issue First approach : Rigid body transformations Next step : Linear Models The diffeomorphic setting 3. Statistical analysis of the deformations 4. Bayesian Modelling for template estimation
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 12 / 82 How to quantify the difference between the two objects ? What kind of deformations ? How to apply the deformation to an object ?
Registration
The Registration issue
Deformable template mathematical model : I0 and I1 two data :
I1 'φ . IO
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 13 / 82 What kind of deformations ? How to apply the deformation to an object ?
Registration
The Registration issue
Deformable template mathematical model : I0 and I1 two data :
I1 'φ . IO
How to quantify the difference between the two objects ?
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 13 / 82 How to apply the deformation to an object ?
Registration
The Registration issue
Deformable template mathematical model : I0 and I1 two data :
I1 'φ. IO
How to quantify the difference between the two objects ? What kind of deformations ?
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 13 / 82 Registration
The Registration issue
Deformable template mathematical model : I0 and I1 two data :
I1 'φ. IO
How to quantify the difference between the two objects ? What kind of deformations ? How to apply the deformation to an object ?
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 13 / 82 Registration
Object distances
Difference between two objects : (Same for the following solutions) Images : L2 difference of the two functions
2 kI1 − φ . I0k2
Landmarks : Sum of the Euclidean distance between points :
X 1 0 2 kxi − φ . xi k2 1≤i≤N
Unlabeled landmark set, meshes, fibers : Requires to embed the objects into a mathematical space where “addition, mean” and other mathematical operations are stable (see Jean) !
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 14 / 82 First improvement : φ = Affine transformation Application of the deformation to : −1 Images : φ. I0 = I0 ◦ φ → Image support is deformed, grey levels are transported
Landmarks : φ. (xi )1≤i≤N = (φ(xi ))1≤i≤N Others : see Jean !
Registration
Rigid body or affine registration
Translation, rotation, scaling. Finite dimensional deformation but very restrictive.
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 15 / 82 Application of the deformation to : −1 Images : φ. I0 = I0 ◦ φ → Image support is deformed, grey levels are transported
Landmarks : φ. (xi )1≤i≤N = (φ(xi ))1≤i≤N Others : see Jean !
Registration
Rigid body or affine registration
Translation, rotation, scaling. Finite dimensional deformation but very restrictive. First improvement : φ = Affine transformation
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 15 / 82 Registration
Rigid body or affine registration
Translation, rotation, scaling. Finite dimensional deformation but very restrictive. First improvement : φ = Affine transformation Application of the deformation to : −1 Images : φ. I0 = I0 ◦ φ → Image support is deformed, grey levels are transported
Landmarks : φ. (xi )1≤i≤N = (φ(xi ))1≤i≤N Others : see Jean !
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 15 / 82 Registration
Rigid body registration
Easy parametrisation of φ → Fast computations
Restricted deformations
No affine transformation to match these objects : Need for non linear deformations St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 16 / 82 Idea : each pixel/voxel/point has its own movement
φ = Id + v
where v ∈ V space of smooth vector fields typically Reproducing Kernel Hilbert Space (RKHS) Application of the deformation to :
Images : φ. I0 = I0(Id−v) → Image support is deformed by Id − v ' φ−1, grey levels are transported
Landmarks : φ. (xi )1≤i≤N = (xi +v(xi ))1≤i≤N
Registration
Linearised deformations
Same difference between two objects
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 17 / 82 Application of the deformation to :
Images : φ. I0 = I0(Id−v) → Image support is deformed by Id − v ' φ−1, grey levels are transported
Landmarks : φ. (xi )1≤i≤N = (xi +v(xi ))1≤i≤N
Registration
Linearised deformations
Same difference between two objects Idea : each pixel/voxel/point has its own movement
φ = Id + v
where v ∈ V space of smooth vector fields typically Reproducing Kernel Hilbert Space (RKHS)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 17 / 82 Registration
Linearised deformations
Same difference between two objects Idea : each pixel/voxel/point has its own movement
φ = Id + v
where v ∈ V space of smooth vector fields typically Reproducing Kernel Hilbert Space (RKHS) Application of the deformation to :
Images : φ. I0 = I0(Id−v) → Image support is deformed by Id − v ' φ−1, grey levels are transported
Landmarks : φ. (xi )1≤i≤N = (xi +v(xi ))1≤i≤N
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 17 / 82 Need of a diffeomorphic condition on φ
Registration
Linearised deformations
φ only depends on v → Explicit parametrisation Relevant for small deformations (where Id − v ' φ−1 is valid)
No invertibility guaranteed : overlaps may appear → some tissue may disappear
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 18 / 82 Registration
Linearised deformations
φ only depends on v → Explicit parametrisation Relevant for small deformations (where Id − v ' φ−1 is valid)
No invertibility guaranteed : overlaps may appear → some tissue may disappear Need of a diffeomorphic condition on φ
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 18 / 82 Registration LDDMM
LDDMM framework
d Let Ω ∈ R be an open set. The framework defines
a class of objects O (eg : images I :Ω → R, landmarks (xi )1≤i≤N ,...)
a class of diffeomorphic deformations φ :Ω → Ω, D
a specific group action
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 19 / 82 T P Final deformation φ1 = φtT ◦ ... ◦ φt1 ◦ φ0 ( ∆ti = 1) i=1 φ0 = Id (no deformation to start with)
When ∆t → 0, φt solution of :
v dφt v dt = vt ◦ φt The flow equation : v φ0 = Id
Registration LDDMM
LDDMM framework : The group of diffeomorphisms
At each time step the deformation is a “small deformation” :
φti +∆ti = Id + vti
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 20 / 82 When ∆t → 0, φt solution of :
v dφt v dt = vt ◦ φt The flow equation : v φ0 = Id
Registration LDDMM
LDDMM framework : The group of diffeomorphisms
At each time step the deformation is a “small deformation” :
φti +∆ti = Id + vti
T P Final deformation φ1 = φtT ◦ ... ◦ φt1 ◦ φ0 ( ∆ti = 1) i=1 φ0 = Id (no deformation to start with)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 20 / 82 Registration LDDMM
LDDMM framework : The group of diffeomorphisms
At each time step the deformation is a “small deformation” :
φti +∆ti = Id + vti
T P Final deformation φ1 = φtT ◦ ... ◦ φt1 ◦ φ0 ( ∆ti = 1) i=1 φ0 = Id (no deformation to start with)
When ∆t → 0, φt solution of :
v dφt v dt = vt ◦ φt The flow equation : v φ0 = Id
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 20 / 82 v 2 The set D = {φ1 , v ∈ L ([0, 1], V )} is a subgroup of diffeomorphisms on Ω
D is equipped with a right-invariant metric : d(φ, ψ) = d(Id, ψ ◦ φ−1)
v 0 v 0 It defines a group action : φ1 . I = I , φ1 ∈ D and I , I ∈ O
and a distance between two objects O0 and O1 is computed via the group action : v d(O0, O1) = inf d(Id, φ ) v 1 vt ∈V ,φ1 (O0)=O1
Registration LDDMM
LDDMM framework : The group of diffeomorphisms
If v ∈ V , with V “admissible space”, then : Existence and uniqueness of the solution of the flow equation
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 21 / 82 D is equipped with a right-invariant metric : d(φ, ψ) = d(Id, ψ ◦ φ−1)
v 0 v 0 It defines a group action : φ1 . I = I , φ1 ∈ D and I , I ∈ O
and a distance between two objects O0 and O1 is computed via the group action : v d(O0, O1) = inf d(Id, φ ) v 1 vt ∈V ,φ1 (O0)=O1
Registration LDDMM
LDDMM framework : The group of diffeomorphisms
If v ∈ V , with V “admissible space”, then : Existence and uniqueness of the solution of the flow equation
v 2 The set D = {φ1 , v ∈ L ([0, 1], V )} is a subgroup of diffeomorphisms on Ω
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 21 / 82 v 0 v 0 It defines a group action : φ1 . I = I , φ1 ∈ D and I , I ∈ O
and a distance between two objects O0 and O1 is computed via the group action : v d(O0, O1) = inf d(Id, φ ) v 1 vt ∈V ,φ1 (O0)=O1
Registration LDDMM
LDDMM framework : The group of diffeomorphisms
If v ∈ V , with V “admissible space”, then : Existence and uniqueness of the solution of the flow equation
v 2 The set D = {φ1 , v ∈ L ([0, 1], V )} is a subgroup of diffeomorphisms on Ω
D is equipped with a right-invariant metric : d(φ, ψ) = d(Id, ψ ◦ φ−1)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 21 / 82 and a distance between two objects O0 and O1 is computed via the group action : v d(O0, O1) = inf d(Id, φ ) v 1 vt ∈V ,φ1 (O0)=O1
Registration LDDMM
LDDMM framework : The group of diffeomorphisms
If v ∈ V , with V “admissible space”, then : Existence and uniqueness of the solution of the flow equation
v 2 The set D = {φ1 , v ∈ L ([0, 1], V )} is a subgroup of diffeomorphisms on Ω
D is equipped with a right-invariant metric : d(φ, ψ) = d(Id, ψ ◦ φ−1)
v 0 v 0 It defines a group action : φ1 . I = I , φ1 ∈ D and I , I ∈ O
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 21 / 82 Registration LDDMM
LDDMM framework : The group of diffeomorphisms
If v ∈ V , with V “admissible space”, then : Existence and uniqueness of the solution of the flow equation
v 2 The set D = {φ1 , v ∈ L ([0, 1], V )} is a subgroup of diffeomorphisms on Ω
D is equipped with a right-invariant metric : d(φ, ψ) = d(Id, ψ ◦ φ−1)
v 0 v 0 It defines a group action : φ1 . I = I , φ1 ∈ D and I , I ∈ O
and a distance between two objects O0 and O1 is computed via the group action : v d(O0, O1) = inf d(Id, φ ) v 1 vt ∈V ,φ1 (O0)=O1
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 21 / 82 Registration LDDMM
LDDMM framework : The group of diffeomorphisms
Existence, uniqueness and regularity (diffeomorphism) of the flow φv guaranteed for (vt )t∈[0,1] such as
Z 1 2 kvt kV dt < ∞ . 0
R 1 2 Then : d(O0, O1) = inf { ||vt || dt} v 0 V vt ∈V ,φ1 (O0)=O1
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 22 / 82 : deformation cost
: data attachment term
+ A distance between diffeomorphic objects + A similarity term between objects
=
An energy to minimise :
Z 1 1 2 1 v −1 2 E(v) = kvt kV dt + 2 × |I1 − I0 ◦ (φ1 ) | 2 0 2σ | {z } | {z } |{z} data attachment term deformation cost tradeoff parameter
Registration LDDMM
LDDMM framework : The Matching problem
How to match one object onto another ? Group action ⇒ a way to apply the deformation and stay in O
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 23 / 82 : data attachment term
: deformation cost + A similarity term between objects
=
An energy to minimise :
Z 1 1 2 1 v −1 2 E(v) = kvt kV dt + 2 × |I1 − I0 ◦ (φ1 ) | 2 0 2σ | {z } | {z } |{z} data attachment term deformation cost tradeoff parameter
Registration LDDMM
LDDMM framework : The Matching problem
How to match one object onto another ? Group action ⇒ a way to apply the deformation and stay in O + A distance between diffeomorphic objects
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 23 / 82 : deformation cost
: data attachment term
=
An energy to minimise :
Z 1 1 2 1 v −1 2 E(v) = kvt kV dt + 2 × |I1 − I0 ◦ (φ1 ) | 2 0 2σ | {z } | {z } |{z} data attachment term deformation cost tradeoff parameter
Registration LDDMM
LDDMM framework : The Matching problem
How to match one object onto another ? Group action ⇒ a way to apply the deformation and stay in O + A distance between diffeomorphic objects + A similarity term between objects
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 23 / 82 Registration LDDMM
LDDMM framework : The Matching problem
How to match one object onto another ? Group action ⇒ a way to apply the deformation and stay in O + A distance between diffeomorphic objects : deformation cost + A similarity term between objects : data attachment term
=
An energy to minimise :
Z 1 1 2 1 v −1 2 E(v) = kvt kV dt + 2 × |I1 − I0 ◦ (φ1 ) | 2 0 2σ | {z } | {z } |{z} data attachment term deformation cost tradeoff parameter
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 23 / 82 (pi (t))1≤i≤N = momenta of the landmarks at time t 1 Let H(x, p) = 2 hp, K(x)pi = Hamiltonian Then
dx(t) = K (x(t))p(t) = ∂H (x, p) dt V ∂p
dp(t) 1 ∂H dt = − 2 ∇x(t)KV (p(t), p(t)) = − ∂x (x, p) Hamiltonian constant on the geodesics
Registration LDDMM
LDDMM framework : From Hamiltonian to Shooting
The Hamiltonian formulation :
Let V be a RKHS which kernel KV : (example for N landmarks) N n P ∃ (pi (t))1≤i≤N ∈ R such that vt (x) = KV (xi (t), x)pi (t) i=1
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 24 / 82 1 Let H(x, p) = 2 hp, K(x)pi = Hamiltonian Then
dx(t) = K (x(t))p(t) = ∂H (x, p) dt V ∂p
dp(t) 1 ∂H dt = − 2 ∇x(t)KV (p(t), p(t)) = − ∂x (x, p) Hamiltonian constant on the geodesics
Registration LDDMM
LDDMM framework : From Hamiltonian to Shooting
The Hamiltonian formulation :
Let V be a RKHS which kernel KV : (example for N landmarks) N n P ∃ (pi (t))1≤i≤N ∈ R such that vt (x) = KV (xi (t), x)pi (t) i=1 (pi (t))1≤i≤N = momenta of the landmarks at time t
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 24 / 82 Then
dx(t) = K (x(t))p(t) = ∂H (x, p) dt V ∂p
dp(t) 1 ∂H dt = − 2 ∇x(t)KV (p(t), p(t)) = − ∂x (x, p) Hamiltonian constant on the geodesics
Registration LDDMM
LDDMM framework : From Hamiltonian to Shooting
The Hamiltonian formulation :
Let V be a RKHS which kernel KV : (example for N landmarks) N n P ∃ (pi (t))1≤i≤N ∈ R such that vt (x) = KV (xi (t), x)pi (t) i=1 (pi (t))1≤i≤N = momenta of the landmarks at time t 1 Let H(x, p) = 2 hp, K(x)pi = Hamiltonian
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 24 / 82 dx(t) = K (x(t))p(t) = ∂H (x, p) dt V ∂p
dp(t) 1 ∂H dt = − 2 ∇x(t)KV (p(t), p(t)) = − ∂x (x, p) Hamiltonian constant on the geodesics
Registration LDDMM
LDDMM framework : From Hamiltonian to Shooting
The Hamiltonian formulation :
Let V be a RKHS which kernel KV : (example for N landmarks) N n P ∃ (pi (t))1≤i≤N ∈ R such that vt (x) = KV (xi (t), x)pi (t) i=1 (pi (t))1≤i≤N = momenta of the landmarks at time t 1 Let H(x, p) = 2 hp, K(x)pi = Hamiltonian Then
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 24 / 82 Hamiltonian constant on the geodesics
Registration LDDMM
LDDMM framework : From Hamiltonian to Shooting
The Hamiltonian formulation :
Let V be a RKHS which kernel KV : (example for N landmarks) N n P ∃ (pi (t))1≤i≤N ∈ R such that vt (x) = KV (xi (t), x)pi (t) i=1 (pi (t))1≤i≤N = momenta of the landmarks at time t 1 Let H(x, p) = 2 hp, K(x)pi = Hamiltonian Then
dx(t) = K (x(t))p(t) = ∂H (x, p) dt V ∂p
dp(t) 1 ∂H dt = − 2 ∇x(t)KV (p(t), p(t)) = − ∂x (x, p)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 24 / 82 Registration LDDMM
LDDMM framework : From Hamiltonian to Shooting
The Hamiltonian formulation :
Let V be a RKHS which kernel KV : (example for N landmarks) N n P ∃ (pi (t))1≤i≤N ∈ R such that vt (x) = KV (xi (t), x)pi (t) i=1 (pi (t))1≤i≤N = momenta of the landmarks at time t 1 Let H(x, p) = 2 hp, K(x)pi = Hamiltonian Then
dx(t) = K (x(t))p(t) = ∂H (x, p) dt V ∂p
dp(t) 1 ∂H dt = − 2 ∇x(t)KV (p(t), p(t)) = − ∂x (x, p) Hamiltonian constant on the geodesics
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 24 / 82 Registration LDDMM
LDDMM framework : From Hamiltonian to Shooting
Consequences :
⇒ Parametrisation of the problem by only x0 (known) and p0 For images : the momentum supported by the gradient of the image
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 25 / 82 Registration LDDMM
LDDMM framework : Example of deformations
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 26 / 82 Registration LDDMM
So now ?
Can do statistics on either the objects or the deformations ! For example : Global shape analysis
Local deformation pattern detection
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 27 / 82 Noisy ICA general model Gaussian Graphical Models
Statistical analysis of the deformations
Outline
1. Introduction to Computational Anatomy 2. Registration technics 3. Statistical analysis of the deformations
4. Bayesian Modelling for template estimation
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 28 / 82 Statistical analysis of the deformations
Outline
1. Introduction to Computational Anatomy 2. Registration technics 3. Statistical analysis of the deformations Noisy ICA general model Gaussian Graphical Models 4. Bayesian Modelling for template estimation
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 28 / 82 Mixture of two Gaussian distributions
How : Extracting the “sources” which had generated the data
Statistical analysis of the deformations
Noisy ICA : Goal of the decomposition :
Data point cloud
Data : one point cloud Goal : explain these data
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 29 / 82 Statistical analysis of the deformations
Noisy ICA : Goal of the decomposition :
Data point cloud Mixture of two Gaussian distributions Data : one point cloud Goal : explain these data How : Extracting the “sources” which had generated the data
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 29 / 82 Statistical analysis of the deformations
Principal Component Analysis (PCA)
Orthogonal direction of maximum variance : Assumes a Gaussian distribution : X = µ + Σ1/2ε, ε ∼ N (0, Id).
Result from PCA (black lines) Resampling from the model
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 30 / 82 Second estimated source Resampling form the model
Statistical analysis of the deformations
Independent Component Analysis (ICA)
Interpretation of the data (6= Description) Finds sources which may have generated the cloud Accounts for non Gaussian distributions (more flexible model)
First estimated source
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 31 / 82 Resampling form the model
Statistical analysis of the deformations
Independent Component Analysis (ICA)
Interpretation of the data (6= Description) Finds sources which may have generated the cloud Accounts for non Gaussian distributions (more flexible model)
First estimated source Second estimated source
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 31 / 82 Statistical analysis of the deformations
Independent Component Analysis (ICA)
Interpretation of the data (6= Description) Finds sources which may have generated the cloud Accounts for non Gaussian distributions (more flexible model)
First estimated source Second estimated source Resampling form the model
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 31 / 82 Statistical analysis of the deformations
Difference between PCA and ICA
PCA ICA Maximum variance axes Source separation Geometrical (orthogonal axis) Statistical (source points in the plane) Description of data Explanation and interpretation Gaussian distribution Many other possible distributions only e.g. mixtures, continuous or discrete (see later)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 32 / 82 Model : for all images Xi , 1 ≤ i ≤ n
β ∼ ν | η , ∀ 1 ≤ j ≤ p i,j η 2 2 Xi ∼ N (Aβi ,σ Id) | A, σ , βi .
Various choices of the distribution νη on the independent components
n β1 hidden variables.
Statistical analysis of the deformations ICA model
General hierarchical model :
n d n ∗ Observations : X1 in (R ) ∗ Source matrix : A called decomposition matrix ∗ Gaussian noise : σεi Xi = Aβi + σεi p ∗ Independent components : βi ∈ R , p << d → random vector with independent coordinates
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 33 / 82 Model : for all images Xi , 1 ≤ i ≤ n
β ∼ ν | η , ∀ 1 ≤ j ≤ p i,j η 2 2 Xi ∼ N (Aβi ,σ Id) | A, σ , βi .
Various choices of the distribution νη on the independent components
n β1 hidden variables.
Statistical analysis of the deformations ICA model
General hierarchical model :
n d n ∗ Observations : X1 in (R ) ∗ Source matrix : A called decomposition matrix ∗ Gaussian noise : σεi Xi = Aβi + σεi p ∗ Independent components : βi ∈ R , p << d → random vector with independent coordinates
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 33 / 82 Model : for all images Xi , 1 ≤ i ≤ n
β ∼ ν | η , ∀ 1 ≤ j ≤ p i,j η 2 2 Xi ∼ N (Aβi ,σ Id) | A, σ , βi .
Various choices of the distribution νη on the independent components
n β1 hidden variables.
Statistical analysis of the deformations ICA model
General hierarchical model :
n d n ∗ Observations : X1 in (R ) ∗ Source matrix : A called decomposition matrix ∗ Gaussian noise : σεi Xi = Aβi + σεi p ∗ Independent components : βi ∈ R , p << d → random vector with independent coordinates
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 33 / 82 Model : for all images Xi , 1 ≤ i ≤ n
β ∼ ν | η , ∀ 1 ≤ j ≤ p i,j η 2 2 Xi ∼ N (Aβi ,σ Id) | A, σ , βi .
Various choices of the distribution νη on the independent components
n β1 hidden variables.
Statistical analysis of the deformations ICA model
General hierarchical model :
n d n ∗ Observations : X1 in (R ) ∗ Source matrix : A called decomposition matrix ∗ Gaussian noise : σεi Xi = Aβi + σεi p ∗ Independent components : βi ∈ R , p << d → random vector with independent coordinates
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 33 / 82 Model : for all images Xi , 1 ≤ i ≤ n
β ∼ ν | η , ∀ 1 ≤ j ≤ p i,j η 2 2 Xi ∼ N (Aβi ,σ Id) | A, σ , βi .
Various choices of the distribution νη on the independent components
n β1 hidden variables.
Statistical analysis of the deformations ICA model
General hierarchical model :
n d n ∗ Observations : X1 in (R ) ∗ Source matrix : A called decomposition matrix ∗ Gaussian noise : σεi Xi = Aβi + σεi p ∗ Independent components : βi ∈ R , p << d → random vector with independent coordinates
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 33 / 82 Model : for all images Xi , 1 ≤ i ≤ n
β ∼ ν | η , ∀ 1 ≤ j ≤ p i,j η 2 2 Xi ∼ N (Aβi ,σ Id) | A, σ , βi .
Various choices of the distribution νη on the independent components
Statistical analysis of the deformations ICA model
General hierarchical model :
n d n ∗ Observations : X1 in (R ) ∗ Source matrix : A called decomposition matrix ∗ Gaussian noise : σεi Xi = Aβi + σεi p ∗ Independent components : βi ∈ R , p << d → random vector with independent coordinates n β1 hidden variables.
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 33 / 82 Statistical analysis of the deformations ICA model
General hierarchical model :
n d n ∗ Observations : X1 in (R ) ∗ Source matrix : A called decomposition matrix ∗ Gaussian noise : σεi Xi = Aβi + σεi p ∗ Independent components : βi ∈ R , p << d → random vector with independent coordinates n β1 hidden variables.
Model : for all images Xi , 1 ≤ i ≤ n
β ∼ ν | η , ∀ 1 ≤ j ≤ p i,j η 2 2 Xi ∼ N (Aβi ,σ Id) | A, σ , βi .
Various choices of the distribution νη on the independent components
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 33 / 82 Statistical analysis of the deformations ICA model
Various examples of distributions : Independent Factor analysis
β ∼ ν | η , ∀ 1 ≤ j ≤ p i,j η 2 2 Xi ∼ N (Aβi , σ Id) | A, σ , βi
For identifiability, νη cannot be Gaussian
νη is a mixture of K 1D Gaussian distributions N (mk , 1), k = 1, .., K with weights (wk )1≤k≤K .
η = (mk , wk )1≤k≤K
2 θ = (A, σ , (mk , wk )1≤k≤K )
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 34 / 82 Statistical analysis of the deformations ICA model
Various examples of distributions : Continuous distributions
β ∼ ν | η , ∀ 1 ≤ j ≤ p i,j η 2 2 Xi ∼ N (Aβi , σ Id) | A, σ , βi
νη is either : Logistic Log(1/2), Laplacian, j j j Exponentially scaled Gaussian(EG) : βi = si Yi where 1 p Y ∼ N (µ, Id) and µ = (µ, . . . , µ); si ,..., si are independent Exp(1), also independent from Y (sub-exponential tail)
η = ∅ or µ
θ = (A, σ2) or θ = (A, σ2, µ)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 34 / 82 Bernoulli-censored Gaussian (BG) : βj = bj Y j with bj ∼ B(α), Y is a Gaussian vector with distribution N (µ, Id). Exponentially scaled Bernoulli-censored Gaussian (EBG) : mix of EG and BG Exponentially-scaled ternary distribution (ET) : βj = sj Y j , where s1,..., sp are i.i.d. Exp(1). γ = P(Y j = −1) = P(Y j = 1), providing a symmetric distribution for the components of Y . θ = (A, σ2, µ, α, γ)
Statistical analysis of the deformations ICA model
Various examples of distributions : Discrete distributions
β ∼ ν | η , ∀ 1 ≤ j ≤ p i,j η 2 2 Xi ∼ N (Aβi , σ Id) | A, σ , βi
Idea : Introduce a switch to cancel some of the decomposition vectors → Either binary (”on/off”) or ternary (activate, inhibit, remove)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 34 / 82 Statistical analysis of the deformations ICA model
Various examples of distributions : Discrete distributions
β ∼ ν | η , ∀ 1 ≤ j ≤ p i,j η 2 2 Xi ∼ N (Aβi , σ Id) | A, σ , βi
Idea : Introduce a switch to cancel some of the decomposition vectors → Either binary (”on/off”) or ternary (activate, inhibit, remove) Bernoulli-censored Gaussian (BG) : βj = bj Y j with bj ∼ B(α), Y is a Gaussian vector with distribution N (µ, Id). Exponentially scaled Bernoulli-censored Gaussian (EBG) : mix of EG and BG Exponentially-scaled ternary distribution (ET) : βj = sj Y j , where s1,..., sp are i.i.d. Exp(1). γ = P(Y j = −1) = P(Y j = 1), providing a symmetric distribution for the components of Y . θ = (A, σ2, µ, α, γ)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 34 / 82 BUT : E step not computationally tractable !
Iteration k of the algorithm : E Step : Compute the posterior distribution of βi given Xi and θl : νi,k (dβi ) M Step : Parameter update :
n n θk+1 = arg max Eνn [log q(β1 , X1 ; θ)|] . θ 1,k
Statistical analysis of the deformations Estimation
ML Estimator :
Parameters θ are estimated by maximum likelihood :
θˆ = arg max P(X ; θ)
n β1 unobserved random variables + Maximise a likelihood → EM algorithm (Expectation - Maximisation)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 35 / 82 BUT : E step not computationally tractable !
Statistical analysis of the deformations Estimation
ML Estimator :
Parameters θ are estimated by maximum likelihood :
θˆ = arg max P(X ; θ)
n β1 unobserved random variables + Maximise a likelihood → EM algorithm (Expectation - Maximisation) Iteration k of the algorithm : E Step : Compute the posterior distribution of βi given Xi and θl : νi,k (dβi ) M Step : Parameter update :
n n θk+1 = arg max Eνn [log q(β1 , X1 ; θ)|] . θ 1,k
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 35 / 82 Statistical analysis of the deformations Estimation
ML Estimator :
Parameters θ are estimated by maximum likelihood :
θˆ = arg max P(X ; θ)
n β1 unobserved random variables + Maximise a likelihood → EM algorithm (Expectation - Maximisation) Iteration k of the algorithm : E Step : Compute the posterior distribution of βi given Xi and θl : νi,k (dβi ) M Step : Parameter update :
n n θk+1 = arg max Eνn [log q(β1 , X1 ; θ)|] . θ 1,k BUT : E step not computationally tractable !
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 35 / 82 Statistical analysis of the deformations Estimation
Others :
FastICA : Employs a fixed point algorithm to minimise the mutual information between the coordinates of A−1X
Particle filtering within EM : Approximates the posterior distribution using particle filtering
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 36 / 82 Samples of the four training sets different level of noise. From left to right and top to bottom : σ = 0.1, 0.5, 0.8, 1.5
Statistical analysis of the deformations Experiments
Experiments
Two source images
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 37 / 82 Statistical analysis of the deformations Experiments
Experiments
Two source images
Samples of the four training sets different level of noise. From left to right and top to bottom : σ = 0.1, 0.5, 0.8, 1.5
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 37 / 82 Statistical analysis of the deformations Experiments
Results of the PCA decomposition
σ = 0.1 σ = 0.5 σ = 0.8 σ = 1.5
Cumulative eigen values of the PCA decomposition
Two first Principal Components (orthogonal images).
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 38 / 82 Statistical analysis of the deformations Experiments
Comparison : 30 images per training set
Model/Algo σ = 0.1 σ = 0.5 σ = 0.8 σ = 1.5
FAM-EM/Log
SAEM/Log
SAEM/IFA
EM/IFA
SAEM/BG
FastICA
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 39 / 82 Statistical analysis of the deformations Experiments
Comparison : 50 images per training set
Model/Algo σ = 0.1 σ = 0.5 σ = 0.8 σ = 1.5
FAM-EM/Log
SAEM/Log
SAEM/IFA
EM/IFA
SAEM/BG
FastICA
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 40 / 82 Statistical analysis of the deformations Experiments
Comparison : 100 images per training set
Model/Algo σ = 0.1 σ = 0.5 σ = 0.8 σ = 1.5
FAM-EM/Log
SAEM/Log
SAEM/IFA
EM/IFA
SAEM/BG
FastICA
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 41 / 82 Statistical analysis of the deformations Real databases
Patches of faces from Caltech101 database
100 random images picked from the 10, 000 images used as the training set. These images are patches extracted from the face images of the Caltech101 data base. Each image is a grey level image of size 13 × 13.
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 42 / 82 Statistical analysis of the deformations Real databases
Patches of faces from Caltech101 database
100 decomposition vectors from 2 models. Left : Log-ICA. Right : BG-ICA.
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 43 / 82 Statistical analysis of the deformations Real databases
101 hippocampus deformations (3 populations : Ctrl - Mild AD - AD)
Mean and five decomposition vectors estimated with L-ICA (left) and ET-ICA (right). Each image has its own colorbar to highlight the major patterns.
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 44 / 82 Statistical analysis of the deformations Real databases
101 hippocampus deformations (3 populations : Ctrl - Mild AD - AD)
Ctrl/AD Ctrl/mild AD Model L-ICA BG-ICA L-ICA BG-ICA Mean 0.31 ×10−3 0.33 ×10−3 9.0 ×10−3 1.09 ×10−2 Std dev. 0.16 ×10−3 0.25 ×10−3 3.8 ×10−3 4.6 7.6 ×10−3 Table – Mean and standard deviation of the p-values for the two models with the decomposition vectors. Means and standard deviations are computed over 50 runs to separate the Controls from the AD group (left columns) and to separate the Controls from the mild AD group (right columns). PCA p-values : 0.3 × 10−3 and 7.7 × 10−3 using 95% of the cumulative variance.
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 45 / 82 Such as : correlation patterns Example : computation of the correlation matrix using a PCA decomposition Then highlight the most important by thresholding
Correlations describe the global statistical dependencies between variables = both direct and indirect interactions
Statistical analysis of the deformations Real databases
What more can we look for ?
Relations between regions of the brain or an anatomical structure
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 46 / 82 Example : computation of the correlation matrix using a PCA decomposition Then highlight the most important by thresholding
Correlations describe the global statistical dependencies between variables = both direct and indirect interactions
Statistical analysis of the deformations Real databases
What more can we look for ?
Relations between regions of the brain or an anatomical structure Such as : correlation patterns
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 46 / 82 Then highlight the most important by thresholding
Correlations describe the global statistical dependencies between variables = both direct and indirect interactions
Statistical analysis of the deformations Real databases
What more can we look for ?
Relations between regions of the brain or an anatomical structure Such as : correlation patterns Example : computation of the correlation matrix using a PCA decomposition
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 46 / 82 Correlations describe the global statistical dependencies between variables = both direct and indirect interactions
Statistical analysis of the deformations Real databases
What more can we look for ?
Relations between regions of the brain or an anatomical structure Such as : correlation patterns Example : computation of the correlation matrix using a PCA decomposition Then highlight the most important by thresholding
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 46 / 82 = both direct and indirect interactions
Statistical analysis of the deformations Real databases
What more can we look for ?
Relations between regions of the brain or an anatomical structure Such as : correlation patterns Example : computation of the correlation matrix using a PCA decomposition Then highlight the most important by thresholding However : Correlations describe the global statistical dependencies between variables
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 46 / 82 Statistical analysis of the deformations Real databases
What more can we look for ?
Relations between regions of the brain or an anatomical structure Such as : correlation patterns Example : computation of the correlation matrix using a PCA decomposition Then highlight the most important by thresholding However : Correlations describe the global statistical dependencies between variables = both direct and indirect interactions
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 46 / 82 But conditionally on Snow, Number of snowmen is independent of Traffic jams ⇔ No edge between Traffic jam and Snowmen random variables
Statistical analysis of the deformations Real databases
Correlation vs Conditional Correlation
Under Gaussian assumption Traffic jam intensity correlated to Number of snowmen in town due to snowstorm.
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 47 / 82 ⇔ No edge between Traffic jam and Snowmen random variables
Statistical analysis of the deformations Real databases
Correlation vs Conditional Correlation
Under Gaussian assumption Traffic jam intensity correlated to Number of snowmen in town due to snowstorm. But conditionally on Snow, Number of snowmen is independent of Traffic jams
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 47 / 82 Statistical analysis of the deformations Real databases
Correlation vs Conditional Correlation
Under Gaussian assumption Traffic jam intensity correlated to Number of snowmen in town due to snowstorm. But conditionally on Snow, Number of snowmen is independent of Traffic jams ⇔ No edge between Traffic jam and Snowmen random variables
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 47 / 82 No post processing Introduce sparsity into the modelling
+ high-dimension-low-sample-size paradigm Although the underlying real graph is not sparse, perform a sparse estimation of its structure
Significant edges only appear and estimation more stable
Statistical analysis of the deformations Real databases
Why do we need sparsity ?
Only few of these direct interaction are important
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 48 / 82 Introduce sparsity into the modelling
+ high-dimension-low-sample-size paradigm Although the underlying real graph is not sparse, perform a sparse estimation of its structure
Significant edges only appear and estimation more stable
Statistical analysis of the deformations Real databases
Why do we need sparsity ?
Only few of these direct interaction are important No post processing
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 48 / 82 + high-dimension-low-sample-size paradigm Although the underlying real graph is not sparse, perform a sparse estimation of its structure
Significant edges only appear and estimation more stable
Statistical analysis of the deformations Real databases
Why do we need sparsity ?
Only few of these direct interaction are important No post processing Introduce sparsity into the modelling
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 48 / 82 Although the underlying real graph is not sparse, perform a sparse estimation of its structure
Significant edges only appear and estimation more stable
Statistical analysis of the deformations Real databases
Why do we need sparsity ?
Only few of these direct interaction are important No post processing Introduce sparsity into the modelling
+ high-dimension-low-sample-size paradigm
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 48 / 82 Significant edges only appear and estimation more stable
Statistical analysis of the deformations Real databases
Why do we need sparsity ?
Only few of these direct interaction are important No post processing Introduce sparsity into the modelling
+ high-dimension-low-sample-size paradigm Although the underlying real graph is not sparse, perform a sparse estimation of its structure
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 48 / 82 Statistical analysis of the deformations Real databases
Why do we need sparsity ?
Only few of these direct interaction are important No post processing Introduce sparsity into the modelling
+ high-dimension-low-sample-size paradigm Although the underlying real graph is not sparse, perform a sparse estimation of its structure
Significant edges only appear and estimation more stable
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 48 / 82 On these points, we observe n random responses The p nodes of the graph are thus identified to p random variables denoted X = (X1, ..., Xp)
X assumed to be distributed as a multivariate Gaussian Np(0, Σ)
There exists an edge between nodes a and b if and only if the variables Xa and Xb are dependent given all the remaining variables Conditional correlations given by non-zero entries of Σ−1
Statistical analysis of the deformations Real databases
The GGM statistical Model
Consider p points on a given shape = nodes of the graph
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 49 / 82 The p nodes of the graph are thus identified to p random variables denoted X = (X1, ..., Xp)
X assumed to be distributed as a multivariate Gaussian Np(0, Σ)
There exists an edge between nodes a and b if and only if the variables Xa and Xb are dependent given all the remaining variables Conditional correlations given by non-zero entries of Σ−1
Statistical analysis of the deformations Real databases
The GGM statistical Model
Consider p points on a given shape = nodes of the graph On these points, we observe n random responses
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 49 / 82 X assumed to be distributed as a multivariate Gaussian Np(0, Σ)
There exists an edge between nodes a and b if and only if the variables Xa and Xb are dependent given all the remaining variables Conditional correlations given by non-zero entries of Σ−1
Statistical analysis of the deformations Real databases
The GGM statistical Model
Consider p points on a given shape = nodes of the graph On these points, we observe n random responses The p nodes of the graph are thus identified to p random variables denoted X = (X1, ..., Xp)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 49 / 82 There exists an edge between nodes a and b if and only if the variables Xa and Xb are dependent given all the remaining variables Conditional correlations given by non-zero entries of Σ−1
Statistical analysis of the deformations Real databases
The GGM statistical Model
Consider p points on a given shape = nodes of the graph On these points, we observe n random responses The p nodes of the graph are thus identified to p random variables denoted X = (X1, ..., Xp)
X assumed to be distributed as a multivariate Gaussian Np(0, Σ)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 49 / 82 There exists an edge between nodes a and b if and only if the variables Xa and Xb are dependent given all the remaining variables Conditional correlations given by non-zero entries of Σ−1
Statistical analysis of the deformations Real databases
The GGM statistical Model
Consider p points on a given shape = nodes of the graph On these points, we observe n random responses The p nodes of the graph are thus identified to p random variables denoted X = (X1, ..., Xp)
X assumed to be distributed as a multivariate Gaussian Np(0, Σ)
The graph GΣ of conditional dependencies is defined as follows :
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 49 / 82 Conditional correlations given by non-zero entries of Σ−1
Statistical analysis of the deformations Real databases
The GGM statistical Model
Consider p points on a given shape = nodes of the graph On these points, we observe n random responses The p nodes of the graph are thus identified to p random variables denoted X = (X1, ..., Xp)
X assumed to be distributed as a multivariate Gaussian Np(0, Σ)
The graph GΣ of conditional dependencies is defined as follows : There exists an edge between nodes a and b if and only if the variables Xa and Xb are dependent given all the remaining variables
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 49 / 82 Statistical analysis of the deformations Real databases
The GGM statistical Model
Consider p points on a given shape = nodes of the graph On these points, we observe n random responses The p nodes of the graph are thus identified to p random variables denoted X = (X1, ..., Xp)
X assumed to be distributed as a multivariate Gaussian Np(0, Σ)
The graph GΣ of conditional dependencies is defined as follows : There exists an edge between nodes a and b if and only if the variables Xa and Xb are dependent given all the remaining variables Conditional correlations given by non-zero entries of Σ−1
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 49 / 82 Information known a priori −→ Into the statistical model New data : Neighbouring graph G0 There are correlations between these points But we are not estimating them rather looking for the other ones = Long distance conditional correlations Our idea : Do the estimation in the orthogonal space of G0
T −1 T Xa − Xma (Xma Xma ) Xma Xa , (1) May be too strong constrain (numerically un-invertible) −→ alleviate through :
T −1 T Xa − Xma (Xma Xma + γ0Id) Xma Xa . (2)
Introducing a neighbourhood prior
Introducing a neighbourhood prior
The neighbouring points of the graph are very likely to be conditionally correlated.
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 50 / 82 New data : Neighbouring graph G0 There are correlations between these points But we are not estimating them rather looking for the other ones = Long distance conditional correlations Our idea : Do the estimation in the orthogonal space of G0
T −1 T Xa − Xma (Xma Xma ) Xma Xa , (1) May be too strong constrain (numerically un-invertible) −→ alleviate through :
T −1 T Xa − Xma (Xma Xma + γ0Id) Xma Xa . (2)
Introducing a neighbourhood prior
Introducing a neighbourhood prior
The neighbouring points of the graph are very likely to be conditionally correlated. Information known a priori −→ Into the statistical model
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 50 / 82 There are correlations between these points But we are not estimating them rather looking for the other ones = Long distance conditional correlations Our idea : Do the estimation in the orthogonal space of G0
T −1 T Xa − Xma (Xma Xma ) Xma Xa , (1) May be too strong constrain (numerically un-invertible) −→ alleviate through :
T −1 T Xa − Xma (Xma Xma + γ0Id) Xma Xa . (2)
Introducing a neighbourhood prior
Introducing a neighbourhood prior
The neighbouring points of the graph are very likely to be conditionally correlated. Information known a priori −→ Into the statistical model New data : Neighbouring graph G0
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 50 / 82 But we are not estimating them rather looking for the other ones = Long distance conditional correlations Our idea : Do the estimation in the orthogonal space of G0
T −1 T Xa − Xma (Xma Xma ) Xma Xa , (1) May be too strong constrain (numerically un-invertible) −→ alleviate through :
T −1 T Xa − Xma (Xma Xma + γ0Id) Xma Xa . (2)
Introducing a neighbourhood prior
Introducing a neighbourhood prior
The neighbouring points of the graph are very likely to be conditionally correlated. Information known a priori −→ Into the statistical model New data : Neighbouring graph G0 There are correlations between these points
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 50 / 82 Our idea : Do the estimation in the orthogonal space of G0
T −1 T Xa − Xma (Xma Xma ) Xma Xa , (1) May be too strong constrain (numerically un-invertible) −→ alleviate through :
T −1 T Xa − Xma (Xma Xma + γ0Id) Xma Xa . (2)
Introducing a neighbourhood prior
Introducing a neighbourhood prior
The neighbouring points of the graph are very likely to be conditionally correlated. Information known a priori −→ Into the statistical model New data : Neighbouring graph G0 There are correlations between these points But we are not estimating them rather looking for the other ones = Long distance conditional correlations
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 50 / 82 May be too strong constrain (numerically un-invertible) −→ alleviate through :
T −1 T Xa − Xma (Xma Xma + γ0Id) Xma Xa . (2)
Introducing a neighbourhood prior
Introducing a neighbourhood prior
The neighbouring points of the graph are very likely to be conditionally correlated. Information known a priori −→ Into the statistical model New data : Neighbouring graph G0 There are correlations between these points But we are not estimating them rather looking for the other ones = Long distance conditional correlations Our idea : Do the estimation in the orthogonal space of G0
T −1 T Xa − Xma (Xma Xma ) Xma Xa , (1)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 50 / 82 Introducing a neighbourhood prior
Introducing a neighbourhood prior
The neighbouring points of the graph are very likely to be conditionally correlated. Information known a priori −→ Into the statistical model New data : Neighbouring graph G0 There are correlations between these points But we are not estimating them rather looking for the other ones = Long distance conditional correlations Our idea : Do the estimation in the orthogonal space of G0
T −1 T Xa − Xma (Xma Xma ) Xma Xa , (1) May be too strong constrain (numerically un-invertible) −→ alleviate through :
T −1 T Xa − Xma (Xma Xma + γ0Id) Xma Xa . (2)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 50 / 82 Experiments
No prior
Figure – Examples of the training set. The colour depends on the intensity of the Jacobian of the deformation. Blue means a contraction and red dilatation. The intensity itself is not important but rather its relative value with respect to the others.
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 51 / 82 Experiments
No prior
Lasso + Or Lasso + And Enet + Or Enet + And
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 52 / 82 Experiments
With prior
Figure – Two examples of neighbourhood graphs we used. Left : 3nearest-neighbour graph. Right : neighbours have Euclidean distance below a given threshold.
Note : User’s choice : you can add connection that you know.
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 53 / 82 Experiments
LASSO LASSO LASSO LASSO Enet Enet Enet Enet Orth Orth Ridge Ridge Orth Orth Ridge Ridge Or And Or And Or And Or And
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 54 / 82 Experiments
Clustering of the Shape : using spectral clustering
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 55 / 82 Experiments
Population comparison
Without prior With prior
Controls
AD patients
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 56 / 82 Mathematical framework for deformable models Past approaches to compute a population average Generative statistical models Statistical estimation of the model parameters Experiments And next ?
Statistical Questions
Outline
1. Introduction to Computational Anatomy 2. Registration technics 3. Statistical analysis of the deformations 4. Bayesian Modelling for template estimation
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 57 / 82 Statistical Questions
Outline
1. Introduction to Computational Anatomy 2. Registration technics 3. Statistical analysis of the deformations 4. Bayesian Modelling for template estimation Mathematical framework for deformable models Past approaches to compute a population average Generative statistical models Statistical estimation of the model parameters Experiments And next ?
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 57 / 82 One of them ? Which one and why this one ?
What is a good template ?
Statistical Questions
What are the mean and the variability
Matching depends on the template I0
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 58 / 82 One of them ? Which one and why this one ?
Statistical Questions
What are the mean and the variability
Matching depends on the template I0 What is a good template ?
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 58 / 82 One of them ? Which one and why this one ?
Statistical Questions
What are the mean and the variability
Matching depends on the template I0 What is a good template ?
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 58 / 82 Which one and why this one ?
Statistical Questions
What are the mean and the variability
Matching depends on the template I0 What is a good template ?
One of them ?
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 58 / 82 Statistical Questions
What are the mean and the variability
Matching depends on the template I0 What is a good template ?
One of them ? Which one and why this one ?
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 58 / 82 Problem : Oi lives in a manifold Question : how to compute a mean ? And population normal variability ?
Statistical Questions
What are the mean and the variability
Other question (related)
Let (Oi )1≤i≤n be a homogeneous population (control or AD, or autistic, etc)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 59 / 82 Question : how to compute a mean ? And population normal variability ?
Statistical Questions
What are the mean and the variability
Other question (related)
Let (Oi )1≤i≤n be a homogeneous population (control or AD, or autistic, etc)
Problem : Oi lives in a manifold
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 59 / 82 Statistical Questions
What are the mean and the variability
Other question (related)
Let (Oi )1≤i≤n be a homogeneous population (control or AD, or autistic, etc)
Problem : Oi lives in a manifold Question : how to compute a mean ? And population normal variability ?
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 59 / 82 3 3 ∃ z : R → R an unobserved deformation field 3 a continuously defined template It : R → R 2 a Gaussian centred white noise of variance σt such that
Each observation y belongs to an unknown component t Conditional on the image membership to component t,
y(s) = It (xs − z(xs )) + t (s) = z . It (s) + (s) ,
Observation model :
Statistical Questions
BME Template model
n Population of n grey level images y1
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 60 / 82 3 3 ∃ z : R → R an unobserved deformation field 3 a continuously defined template It : R → R 2 a Gaussian centred white noise of variance σt such that
Each observation y belongs to an unknown component t Conditional on the image membership to component t,
y(s) = It (xs − z(xs )) + t (s) = z . It (s) + (s) ,
Statistical Questions
BME Template model
n Population of n grey level images y1 Observation model :
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 60 / 82 3 3 ∃ z : R → R an unobserved deformation field 3 a continuously defined template It : R → R 2 a Gaussian centred white noise of variance σt such that
Conditional on the image membership to component t,
y(s) = It (xs − z(xs )) + t (s) = z . It (s) + (s) ,
Statistical Questions
BME Template model
n Population of n grey level images y1 Observation model : Each observation y belongs to an unknown component t
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 60 / 82 3 3 ∃ z : R → R an unobserved deformation field 3 a continuously defined template It : R → R 2 a Gaussian centred white noise of variance σt such that
y(s) = It (xs − z(xs )) + t (s) = z . It (s) + (s) ,
Statistical Questions
BME Template model
n Population of n grey level images y1 Observation model : Each observation y belongs to an unknown component t Conditional on the image membership to component t,
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 60 / 82 3 a continuously defined template It : R → R 2 a Gaussian centred white noise of variance σt such that
y(s) = It (xs − z(xs )) + t (s) = z . It (s) + (s) ,
Statistical Questions
BME Template model
n Population of n grey level images y1 Observation model : Each observation y belongs to an unknown component t Conditional on the image membership to component t, 3 3 ∃ z : R → R an unobserved deformation field
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 60 / 82 2 a Gaussian centred white noise of variance σt such that
y(s) = It (xs − z(xs )) + t (s) = z . It (s) + (s) ,
Statistical Questions
BME Template model
n Population of n grey level images y1 Observation model : Each observation y belongs to an unknown component t Conditional on the image membership to component t, 3 3 ∃ z : R → R an unobserved deformation field 3 a continuously defined template It : R → R
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 60 / 82 such that
y(s) = It (xs − z(xs )) + t (s) = z . It (s) + (s) ,
Statistical Questions
BME Template model
n Population of n grey level images y1 Observation model : Each observation y belongs to an unknown component t Conditional on the image membership to component t, 3 3 ∃ z : R → R an unobserved deformation field 3 a continuously defined template It : R → R 2 a Gaussian centred white noise of variance σt
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 60 / 82 Statistical Questions
BME Template model
n Population of n grey level images y1 Observation model : Each observation y belongs to an unknown component t Conditional on the image membership to component t, 3 3 ∃ z : R → R an unobserved deformation field 3 a continuously defined template It : R → R 2 a Gaussian centred white noise of variance σt such that
y(s) = It (xs − z(xs )) + t (s) = z . It (s) + (s) ,
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 60 / 82 It ∈ Vp RKHS with kernel Kp kp Spline model : Given (pk )1≤k≤kp , ∃ αt ∈ R such that : k Pp It (x) = Kpαt (x), = Kp(x, pk )αt (k) k=1 Same spline model for the deformation : z ∈ Vg RKHS with kernel Kg (1) (2) kg kg Given (gk )1≤k≤kg ∃ (β , β ) ∈ R × R such that : kg P (1) (2) z(x) = (Kgβ)(x) = Kg (x, gk )(β (k), β (k)). k=1
Statistical Questions
BME Template model
Template and deformation model :
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 61 / 82 Same spline model for the deformation : z ∈ Vg RKHS with kernel Kg (1) (2) kg kg Given (gk )1≤k≤kg ∃ (β , β ) ∈ R × R such that : kg P (1) (2) z(x) = (Kgβ)(x) = Kg (x, gk )(β (k), β (k)). k=1
Statistical Questions
BME Template model
Template and deformation model :
It ∈ Vp RKHS with kernel Kp kp Spline model : Given (pk )1≤k≤kp , ∃ αt ∈ R such that : k Pp It (x) = Kpαt (x), = Kp(x, pk )αt (k) k=1
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 61 / 82 z ∈ Vg RKHS with kernel Kg (1) (2) kg kg Given (gk )1≤k≤kg ∃ (β , β ) ∈ R × R such that : kg P (1) (2) z(x) = (Kgβ)(x) = Kg (x, gk )(β (k), β (k)). k=1
Statistical Questions
BME Template model
Template and deformation model :
It ∈ Vp RKHS with kernel Kp kp Spline model : Given (pk )1≤k≤kp , ∃ αt ∈ R such that : k Pp It (x) = Kpαt (x), = Kp(x, pk )αt (k) k=1 Same spline model for the deformation :
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 61 / 82 Statistical Questions
BME Template model
Template and deformation model :
It ∈ Vp RKHS with kernel Kp kp Spline model : Given (pk )1≤k≤kp , ∃ αt ∈ R such that : k Pp It (x) = Kpαt (x), = Kp(x, pk )αt (k) k=1 Same spline model for the deformation : z ∈ Vg RKHS with kernel Kg (1) (2) kg kg Given (gk )1≤k≤kg ∃ (β , β ) ∈ R × R such that : kg P (1) (2) z(x) = (Kgβ)(x) = Kg (x, gk )(β (k), β (k)). k=1
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 61 / 82 weights prior
parameters priors
pick labels for images
draw deformations n from τ1
n draw images from β1 and θτ + weakly informative priors
Statistical Questions
BME Template model
Hierarchical generative model :
ρ ∼ ν , ρ θ = (α , σ2, Γt ) ∼ ⊗T (ν ⊗ ν ) t t g 1≤t≤T t=1 p g T n n P τ1 ∼ ⊗i=1 ρt δt | ρ t=1 βn ∼ ⊗n N (0, Γτi )| τ n 1 i=1 g 1 y n ∼ ⊗n N (z I , σ2 Id ) | βn, τ n 1 i=1 βi ατi τi Λ 1 1
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 62 / 82 parameters priors
pick labels for images
draw deformations n from τ1
n draw images from β1 and θτ + weakly informative priors
Statistical Questions
BME Template model
Hierarchical generative model :
ρ ∼ ν , weights prior ρ θ = (α , σ2, Γt ) ∼ ⊗T (ν ⊗ ν ) t t g 1≤t≤T t=1 p g T n n P τ1 ∼ ⊗i=1 ρt δt | ρ t=1 βn ∼ ⊗n N (0, Γτi )| τ n 1 i=1 g 1 y n ∼ ⊗n N (z I , σ2 Id ) | βn, τ n 1 i=1 βi ατi τi Λ 1 1
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 62 / 82 weights prior
pick labels for images
draw deformations n from τ1
n draw images from β1 and θτ + weakly informative priors
Statistical Questions
BME Template model
Hierarchical generative model :
ρ ∼ ν , ρ θ = (α , σ2, Γt ) ∼ ⊗T (ν ⊗ ν ) parameters priors t t g 1≤t≤T t=1 p g T n n P τ1 ∼ ⊗i=1 ρt δt | ρ t=1 βn ∼ ⊗n N (0, Γτi )| τ n 1 i=1 g 1 y n ∼ ⊗n N (z I , σ2 Id ) | βn, τ n 1 i=1 βi ατi τi Λ 1 1
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 62 / 82 weights prior
parameters priors
draw deformations n from τ1
n draw images from β1 and θτ + weakly informative priors
Statistical Questions
BME Template model
Hierarchical generative model :
ρ ∼ ν , ρ θ = (α , σ2, Γt ) ∼ ⊗T (ν ⊗ ν ) t t g 1≤t≤T t=1 p g T n n P τ1 ∼ ⊗i=1 ρt δt | ρ pick labels for images t=1 βn ∼ ⊗n N (0, Γτi )| τ n 1 i=1 g 1 y n ∼ ⊗n N (z I , σ2 Id ) | βn, τ n 1 i=1 βi ατi τi Λ 1 1
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 62 / 82 weights prior
parameters priors
pick labels for images
n draw images from β1 and θτ + weakly informative priors
Statistical Questions
BME Template model
Hierarchical generative model :
ρ ∼ ν , ρ θ = (α , σ2, Γt ) ∼ ⊗T (ν ⊗ ν ) t t g 1≤t≤T t=1 p g T n n P τ1 ∼ ⊗i=1 ρt δt | ρ t=1 draw deformations βn ∼ ⊗n N (0, Γτi )| τ n 1 i=1 g 1 n from τ1 y n ∼ ⊗n N (z I , σ2 Id ) | βn, τ n 1 i=1 βi ατi τi Λ 1 1
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 62 / 82 weights prior
parameters priors
pick labels for images
draw deformations n from τ1
+ weakly informative priors
Statistical Questions
BME Template model
Hierarchical generative model :
ρ ∼ ν , ρ θ = (α , σ2, Γt ) ∼ ⊗T (ν ⊗ ν ) t t g 1≤t≤T t=1 p g T n n P τ1 ∼ ⊗i=1 ρt δt | ρ t=1 βn ∼ ⊗n N (0, Γτi )| τ n 1 i=1 g 1 y n ∼ ⊗n N (z I , σ2 Id ) | βn, τ n 1 i=1 βi ατi τi Λ 1 1 n draw images from β1 and θτ
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 62 / 82 weights prior
parameters priors
pick labels for images
draw deformations n from τ1
n draw images from β1 and θτ
Statistical Questions
BME Template model
Hierarchical generative model :
ρ ∼ ν , ρ θ = (α , σ2, Γt ) ∼ ⊗T (ν ⊗ ν ) t t g 1≤t≤T t=1 p g T n n P τ1 ∼ ⊗i=1 ρt δt | ρ t=1 βn ∼ ⊗n N (0, Γτi )| τ n 1 i=1 g 1 y n ∼ ⊗n N (z I , σ2 Id ) | βn, τ n 1 i=1 βi ατi τi Λ 1 1
+ weakly informative priors
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 62 / 82 Statistical Questions
BME Template model
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 63 / 82 Statistical Questions MCMC-SAEM algorithm
How to learn the parameters ? the MAP Estimator :
Parameters θ are estimated by maximum posterior likelihood :
θˆ = arg max P(θ|y)
where θ ∈ Θ = { (α, σ2, Γ )|α ∈ kp , σ2 > 0, Γ ∈ Sym+ ( ) }. g R g 2kg ,∗ R Sym+ ( ) is the set of positive definite symmetric matrices. 2kg ,∗ R
Let Θ∗ = { θ∗ ∈ Θ | EP (log q(y|θ∗)) = supθ∈Θ EP (log q(y|θ))} where P denotes the distribution governing the observations.
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 64 / 82 Statistical Questions MCMC-SAEM algorithm
How to do in practice ?
n Since β1 are unobserved variables, a natural approach to reach the MAP estimator is the EM algorithm.
Iteration l of the algorithm : E Step : Compute the posterior law on βi , i = 1,..., n. M Step : Parameter update :
n n n θl+1 = arg max E [log q(θ, β1 , y1 )|y1 , θl ] . θ BUT : the E step is not tractable !
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 65 / 82 Statistical Questions MCMC-SAEM algorithm
E step : First solution proposed :
Fast approximation with modes : E Step : ∗ ν (dβ ) ' δ ∗ , ∀i = 1,..., n. β maximise the conditional i,k i βi i distribution on β with the current parameters :
∗ βi = arg max log q(β|Xi ; θk ) β
M Step : Parameter update : uses the “completed observations”
∗ n n θk+1 = arg max log q((β )1, X1 ; θ). θ
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 66 / 82 Statistical Questions MCMC-SAEM algorithm
Details of the maximisation step :
Geometry :
1 t θg,l+1 = Γg,l+1 = (n[ββ ]l + ag Σg ). n + ag where n 1 X Z [ββt ] = ββt ν (β)dβ, l n l,i i=1 is the empirical covariance matrix with respect to the posterior density function. → Importance of the prior !
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 67 / 82 Statistical Questions MCMC-SAEM algorithm
Details of the maximisation step :
Photometry :
t −1 t β β 2 −1 β 2 −1 α = n Kp Kp + σ Σp n Kp Y + σ Σp µp l l t 2 1 t t β β σ = n [Y Y ] + α Kp Kp α n|Λ|+ap l l t t β 2 −2α Kp Y + apσ0 . l
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 68 / 82 Statistical Questions MCMC-SAEM algorithm
E step : First solution proposed : Fast approximation with modes :
∗ ∗ ν (dβ ) = δ ∗ , β maximise the conditional distribution on β for i,l i βi i each component :
∗ βi,τ = arg max log q(β|αl,τ , σl,τ , Γg,l,τ , yi , τ) = β ( ) 1 t −1 1 β 2 arg min β (Γg,l,τ ) β + 2 |yi − Kp αl,τ | , β 2 2σl,τ
∗ Maximise the conditional distribution on τ given the βi . ∗ Pick βi,τ ∗ .
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 69 / 82 Statistical Questions MCMC-SAEM algorithm
Advantages and drawbacks :
∗ Computation of βi : standard gradient descent.
Reduce the EM algorithm to an iterative maximisation of the joint density.
Highly sensitive to noise (see experiments)
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 70 / 82 Continuous function : need to be stored
Statistical Questions MCMC-SAEM algorithm
Our solution : MCMC- Stochastic Approximation EM algorithm :
Iteration k → k + 1 of the algorithm : k+1 l Simulation step : β ∼ Πθl (β , ·) k where Πθk (β , ·) is a transition probability of a convergent Markov Chain having the posterior distribution as stationary distribution, Stochastic approximation :
k+1 Qk+1(θ) = Qk (θ) + ∆k [log q(X , β ; θ) − Qk (θ)]
where (∆k ) is a decreasing sequence of positive step-sizes.
Maximisation step : θk+1 = arg max Qk+1(θ) l [∗]Πθk (β , ·) given by different samplers.
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 71 / 82 Statistical Questions MCMC-SAEM algorithm
Our solution : MCMC- Stochastic Approximation EM algorithm :
Iteration k → k + 1 of the algorithm : k+1 l Simulation step : β ∼ Πθl (β , ·) k where Πθk (β , ·) is a transition probability of a convergent Markov Chain having the posterior distribution as stationary distribution, Stochastic approximation : Continuous function : need to be stored
k+1 Qk+1(θ) = Qk (θ) + ∆k [log q(X , β ; θ) − Qk (θ)]
where (∆k ) is a decreasing sequence of positive step-sizes.
Maximisation step : θk+1 = arg max Qk+1(θ) l [∗]Πθk (β , ·) given by different samplers.
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 71 / 82 Statistical Questions MCMC-SAEM algorithm
Our solution : MCMC- Stochastic Approximation EM algorithm :
Iteration k → k + 1 of the algorithm : k+1 l Simulation step : β ∼ Πθl (β , ·) k where Πθk (β , ·) is a transition probability of a convergent Markov Chain having the posterior distribution as stationary distribution, Stochastic approximation : Continuous function : need to be stored
k+1 Qk+1(θ) = Qk (θ) + ∆k [log q(X , β ; θ) − Qk (θ)]
where (∆k ) is a decreasing sequence of positive step-sizes.
Maximisation step : θk+1 = arg max Qk+1(θ) l [∗]Πθk (β , ·) given by different samplers.
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 71 / 82 It exists θˆ (independent of k) such as
θk+1 = θˆ(sk+1) .
Stochastic approximation
Statistical Questions MCMC-SAEM algorithm
Our solution : MCMC- Stochastic Approximation EM algorithm : (2) :
But ! All our models belong to the Exponential family,
q(X , βk+1; θ) = exp {−ψ(θ) + hS(X , β), φ(θ)i}
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 72 / 82 It exists θˆ (independent of k) such as
θk+1 = θˆ(sk+1) .
Statistical Questions MCMC-SAEM algorithm
Our solution : MCMC- Stochastic Approximation EM algorithm : (2) :
But ! All our models belong to the Exponential family,
q(X , βk+1; θ) = exp {−ψ(θ) + hS(X , β), φ(θ)i}
Stochastic approximation
k+1 Qk+1(θ) = Qk (θ) + ∆k [log q(X , β ; θ) − Qk (θ)]
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 72 / 82 It exists θˆ (independent of k) such as
θk+1 = θˆ(sk+1) .
Statistical Questions MCMC-SAEM algorithm
Our solution : MCMC- Stochastic Approximation EM algorithm : (2) :
But ! All our models belong to the Exponential family,
q(X , βk+1; θ) = exp {−ψ(θ) + hS(X , β), φ(θ)i}
Stochastic approximation
k+1 sk+1 = sk + ∆k S(X , β ) − sk
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 72 / 82 Statistical Questions MCMC-SAEM algorithm
Our solution : MCMC- Stochastic Approximation EM algorithm : (2) :
But ! All our models belong to the Exponential family,
q(X , βk+1; θ) = exp {−ψ(θ) + hS(X , β), φ(θ)i}
Stochastic approximation
k+1 sk+1 = sk + ∆k S(X , β ) − sk It exists θˆ (independent of k) such as
θk+1 = θˆ(sk+1) .
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 72 / 82 Statistical Questions MCMC-SAEM algorithm
Our solution : MCMC- Stochastic Approximation EM algorithm : (2) :
But ! All our models belong to the Exponential family,
q(X , βk+1; θ) = exp {−ψ(θ) + hS(X , β), φ(θ)i}
Stochastic approximation
k+1 sk+1 = sk + ∆k S(X , β ) − sk It exists θˆ (independent of k) such as
θk+1 = θˆ(sk+1) .
Very simple algorithm !
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 72 / 82 Results :
Convergence of (sk ) a.s. towards critical point of mean field of the problem
Convergence of estimated parameters (θk ) a.s. towards critical point of observed likelihood
Central limit theorem for (θk )
Statistical Questions MCMC-SAEM algorithm
Theoretical Results :
With these models and algorithms we have proved some important asymptotic results : Conditions : Smoothness of the model (classic conditions for convergence of stochastic approximation and EM) In case of AMALA sampler : condition for its geometric ergodicity
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 73 / 82 Convergence of estimated parameters (θk ) a.s. towards critical point of observed likelihood
Central limit theorem for (θk )
Statistical Questions MCMC-SAEM algorithm
Theoretical Results :
With these models and algorithms we have proved some important asymptotic results : Conditions : Smoothness of the model (classic conditions for convergence of stochastic approximation and EM) In case of AMALA sampler : condition for its geometric ergodicity Results :
Convergence of (sk ) a.s. towards critical point of mean field of the problem
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 73 / 82 Central limit theorem for (θk )
Statistical Questions MCMC-SAEM algorithm
Theoretical Results :
With these models and algorithms we have proved some important asymptotic results : Conditions : Smoothness of the model (classic conditions for convergence of stochastic approximation and EM) In case of AMALA sampler : condition for its geometric ergodicity Results :
Convergence of (sk ) a.s. towards critical point of mean field of the problem
Convergence of estimated parameters (θk ) a.s. towards critical point of observed likelihood
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 73 / 82 Statistical Questions MCMC-SAEM algorithm
Theoretical Results :
With these models and algorithms we have proved some important asymptotic results : Conditions : Smoothness of the model (classic conditions for convergence of stochastic approximation and EM) In case of AMALA sampler : condition for its geometric ergodicity Results :
Convergence of (sk ) a.s. towards critical point of mean field of the problem
Convergence of estimated parameters (θk ) a.s. towards critical point of observed likelihood
Central limit theorem for (θk )
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 73 / 82 Statistical Questions Experiments
Training sets
Figure – Left : Training set (inverse video). Right : Noisy training set (inverse video).
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 74 / 82 Statistical Questions Experiments
MCMC-SAEM algorithm :
Algorithm/ FAM-EM H.G.-SAEM AMALA-SAEM Noise level
No Noise
Noisy of Variance 1
Figure – Estimated templates using different algorithms and two level of noise. The training set includes 20 images per digit.
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 75 / 82 Statistical Questions Experiments
Estimated geometric variability
– Synthetic samples generated with respect to the BME template model. FigureSt´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 76 / 82 Left : No noise. Right : Noisy data. Statistical Questions Experiments
Figure – Evolution of the estimation of the noise variance along the SAEM iterations. Test of convergence towards the Gaussian distribution of the estimated parameters.
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 77 / 82 Statistical Questions Experiments
Classification rates :
Error rate “EM-Mode” est. SAEM-MCMC est. Mode classifier 40.71 22.52 MCMC classifier - 17.07 Table – Error rate with respect to the estimation and classification methods.
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 78 / 82 Statistical Questions Experiments
3D dendrite spines :
Figure – Estimated template with the one component model : Left : 3D representation of the grey level volume. Right : 3D representation of the thresholded volume.
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 79 / 82 Statistical Questions Experiments
3D dendrite spines :
Figure – Estimated templates of the two components with the 30 image training set : 3D representation after thresholding.
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 80 / 82 Statistical Questions Experiments
3D dendrite spines :
Figure – 3D view of eight synthetic data. The estimated template shown in Figure 7 is randomly deformation with respect to the estimated covariance matrix. The results are then thresholded in order to get a binary volume.
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 81 / 82 One step further
One step further ?
Models of longitudinal data !
St´ephanieAllassonni`ere (Descartes) Computational Anatomy September 2018 82 / 82