CPSC 425:

Image Credit: https://en.wikipedia.org/wiki/Corner_detection

Lecture 12: Corner Detection

( unless otherwise stated slides are taken or adopted from Bob Woodham, Jim Little and Fred Tung ) Menu for Today (October 1, 2018)

Topics: — Corner Detection — Harris Corner Detector — Autocorrelation

Redings:

— Today’s Lecture: Forsyth & Ponce (2nd ed.) 5.3.0 - 5.3.1 — Next Lecture: Forsyth & Ponce (2nd ed.) 6.1, 6.3

Reminders: — Assignment 2: Face Detection in a Scaled Representation is October 10th

2 Today’s “fun” Example:

3 Today’s “fun” Example:

4 Today’s “fun” Example:

5 Lecture 11: Re-cap

Physical properties of a 3D scene cause “edges” in an image: — depth discontinuity — surface orientation discontinuity — reflectance discontinuity — illumination boundaries

Two generic approaches to : — local extrema of a first derivative operator → Canny — zero crossings of a second derivative operator → Marr/Hildreth

Many algorithms consider “boundary detection” as a high-level recognition task and output a probability or confidence that a pixel is on a human-perceived boundary

6 Motivation: Template Matching

When might template matching fail?

— Different scales — Partial Occlusions

— Different orientation — Different Perspective — Lighting conditions — Motion / blur — Left vs. Right hand

7 Motivation: Template Matching in Scaled Representation

When might template matching in scaled representation fail?

— Different scales — Partial Occlusions

— Different orientation — Different Perspective — Lighting conditions — Motion / blur — Left vs. Right hand

8 Motivation: Edge Matching in Scaled Representation

When might edge matching in scaled representation fail?

— Different scales — Partial Occlusions

— Different orientation — Different Perspective — Lighting conditions — Motion / blur — Left vs. Right hand

9 Planar Object Instance Recognition

Database of planar objects Instance recognition

10 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) Recognition under Occlusion

11 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) Image Matching

12 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) Image Matching

13 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) Finding Correspondences

NASA Mars Rover images

14 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) Finding Correspondences

15 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) What is a Good Feature?

Pick a point in the image. Find it again in the next image.

16 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) What is a Good Feature?

Pick a point in the image. Find it again in the next image.

17 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) What is a corner?

Image Credit: John Shakespeare, Sydney Morning Herald We can think of a corner as any locally distinct 2D image feature that (hopefully) corresponds to a distinct position on an 3D object of interest in the scene.

18 Why are corners distinct?

A corner can be localized reliably.

Thought experiment:

19 Why are corners distinct?

A corner can be localized reliably.

Thought experiment:

“flat” region: — Place a small window over a patch of constant image value. no change in all directions

20 Image Credit: Ioannis (Yannis) Gkioulekas (CMU) Why are corners distinct?

A corner can be localized reliably.

Thought experiment:

“flat” region: — Place a small window over a patch of constant image value. no change in all If you slide the window in any direction, the image in the directions window will not change.

21 Image Credit: Ioannis (Yannis) Gkioulekas (CMU) Why are corners distinct?

A corner can be localized reliably.

Thought experiment:

“edge”: — Place a small window over a patch of constant image value. no change along If you slide the window in any direction, the image in the the edge direction window will not change.

— Place a small window over an edge.

22 Image Credit: Ioannis (Yannis) Gkioulekas (CMU) Why are corners distinct?

A corner can be localized reliably.

Thought experiment:

“edge”: — Place a small window over a patch of constant image value. no change along If you slide the window in any direction, the image in the the edge direction window will not change.

— Place a small window over an edge. If you slide the window in the direction of the edge, the image in the window will not change → Cannot estimate location along an edge (a.k.a., aperture problem)

23 Image Credit: Ioannis (Yannis) Gkioulekas (CMU) Why are corners distinct?

A corner can be localized reliably.

Thought experiment:

“corner”: — Place a small window over a patch of constant image value. significant change If you slide the window in any direction, the image in the in all directions window will not change.

— Place a small window over an edge. If you slide the window in the direction of the edge, the image in the window will not change → Cannot estimate location along an edge (a.k.a., aperture problem)

— Place a small window over a corner.

24 Image Credit: Ioannis (Yannis) Gkioulekas (CMU) Why are corners distinct?

A corner can be localized reliably.

Thought experiment:

“corner”: — Place a small window over a patch of constant image value. significant change If you slide the window in any direction, the image in the in all directions window will not change.

— Place a small window over an edge. If you slide the window in the direction of the edge, the image in the window will not change → Cannot estimate location along an edge (a.k.a., aperture problem)

— Place a small window over a corner. If you slide the window in any direction, the image in the window changes.

25 Image Credit: Ioannis (Yannis) Gkioulekas (CMU) How do you find a corner?

[Moravec 1980]

Easily recognized by looking through a small window

Shifting the window should give large change in intensity

26 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) Autocorrelation

Autocorrelation is the correlation of the image with itself.

— Windows centered on an edge point will have autocorrelation that falls off slowly in the direction along the edge and rapidly in the direction across (perpendicular to) the edge.

— Windows centered on a corner point will have autocorrelation that falls of rapidly in all directions.

27 Autocorrelation

Szeliski, Figure 4.5

28 Autocorrelation

Szeliski, Figure 4.5

29 Autocorrelation

Szeliski, Figure 4.5

30 Autocorrelation

Szeliski, Figure 4.5

31 Autocorrelation

Szeliski, Figure 4.5

32 Autocorrelation

Szeliski, Figure 4.5

33 Autocorrelation

Autocorrelation is the correlation of the image with itself.

— Windows centered on an edge point will have autocorrelation that falls off slowly in the direction along the edge and rapidly in the direction across (perpendicular to) the edge.

— Windows centered on a corner point will have autocorrelation that falls of rapidly in all directions.

34 Corner Detection

Edge detectors perform poorly at corners

Observations: — The gradient is ill defined exactly at a corner — Near a corner, the gradient has two (or more) distinct values

35 Harris Corner Detection

1.Compute image gradients over small region

2.Compute the

3.Compute eigenvectors and eigenvalues

4.Use threshold on eigenvalues to detect corners

Slide Adopted: Ioannis (Yannis) Gkioulekas (CMU) 36 1. Compute image gradients over a small region (not just a single pixel)

array of x gradients

array of y gradients

37 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) Visualization of Gradients

image

X derivative

Y derivative

38 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) What Does a Distribution Tells You About the Region?

39 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) What Does a Distribution Tells You About the Region?

Distribution reveals the orientation and magnitude

40 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) What Does a Distribution Tells You About the Region?

Distribution reveals the orientation and magnitude

How do we quantify the orientation and magnitude? 41 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) 2. Compute the covariance matrix (a.k.a. 2nd moment matrix)

C =

42 2. Compute the covariance matrix (a.k.a. 2nd moment matrix)

Sum over small region around the corner

C =

43 2. Compute the covariance matrix (a.k.a. 2nd moment matrix)

Sum over small region Gradient with respect to x, times around the corner gradient with respect to y

C =

44 2. Compute the covariance matrix (a.k.a. 2nd moment matrix)

Sum over small region Gradient with respect to x, times around the corner gradient with respect to y

C =

=sum( .* )

array of x gradients array of y gradients

45 2. Compute the covariance matrix (a.k.a. 2nd moment matrix)

Sum over small region Gradient with respect to x, times around the corner gradient with respect to y

C =

Matrix is symmetric

46 2. Compute the covariance matrix (a.k.a. 2nd moment matrix)

By computing the gradient covariance matrix …

C =

we are fitting a quadratic to the gradients over a small image region

47 Simple Case

Local Image Patch

IxIx IxIy p P p P 1 0 C = 2 2 = 2 P IyIx P IyIy 3 ?0 2 p P p P  4 2 2 5 P P 48 Simple Case Ix Iy ? ?

Local Image Patch

IxIx IxIy p P p P 1 0 C = 2 2 = 2 P IyIx P IyIy 3 ?0 2 p P p P  4 2 2 5 P P 49 Simple Case Ix Iy ?

Local Image Patch high value along vertical strip of pixels and 0 elsewhere

IxIx IxIy p P p P 1 0 C = 2 2 = 2 P IyIx P IyIy 3 ?0 2 p P p P  4 2 2 5 P P 50 Simple Case Ix Iy

Local Image Patch high value along vertical high value along horizontal strip of pixels and 0 elsewhere strip of pixels and 0 elsewhere

IxIx IxIy p P p P 1 0 C = 2 2 = 2 P IyIx P IyIy 3 ?0 2 p P p P  4 2 2 5 P P 51 Simple Case Ix Iy

Local Image Patch high value along vertical high value along horizontal strip of pixels and 0 elsewhere strip of pixels and 0 elsewhere

IxIx IxIy p P p P 1 0 C = 2 2 = 2 P IyIx P IyIy 3 0 2 p P p P  4 2 2 5 P P 52 General Case

It can be shown that since every C is symmetric:

IxIx IxIy p P p P 1 1 0 C = 2 2 = R R 2 P IyIx P IyIy 3 0 2 p P p P  4 P2 P2 5

… so general case is like a rotated version of the simple one

53 3. Computing Eigenvalues and Eigenvectors

54 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) Quick Eigenvalue/Eigenvector Review

Given a square matrix A , a scalar is called an eigenvalue of A if there exists a nonzero vector v that satisfies Av = v

The vector v is called an eigenvector for A corresponding to the eigenvalue .

The eigenvalues of A are obtained by solving

det(A I)=0

55 3. Computing Eigenvalues and Eigenvectors eigenvalue

Ce = e (C I)e =0 eigenvector

56 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) 3. Computing Eigenvalues and Eigenvectors eigenvalue

Ce = e (C I)e =0 eigenvector

1. Compute the of (C I)e =0 (returns a polynomial)

57 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) 3. Computing Eigenvalues and Eigenvectors eigenvalue

Ce = e (C I)e =0 eigenvector

1. Compute the determinant of (C I)e =0 (returns a polynomial) 2. Find the roots of polynomial det(C I)=0 (returns eigenvalues)

58 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) 3. Computing Eigenvalues and Eigenvectors eigenvalue

Ce = e (C I)e =0 eigenvector

1. Compute the determinant of (C I)e =0 (returns a polynomial) 2. Find the roots of polynomial det(C I)=0 (returns eigenvalues) 3. For each eigenvalue, solve (C I)e =0 (returns eigenvectors) 59 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) Example

21 C = 12  21 det =0 12 ✓ ◆ (2 )(2 ) (1)(1) = 0 1. Compute the determinant of (C I)e =0 (returns a polynomial) 2 4 +3=0 2. Find the roots of polynomial ( 3)( 1) = 0 det(C I)=0 (returns eigenvalues) =1, =3 1 2 3. For each eigenvalue, solve (C I)e =0 (returns eigenvectors) 60 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) Example 21 C = 12  21 221 1 C = detdet =0 12 1212  ✓✓ ◆ ◆ 21 det =0 (2 )(2 ) (1)(1) = 0 12 ✓ ◆ (2 )(2 ) (1)(1) =2 0 4 +3=0 1. Compute the determinant of (C I)e =0 (returns a polynomial) 2 4 +3=0 ( 3)( 1) = 0 2. Find the roots of polynomial ( 3)( 1) = 0 =1, =3 det(C I)=0 (returns1 eigenvalues)2 =1, =3 1 2 3. For each eigenvalue, solve (C I)e =0 (returns eigenvectors) 61 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) 21 C = 12  Example 21 21 det =0 C = 12 12 ✓ a ◆  (2 )(2 ) (1)(1) = 0 21 221 1 C = detdet =0 2 4 +3=0 12 1212 2  ✓✓ ◆ ◆ 4( +3=0 3)( 1) = 0 21 det =0 (2 )(2 ) (1)(1) = 0 1 =1, 2 =3 12 ( 3)( 1) = 0 ✓ ◆ (2 )(2 ) (1)(1) =2 0 4 +3=0 1. Compute the determinant of 1(C=1, 2=3I)e =0 (returns a polynomial) 2 4 +3=0 ( 3)( 1) = 0 2. Find the roots of polynomial ( 3)( 1) = 0 =1, =3 det(C I)=0 (returns1 eigenvalues)2 =1, =3 1 2 3. For each eigenvalue, solve (C I)e =0 (returns eigenvectors) 62 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) Visualization as Quadratic

can be written in matrix form like this…

63 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) Visualization as Quadratic

can be written in matrix form like this…

Result of Computing Eigenvalues and Eigenvectors (using SVD) eigenvalues eigenvectors along diagonal

axis of the scaling of the quadratic ‘ellipse slice’ along the axis

64 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) Visualization as Ellipse IxIxIx IxIxyIy IxIx IxIy p pP P p pP P p1 P1 11 0p0P 1 1 0 SinceC C= is= symmetric,2 2 we2 have2 C = =RR2 2 R R = R R 2 2PPIyIxyIx PPIyIyIy3 3 2 P I0yI0x 2P2 IyIy 3 0 2 p pP P p pP PIxIx IxpIyP  p P  2 2 2p 2P p P 2 21 1 0 IxIx WeIxI ycan visualize4 4PP C as= an Pellipse2P with5 axis524 lengthsP determined= R P by the5 eigenvaluesR 0P P p P p P and orientation 1determined1 2 by IyIx IyIy 3 0 2 C = 2 2 = R p P R p P  2 P IyIx P IyIy 3 0 22 2 p P p P  4 P P 5 4 P2 P2 5 Ellipse equation:

65 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) Visualization as Ellipse IxIxIx IxIxyIy IxIx IxIy p pP P p pP P p1 P1 11 0p0P 1 1 0 SinceC C= is= symmetric,2 2 we2 have2 C = =RR2 2 R R = R R 2 2PPIyIxyIx PPIyIyIy3 3 2 P I0yI0x 2P2 IyIy 3 0 2 p pP P p pP PIxIx IxpIyP  p P  2 2 2p 2P p P 2 21 1 0 IxIx WeIxI ycan visualize4 4PP C as= an Pellipse2P with5 axis524 lengthsP determined= R P by the5 eigenvaluesR 0P P p P p P and orientation 1determined1 2 by IyIx IyIy 3 0 2 C = 2 2 = R p P R p P  2 P IyIx P IyIy 3 0 22 2 p P p P  4 P P direction5 of the 2 2 4 P P 5 minor axis Ellipse equation:

-1/2 (λmax) -1/2 (λmin) direction of the major axis

66 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) Interpreting Eigenvalues

2

λ2 >> λ1

What kind of image patch does each region represent?

λ1 >> λ2 1

67 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) Interpreting Eigenvalues

2 corner ‘horizontal’ edge

λ2 >> λ1

λ1 ~ λ2

λ1 >> λ2 flat

‘vertical’ edge 1

68 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) Interpreting Eigenvalues

2 corner ‘horizontal’ edge

λ2 >> λ1

λ1 ~ λ2

λ1 >> λ2 flat

‘vertical’ edge 1

69 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU) Interpreting Eigenvalues

2 corner ‘horizontal’ edge

λ2 >> λ1

λ1 ~ λ2

λ1 >> λ2 flat

‘vertical’ edge 1

70 Slide Credit: Ioannis (Yannis) Gkioulekas (CMU)