<<

Wavefront sensing for adaptive MARCOS VAN DAM & RICHARD CLARE W.M. Keck Observatory Acknowledgments

Wilson Mizner : "If you steal from one author it's plagiarism; if you steal from many it's research."

Thanks to: Richard Lane, Lisa Poyneer, Gary Chanan, Jerry Nelson Outline

Wavefront sensing Shack-Hartmann Pyramid Curvature Phase retrieval Gerchberg-Saxton algorithm Phase diversity Properties of a wave-front sensor

Localization: the measurements should relate to a region of the aperture.

Linearization: want a linear relationship between the wave-front and the intensity measurements.

Broadband: the sensor should operate over a wide range of wavelengths.

=> Geometric Optics regime

BUT: Very suboptimal (see talk by GUYON on Friday) Effect of the wave-front slope

A slope in the wave-front causes an incoming photon to be displaced by x = zWx There is a linear relationship between the mean slope of the wavefront and the displacement of an image Wavelength-independent

W(x)

z

x Shack-Hartmann

The aperture is subdivided using a lenslet array. Spots are formed underneath each lenslet. The displacement of the spot is proportional to the wave-front slope. Shack-Hartmann spots

45-degree astigmatism Typical vision science WFS

Lenslets CCD

Many pixels per subaperture Typical Astronomy WFS

Former Keck AO WFS sensor

21  pixels 2 mm 3x3 pixels/subap

200 μ

lenslets CCD relay 3.15  reduction Centroiding

The performance of the Shack-Hartmann sensor depends on how well the displacement of the spot is estimated.

The displacement is usually estimated using the centroid (center-of-mass) estimator. x I(x, y)  y I(x, y) s = s y = x  I(x, y)  I(x, y) This is the optimal estimator for the case where the spot is Gaussian distributed and the noise is Poisson. G-tilt vs Z-tilt

The centroid gives the mean slope of the wavefront (G-tilt).

However, we usually want the least-mean-squares slope (Z-tilt). Centroiding noise

Due to read noise and dark current, all pixels are noisy.

Pixels far from the center of the subaperture are multiplied by a large number: s x I(x, y) x =  x = {L,3,2,1,0,1,2,3,L}

The more pixels you have, the noisier the centroid estimate! Weighted centroid

The noise can be reduced by windowing the centroid: Weighted centroid

Can use a square window, a circular window:

Or better still, a tapered window, w(x,y)

s xw(x, y) I(x, y) x =  s yw(x, y) I(x, y) y =  Correlation (matched filtering)

Find the displacement of the image that gives the maximum correlation:

(s , sx ) x =arg max(w(x, y) I(x, y))

= 

Use FFT or quadratic interpolation to find the subpixel maximum correlation Correlation (matched filtering)

Noise is independent of number of pixels Much better noise performance for many pixels Estimate is independent of uniform background errors Estimate is relatively insensitive to assumed image. Quad cells

In astronomy, wavefront slope measurements are often made using a quad cell (2x2 pixels) Quad cells are faster to read and to compute the centroid and less sensitive to noise

I1 + I2  I3  I4 I1  I2 + I3  I4 s x = s y = I1 + I2 + I3 + I4 I1 + I2 + I3 + I4 Quad cells

These centroid is only linear with displacement over a small region (small dynamic range) Centroid is proportional to spot size

Centroid vs. displacement for different spot sizes

Centroid

Displacement Denominator-free centroiding

When the photon flux is very low, noise in the denominator increases the centroid error Centroid error can be reduced by using the average value of the denominator

I I I I I1 + I2  I3  I4 1  2 + 3  4 s = s y = x E[I I I I ] E[I1 + I2 + I3 + I4 ] 1 + 2 + 3 + 4 Laser guide elongation

Shack-Hartmann subapertures see a line not a spot LGS elongation at Keck

Laser projected from right A possible solution for LGS elongation

Radial format CCD Arrange pixels to be at same angle as spots Currently testing this design for TMT

laser Pyramid wave-front sensor

Aperture plane

Focal plane

Pyramid (glass prism) Lens to image the aperture

Images of the aperture (conjugate aperture plane) Pyramid wave-front sensor

Similar to the Shack-Hartmann using quad cells: it measures the average slope over a subaperture. The subdivision occurs at the image plane, not the pupil plane. Local slope determines which image receives the light Pyramid wave-front sensor non-linearity

When the aberrations are large, the pyramid sensor is very non-linear.

Large focus aberration

4 pupil images x- and y-slopes estimates. Modulation of pyramid sensor

Without modulation: With modulation: Linear over spot width Linear over modulation width Pyramid + lens = 2x2 lenslet array

+ Pyramid

Relay lens 

lenslets Duality between Shack-Hartmann and pyramid

Shack-Hartmann Pyramid Object Aperture

High resolution Aperture image of the object

Low resolution Low resolution images of the object images of the aperture Duality between Shack-Hartmann and pyramid

Shack-Hartmann Pyramid Duality between Shack-Hartmann and pyramid

Aperture Focal Plane Shack-Hartmann Pyramid

Pixels in Shack-Hartmann = lenslets in Pyramid Lenslets in pyramid = pixels in Shack-Hartmann Multi-sided prisms

Pyramid uses 4-sided glass prism at focal plane to generate 4 aperture images Can use any N-sided prism to produce N aperture images Limit as N tends to Infinity gives the “cone” sensor Aperture

Cone

Relay lens

Aperture image Curvature sensing

Image 2 -z

Aperture

Wave-front at aperture

z Image 1

Curvature sensing

Localization comes from the short effective propagation distance,

f ( f  l) Aperture z = l Linear relationship between the f Defocused image I curvature in the aperture and the 1 l normalized intensity difference:

Defocused Broadband light helps reduce diffraction effects. image I2 Curvature sensing

Using the irradiance transport equation, I = I2 W  I.W z Where I is the intensity, W is the wave-front and z is the direction of propagation, we obtain a linear, first-order approximation,

I 2 I 1 2 I = z W + zW. I +2 I 1 I which is a Poisson equation with Neumann boundary conditions. Solution at the boundary

If the intensity is constant at the aperture,

I  I H (x  R  zW )  H (x  R + zW ) 1 2 = x x 1I + 2I H (x  R  zWx ) + H (x  R + zWx )

I1

I2

I1- I2 Solution inside the boundary

I1  I2 Curvature = z(Wxx +Wyy ) I1 + I2

There is a linear relationship between the signal and the curvature The sensor is more sensitive for large effective propagation distances Curvature sensing

As the propagation distance, z, increases,

I1  I2 Sensitivity increases. = z(Wxx +Wyy ) I + I Spatial resolution decreases. 1 2 Diffraction effects increase.

The relationship between the signal, (I1- I2)/(I1+ I2)

and the curvature, Wxx + Wyy, becomes non-linear

Subaru AO system will use two different propagation distances A large distance for high sensitivity A short distance for high spatial resolution Curvature sensing

Practical implementation uses a variable curvature mirror (to obtain images below and above the aperture) and a single detector. Curvature sensor subapertures

Measure intensity in each subaperture with an avalanche photo-diode (APD) Detect individual photons – no read noise Wavefront sensing from defocused images

There are more accurate, non-linear, algorithms to reconstruct the wavefront using defocused images with many pixels

Defocused images True and reconstructed Phase retrieval

Suppose we have an image and knowledge about the pupil.

Can we find the phase, , that resulted in this image? Phase retrieval

Image is insensitive to: Addition of a constant to (x). does not affect the image

Addition of a multiple of 2 to any point on (x) Phase wrapping

Replacing (x) by -(-x) if amplitude is symmetrical e.g., positive and negative defocused images look identical Called the phase ambiguity problem Gerchberg-Saxton algorithm Phase diversity

Phase retrieval suffers from phase ambiguity, slow convergence, algorithm stagnation and sensitivity to noise These problems can be overcome by taking two or more images with a phase difference between them In AO, introduce defocus by moving a calibration source. Phase diversity

Defocus +2 mm

-2 mm

-4 mm Phase diversity

Poked actuators Minus poke phase Plus poke phase Difference Phase diversity

Theoretical diffraction-limited image Measured image Mahalo!