Wavefront sensing for adaptive optics

Preview:

Citation preview

Wavefront sensing for adaptive

optics

MARCOS VAN DAM & RICHARD CLARE

W.M. Keck Observatory

Wilson Mizner : "If you steal from one author it's

plagiarism; if you steal from many it's research."

Thanks to: Richard Lane, Lisa Poyneer, Gary Chanan,

Jerry Nelson

Acknowledgments

Wavefront sensing

Shack-Hartmann

Pyramid

Curvature

Phase retrievalGerchberg-Saxton algorithm

Phase diversity

Outline

Properties of a wave-front sensor

Localization: the measurements should relate to aregion of the aperture.

Linearization: want a linear relationship between thewave-front and the intensity measurements.

Broadband: the sensor should operate over a widerange of wavelengths.

=> Geometric Optics regime

BUT: Very suboptimal (see talk by GUYON on Friday)

Effect of the wave-front slope

A slope in the wave-front causes an incoming photon

to be displaced by

There is a linear relationship between the mean slope

of the wavefront and the displacement of an image

Wavelength-independent

xzWx =

x

z

W(x)

Shack-Hartmann

The aperture is subdivided using a lenslet array.

Spots are formed underneath each lenslet.

The displacement of the spot is proportional to the

wave-front slope.

Shack-Hartmann spots

45-degree astigmatism

Typical vision science WFS

Lenslets CCD

Many pixels per subaperture

Typical Astronomy WFS

lensletsrelay lens

CCD

200 μ

2 mm

3.15 reduction

21 pixels

3x3 pixels/subap

Former Keck AO WFS sensor

The performance of the Shack-Hartmann sensordepends on how well the displacement of the spot isestimated.

The displacement is usually estimated using thecentroid (center-of-mass) estimator.

This is the optimal estimator for the case where thespot is Gaussian distributed and the noise is Poisson.

=),(

),(

yxI

yxIxsx

Centroiding

=),(

),(

yxI

yxIysy

G-tilt vs Z-tilt

The centroid gives the mean slope of the wavefront

(G-tilt).

However, we usually want the least-mean-squares

slope (Z-tilt).

Due to read noise and dark current, all pixels arenoisy.

Pixels far from the center of the subaperture aremultiplied by a large number:

The more pixels you have, the noisier the centroidestimate!

= ),( yxIxsx

Centroiding noise

},3,2,1,0,1,2,3,{ LL=x

The noise can be reduced by windowing the centroid:

Weighted centroid

Can use a square window, a circular window:

Or better still, a tapered window, w(x,y)

Weighted centroid

= ),(),( yxIyxxwsx

= ),(),( yxIyxywsy

Find the displacement of the image that gives themaximum correlation:

Use FFT or quadratic interpolation to find thesubpixel maximum correlation

Correlation (matched filtering)

)),(),(max(arg),( yxIyxwss xx =

=

Noise is independent of number of pixels

Much better noise performance for many pixels

Estimate is independent of uniform backgrounderrors

Estimate is relatively insensitive to assumed image.

Correlation (matched filtering)

In astronomy, wavefront slope measurements are

often made using a quad cell (2x2 pixels)

Quad cells are faster to read and to compute the

centroid and less sensitive to noise

Quad cells

4321

4321

IIII

IIIIs

x

+++

+=

4321

4321

IIII

IIIIsy

+++

+=

These centroid is only linear with displacement over a

small region (small dynamic range)

Centroid is proportional to spot size

Quad cells

Displacement

Centroid

Centroid vs. displacement for different spot sizes

When the photon flux is very low, noise in the

denominator increases the centroid error

Centroid error can be reduced by using the average

value of the denominator

Denominator-free centroiding

][ 4321

4321

IIIIE

IIIIs

x

+++

+=

][ 4321

4321

IIIIE

IIIIsy

+++

+=

Shack-Hartmann subapertures see a line not a

spot

Laser guide elongation

LGS elongation at Keck

Laser projected from right

A possible solution for LGS elongation

Radial format CCD

Arrange pixels to be at

same angle as spots

Currently testing this

design for TMT

laser

Pyramid wave-front sensor

Focal plane

Images of the aperture

(conjugate aperture plane)

Aperture plane

Pyramid (glass prism)

Lens to image the aperture

Similar to the Shack-Hartmann using quad cells: it

measures the average slope over a subaperture.

The subdivision occurs at the image plane, not the

pupil plane.

Local slope determines which image receives the light

Pyramid wave-front sensor

When the aberrations are large, the pyramid sensor is

very non-linear.

Pyramid wave-front sensor non-linearity

4 pupil images x- and y-slopes estimates.

Large focus aberration

Modulation of pyramid sensor

Without modulation:

Linear over spot width

With modulation:

Linear over modulation width

+

Pyramid + lens = 2x2 lenslet array

Pyramid

Relay lens

lenslets

Duality between Shack-Hartmann and pyramid

Shack-Hartmann Pyramid

Low resolution

images of the object

Object

Low resolution

images of the aperture

Aperture

ApertureHigh resolution

image of the

object

Duality between Shack-Hartmann and pyramid

Shack-Hartmann Pyramid

Duality between Shack-Hartmann and pyramid

Shack-Hartmann

Aperture Focal Plane

Pyramid

Pixels in Shack-Hartmann = lenslets in PyramidLenslets in pyramid = pixels in Shack-Hartmann

Multi-sided prisms

Pyramid uses 4-sided glass prism at focal plane

to generate 4 aperture images

Can use any N-sided prism to produce N aperture

images

Limit as N tends to Infinity gives the “cone” sensor

Cone

Relay lens

Aperture image

Aperture

Wave-front at aperture

Aperture

Image 1

z

-z

Image 2

Curvature sensing

Localization comes from the shorteffective propagation distance,

Linear relationship between thecurvature in the aperture and thenormalized intensity difference:

Broadband light helps reducediffraction effects.

Curvature sensing

Aperture

Defocused

image I1

Defocused

image I2

l

f l

lffz

)(=

Curvature sensing

WIWIz

I= .

2

I

IWzWz

II

II+=

+.

2

12

12

Where I is the intensity, W is the wave-front

and z is the direction of propagation, we obtain

a linear, first-order approximation,

Using the irradiance transport equation,

which is a Poisson equation with Neumann

boundary conditions.

Solution at the boundary

)()(

)()(

21

21

xx

xx

zWRxHzWRxH

zWRxHzWRxH

II

II

++

+=

+

I1

I2

I1- I2

If the intensity is constant at the aperture,

Solution inside the boundary

)(21

21yyxx WWz

II

II+=

+

There is a linear relationship between the signal and

the curvature

The sensor is more sensitive for large effective

propagation distances

Curvature

Curvature sensing

As the propagation distance, z, increases,

Sensitivity increases.

Spatial resolution decreases.

Diffraction effects increase.

The relationship between the signal, (I1- I2)/(I1+ I2)

and the curvature, Wxx + Wyy, becomes non-linear

)(21

21yyxx WWz

II

II+=

+

Subaru AO system will use two different propagation distancesA large distance for high sensitivityA short distance for high spatial resolution

Curvature sensing

Practical implementation uses a variable curvature

mirror (to obtain images below and above the

aperture) and a single detector.

Curvature sensor subapertures

Measure intensity in each subaperture with an

avalanche photo-diode (APD)

Detect individual photons – no read noise

Wavefront sensing from defocused images

There are more accurate, non-linear, algorithms to

reconstruct the wavefront using defocused images

with many pixels

Defocused images True and reconstructed wavefronts

Suppose we have an image and knowledge about the

pupil.

Can we find the phase, , that resulted in this image?

Phase retrieval

Image is insensitive to:

Addition of a constant to (x).Piston does not affect the image

Addition of a multiple of 2 to any point on (x)Phase wrapping

Replacing (x) by - (-x) if amplitude is symmetricale.g., positive and negative defocused images look identical

Called the phase ambiguity problem

Phase retrieval

Gerchberg-Saxton algorithm

Phase retrieval suffers from phase ambiguity, slow

convergence, algorithm stagnation and sensitivity to

noise

These problems can be overcome by taking two or more

images with a phase difference between them

In AO, introduce defocus by moving a calibration source.

Phase diversity

Phase diversity

+2 mm

-2 mm

-4 mm

Defocus

Phase diversity

Poked actuators Minus poke phase Plus poke phase Difference

Phase diversity

Theoretical diffraction-limited image Measured image

Mahalo!

Recommended