Optical test-benches for multiple source wavefront propagation and spatiotemporal point-spread function emulation Stephen J. Weddell1,* and Andrew J. Lambert2 1

Department of Electrical and Computer Engineering, University of Canterbury, Christchurch, New Zealand 2

University of New South Wales, Canberra, ACT, Australia *Corresponding author: [email protected]

Received 20 August 2014; revised 28 October 2014; accepted 29 October 2014; posted 31 October 2014 (Doc. ID 221219); published 1 December 2014

Precise measurement of aberrations within an optical system is essential to mitigate combined effects of user-generated aberrations for the study of anisoplanatic imaging using optical test benches. The optical system point spread function (PSF) is first defined, and methods to minimize the effects of the optical system are discussed. User-derived aberrations, in the form of low-order Zernike ensembles, are introduced using a liquid crystal spatial light modulator (LC-SLM), and dynamic phase maps are used to study the spatiotemporal PSF. A versatile optical test bench is described, where the Shack Hartmann and curvature wavefront sensors are used to emulate the effects of wavefront propagation over time from two independent sources. © 2014 Optical Society of America OCIS codes: (100.5070) Phase retrieval; (220.1080) Active or adaptive optics; (120.5050) Phase measurement. http://dx.doi.org/10.1364/AO.53.008205

1. Introduction

The generation of optical wavefronts, their perturbation, and subsequent acquisition within a controlled environment such as an optical laboratory, is an important requirement for the study of inverse problems. The ability to control the dynamics of a temporally evolving or statically displaced perturbation is essential to understanding spatiotemporal effects on the point spread function (PSF). For example, correcting phase aberrations in real-time places significant demands on algorithm efficiency in open-loop configurations used for image correction, such as deconvolution from wavefront sensing (DWFS) [1]. Computationally, closed-loop systems, such as adaptive optics [2], are comparatively less stringent since reliance is on electromechanical controls to correct the optical path in real-time. 1559-128X/14/358205-11$15.00/0 © 2014 Optical Society of America

Development of the optical test-bench described in this article facilitated the study of efficient algorithms used to estimate the spatially variant PSF (SVPSF). A variety of methods has been used for its estimation, including atmospheric tomography [3], maximum a posteriori (MAP) [4], and more recently, discriminative learning of wavefront evolution using reservoir computing [5]. A brief background on optical test-benches, emphasizing the use of electro-optical methods to generate phase aberrations, is given in the next section. This includes an overview of wavefront propagation and phase map generation, and image restoration using the SVPSF. The optical test-bench developed for this study is discussed in Section 3. A discussion on single- and multisource wavefront propagation is given in Section 4. Empirical results are presented in Section 5. Last, this paper is concluded in Section 6 and includes a brief outline of future work. 10 December 2014 / Vol. 53, No. 35 / APPLIED OPTICS

8205

2. Background

In this section we first define the PSF and apply this to the forward image model. Zernike polynomials are used to describe how phase aberrations are introduced in the optical test-bench using a spatial light modulator. A.

N X M X

f k; l; thp − k; q − l; t  ηp; q; t;

k1 l1

(4) for the time-variant, spatially-invariant image model, and,

Point Spread Function

The spatiotemporal PSF is used to provide knowledge a posteriori of the angular effect of anisoplanatism. By evaluation of the spatial variance of the PSF over time, knowledge a priori can be used from several point sources to minimize decorrelation as the field angle increases. First, the PSF for astronomical imaging problems can be defined as hx2 ; y2   FTfPλzu; λzvg;

(1)

where P is the generalized pupil, u and v are coordinates in the pupil plane, z is the coordinate in the direction of wavefront propagation, λ is the wavelength, and FT is the Fourier transform operator. When also considering the area of exit pupil, Ap and the distance of propagation from the exit pupil to the image, d, the relationship between the wavefront and PSF is given by Goodman [6],    2  Ap  2π  hx2 ;y2   2 2 FT Px1 ; y1  exp −j Wx1 ; y1    ; λ λ d (2) where Px1 ; y1  is the exit pupil function, simplified for a diffraction limited system as  Px1 ; y1  

1 0

fx1 ; y1 ∈ AjRx;y g elsewhere;

(3)

and where Wx1 ; y1  is the wavefront distortion function defined by Eq. (10) in Subsection 2.C. Introducing phase perturbations in the pupil can be analyzed at the image plane by analysis of the PSF. The PSF is spatially invariant over isoplanatic regions, however over a wide field of view, spatial, as well as temporal changes to the PSF should be considered. This is described in the following subsection. B.

g1 p; q; t 

Spatiotemporal Point Spread Function

Changes to the PSF, and correspondingly the image of the science object, are functions of time. Temporal changes can be expressed as a convolution of a science object f k; l; t with a degradation function hp; q; k; l; t, and with additive noise, ηp; q; t. The result is measured in the spatiotemporal domain, i.e., g1 p; q; t, as defined by Eq. (4). Models of this kind are known as the forward problem [7]. For completeness therefore, we should write 8206

APPLIED OPTICS / Vol. 53, No. 35 / 10 December 2014

g2 p; q; t 

N X M X

f k; l; thp; q; k; l; t  ηp; q; t;

k1 l1

(5) for the time-variant and SVPSF image model. C.

Zernike Polynomials

Zernike polynomials are 2D orthonormal basis functions commonly used to define optical aberrations over the unit circle. They represent the statistical eigenfunctions of optical distortions that quantitatively classify each aberration using a set of polynomials. Since Zernike polynomials are used extensively in this article, a brief summary is given here. A more extensive discussion is presented by Born and Wolf [8]. Generally defined in polar coordinates as the product of radial and angular terms, Zernike polynomials, using the simplified ordering scheme of Noll [9], can be expressed as p p n  1Rm n r 2 cosmθ; p p Ziodd r; θ  n  1Rm n r 2 sinmθ; p Zi r  R0n ρ 2; m  0;

Zieven r; θ 

m ≠ 0; m ≠ 0; (6)

where r is the aperture radius, i is the polynomial order, and the terms m and n are the azimutal and radial order, respectively, and last, Rm n r is referred to as the radial polynomial function and is defined as Rm n r 

n−m∕2 X s0

−1n n − s! ih i ρn−2 s : n−m s! nm − s ! − s ! 2 2 h

(7)

Each Zernike mode can be represented by a 2D image, commonly referred to as a phase map. For example, an aberrated PSF is represented by K phase maps on the pupil plane. The linear combination of K aberrations over a unit circle of radius R results in an approximation of the phase perturbation, ϕ·, and is defined as ϕRρ; θ ≈

K X

ai Zi ρ; θ;

(8)

i2

where ρ is the normalized aperture given an aperture of radius, R, and the Zernike coefficients ai are defined as

ai 

Z 1Z 0

2π 0

WρϕRρ; θZi ρ; θdθdρ;

series. Only relatively low aberrations to the 5th order were used to demonstrate our method of subdividing the aperture into regions, where both modal and zonal wavefront sensing and subsequent analyses were undertaken. To emulate the effects of a turbulent atmosphere, wavefront aberrations were induced in the optical pupil using a Holoeye LC-SLM (LC-R-2500). A thorough description of LC-SLMs is given by Love [11]. The application and calibration of this LCSLM module is described in the next section.

(9)

where Wρ is the pupil weighting function. When K  ∞, Eq. (8) is an exact representation of the phase. Additionally, the piston term, Z1 , has been removed in Eq. (8), as is common for single aperture instruments. The relationship between phase aberrations and the pupil distortion function can be defined as [10] Wx; y  Wx; y expjϕx; y;

(10)

3. Optical Configuration

where the pupil weighting function, Wx; y, is more conveniently expressed in rectangular coordinates. We are concerned here with two groups of wavefront distortions that affect the PSF differently. Low-order Zernike modes Z2 and Z3 , i.e., tilt, result in displacement of the PSF. Higher order aberrations, such as defocus, Z4 , and spherical aberration Z11 , result in the deformation of the PSF. Aberrations such as astigmatism, coma and trefoil, Zernike modes Z5 , Z6 , and Z7 to Z10 , respectively, incorporate some tilt component and subsequently displace the PSF. In this work, series and sequences of Zernike polynomials were used to generate phase maps for transfer to a spatial light modulator. D.

The optical test bench configuration used for a single source is shown in Fig. 1. The basic open loop configuration comprised a diode laser module, supporting a 170° polarized 632 nm, 10 mW source and neutral density and spatial filter assembly, is shown in the lower portion of the figure. Lenses L1, L2, and L3 create the optical pupil for propagation of a planar wavefront via mirror M2, or an induced programmable wavefront via the LC-SLM and facilitated by beamsplitter B1. The LC-SLM is used to introduce wavefront phase aberrations using an external computer configured as a second XVGA monitor. When the LC-SLM is covered, mirror M2 provides a planar wavefront for reference PSF. Distortion effects from the edges of the LC-SLM were reduced by a circular aperture before beamsplitter B2. Lens L4 is configured for a 4f system, where imaging is performed using a curvature wavefront sensor [12]. Lenses L5 and L6 are configured as a focal reducer (0.3×) to accommodate the full width of the CCD, and where imaging is performed using either a Hartmann or Shack Hartmann wavefront sensor [13]. Cameras

Phase Screen Generation and Spatial Light Modulators

A phase screen can be generated using a computer program as a random 2D array of phase distortions and which have the same statistics as turbulence induced atmospheric phase [10]. In this study each x; y position of a phase screen can be interpreted as a phase map represented by a Zernike polynomial

z' LC-SLM Plain mirror

C3

Aperture B2

L4

C2 z B3

M2

Ref. W/F

B1 Focus

L5

Curvature wave-front sensor assembly

L6

L3

SH lenslet array C1

L2 Coarse spatial filter

ND and spatial filters Laser

M1 L1

Polarizer

Fig. 1. Basic optical test bench configuration for single laser AO experiments, showing a single-source 632 nm semiconductor laser with polarizing filter, mirror M1 with a coarse spatial filter and associated lens assembly L2 and L3, beamsplitter B1 and mirror M2 to generate a planar wavefront for reference and calibration, the LC-R-2500 LC-SLM, aperture stop, and wavefront sensor modules comprising a Shack Hartmann and curvature wavefront sensor, both of which employ the Sony KAI-0340 CCD image sensor in cameras C1, and C2-C3, respectively. 10 December 2014 / Vol. 53, No. 35 / APPLIED OPTICS

8207

of the optical system is described, and, second, the methods used for wavefront sensing are outlined.

4

x 10 7

A. Wavefront Generation and Calibration

6 Mirror reference PSF

Pixel intensity

5

LC−SLM PSF

4 3 2 1 0 0 0 50 y (pixels)

20 40

100 60

x (pixels)

Fig. 2. Point spread functions for the mirror reference (left), and the LC-SLM reference.

C1–C3 are the Dragonfly-2, manufactured by Point Grey. This camera supported a 10-bit grayscale Sony KAI-0340 image sensor and Firewire 800 Mbps interface, providing good resolution and high-speed performance for capturing video sequences commensurate with the LC-R-2500 operational frequency. Additionally, the application programming library supported user programmability, such that image data could be streamed using a fully customized graphical user interface. Aberrations induced in the pupil allowed analysis and verification of the PSF when lens L4 was adjusted to a focus on the image plane of camera C2. Spatiotemporal effects of the PSF when subjected to an aberrated wavefront was also investigated. Small displacements of propagating wavefronts from multiple sources were introduced. This was achieved using two methods: (1) individual laser sources, and (2) a digital light processor. Both methods are discussed in Section 4.B. The PSFs of a planar wavefront are shown in Fig. 2, where the reference PSF from mirror M2 (with the LC-SLM covered) is shown on the left, and with mirror M2 covered, the native curvature of the LC-SLM is shown on the right. Power readings were conducted at the conclusion of each session. Essentially, a power meter was used to take readings immediately at the output of a single coherent laser source, and at two locations within the optical path, shown between the polarizer and L1, and after lens L4, as shown in Fig. 1. These power readings are shown in Table 1. The remainder of this section is organized as follows. First, phase map generation and calibration Table 1.

Configuration Curvature set Holographic set Hartmann set

8208

The LC-SLM was calibrated using a phase function that linearized the phase modulations at the input of the L-CR-2500 with slightly more than 2π at a wavelength of 632 nm [14]. Amplitude modulations were minimized by orientating the polarized light from the laser with respect to the LC-SLM, ensuring a 170° polarization angle incident on the LC-SLM as specified by the manufacturer. Our calibration, however, showed a maximum phase shift of at least π wavelengths over the full eight-bit range supported by the LC-SLM, however, this was highly dependent on the phase function and the video encoding settings used in transfers to the SLM. The procedure used to determine maximum phase shift first employed the default gamma table by Hu et al. [14] but later was replaced by the gamma table supplied by Holoeye. The generation of holographic fringes is shown in Fig. 3. These were produced by carefully aligning horizontal interference patterns between the LC-SLM and reference mirror, M2, as shown in the top-left of Fig. 1. The background between the lighter bands shows a faint image. We believe this to be the slight amplitude modulation that accompanies the predominantly phase modulation modality used. Hu et al. [14] measured the amplitude modulation of this LC-SLM to be less than 10%. Maximum phase shift was measured by transferring a black (0) and white (255 unsigned integer) image to the LC-SLM, where the transition was partitioned vertically in the center. The resulting image in Fig. 3 highlights the phase grating used to measure the maximum phase response of the LC-SLM. A static phase map was generated to test the optical system for calibration. This comprised two, loworder aberrations, tilt and defocus, and was used to generate a corresponding phase aberration in the pupil. However, in order to generate the appropriate wavefront distortion pattern in the pupil, a grating was applied to the phase screen. The result was used to modulate incident monochromatic 632 nm planar wavefronts to the LC-SLM. The reflected wavefront was effectively phase induced through modulation. The degree of modulation represents a compromise between spatial resolution supported

Power Meter Readings

Source (μW)

Polarizer

Sensor (nW)

45.3 45.3 —

2.2 μW 1.6 μW 217 nW

156 143 11

APPLIED OPTICS / Vol. 53, No. 35 / 10 December 2014

Fig. 3. Interferometric holographic fringes formed over a vertical binary image to measure phase diversity.

300 400 500 600 700 200

400

600

800

1000

Fig. 4. Example of a static phase screen used to generate phase distortions in the pupil. The effects of a dominant tilt and secondary defocus aberration are shown incorporating a 50 × 2π grating.

by the LC-SLM and the phase diversity of the induced aberration. To calibrate the estimated wavefront in the pupil plane with Matlab generated simulations, the root mean square (RMS) wavefront error of each induced aberration was measured using a 14 × 14 element Shack Hartmann wavefront sensor. Various 2π gratings were used in this test, and the measured results were then compared with simulated phase maps. An example of a 100π tip with added defocus waveform map is shown in Fig. 4. Each grated and modulated phase map was based on a set of low-order Zernike coefficients, defined by Eq. (9). These were derived using a wide-field computer generated phase screen, where a narrow-field circular region simulated a telescope aperture. As the phase screen is displaced over time and w.r.t. this smaller region, the spatiotemporal PSF introduced in Subsection 2.B can be imaged at the Fourier plane. For each discrete time-step, a grated and modulated phase map is transferred to the LC-SLM configured as a second, XGA, LCD monitor. A set of Matlab routines simplify this process and generate time-series sequences at a theoretical maximum rate of 60 fps for image capture, and processing using both Shack Hartmann and curvature wavefront sensors. This rate is within the temporal bandwidth of the LCSLM of 75 Hz and provides a platform to perform “frozen” turbulence emulations based on the Taylor hypothesis [15]. A similar method is discussed by Hu et al. [14]. Video sequences can also be programmed to gradually evolve over a fixed time period by modifying each coefficient representing a loworder Zernike term.

(a) 16000 14000 100 y pixels

200

induced aberrations from distortions inherent in the optical system. A simulation-based comparison of both wavefront sensors for low-order myopic aberrations is given by Basavaraju et al. [16]. The curvature sensor measures the intensity on either side of the focal plane. The intensity differential between the extrafocal and intrafocal planes contains information about wavefront curvature. If low-noise extrafocal and intrafocal pupil images can be obtained, the high sensitivity advantage of the curvature sensor and its simpler optical setup make it a cost effective alternative for high dynamic range wavefront sensing of low-order aberrations. The distance between the defocused image planes defines the spatial sampling resolution and the wavefront sensing accuracy; both are critical parameters. Combinations of low-order Zernike terms were used to generate phase maps and these were sent to the LC-SLM to introduce phase distortions in the pupil. Initially, aberrations comprised tilt, Z2 or Z3, and defocus, Z4 . Two images from cameras C2 and C3 in Fig. 1 show intensity differences that are used to measure wavefront curvature from both intra- and extrafocal planes; these images are shown in Fig. 5. Figure 5 highlights the effects of two dominant wavefront aberrations, tilt (Z2 ), and defocus (Z4 ). Tilt is evident in the displacement of the first diffraction

12000 10000

200

8000 300

6000 4000

400

2000 100

200

300 400 x pixels

500

600

(b) 12000 100 y pixels

100

10000 8000

200

6000 300 4000

B.

Wavefront Sensing

Optical wavefront sensors are used to measure phase aberrations in the pupil or image planes. Two pupil plane optical wavefront sensors were used to verify induced phase aberrations. The Shack Hartmann and curvature wavefront sensors provided both zonal and modal operations, respectively. This combination was useful in determining individual wavefront aberrations for multisource operation, and to separate

400

2000 100

200

300 400 x pixels

500

600

Fig. 5. Curvature sensor projections showing the effects of both defocus and tilt aberrations: (a) intrafocal image; (b) extrafocal image. The white dashed circle in both subfigures show the region of interest which is extracted to determine aberrations, and where the extrafocal image is flipped by virtue of reflection at B3. 10 December 2014 / Vol. 53, No. 35 / APPLIED OPTICS

8209

(a)

(b)

100

y pixels

100

y pixels

(nondashed circle) order. Intensity increases highlighted on the left of each zeroth diffraction order [circled intrafocal (a) and extrafocal (b) CCD images], is clearly evident. In addition, defocus is clearly evident as shown by the size of each displaced, first diffraction order. For example, a larger first diffraction order disk is shown in (b), whereas a smaller diffraction disk is shown in (a). In the absence of defocus, both disks would be identical in size. To minimize the effects of scintillation, regions shown as dashed circles in (a) and (b) of Fig. 5 were cropped and centered. The resulting intrafocal region-of-interest (ROI) was subtracted from the extrafocal ROI to determine curvature and subsequent estimation of individual Zernike modes. Estimates of Zernike coefficients measured using a curvature wavefront sensor are shown in Fig. 6. The dominance of both tip (Z2 ) and defocus (Z4 ) aberrations are evident. Each dominant wavefront error was within 10% of our simulated results. The residual wavefront error terms are discussed in Section 5. To verify the results of the curvature sensor a Shack Hartmann wavefront sensor (SHWS) was used to capture aberrations in the pupil through slope measurements. The SHWS comprised a CCD image sensor, afocal optics, and a lenslet array; Fig. 1 shows these components as C1, L5 and L6, and SH, respectively. The lenslet array subdivided the pupil into 14 × 14 subapertures and afocal stage provided full-aperture estimation over a r  2.25 mm portion of the 4.7 × 5.2 mm KAI−0340 Sony image sensor. The 640 × 480 pixel array supported a lenslet size of 74  1 μm (10 pixels), with 266 μm (36 pixel) spacing between subapertures. To generate a wavefront map the least-squares algorithm required both aberration and reference images. As with the curvature sensor, the grated phase map shown in Fig. 4 was used to induce phase aberrations in the pupil. Images were captured simultaneously on both curvature and Shack Hartmann wavefront sensor cameras. An additional reference frame was captured for the Shack Hartmann sensor on camera C1. Figure 7 shows both an aberration and reference SHWS image used for wavefront estimation.

200

200

300

300

400

400 100

200

300

400

x pixels

100

200

300

400

x pixels

Fig. 7. Shack Hartmann wavefront sensor masks: (a) reference mask; (b) aberration mask.

Fig. 8. 3D Wavefront map of the pupil showing the dominant tilt (Z2 ) aberration with secondary defocus (Z4 ).

Singular value decomposition was used for the least-squares fitting of centroid data. The results of fitting these data are shown in the 3D plot of Fig. 8. In summary, a versatile open-loop optical testbench for generating and estimating wavefront aberrations has been described. A discussion on the experiments conducted using this platform is given in the next section. 4. Generation of Single and Multiple Wavefronts

Two approaches for the generation of single and multiple optical wavefronts for data acquisition and analysis are discussed in this section.

RMS wavefront error (µ m)

3 2 1 0

A. Single Source Experiments

−1 −2 −3 −4 −5 0

5

10 15 Zernike coefficients

20

Fig. 6. Curvature sensor results with dominant tip and defocus aberrations. 8210

APPLIED OPTICS / Vol. 53, No. 35 / 10 December 2014

Combinations of two Zernike polynomials comprising several RMS orders were applied to the LCSLM using a phase map. Each phase map employed several 2π phase grating, necessary for phase modulation. Low-order tilt aberrations were used to displace the resulting PSF, thus allowing orientation within the aperture. In addition, both defocus and astigmatism aberrations were applied to deform

the PSF, representing a single source object. A Shack Hartmann wavefront sensor was used to measure phase aberrations in the pupil. Each set of aberrations was also verified using a curvature wavefront sensor (CWS). A detailed discussion of the results of this process are given in Section 5. B.

Multiple Source Experiments

As discussed in Subsection 3.A, a single set of Zernike terms is used to generate fringe aberration masks for single source experiments. This was initially used for isoplanatic imaging experiments, where the effects of a single phase map can be applied to multiple sources thereby emulating the effects of turbulence of close binary or trinary star systems. However, if the LC-SLM is spatially divided, such that alternate Zernike terms are used to generate aberrations within separate zones within the pupil, anisoplanatic imaging emulation can be attempted. Thus, more recent experiments have used this hypothesis to generate phase maps comprising two or more individual Zernike ensembles. Two configurations were employed and evaluated for multiple-object adaptive optics experiments. The first used a single laser source and digital light processor (DLP) to generate multiple source objects by alternatively deflecting the optical path through the use of pulse-width modulation. The second configuration used independent laser sources. In both cases, multiple source objects were generated and were used in conjunction with a single LC-SLM for the study of the SVPSF. Each configuration will be discussed in this subsection. 1. Multiple Sources from a Single Laser Multiple source objects were created using a DLP manufactured by Texas Instruments. The DLP was incorporated in the optical system shown in Fig. 1, effectively replacing mirror M1. This was used to modulate a single laser source at two different spatial frequencies, creating two distinct spatial paths. An example of a generated binary source is shown in Fig. 9. Close examination of Fig. 9 shows a dominant zero diffraction order, and several subsequent orders diminishing in intensity. Each order shows two

Fig. 9. Zero and subsequent orders of the point spread function created using the binary DLP.

individual point sources, in addition to their corresponding first orders. A series of tilt aberrations were sent to the LCSLM, resulting in an x–y displacement of the PSF. However, a strong zero-order term remained that we expect was due to undiffracted light. The application of tilt terms provided a method to separate the modulated PSF from this zeroth order. However, due to the low resolution of each PSF at the first order, this practice was discontinued. Subsequent experimentation was conducted using separate laser sources. 2. Multiple Sources from Individual Laser Modules As an alternative to the DLP method, this set of experiments used separate laser modules as individual sources. By orientating the planar wavefront of each source such that the wavefronts were separated by a small angle, θ, and by ensuring the orientation of each wavefront was incident on a separate aberration pattern generated by the LC-SLM, and a separate region of the LC-SLM, anisoplanatic experiments were conducted. The optical configuration for this experiment was essentially the same as shown in Fig. 1, however, rather than using a single laser assembly, two or more individual laser modules were used. The resulting array of laser sources is shown in Fig. 10. In order for the LC-SLM to be used with multiple source objects, several conditions had to be met to overcome spatial constraints at the image plane and to achieve independent curvature measurements. These constraints can be seen on the schematic of an LC-SLM used in our work, as shown in Fig. 11. Consider three planar wavefronts, each generated by independent laser sources. The wavefront from a natural guide star would be infinite in dimensionality, and assumed planar. However, in an optical laboratory, generated wavefronts have finite dimensions. Assuming a C-mount optical system, the diameter of the “cylinder” of laser light would be approximately 25 mm, and therefore sufficient to fully illuminate an LC-SLM, such as the LC-R-2500. If however, the aperture of the telescope is reduced to say 10 mm, independent laser sources could be

Fig. 10. Multiple laser module configuration for anisoplanatic experiments. 10 December 2014 / Vol. 53, No. 35 / APPLIED OPTICS

8211

(a) S2

RMS wavefront error (µm)

LC-SLM S3 T

S1

Fig. 11. LC-SLM projection.

configuration

used

for

Required RMS = 0.5 µm 3 2 1 0

v(h)

aligned to explore separate spatial regions over the SLM. An example of this is shown in Fig. 11, where three independent sources, S1 , S2 , and S3 , and an estimated target source, T, are shown. By directing multiple aberrations to a region on the LC-SLM that coincides with the center of each angularly separated source, higher order aberrations, other than tip, can be modulated with independent higher order aberrations.

100

50

10 25 Grating modulus

5

1

(b)

multiple-object P−V wavefront error (µm)

W

4

2 0 −2 −4 Required RMS = 0.5 µm −6

100

50

25 10 Grating modulus

5

1

Fig. 12. Uncompensated SHWS results for induced defocus (Z4 ) with RMS wavefront error of 0.5 μm over six-phase gratings: (a) RMS comparison and (b) respective P–V values.

5. Results and Discussion

The results from the SH and curvature wavefront sensors on single source objects were compiled and are presented in this section. The methods outlined in Subsection 4.B.2 were used, and the curvature wavefront sensor allowed the detection of low-order wavefront aberrations from multiple source objects over a single aperture. Once wavefront aberrations were introduced in the pupil of the optical system using the method outlined in Subsection 3.A, each source became time and spatially variant. Given this added flexibility, our first objective was to verify the accuracy of each aberration as it varied over time. Our second objective was to emulate modal tomography in a controlled laboratory environment. Open-loop estimates of the SVPSF were obtained for each source object using the test-bench described in Section 3. This facilitated further experimentation in the restoration of a target wavefront, such as the meta-pupil T in Fig. 11, using an appropriate deconvolution algorithm. A.

Single Source Objects

Both the curvature and SH wavefront sensors were used on single source objects. Results from both wavefront sensors will be presented here. 1. Shack Hartmann Wavefront Sensor Our initial wavefront analysis used only the defocus (Z4 ) aberration. A phase map with defocus RMS wavefront error of 0.5 μm was generated in software. This simplification was used to verify both RMS and peak-to-valley (P–V) metrics over a range of 2π phase gratings, where the modulus was varied from 1 to 100. The measured wavefronts mapped in the pupil 8212

APPLIED OPTICS / Vol. 53, No. 35 / 10 December 2014

were then compared with the corresponding simulation. The results are shown in Fig. 12. First, the results from Fig. 12 show that for a grating modulus of 5, the induced phase error was reasonably consistent with the simulated result. However, we found that the eight-bit resolution of the LC-SLM severely limited the range of induced wavefront errors. This meant that overcoming the sensor noise floor in an attempt to represent weak aberrations in the pupil was challenging. In addition, due to residual phase error the number of Zernike terms was limited to less than four orders. Second, Fig. 12(a) suggests a linear relationship between phase modulation and RMS wavefront error, but only over grating moduli between 5 and 25. Above a grating of 25 the structure of the original aberration is lost, as is evident by the severe drop in RMS and increase in P–V values. It is important to note that the RMS values shown in Fig. 12 are uncompensated. A Galilean afocal stage (Lenses L5 and L6 in Fig. 1) is used to reduce the pupil by 85%, thereby allowing the entire pupil to be imaged. However, the RMS values shown in Fig. 12 should be further compensated due to the reduction in effective pupil size that results from multiple phase modulation. For example, the effective area of the generated wavefront shown in Fig. 13(c) is 90%, and for Fig. 13(b) this reduces to 40%. For a grating modulus of 100 the integrity of the original wavefront aberration is effectively lost, which is evident in the phase map shown in Fig. 13(a). In summary, due to the limited size and spatial resolution of both the SLM and the CCD, in addition to the limited wavefront resolution supported by the SLM, programmable modulus gratings of between 5

(a) 4

100

3

150

2

200

1

250

0

300

−1

350

−2

400

RMS wavefront error (µ m)

50

RMS wavefront error (µ m)

(a) 2.5 RMS error (Act.) for 0.5 µm, Z4 2 1.5 1 0.5 0

100

50

100

200

300

400

(b) 50

2 0

150 −2

200

RMS wavefront error (µ m)

4 100

250

−4

300

−6

350

−8

400

−10

450

−12 100

200

300

10

5

1

(b)

−4

400

(c)

RMS wavefront error (µ m)

450

25

Grating modulus

−3

2.5 Total RMS error for 0.5 µm, Z4

2 1.5 1 0.5 0

100

50

25 10 Grating modulus

5

1

Fig. 14. Compensated curvature sensor results for an induced defocus (Z4 ) term with RMS wavefront errors for 0.5 μm, over six-phase gratings: (a) Actual RMS wavefront error for a single defocus (Z4 ) aberration; (b) total RMS wavefront error for single defocus (Z4 ) aberration with residual phase.

100

2

150 1 200 0

250 300

−1

350

RMS wavefront error (µ m)

3

50

−2

400 450 100

200

300

400

Fig. 13. Wavefront map of defocus (Z4 ) aberration with modular gratings: (a) modulus of 100 showing no relation to the original wavefront aberration due to excessive 2π modulation; (b) modulus of 25 showing a relatively large RMS wavefront error and subsequent reduction in pupil coverage by approximately 40%; (c) modulus of 10 with moderate RMS wavefront error, valid over approximately 90% of the pupil.

and 25 were considered a workable range to induce the lowest order, i.e., most dominant wavefront aberration. Subsequent wavefront aberrations, encoded within the modulated phase map, are added in proportion to the dominant Zernike term. Nonlinear distortion is induced as the grating modulus is increased, and thus the validity of the spatial area within the pupil is reduced. 2. Curvature Wavefront Sensor The results from the curvature wavefront sensor method described in Section 3.B are shown in Fig. 14.

The SH wavefront sensor was used to verify the results from the curvature sensor. To achieve this, the Peak–Valley method was used to measure the RMS of each wavefront aberration, resulting in a total RMS wavefront error, comprising the induced term, defocus (Z4 ), and additional, residual wavefront terms. Wavefront data from the curvature sensor shown in Fig. 1 were verified simultaneously using the Shack Hartmann WFS discussed in Section 3. A set of Zernike coefficients to the fourth order were used to compare phase aberrations in the pupil, independently. The results in Fig. 14 show that for a grating modulus of 10, a generated defocus (Z4 ) term with an RMS error of 0.5 μm and measured over 10 independent experiments, was verified to have a mean coefficient of 0.58 μm, and with a total RMS wavefront error of 0.71 μm. B. Multiple Source Objects

The motivation for these experiments was to emulate modal tomography in a controlled laboratory environment. Open-loop estimates of the SVPSF could be obtained for each source object used, as described in the introduction of this section. Two aberration sequences were introduced into the pupil over a reduced aperture. To achieve this, two perturbation sequences were created and were mapped independently using separate regions over the LC-SLM. Figure 15 shows how this was achieved. As described in Subsection 4.B.2 and shown in Fig. 10, planar wavefronts from individual sources 10 December 2014 / Vol. 53, No. 35 / APPLIED OPTICS

8213

250 100 200

200 300

150

400 100

500 600

50

700 200

400

600

800

1000

0

Fig. 15. LC-SLM phase mask showing two independent aberrations, defocus (Z4 ), and astigmatism (Z6 ), grated with modulus of 50.

are angularly separated to cover two or more LCSLM regions, such that each reflected wavefront was individually aberrated commensurate with perturbations similar to wavefronts passing through anisoplanatic patches of a turbulent atmosphere. The angular separation is controlled by θ1 and θ2 , and is based on a single-layer tomographic model [17]. (a)

100 200 300 400

100

200

300

400

500

600

(b)

6. Conclusion and Future Work

In this article we have shown examples of practical optical workbenches that can be employed for open loop wavefront propagation and analysis. In addition, we have shown that multiple source objects can be employed in the laboratory for the study of modal tomography, which require multiple reference beacons for wavefront recovery over anisoplanatic regions. Future work first will include refining the quality of waveforms generated by the optical test bench, especially considering the limited resolution and spatial constraints of the system. Second, several applications are intended and these include wavefront sensor design, DWFS, and modal tomography. Currently, work on image reconstruction using estimated SVPSFs is being conducted.

300

The authors thank Drs. Brian Vohnsen and Vyas Akondi from the Advanced Optical Imaging Group at the University College Dublin, for use of an advanced version of their Shack Hartmann wavefront sensor code. In addition, we thank the reviewers of this article for their constructive comments and suggestions.

400

References

100 200

100

200

300

400

500

600

Fig. 16. Curvature sensor (a) extrafocal and (b) intrafocal images of a binary source object subjected to individual phase aberrations in the pupil plane, i.e., defocus (Z4 ), and astigmatism (Z6 ). Note first that the defocus aberration enlarges the overall extrafocal image, where the ROI in (a) is highlighted with a red dotted circle, with respect to (b) the intra-focal image highlighted with a solid red circle; the characteristic astigmatism aberrations are correspondingly shown in aqua dotted and solid circles, respectively. Second, the induced tilt component discussed in this article is used to offset both aberrations as is highlighted in both intra- and extrafocal images. 8214

The results of experiments using this model are shown in Fig. 16, where a curvature sensor was used to record both intrafocal and extrafocal images. The projection of the right extrafocal image in Fig. 16(a) is expanded due to defocus and will need to be combined with reduced intrafocal image on the left of Fig. 16(a). The astigmatism aberration is also postprocessed where each ROI is extracted, centered and smoothed using a low-pass filter (LPF) before curvature sensor processing can be used to recover the wavefront from each source. The use of a LPF is a necessary process to ensure even intensity distribution over the region of interest without compromising curvature features. Such image preprocessing is identical to that performed for single source images, where the spatial cut-off frequency is commensurate with the degree of modulus grating. Last, time series sequence of repeatable aberrations, such as defocus and astigmatism, was generated and corresponding phase maps were produced and sent to the LC-SLM. Recovery of each wavefront using this optical configuration provided a workable platform for a temporal study of the SVPSF.

APPLIED OPTICS / Vol. 53, No. 35 / 10 December 2014

1. J. Primot, G. Rousset, and J. C. Fontanella, “Deconvolution from wave-front sensing: a new technique for compensating turbulence-degraded images,” J. Opt. Soc. Am. A 7, 1598– 1608 (1990). 2. R. K. Tyson, Introduction to Adaptive Optics (IEEE, 2000). 3. R. Ragazzoni, E. Marchetti, and G. Valento, “Adaptive-optics corrections available for the whole sky,” Nature 403, 54–56 (2000). 4. T. Fusco, J.-M. Conan, L. M. Mugnier, V. Michau, and G. Rousset, “Characterization of adaptive optics point spread function for anisoplanatic imaging. Application to stellar field deconvolution,” Astron. Astrophys. 142, 149–156 (2000). 5. S. J. Weddell and R. Y. Webb, “Reservoir computing for prediction of the spatially-variant point spread function,” IEEE J. Sel. Top. Signal Process. 2, 624–634 (2008).

6. J. W. Goodman, Introduction to Fourier Optics (McGraw-Hill, 1968). 7. M. R. Banham and A. K. Katsaggelos, “Digital image restoration,” IEEE Signal Process. Mag. 14, 24–41 (1997). 8. M. Born and E. Wolf, “Appendix VII—The circle polynomial of Zernike (9.2.1),” in Principles of Optics, M. B. Wolf, ed., 6th (Corrected) ed. (Cambridge University, 1980), pp. 767–772. 9. R. J. Noll, “Zernike polynomials and atmospheric turbulence,” J. Opt. Soc. Am. 66, 207–211 (1976). 10. M. C. Roggemann and B. Welsh, Imaging Through Turbulence (CRC Press, 1996). 11. G. D. Love, “Liquid crystal adaptive optics,” in Adaptive Optics Engineering Handbook, R. K. Tyson, ed. (M. Dekker, 2000), pp. 273–285.

12. F. Roddier, “Curvature sensing and compensation: a new concept in adaptive optics,” Appl. Opt. 27, 1223–1225 (1988). 13. B. C. Platt and R. V. Shack, “Lenticular Hartmann screen,” Opt. Sci. Cent. Newsl. 5, 15–16 (1971). 14. L. Hu, L. Xuan, Z. Cao, Q. Mu, D. Li, and Y. Liu, “A liquid crystal atmospheric turbulence simulator,” Opt. Express 14, 11911–11918 (2006). 15. G. I. Taylor, “Statistical theory of turbulence,” Proc. R. Soc. London Ser. A 151, 421–444 (1935). 16. R. M. Basavaraju, V. Akondi, S. J. Weddell, and R. P. Budihal, “Myopic aberrations: simulation based comparison of curvature and hartmann shack wavefront sensors,” Opt. Commun. 312, 23–30 (2014). 17. S. J. Weddell, “Optical wavefront prediction with reservoir computing,” Ph.D. thesis (Dept. of Electrical and Computer Engineering, University of Canterbury, 2010).

10 December 2014 / Vol. 53, No. 35 / APPLIED OPTICS

8215

Optical test-benches for multiple source wavefront propagation and spatiotemporal point-spread function emulation.

Precise measurement of aberrations within an optical system is essential to mitigate combined effects of user-generated aberrations for the study of a...
3MB Sizes 0 Downloads 7 Views