Efficient freeform lens optimization for computational caustic displays Gerwin Damberg1,∗ and Wolfgang Heidrich2,1 1 Department

2 Visual

of Computer Science, University of British Columbia, Canada Computing Center, King Abdullah University of Science and Technology, Saudi Arabia ∗ [email protected]

Abstract: Phase-only light modulation shows great promise for many imaging applications, including future projection displays. While images can be formed efficiently by avoiding per-pixel attenuation of light most projection efforts utilizing phase-only modulators are based on holographic principles which rely on interference of coherent laser light and a Fourier lens. Limitations of this type of an approach include scaling to higher power as well as visible artifacts such as speckle and image noise. We propose an alternative approach: operating the spatial phase modulator with broadband illumination by treating it as a programmable freeform lens. We describe a simple optimization approach for generating phase modulation patterns or freeform lenses that, when illuminated by a collimated, broadband light source, will project a pre-defined caustic image on a designated image plane. The optimization procedure is based on a simple geometric optics image formation model and can be implemented computationally efficient. We perform simulations and show early experimental results that suggest that the implementation on a phase-only modulator can create structured light fields suitable, for example, for efficient illumination of a spatial light modulator (SLM) within a traditional projector. In an alternative application, the algorithm provides a fast way to compute geometries for static, freeform lens manufacturing. © 2015 Optical Society of America OCIS codes: (080.4225) Nonspherical lens design; (120.2040) Displays; (100.3190) Inverse problems; (110.1758) Computational imaging.

References and links 1. L. Lesem, P. Hirsch, and J. Jordan, “The kinoform: a new wavefront reconstruction device,” IBM J. Res. Dev. 13, 150–155 (1969). 2. P. R. Haugen, H. Bartelt, and S. K. Case, “Image formation by multifacet holograms,” Appl. Opt. 22, 2822–2829 (1983). 3. G. Damberg, H. Seetzen, G. Ward, W. Heidrich, and L. Whitehead, “3.2: High dynamic range projection systems,” in “SID Symposium Digest of Technical Papers,” (Wiley Online Library, 2007), vol. 38, pp. 4–7. 4. M. Berry, “Oriental magic mirrors and the laplacian image,” Eur. J. Phys. 27, 109 (2006). 5. M. Papas, W. Jarosz, W. Jakob, S. Rusinkiewicz, W. Matusik, and T. Weyrich, “Goal-based caustics,” in “Computer Graphics Forum,” (Wiley Online Library, 2011), vol. 30, pp. 503–511. 6. T. Kiser, M. Eigensatz, M. M. Nguyen, P. Bompas, and M. Pauly, Architectural Caustics - Controlling Light with Geometry (Springer, 2013). 7. Y. Schwartzburg, R. Testuz, A. Tagliasacchi, and M. Pauly, “High-contrast computational caustic design,” ACM T. Graphic. 33, 74 (2014). 8. Y. Yue, K. Iwasaki, B.-Y. Chen, Y. Dobashi, and T. Nishita, “Pixel art with refracted light by rearrangeable sticks,” in “Computer Graphics Forum,” (Wiley Online Library, 2012), vol. 31, pp. 575–582.

#232187 - $15.00 USD (C) 2015 OSA

Received 13 Jan 2015; revised 28 Mar 2015; accepted 30 Mar 2015; published 13 Apr 2015 20 Apr 2015 | Vol. 23, No. 8 | DOI:10.1364/OE.23.0010224 | OPTICS EXPRESS 10224

9. Y. Ohno, “Color rendering and luminous efficacy of white led spectra,” in “Optical Science and Technology, the SPIE 49th Annual Meeting,” (International Society for Optics and Photonics, 2004), pp. 88–98.

1.

Introduction

In this work we propose to use phase only spatial light modulation combined with broadband illumination for image formation. We achieve this by treating the spatial phase modulator as a programmable freeform lens, and devising a simple and computationally efficient optimization procedure to derive a lens surface or modulation pattern that will form a caustic representing a predefined target image when illuminated by a collimated, broadband light source. Our research draws from a number of different research fields including holography and goal-based caustics. 1.1.

Holographic displays

Early holographic image formation models [1] have been adapted to create digital holograms [2]. Most of the common approaches require coherent light which has several disadvantages. Coherent light can result in high resolution artifacts, including screen speckle and diffraction on structures such as the discrete pixel grid of a SLM. On the other hand, using broadband light sources as illumination source can eliminate screen speckle, but is not feasible for holography. Phase patterns used in holography typically contain high spatial frequencies while low-frequency phase modulation patterns would help in mitgating diffraction artifacts. Finally scaling holography-based approaches cost-efficiently to high power is currently not feasible due to its incompatibility with broadband light sources as well as poor beam quality of high power diode lasers. 1.2.

Freeform lenses

Recently, there has been strong interest in freeform lens design, both for general lighting applications and also to generate images from caustics [4]. In the latter application, we can distinguish between discrete optimization methods that work on a pixelated version of the problem (e.g. [5]), and those that optimize for continuous surfaces without obvious pixel structures (e.g. [6, 7, 8]). The current state of the art [8] defines an optimization problem on the gradients of the lens surface, which then have to be integrated up into a height field. In addition to low computational performance, this leads to a tension between satisfying a data term (the target caustic image) and maintaining the integrability of the gradient field. Our goal is to derive an efficient algorithm to compute freeform lens patterns for dynamic phase modulation on a SLM, that produces images when illuminated with non-coherent, broadband light. While diffraction will occur off the SLM (or off any small, pixelated grid), we expect the resulting diffraction artifacts to be averaged out by the broadband nature of the illumination, resulting in a small amount of blur that can be modeled and compensated for [3]. In our work we derive a simple and efficient formulation in which we optimize directly for the phase function (i.e. the shape of the wavefront in the lens plane) without the need for a subsequent integration step. This is made possible by a new parameterization of the problem that allows us to express the optimization directly in the lens plane rather than the image plane. 2. 2.1.

Freeform lensing Phase modulation image formation

To derive the image formation model for a phase modulator, we consider the geometry shown in Fig. 1: a lens plane and an image plane (screen) are parallel to each other at focal distance f . Collimated light is incident at the lens plane from the normal direction, but a phase modulator

#232187 - $15.00 USD (C) 2015 OSA

Received 13 Jan 2015; revised 28 Mar 2015; accepted 30 Mar 2015; published 13 Apr 2015 20 Apr 2015 | Vol. 23, No. 8 | DOI:10.1364/OE.23.0010224 | OPTICS EXPRESS 10225

in the lens plane distorts the phase of the light, resulting in a curved phase function p(x) which corresponds to a local deflection of the light rays. lens plane

image plane

f u-x

φ

x=(x,y)

u=(u,v)

phase function p(x)

image i(u)

Fig. 1. Geometry for image formation model: Phase modulation in lens plane at focal distance f from image plane resulting in curvature of the wavefront (phase function p(x)).

With the paraxial approximation sin φ ≈ φ we obtain the following Eq. for the mapping between x on the lens plane and u on the image plane: u(x) ≈ x + f · ∇p(x).

(1)

Using the above geometric mapping, we derive the intensity change associated with this distortion as follows. Let dx be a differential area on the lens plane, and let du = m(x) · dx be the differential area of the corresponding region in the image plane, where m(.) is a spatially varying magnification factor. The intensity on the image plane is then given as i(u(x)) =

dx 1 i0 = i0 , du m(x)

(2)

where i0 is the intensity of the collimated light incident at the lens plane. In the following we set i0 = 1 for simplicity of notation.

lens plane

x+(0,ε)

image plane u(x+(0,ε)) m⋅dx

dx x

x+(ε,0)

intensity i0

u(x)

u(x+(ε,0)) intensity i0/m

Fig. 2. Intensity change due to distortion of a differential area dx.

The magnification factor m(.) can be expressed in terms of the derivatives of the mapping

#232187 - $15.00 USD (C) 2015 OSA

Received 13 Jan 2015; revised 28 Mar 2015; accepted 30 Mar 2015; published 13 Apr 2015 20 Apr 2015 | Vol. 23, No. 8 | DOI:10.1364/OE.23.0010224 | OPTICS EXPRESS 10226

between the lens and image planes (also compare Fig. 2):     ∂ ∂ m(x) = u(x) × u(x) ≈ 1 + f · ∇2 p(x). ∂x ∂y

(3)

This yields the following expression for the intensity distribution in the image plane: i(x + f · ∇p(x)) =

1 . 1 + f · ∇2 p(x)

(4)

In other words, the magnification m, and therefore the intensity i(u) on the image plane can be directly computed from the Laplacian of the scalar phase function in the lens plane. 2.2.

Optimization problem

While it is possible to directly turn the image formation model from Eq. ( 4) into an optimization problem, we found that we can achieve better convergence by first linearizing the Eq. with a first-order Taylor approximation, which yields i(x + f · ∇p(x)) ≈ 1 − f · ∇2 p(x),

(5)

where the left hand side can be interpreted as a warped image i p (x) = i(x + f · ∇p(x)) for which the target intensity i(u) in the image plane has been warped backwards onto the lens plane using the geometric distortion u(x) produced by a known phase function p(x). With this parameterization, the continuous least-square optimization problem for determining the desired phase function becomes Z

p(x) ˆ = argmin p(x)

x

2 i p (x) − 1 + f · ∇2 p(x) dx.

(6)

This problem can be solved by iterating between updates to the phase function and updates to the warped image, as shown in Algorithm 1. The algorithm is initialized with the target image intensity. From this, the first phase pattern is computed, which in turn is used to warp the original target image intensity to provide a distorted intensity image for use in the next iteration. Algorithm 1 Freeform lens optimization // Initialization i0p (x) = i(u) while not converged do // phase update 2 R  (k−1) p(k) (x)=argmin p(x) x i p (x) − 1 + f · ∇2 p(x) dx // image warp (k) i p (x)=i(x + f · ∇pk (x)) end while After discretization of i(.) and p(.) into pixels, the phase update corresponds to solving a linear least squares problem with a discrete Laplace operator as the system matrix. We can solve this positive semi-definite system using a number of different algorithms, including Conjugate Gradient, BICGSTAB and Quasi Minimal Residual (QMR). The image warp corresponds to a texture mapping operation and can be implemented on a GPU. We implement a non-optimized prototype of the algorithm in the Matlab programming environment using QMR as the least

#232187 - $15.00 USD (C) 2015 OSA

Received 13 Jan 2015; revised 28 Mar 2015; accepted 30 Mar 2015; published 13 Apr 2015 20 Apr 2015 | Vol. 23, No. 8 | DOI:10.1364/OE.23.0010224 | OPTICS EXPRESS 10227

Fig. 3. Algorithm progression for six iterations: target i gets progressively distorted by (k) backwards warping onto lens plane i p as phase function p(k) converges towards a solution. The 3D graphic depicts the final lens height field.

squares solver. Table 1 shows run times for Algorithm 1 and a selection of artificial and natural test images at different resolution. It was executed on a single core of a mobile Intel Core i7 clocked at 1.9 GHz with 8 GByte of memory. We note that due to the continuous nature of the resulting lens surfaces, computation of the phase with resolutions as low as 128 × 64 are sufficient for applications such as structured illumination in a projector. We also note that the algorithm could, with slight modifications, be rewritten as a convolution in the Fourier domain which would result in orders of magnitude shorter computation time for single threaded CPU implementations and even further speed-ups on parallel hardware such as GPUs. With these improvements, computations at, for example, 1920 × 1080 resolution will be possible at video frame rates. In addition both, the resulting contrast of the caustic image as well as the sharpness (effective resolution), benefit from higher working resolution. The progression of this algorithm is depicted in Fig. 3. We show the undistorted target image, from which we optimize an initial phase function. Using this phase function, we update the target image in the lens plane by backward warping the image-plane target. This process increasingly distorts the target image for the modulator plane as the phase function converges. The backward warping step implies a non-convex objective function, but we empirically find that we achieve convergence in only a small number of iterations (5-10). Table 1. Run times of Algorithm 1 using five iterations for a set of different test images and image resolutions.

Image Logo Lena Wave Logo Lena Wave Logo Lena Wave

#232187 - $15.00 USD (C) 2015 OSA

Resolution 128 × 64 128 × 64 128 × 64 256 × 128 256 × 128 256 × 128 512 × 256 512 × 256 512 × 256

Runtime 2.62 s 2.14 s 1.81 s 4.03 s 4.75 s 3.23 s 9.37 s 10.22 s 5.27 s

Received 13 Jan 2015; revised 28 Mar 2015; accepted 30 Mar 2015; published 13 Apr 2015 20 Apr 2015 | Vol. 23, No. 8 | DOI:10.1364/OE.23.0010224 | OPTICS EXPRESS 10228

3.

Simulation results

We evaluate the performance of our algorithm by utilizing different simulation techniques: a common computer graphics ray tracer and a wavefront model based on the Van Huygens principle, to simulate diffraction effects at a spectral resolution of 5nm. 3.1.

Ray tracer simulation

For the ray tracer simulation we use the LuxRender framework, an unbiased, physically-based rendering engine for the Blender tool. The setup of the simulation is quite straightforward: the freeform lens is imported as a mesh, and material properties are set to mimic a physical lens manufactured out of acrylic.

Fig. 4. LuxRender simulation results of a caustic image caused by an acrylic freeform lens. The inset shows the absolute intensity difference between simulated and original image, where the original image is encoded in the interval [0-1], and 0 in the difference map (green) means no difference. There are three possible sources of error: reflections off the edges of the physically thick lens (vertical and horizontal lines, misalignment and scaling of the output relative to the original (manual alignment) and the nature of the light source (not perfectly collimated).

A distant spot light provides approximately collimated illumination, a white surface with Lambertian reflectance properties serves as screen. The linear, high dynamic range data output from the simulation is tone mapped for display. The results (see Fig. 4) visually match the target well. 3.2.

Physical optics simulation

To analyze possible diffraction effects that cannot be modeled in a ray tracer based on geometric optics principles, we perform a wave optics simulation based on the Van Huygens principle. We compute a freeform lens surface for a binary test image (see Fig. 5) and illuminate it in simulation with light from a common 3-LED (RGB) white light source (see Fig. 6, dotted line) in 5nm steps. We integrate over spectrum using the luminous efficiency of the LED and the spectral sensitivity curves of the CIE color matching functions (see Fig. 6, solid line), as well as a 3x3 transformation matrix and a 2.2 gamma to map tristimulus values to display/print RGB primaries for each LED die and for the combined white light source (see Fig.7). As expected, #232187 - $15.00 USD (C) 2015 OSA

Received 13 Jan 2015; revised 28 Mar 2015; accepted 30 Mar 2015; published 13 Apr 2015 20 Apr 2015 | Vol. 23, No. 8 | DOI:10.1364/OE.23.0010224 | OPTICS EXPRESS 10229

Fig. 5. Binary test pattern (left) and resulting lens height field (right) used in the wave optics simulation.

Fig. 6. Spectra of standard white 3-LED (RGB) [9] (dotted graph) and the CIE standard observer color matching functions (solid graph) used in the wave optics simulation.

the wavefront simulation reveals chromatic aberrations within the pattern and diffraction off the edge of the modulator, which can be (partially) mitigated, for example, by computing separate lens surfaces for each of R,G and B. 4.

Experimental results

In addition to the simulations, we report on early experimental results using the computed freeform lenses in a static (acrylic, physical lens) and programmable (dynamically addressable phase modulator) fashion. 4.1.

Static lenses

For refractive lens surfaces the phase function p(x) is converted to a geometric model describing the lens shape. We design a lens that is flat on one side, and has a freeform height field h(x) on the other side. In the (x, z) plane, the deflection angle φ is related to the incident (θi ) and the exitant (θo ) angles at the height field as follows: ∂ p(x) ≈ φ = θo − θi . ∂x

(7)

The analogous relationship holds in the (y, z) plane. In addition, the lens material has a refractive index of n. Using Snell’s law, and again the paraxial approximation, we obtain 1 sin θi θi = ≈ . n sin θo θo

(8)

Using Eqs. ( 7) and ( 8), as well as θi ≈ ∂ h(x)/∂ x, we can derive the lens shape as h(x) = h0 + #232187 - $15.00 USD (C) 2015 OSA

1 p(x), n−1

(9)

Received 13 Jan 2015; revised 28 Mar 2015; accepted 30 Mar 2015; published 13 Apr 2015 20 Apr 2015 | Vol. 23, No. 8 | DOI:10.1364/OE.23.0010224 | OPTICS EXPRESS 10230

(a) Red LED

(b) Green LED

(c) Blue LED

(d) Combined (RGB) White LED

Fig. 7. Wave optics simulation for a test lens using standard white 3-LED (RGB) spectra. The simulation was performed at 5nm intervals and mapped to a RGB color space for print.

Fig. 8. 3D printed refractive lens (left), broadband LED spotlight and rear-projection screen with image (right). Differences to the simulation results in Fig. ( 4) partially stem from an increased beam divergence compared to an ideal light source as well as limitations in the manufacturing process (3D printer).

where h0 is a base thickness for the lens. Figure 8 shows a prototype of a 3D printed (42µm resolution) lens. Improved results and longer focal lengths can be achieved using other fabrication methods [7]. 4.2.

Implementation on spatial light modulators

The phase function p(x) can be directly implemented on a phase-only modulator: in our experiment an LCoS-based SLM with a pixel pitch of 8.0µm and a maximum phase retardation of 2π, the PLUTO SLM, by HOLOEYE Photonics AG. Since most high contrast images, for focal lengths reasonably far away from a modulator, require lens thickness of multiple wavelengths, we wrap the phase from Fig. 3 at multiples of 2π, comparable to grooves in a Fresnel lens (see Fig. 9, left). A broadband, white LED spot light provides collimated light on the reflective phase modulator and we observe the resulting image on a small Lambertian screen in Fig. 9, right. 5.

Discussion

We introduce a novel, computationally inexpensive method to compute freeform lenses and propose a new implementation for applications requiring dynamic updates. Wavefront and ray#232187 - $15.00 USD (C) 2015 OSA

Received 13 Jan 2015; revised 28 Mar 2015; accepted 30 Mar 2015; published 13 Apr 2015 20 Apr 2015 | Vol. 23, No. 8 | DOI:10.1364/OE.23.0010224 | OPTICS EXPRESS 10231

Fig. 9. Left: Wrapped phase function p(x) from Fig. ( 3). Right: Experimental test set-up using a phase modulator, a partially collimated, low power white LED light source and a small front projection screen. Light from an LED reflects off a phase modulator at an angle of 5 degrees and onto a projection screen at the focal distance of our lens computation. The resulting caustic image resembles the target image, but appears slightly blurry, distorted and at lower contrast compared to the simulations. This can be attributed to the broadband nature of light (the image would be sharper for separate lenses for red, green and blue LED light), and the degree of collimation of the incoming beam.

tracer simulations as well as experiments show promising results. However, several improvements in computation time, contrast of the resulting caustic images and sharpness are possible. An implementation of the algorithm on the GPU will allow for a shorter runtime of the algorithm making more iterations and higher working resolutions possible. We anticipate this will improve contrast and sharpness of the results further. Our current implementation of the algorithm results in smooth phase patterns. As part of future work we plan to investigate, whether allowing for steep gradients in the phase pattern (e.g. sharp ridges and valleys) would lead to higher contrast results for certain images. The wavefront simulation results were computed for three separate phase patterns for the spectra of red, green and blue LEDs, then combined into one white light field, an implementation common in projection systems. The experiments, however due to hardware availability, were performed using a single, broadband LED for both the physical lens and phase modulator implementation. The use of separate color LEDs for red, green and blue illumination in the experiments will produce images with higher contrast and sharper edges. Finally, better light collimation optics will further improve the results. Acknowledgments Research reported in this publication was supported by MTT Innovation Inc., NSERC, and the King Abdullah University of Science and Technology (KAUST).

#232187 - $15.00 USD (C) 2015 OSA

Received 13 Jan 2015; revised 28 Mar 2015; accepted 30 Mar 2015; published 13 Apr 2015 20 Apr 2015 | Vol. 23, No. 8 | DOI:10.1364/OE.23.0010224 | OPTICS EXPRESS 10232

Efficient freeform lens optimization for computational caustic displays.

Phase-only light modulation shows great promise for many imaging applications, including future projection displays. While images can be formed effici...
8MB Sizes 4 Downloads 9 Views