Wavelet-based depth-of-field extension, accurate autofocusing, and particle pairing for digital inline particle holography Wu Yingchun,1 Wu Xuecheng,1,* Yang Jing,1 Wang Zhihua,1 Gao Xiang,1 Zhou Binwu,1 Chen Linghong,1 Qiu Kunzan,1 Gérard Gréhan,2 and Cen Kefa1 1

State Key Laboratory of Clean Energy Utilization, Zhejiang University, Hangzhou 310027, China 2

UMR 6614/CORIA, LABEX EMC3, Centre National de la Recherche Scientifique, Université et Institut National des Sciences Appliquées de Rouen, Site du Madrillet, Avenue de l’Université, BP12 76801 Saint Etienne du Rouvray, France *Corresponding author: [email protected] Received 2 August 2013; revised 19 October 2013; accepted 27 November 2013; posted 11 December 2013 (Doc. ID 194697); published 24 January 2014

Depth-of-field extension and accurate 3D position location are two important issues in digital holography for particle characterization and motion tracking. We propose a method of locating the axial positions of both opaque and transparent objects in the reconstructed 3D field in the wavelet domain. The spatial– frequency property of the reconstructed image is analyzed from the viewpoint of the point spread function of the digital inline holography. The reconstructed image is decomposed into high- and low-frequency subimages. By using the variance of the image gradient in the subimages as focus metrics, the depth-of-field of the synthesis image can be extended with all the particles focalized, and the focal plane of the object can be accurately determined. The method is validated by both simulated and experimental holograms of transparent spherical water droplets and opaque nonspherical coal particles. The extendedfocus image is applied to the particle pairing in a digital holographic particle tracking velocimetry to obtain the 3D vector field. © 2014 Optical Society of America OCIS codes: (090.1995) Digital holography; (090.0090) Holography; (090.2880) Holographic interferometry; (100.6890) Three-dimensional image processing; (180.6900) Three-dimensional microscopy. http://dx.doi.org/10.1364/AO.53.000556

1. Introduction

Digital holography has been demonstrated to be a promising technique for particle characterization [1–3] and 3D flow diagnostics [4], since it can simultaneously extract 3D position and size distribution (cross-sectional morphology) [5], object phase [6], and even the 3D motion field by tracking individual seeded particles [7–14]. Regarding particle measurement with digital inline holography, there are two important issues: depth-of-field extension and 1559-128X/14/040556-09$15.00/0 © 2014 Optical Society of America 556

APPLIED OPTICS / Vol. 53, No. 4 / 1 February 2014

accurate autofocusing. For the hologram of particle clouds dispersed at different depth positions, each reconstructed plane image, in which only part of the reconstructed particles can be focalized, has only a limited depth-of-field in the plane-by-plane refocusing strategy. The other is the axial position measurement in the presence of various noises, since accurately determining the focus plane is a prerequisite for extracting particle shape and size and 3D motion tracking. Thus, digital inline particle holography is badly in need of depth-of-field extension and accurate focus-detection techniques. The formation of the reconstructed image in digital inline holography includes digital recording, and

thereafter numerical reconstruction, and can be described by a linear shift invariant system from a diffraction viewpoint [1]. The process can be characterized by the point spread function (PSF) [15–20] which has been widely used to characterize optical imaging systems. The PSF of digital holography is affected by not only the optical recording but also the numerical reconstruction algorithm. Therefore, it is different from that of a traditional optical imaging system. The in-focus object is reconstructed by numerical diffraction of the hologram in digital holography rather than the propagation of the pupil of the lens in a traditional imaging system. The reconstructed image can be described by the convolution of the actual object and the PSF [17]. In the in-focus reconstructed plane, the PSF tends to be a Dirac function, and the reconstructed image approaches the actual object with sharp intensity gradients at the border. The transformation of the focal image into the frequency domain has larger coefficients in the high-frequency band and sharply detailed images as well. In the out-of-focus reconstructed planes, the pseudo-PSF usually has properties of a low-pass filter [18]. The defocused image in the out-of-focus plane has characteristics of the convolution of the in-focus image and the low-pass filter. The low-pass filter increases the energy of the low-frequency bands and simultaneously suppresses the energy of the high-frequency bands. This leads to blurring of the reconstructed image. The intensity declines gradually at the boundary of the object which produces a smoothing effect in the image. Since the bandwidth of the PSF increases with the off-focus distance away from the in-focus plane, the low-frequency signal is proportional to the distance. The more energy in the low-frequency bands there is, the blurrier the image is, and this causes difficulty in the determination of the focal plane of the reconstructed object. Since the depth-of-field and accurate axial location for particles are of vital importance in digital inline particle holography, numerous algorithms have been proposed for the solution. For depth-of-field extension, the reconstruction methods had been modified to reconstruct the extended-focused image in the reconstruction process for special conditions [21–24], such as tilted objects. The extended-focused image can also be created from the reconstructed 3D optical field. Antkowiak et al. [25] obtained the extendedfocus image of a microparticle field based on the local integrated amplitude modulus using the small overlapping windows strategy. McElhinney et al. [26] created an extended-focus image of macroscopic objects from the reconstructed 3D field using a depth-from-focus algorithm. Chen et al. [27] employed the pixel-based entropy method to extend the depth-of-focus. Bergoënd and co-workers [28,29] applied the depth-of-field extension methods in the spatial domain in classical microscopy to digital holographic microscopy. For the particle’s 3D position determination, the core idea shared by all the methods

is to characterize the various properties of reconstructed objects at the focused plane. The existing methods can be classified into two categories: spatial domain methods, such as the intensity [30], entropy [27,31], intensity gradient or variance [3,13,32,33], image correlation [34,35], and constrained least squares filtering techniques [36], and frequency domain methods, such as the sparsity of a Fresnelet decomposition [37] and weighted spectral analysis in the Fourier transform domain [32]. The entropy, variance, and correlation methods measure the local statistical properties of the particle, and the image gradient method is a sharpness measurement of the particle’s boundaries. Furthermore, inverse problem approaches were also proposed to yield the optimal depth location [38], such as sparsity constraints [39] and compressive holography [40,41], but multiiterations were usually required to obtain globally optimized results. In addition, the 3D position of the particle can be retrieved by directly analyzing the hologram fringes without reconstruction [42–44]. The 3D velocity field can be obtained from the detected 3D particle field. The approaches can be classified into the following categories: 3D-correlationbased analysis [8], tracking individual particles [10–14], and a hybrid algorithm combining planar PIV and PTV [9]. In these methods, determining the 3D position of the particles is required before particle tracking and 3D vector evaluation, which is inevitably affected by the error in depth position. The previous studies mentioned above have provided valuable progress, but the depth-of-field extension and accurate autofocusing remain as challenges for particle field measurements with digital inline holography. The in-focus reconstructed image contains more high-frequency contents and fewer low-frequency contents than the out-of-focus image in the region near the edge of the object. This property is reflected in both the spatial domain and the spatial–frequency domain. Thus, the axial position of the object can also be measured in the spatial– frequency domain as well as the spatial domain. This property is independent of the object and could be used to locate both absorbing opaque particles and pure phase droplets. The wavelet transform, with wavelet bases of varying frequency and limited duration, can provide local spatial–frequency analysis of the image and has been used in image fusion [45] and the focus measurement of natural images [46,47]. In this work, a wavelet-based algorithm was proposed to extend the depth-of-field and locate the 3D position in digital inline particle holography. We will first describe the detailed procedures of the depth-of-field extension and the autofocusing method. Then the demonstration of the proposed algorithm with experimental holograms of nonspherical coal particles and water spray droplets will be presented. In addition, the extended-focus image applied to particle pairing in the digital holographic particle tracking velocimetry (DHPTV) will be presented. 1 February 2014 / Vol. 53, No. 4 / APPLIED OPTICS

557

low-frequency subimage LL and high-frequency subimages HH, HL, LH to detect the image gradient

2. Methodology A.

Hologram Recording and Reconstruction

The formation of a digital inline particle hologram is shown in Fig. 1. With a collimated laser beam illuminating the particles, the scattered light, as the object wave, interferes with the undisturbed reference wave, and the interference pattern is recorded by the CCD to form a digital particle hologram. The holograms were reconstructed with the wavelet reconstruction method [17]. Let Ix; y; z denote the intensity of the reconstructed 3D particle field. Each plane image can be reconstructed through a convolution of the hologram and the wavelet function Ix; y; z  1 − I holo x; y ⊗ ψ z x; y;

(1)

where I holo x; y is the particle hologram and ⊗ denotes the convolution. The wavelet function ψ z x;y  π∕λzsinπx2  y2 ∕λz − M ψ  × exp−πx2  y2 ∕λzσ 2 , with M ψ  σ 2 ∕1  σ 4  to ensure a zero mean value of ψ z x; y. σ is the width parameter of the windowand it can be ing function exp−πx2  y2 ∕λzσ 2 , p  ∕2 π∕λz lnε−1 ; determined as σ  minNδ ccd p 1∕2 πλz∕ lnε−1 , where N and δccd are the resolution and pixel size of the CCD, respectively, and ε is a small constant with a value of 0.01 in this work. B.

Gy;h  LH ⊗ S0x  HH ⊗ S0x ; Gx;l  LL ⊗ Sx ; Gy;l  LL ⊗ S0x ;

(1) Decompose each reconstructed image into four subimages using the discrete wavelet transform: HH, HL, LH, LL. In each level, the coefficients of the HH, HL, LH contain the detailed subimage in the high-frequency bands, and the coefficients of the LL are the approximation image in the lowfrequency bands. In this paper, the image is decomposed into the first level (l  1). (2) Compute the local variance of the intensity gradient in the both low- and high-frequency bands. The Sobel operator was applied to both the

Lens Laser

Lens

Particles

CCD

Fig. 1. Experimental setup of digital inline particle holography. APPLIED OPTICS / Vol. 53, No. 4 / 1 February 2014

(2)

where the Sx 

−1 −2 −1

0 0 0

1! 2 1

is the Sobel operator. The gradient magnitudes Gl and Gh in the low- and high-frequency bands are obtained as q Gh  G2x;h  G2y;h ; q Gl  G2x;l  G2y;l : (3) The variance of the gradient magnitude is computed as follows: XX Gh n; m − Gh n; m2 ; εH;z  εL;z 

Depth-of-Field Extension

A series of plane images are reconstructed with the particles focalized in image slices at different depth positions. To extend the depth-of-field and focalize all the particles in a single synthesis image, the regionbased image depth-of-field extension algorithm using the local variance of the gradient of the subimages in the wavelet domain as a criterion goes as follows:

558

Gx;h  HL ⊗ Sx  HH ⊗ Sx ;

n

m

n

m

XX

Gl n; m − Gl n; m2 ;

(4)

where GH;L n; m is the average of GH;L over the local region with a block size of n × m. (3) Get the decision map in both high- and lowfrequency subimages with the maximum selection scheme. Pick the coefficients in the high- and lowfrequency bands as the fused wavelet coefficient with the largest εH;z and εL;z , respectively. (4) Obtain the depth-of-field extended image with all the reconstructed particles focalized by applying the inverse discrete wavelet transform to the fused wavelet coefficients. C.

Autofocusing

After the depth-of-field extension, a proper algorithm such as background subtraction could be performed to reduce the noise and enhance the intensity contrast between the particle and the background in the depth-of-field extended image, if necessary. Then the synthesis image can be binarized by applying a proper threshold to separate the particles. The connectivity of the binarized image is checked and combined with the decision map in step (3). The reconstructed particles can be detected with the labeled region of each particle. A series of windowed regions covering the labeled particle are selected as the region of interested (ROI) to determine the particle’s depth position. The ROI image can be decomposed into four subimages using the discrete wavelet transform: HH, HL, LH, LL. The particle can be located in both low- and

high-frequency bands. The focus metric εH;z is normalized between 0 and 1 for each particle in this paper. Since the LL is the approximation of the image and contains most of the energy of the interrogation image, the focal plane can also be measured in the low-frequency band subimage LL using the previously proposed methods, such as intensity variance, and entropy. 3. Results and Discussions A.

Depth-of-Field Extension

The algorithm was validated by the experimental holograms of nonspherical, opaque coal particles and transparent, spherical water droplets. The experimental holograms were obtained with the typical experimental setup of digital inline holography, as shown in Fig. 1. A helium–neon laser beam, with the wavelength of 632.8 nm, passed through the spatial filter and then expanded and collimated to a plane wave. The particle field was illuminated by the collimated beam. The particle holograms were recorded by the CCD with a pixel number of 1394 × 1028 and a square pixel size of 6.45 μm. Two particle fields, water spray droplets and coal particle flow, were used in the experiments. Figure 2(a) shows a recorded hologram of coal particles. A 3D optical field scan from 9.0 to

16.0 cm in the depth position was reconstructed from the hologram using the wavelet reconstruction method mentioned above and 141 slice images were obtained with a depth interval of 500 μm between two conjunctive slices. Figure 2(b) shows a reconstructed image at the depth position of 12.3 cm. In the slice of the image, only a few particles are in focus due to the small depth-of-field, and the images of the out-of-focus particles are blurred. The proposed depth-of-field extension algorithm is applied to the reconstructed plane images to combine a series of the multifocus slice images into a synthesis image. Figure 3(a) shows the depth-of-field extended image with the block size of 15 × 15 for the computation of the local variance of the image gradient in both low- and high-frequency subimages. This image presents better visual effect, and all the particles are focalized with clear edges, sharp borders, and are in high contrast to the local background. Figure 3(b) shows the decision map for the fused wavelet coefficient of the low frequency band with the color proportional to the depth position. Comparing Fig. 3(a) with Fig. 3(b), for each particle in the depth-of-field extended image, there is a corresponding region with uniform color in the decision map. It implies that the coefficients of the in-focus

500µm

(a) 500µm

Out-of-focus ROI In-focus

(b) Fig. 2. (a) Experimental coal particle hologram and (b) the reconstructed plane image at a depth position of 12.3 cm.

Fig. 3. (a) The depth-of-field extended image and (b) the decision map of the wavelet coefficient of the low-frequency band. 1 February 2014 / Vol. 53, No. 4 / APPLIED OPTICS

559

Autofocusing

For each detected and labeled particle, a ROI window was selected to determine the depth position, as shown in Fig. 2(b). Figures 4(a)–4(d) compare local spatial–frequency properties of the in-focus and out-of-focus images of a ROI particle in Fig. 2(b) at different depth positions by using the discrete wavelet transform. For the in-focus particle image, the LL

coal particle 1 coal particle 2

1.0 (L,z)

B.

subimage of the in-focus particle is uniformly bright in the particle’s region with high-intensity contrast, and there is a sharp intensity gradient near the particle edge in the subimages of the high-frequency bands, as shown in Fig. 4(c). For the blurred out-of-focus image, the particle fades into the background, smoothing the intensity gradient, as shown in Fig. 4(d). Figure 5 shows the criterion curves of focal plane detection via the depth position of the opaque nonspherical coal particles and the transparent water droplets. There is a global maximum at the position of the focal plane. It is worth mentioning that the curve is not monotonic with respect to the degree of out-of-focus. It might be caused by spatial aliasing of the high-frequency content (particle edge) and the low-frequency bands (region inside the particle), as illustrated in the defocused images in Figs. 4(b) and 4(d). However, the local peaks at the out-of-focus plane are much smaller than the global peaks of the focused plane, and they do not affect the accurate determination of the particle position. Figure 5 demonstrates that the 3D position of the reconstructed particle can be robustly determined. The lateral position of each labeled particle can be obtained by the intensity weighted centroid. Figure 6 shows the 3D location of the detected particles from the hologram in Fig. 2(a) with the color proportional to the equivalent size of the particle. Figure 7 compares the focus plane detection with the method in this work and the previously proposed algorithms. The pixel-based point intensity method failed to accurately locate the particle due to the multipeaks in the intensity of the reconstructed particle image along the depth direction. Compared with the

Focus metric

particle image are selected into the fused wavelet coefficients and explains why all the particles are focalized in the synthesis image. This is also useful for particle detection and identification. The small patches in Fig. 3(b) with different colors are caused by the noise in the slices with different depth positions. To investigate the effect of the block size on the depth-of-field of the P extended image, the entropy of the image I en  − i255 i0 Pi logPi, where Pi denotes the probability of pixel gray value i in the image, is employed as a quantitative criterion. The proposed depth-of-field extension algorithm was tested by the same reconstructed slice images with the block size varying from 5 × 5 to 35 × 35. Results show that although the particles can be focalized in all the depth-of-field extended images, the entropy of the synthesis image decreased with the block size and so does the visual effect. The reconstructed particles can be directly detected and identified from the extended-focus image, free of searching the whole 3D reconstructed field in the conventional method. Furthermore, the particles can be sized using 2D image processing algorithms without locating the focus plane of each particles. Note that in framework of this algorithm, all the in-focus images of the particles are fused into the extended depth-of-field image. Particles will overlap each other if the particle density is too high, and this will cause further problems for particle separation and detection.

0.8 0.6

Fr=0.66 Fr=0.25

0.4 0.2 0.0 0.10

0.11

0.12

0.13

0.14

0.15

(a) z/m

(b)

1.0

LL

HL

LH HH

(c)

LL

HL

LH HH

(d)

Fig. 4. Local spatial–frequency analysis of the in-focus and out-offocus images. (a) In-focus particle image, (b) out-of-focus particle image, (c) wavelet decomposition of (a), with the upper right, upper left, lower right and lower left subimages corresponding to the LL, HL, LH and HH, respectively, and (d) wavelet decomposition of (b). 560

APPLIED OPTICS / Vol. 53, No. 4 / 1 February 2014

Focus metric

(H,z)

(a)

water droplet 1 water droplet 2

0.8 Fr=0.26

0.6

Fr=1.25

0.4 0.2 0.0 0.04

0.05

0.06

0.07

0.08

0.09

0.10

(b) z/m Fig. 5. Focal plane measurement curve in the high-frequency subimages of (a) an opaque coal particle and (b) the transparent water droplet.

d/m

0.005 /m 0.002 x

2.60E-04 2.40E-04 2.20E-04 2.00E-04 1.80E-04 1.60E-04

y/m

0.001

0.005

0.135 0.13

z/m

0.125 0.12

Fig. 6. 3D positions of the reconstructed coal particles.

intensity variance and entropy method, the method proposed in this paper also shows an advantage in reducing the depth-of-focus, and the focus metrics εH;z decreases rapidly with the degree of out-of-focus, as shown in Figs. 7(a) and 7(b). It is caused by the low-pass feature of the PSF. In the off-focus plane, the high-frequency sharp boundary component of the object cannot be reconstructed, but a blurred diffraction image of the in-focus image is formed which is mainly composed of the low-frequency component. In the high-frequency band images with wavelet decomposition, the variance of the image gradient is small, and this gives rise to the rapid decrease of the focus metrics εH;z. To quantitatively evaluate the accuracy and robustness of the proposed method, simulated

Fig. 8. Depth error of simulated particles from near- to far-field. (a) Absolute depth errors of opaque particles and droplets and (b) relative depth errors.

holograms of absorbing opaque particle (refractive index n  1.5 − 0.5i) and transparent droplet (n  1.5 − 0.0i) were tested. Each hologram contains 40 particles within a volume of 2 mm × 2 mm × 1 mm to produce a relatively dense particle field. Particles with diameters ranging from 50 to 110 μm were randomly located at different depth positions (from near- to far-field) with Fraunhofer number [Fr  πd2 ∕4λz] from 0.1 to 3.0. The simulated holograms of homogeneous spheroids were computed in the framework of the near-field Lorenz–Mie scattering theory [48,49]. The extracted 3D positions were compared with their exact values used in the simulation. Figure 8 shows the discrepancy of the particle depth position. The opaque particles and droplets shows similarity in depth error. The absolute depth error is stable and smaller than 1 mm from near- to far-field. The mean depth errors for opaque particles and droplets are 426 and 436 μm, with the standard deviation of 25 and 26 μm, respectively. The relative depth error first increases with the Fraunhofer number in the far-field, from 2% at Fr  0.1 up to 15% at Fr  2, and then remains stable from Fr  2 to Fr  3 in the near-field. The results show that both opaque and transparent particles can be accurately located with the proposed method. C.

Fig. 7. Comparison of focus plane measurement between the proposed method and the point intensity, the intensity variance, and the entropy methods for (a) nonspherical opaque coal particles and (b) spherical transparent water droplets.

Particle Pairing

The usual strategy for retrieving the 3D velocity in a DHPTV is to track an individual particle after locating the 3D position of the particle from reconstructed optical particle fields of the hologram pairs. The error in the depth position inevitably affects the particle pairing. Since all the particles are focused in the extended-focus image, the particles can be paired by applying the 2D existing particle tracking algorithms to the synthesized extended-focus image, such as the cross-correlation method, the matching probability method, the spring model method, the 1 February 2014 / Vol. 53, No. 4 / APPLIED OPTICS

561

velocity gradient tensor method, and other intelligent methods. By combining the 3D position in Section 3.B, then the 3D velocity of each paired particle can be obtained. This method is validated by the particle hologram pair of the DHPTV. The DHPTV system [33] was operated in the double exposures/double frames mode. A double-pulsed laser beam with a wavelength of 532 nm illuminated a falling (in the y direction) coal particle field, and the holograms were recorded by a frame transfer CCD to form a hologram pair. The CCD had a resolution of 2048 × 2048 (a ROI region of 1600 × 2048 for processing), with the equivalent pixel size of 14 μm at the ghost recording plane by using an imaging system. The pulse interval was set to 200 μs. Details of the experiments can be found in [33]. Figure 9 illustrates the extended-focus images of the coal particle hologram pair for particle pairing. The gray background in Fig. 9 is the extended-focus image of frame 1 (the first frame of the hologram pair). Despite the high coal particle density and the relative large depth range, the particles are clear focalized in the synthetic extended-focus image. The particles can be detected using the local adaptive threshold algorithm, then the particle transverse centroid position (x, y), as denoted by the blue plus

Fig. 9. Extended-focus image used for particle pairing in DHPTV. The gray background is the extended-focus image of frame 1. The red plus signs denote the centroids of the detected particle in frame 1, the closed curves are the boundaries of the detected particle in frame 2, and the green arrows denote the motion between the paired particles. 562

APPLIED OPTICS / Vol. 53, No. 4 / 1 February 2014

signs in Fig. 9, and the particle equivalent diameter or even shape (boundary) can be evaluated. There are 540 and 534 particles detected from frames 1 and 2, respectively. Figure 10 compares the particle diameter distribution of frames 1 and 2. The two distributions agree well with each other, with a mean diameter of 102 and 101 μm of frames 1 and 2, respectively. The closed curves in red in Fig. 9 display the boundaries of the particles detected from frame 2. Comparing the positions of the particles at two different times, there is an apparent bulk movement between the two clouds of particles. The matching probability method [50] is employed to pair the particles, since it is a two-frame particle pairing algorithm and can work in the condition of the particle appearing/disappearing. This algorithm is well developed and details can be found in [50], and thus it is not necessary to be repeated in this work. The green arrows in Fig. 9 show the particle pairing between frames 1 and 2. The particles in frames 1 and 2 connected by a arrow denote a successful pair. The falling motion of the particle clouds are clearly observed in Fig. 9. Actually, the 2D vectors in Fig. 9 are the projections of the 3D falling motion onto the hologram plane. There are loss-of-pair particles between frames 1 and 2 which are mainly caused by the move out of the image boundary or by the particle appearing/disappearing phenomenon. By removing the spurious vectors, 434 valid vectors are obtained with a pairing rate of 81%. The depth movement of each paired particle can be computed after it being located in the depth position. Figure 11 shows the 3D particle locations and vectors, with the vector color and length proportional to vector magnitude, and the spheres proportional to the particle size. The average velocity of the particle is 1.24 m∕s. Results show that the extended-focus image can be employed for particle pairing in DHPTV without locating the depth positions first. The whole hologram processing procedure contains hologram reconstruction, extended-focus image

Fig. 10. Comparison of particle size distribution between frame 1 and frame 2.

National Basic Research Program of China (Grant No. 2009CB219802), and the Program of Introducing Talents of Discipline to University (B08026). References

Fig. 11. 3D vectors of the hologram pair with the vector color and length proportional to vector magnitude and the spheres proportional to the particle size.

synthesis, particle detection and locating, particle pairing, and 3D vector processing. Since the particles are detected from the extended focus image, only one binarization operation on the extended focus image is needed, instead of binarizing every slice image. However, it needs to decompose every slice to get the decision map. The computation was performed with a MATLAB program on a Win 7 (64 bits) computer with AMD Athlon X4 at 2.8 GHz and 8 GB of RAM, and all of the data were stored in the RAM during the computation to speed up the 2D fast Fourier transform. It takes about 20 min to process the above hologram pair, yielding about 400 vectors from about 1000 particles among 400 slices. 4. Conclusion

A method of extending the depth-of-field and measuring the focal plane of digital particle holography was proposed based on the local variance of the image gradient in the wavelet domain. The reconstructed images were decomposed into the detailed subimages in the high-frequency bands and the approximation subimage in the low-frequency bands with wavelet transform. The depth-of-field extended image can be obtained with all the particles focalized using the maximum selection scheme. Then the 3D position of the particle can be accurately determined in the high-frequency subimages. The proposed method was verified by experimental holograms of both coal particles and sprayed water droplets with the accuracy quantitatively tested by simulated holograms. This method is robust and independent of the object. The extended-focus image can be applied to pair the particles in a DHPTV to obtain the 3D velocity field. The authors thank Prof. Jingang Zhong for useful discussion, and gratefully acknowledge financial support from the National Natural Science Foundation of China (NSFC) projects (Grant No. 51176162), the National Science Fund for Distinguished Young Scholars (Grant No. 51125025), the

1. G. Tyler and B. Thompson, “Fraunhofer holography applied to particle size analysis a reassessment,” J. Mod. Opt. 23, 685–700 (1976). 2. M. Adams, T. M. Kreis, and W. P. O. Jueptner, “Particle size and position measurement with digital holography,” Proc. SPIE 3098, 234–240 (1997). 3. J. K. Abrantes, M. Stanislas, S. Coudert, and L. F. A. Azevedo, “Digital microscopic holography for micrometer particles in air,” Appl. Opt. 52, A397–A409 (2013). 4. J. Katz and J. Sheng, “Applications of holography in fluid mechanics and particle dynamics,” Annu. Rev. Fluid Mech. 42, 531–555 (2010). 5. L. Tian, N. Loomis, J. A. Domnguez-Caballero, and G. Barbastathis, “Quantitative measurement of size and threedimensional position of fast-moving bubbles in air–water mixture flows using digital holography,” Appl. Opt. 49, 1549–1554 (2010). 6. J. Weng, J. Zhong, and C. Hu, “Phase reconstruction of digital holography with the peak of the two-dimensional Gabor wavelet transform,” Appl. Opt. 48, 3308–3316 (2009). 7. K. D. Hinsch, “Holographic particle image velocimetry,” Meas. Sci. Technol. 13, R61–R72 (2002). 8. G. Shen and R. Wei, “Digital holography particle image velocimetry for the measurement of 3Dt-3c flows,” Opt. Lasers Eng. 43, 1039–1055 (2005). 9. J. Sheng, E. Malkeil, and J. Katz, “Using digital holographic microscopy for simultaneous measurements of 3D near wall velocity and wall shear stress in a turbulent boundary layer,” Exp. Fluids 45, 1023–1035 (2008). 10. L. Cao, G. Pan, J. de Jong, S. Woodward, and H. Meng, “Hybrid digital holographic imaging system for three-dimensional dense particle field measurement,” Appl. Opt. 47, 4501–4508 (2008). 11. S.-i. Satake, A. Takafumi, K. Hiroyuki, K. Tomoaki, S. Kazuho, and I. Tomoyoshi, “Measurements of threedimensional flow in microchannel with complex shape by micro-digital-holographic particle-tracking velocimetry,” J. Heat Transfer 130, 042413 (2008). 12. S. Kim and S. J. Lee, “Measurement of dean flow in a curved micro-tube using micro digital holographic particle tracking velocimetry,” Exp. Fluids 46, 255–264 (2009). 13. Y. Yang, G. Y. Li, L. L. Tang, and L. Huang, “Integrated graylevel gradient method applied for the extraction of threedimensional velocity fields of sprays in in-line digital holography,” Appl. Opt. 51, 255–267 (2012). 14. D. Allano, M. Malek, F. Walle, F. Corbin, G. Godard, S. Coëtmellec, B. Lecordier, J.-M. Foucaut, and D. Lebrun, “Three-dimensional velocity near-wall measurements by digital in-line holography: calibration and results,” Appl. Opt. 52, A9–A17 (2013). 15. G. Indebetouw, W. Zhong, and D. Chamberlin-Long, “Point-spread function synthesis in scanning holographic microscopy,” J. Opt. Soc. Am. A 23, 1708–1717 (2006). 16. P. Picart and J. Leval, “General theoretical formulation of image formation in digital Fresnel holography,” J. Opt. Soc. Am. A 25, 1744–1761 (2008). 17. M. Malek, S. Coetmellec, D. Allano, and D. Lebrun, “Formulation of in-line holography process by a linear shift invariant system: application to the measurement of fiber diameter,” Opt. Commun. 223, 263–271 (2003). 18. S. Coëtmellec, N. Verrier, M. Brunel, and D. Lebrun, “General formulation of digital in-line holography from correlation with a chirplet function,” J. Eur. Opt. Soc. 5, 10027 (2010). 19. T. M. Kreis, “Frequency analysis of digital holography with reconstruction by convolution,” Opt. Eng. 41, 1829–1839 (2002). 20. A. Marian, F. Charriere, T. Colomb, F. Montfort, J. Kuehn, P. Marquet, and C. Depeursinge, “On the complex threedimensional amplitude point spread function of lenses and 1 February 2014 / Vol. 53, No. 4 / APPLIED OPTICS

563

21. 22.

23.

24. 25. 26.

27. 28.

29.

30.

31. 32.

33. 34. 35.

564

microscope objectives: theoretical aspects, simulations and measurements by digital holography,” J. Microsc. 225, 156–169 (2007). D. Lebrun, A. M. Benkouider, and S. Cotmellec, “Particle field digital holographic reconstruction in arbitrary tilted planes,” Opt. Express 11, 224–229 (2003). S. De Nicola, A. Finizio, G. Pierattini, P. Ferraro, and D. Alfieri, “Angular spectrum method with correction of anamorphism for numerical reconstruction of digital holograms on tilted planes,” Opt. Express 13, 9935–9940 (2005). S. J. Jeong and C. K. Hong, “Pixel-size-maintained image reconstruction of digital holograms on arbitrarily tilted planes by the angular spectrum method,” Appl. Opt. 47, 3064–3071 (2008). M. Paturzo and P. Ferraro, “Creating an extended focus image of a tilted object in Fourier digital holography,” Opt. Express 17, 20546–20552 (2009). M. Antkowiak, N. Callens, C. Yourassowsky, and F. Dubois, “Extended focused imaging of a microparticle field with digital holographic microscopy,” Opt. Lett. 33, 1626–1628 (2008). C. P. McElhinney, B. M. Hennelly, and T. J. Naughton, “Extended focused imaging for digital holograms of macroscopic three-dimensional objects,” Appl. Opt. 47, D71–D79 (2008). W. Chen, C. Quan, and C. Tay, “Extended depth of focus in a particle field measurement using a single-shot digital hologram,” Appl. Phys. Lett. 95, 201103 (2009). I. Bergoënd, T. Colomb, N. Pavillon, Y. Emery, and C. Depeursinge, “Extended depth-of-field and 3D information extraction in digital holographic microscopy,” in Advances in Imaging, OSA Technical Digest (CD) (Optical Society of America, 2009), paper DWB5. I. Bergoënd, T. Colomb, N. Pavillon, Y. Emery, and C. Depeursinge, “Depth-of-field extension and 3D reconstruction in digital holographic microscopy,” Proc. SPIE 7390, 73901C (2009). C. Buraga-Lefebvre, S. Coetmellec, D. Lebrun, and C. Ozkul, “Application of wavelet transform to hologram analysis: three-dimensional location of particles,” Opt. Lasers Eng. 33, 409–421 (2000). J. Gillespie and R. A. King, “The use of self-entropy as a focus measure in digital holography,” Pattern Recogn. Lett. 9, 19–25 (1989). P. Langehanenberg, B. Kemper, D. Dirksen, and G. von Bally, “Autofocusing in digital holographic phase contrast microscopy on pure phase objects for live cell imaging,” Appl. Opt. 47, D176–D182 (2008). Y. Wu, X. Wu, Z. Wang, L. Chen, and K. Cen, “Coal powder measurement by digital holography with expanded measurement area,” Appl. Opt. 50, H22–H29 (2011). Y. Yang, B. Kang, and Y. Choo, “Application of the correlation coefficient method for determination of the focal plane to digital particle holography,” Appl. Opt. 47, 817–824 (2008). Y. Yang and B.-s. Kang, “Experimental validation for the determination of particle positions by the correlation

APPLIED OPTICS / Vol. 53, No. 4 / 1 February 2014

36.

37. 38.

39. 40.

41. 42. 43.

44. 45. 46.

47. 48.

49.

50.

coefficient method in digital particle holography,” Appl. Opt. 47, 5953–5960 (2008). C. Deng, J. Huang, G. Li, and Y. Yang, “Application of constrained least squares filtering technique to focal plane detection in digital holography,” Opt. Commun. 291, 52–60 (2013). M. Liebling and M. Unser, “Autofocus for digital Fresnel holograms by use of a Fresnelet-sparsity criterion,” J. Opt. Soc. Am. A 21, 2424–2430 (2004). F. Soulez, L. Denis, C. Fournier, É. Thiébaut, and C. Goepfert, “Inverse-problem approach for particle digital holography: accurate location based on local optimization,” J. Opt. Soc. Am. A 24, 1164–1171 (2007). L. Denis, D. Lorenz, E. Thiébaut, C. Fournier, and D. Trede, “Inline hologram reconstruction with sparsity constraints,” Opt. Lett. 34, 3475–3477 (2009). Y. Liu, L. Tian, J. W. Lee, H. Y. H. Huang, M. S. Triantafyllou, and G. Barbastathis, “Scanning-free compressive holography for object localization with subpixel accuracy,” Opt. Lett. 37, 3357–3359 (2012). Y. Rivenson, A. Stern, and B. Javidi, “Improved depth resolution by single-exposure in-line compressive holography,” Appl. Opt. 52, A223–A231 (2013). S. Soontaranon, J. Widjaja, and T. Asakura, “Extraction of object position from in-line holograms by using single wavelet coefficient,” Opt. Commun. 281, 1461–1467 (2008). D. Moreno-Hernandez, J. Andrés Bueno-García, J. Ascención Guerrero-Viramontes, and F. Mendoza-Santoyo, “3D particle positioning by using the Fraunhofer criterion,” Opt. Lasers Eng. 49, 729–735 (2011). J. Widjaja and P. Chuamchaitrakool, “Holographic particle tracking using Wigner–Ville distribution,” Opt. Lasers Eng. 51, 311–316 (2013). G. Pajares and J. Manuel de la Cruz, “A wavelet-based image fusion tutorial,” Pattern Recogn. 37, 1855–1872 (2004). J. T. Huang, C. H. Shen, S. M. Phoong, and H. Chen, “Robust measure of image focus in the wavelet domain,” in Intelligent Signal Processing and Communication Systems (ISPACS) (IEEE, 2005), pp. 157–160. J. Kautsky, J. Flusser, B. Zitov, and S. Simberov, “A new wavelet-based measure of image focus,” Pattern Recogn. Lett. 23, 1785–1794 (2002). X. Wu, S. Meunier-Guttin-Cluzel, Y. Wu, S. Saengkaew, D. Lebrun, M. Brunel, L. Chen, S. Coetmellec, K. Cen, and G. Grehan, “Holography and micro-holography of particle fields: a numerical standard,” Opt. Commun. 285, 3013– 3020 (2012). Y. Wu, X. Wu, S. Saengkaew, S. Meunier-Guttin-Cluzel, L. Chen, K. Qiu, X. Gao, G. Grhan, and K. Cen, “Digital Gabor and off-axis particle holography by shaped beams: a numerical investigation with GLMT,” Opt. Commun. 305, 247–254 (2013). S. Baek and S. Lee, “A new two-frame particle tracking algorithm using match probability,” Exp. Fluids 22, 23–32 (1996).

Wavelet-based depth-of-field extension, accurate autofocusing, and particle pairing for digital inline particle holography.

Depth-of-field extension and accurate 3D position location are two important issues in digital holography for particle characterization and motion tra...
3MB Sizes 0 Downloads 0 Views