EP1929237A2 - Stereoscopic imaging through scattering media - Google Patents

Stereoscopic imaging through scattering media

Info

Publication number
EP1929237A2
EP1929237A2 EP06802527A EP06802527A EP1929237A2 EP 1929237 A2 EP1929237 A2 EP 1929237A2 EP 06802527 A EP06802527 A EP 06802527A EP 06802527 A EP06802527 A EP 06802527A EP 1929237 A2 EP1929237 A2 EP 1929237A2
Authority
EP
European Patent Office
Prior art keywords
image
hidden
camera
images
recited
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP06802527A
Other languages
German (de)
French (fr)
Inventor
David Abookasis
Joseph Rosen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US11/226,572 external-priority patent/US7336372B2/en
Application filed by Individual filed Critical Individual
Publication of EP1929237A2 publication Critical patent/EP1929237A2/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/47Scattering, i.e. diffuse reflection
    • G01N21/4795Scattering, i.e. diffuse reflection spatially resolved investigating of object in scattering medium
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B9/00Measuring instruments characterised by the use of optical techniques
    • G01B9/02Interferometers
    • G01B9/02015Interferometers characterised by the beam path configuration
    • G01B9/02027Two or more interferometric channels or interferometers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B9/00Measuring instruments characterised by the use of optical techniques
    • G01B9/02Interferometers
    • G01B9/02083Interferometers characterised by particular signal processing and presentation
    • G01B9/02087Combining two or more images of the same region
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B9/00Measuring instruments characterised by the use of optical techniques
    • G01B9/02Interferometers
    • G01B9/0209Low-coherence interferometers
    • G01B9/02091Tomographic interferometers, e.g. based on optical coherence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0062Arrangements for scanning
    • A61B5/0066Optical coherence imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B2290/00Aspects of interferometers not specifically covered by any group under G01B9/02
    • G01B2290/45Multiple detectors for detecting interferometer signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/232Image signal generators using stereoscopic image cameras using a single 2D image sensor using fly-eye lenses, e.g. arrangements of circular lenses

Definitions

  • the present invention relates in general to imaging systems, and in particular, to a system for imaging a hidden object.
  • Medical tomography techniques such as X-ray Computed Tomography (CT) ⁇ see J. G. Webster, Minimally Invasive Medical Technology, (IoP, Bristol and Philadelphia, 2001), Chap. 4, pp. 46-58) offer great advantages and are still widely used despite the fact that they suffer from several drawbacks such as ionizing radiation, a complex structure and high-cost.
  • An advantage of optical tomography over other medical tomography techniques is that it provides quantitative information on functional properties of tissues, while being non-harmful (the radiation is non-ionizing). Accordingly, in recent years, researchers have invested considerable efforts towards developing optical tomography systems that use near-infrared (NIR) light.
  • NIR near-infrared
  • OCT optical coherence tomography
  • a hidden object is reconstructed from many speckled images formed by a microlens array (MLA).
  • MLA microlens array
  • Each microlens from the array projects a small different speckled image of the hidden object onto a CCD camera.
  • the reconstruction algorithm all the noisy images from the array are shifted to a common center and then accumulated to a single average picture thereby revealing the shape of the hidden object.
  • NOISE-2 also described in Serial No. 11/226,572
  • a different algorithm is implemented on the same optical system, alleviating a need to shift the speckled images to a common center.
  • speckled images of the object In addition to recording the speckled images of the object, recorded are speckled images of a point- like object generated by illumination through the medium with a point-source. Each sub-image of the speckled object with a corresponding sub-image of the speckled point-like object is placed together side by side in the computer. Then, by computing the autocorrelation of each joint sub- image, and averaging over all the autocorrelations, a cross-correlation between the object function and a narrow point-like function is revealed.
  • Fig. 1 illustrates a system configured in accordance with an embodiment of the present invention
  • Fig. 2 illustrates a perspective projection geometry of an object and a reference point through each channel
  • Fig. 3 illustrates a summary of imaging results
  • Fig. 4 illustrates an alternative embodiment of the present invention.
  • Fig. 5 illustrates a block diagram of the computer in Fig. 1.
  • the present invention discloses an optical tomography technique that is based on speckled images.
  • Embodiments of the present invention are an extension toward three dimensional (3D) imaging of the embodiments disclosed in Serial No. 11/226,572, referred to as Noninvasive Optical Imaging by Speckle E Ensemble (NOISE) (see J. Rosen and D. Abookasis, "Seeing through biological tissues using the fly eye principle," Opt. Exp. 11, 3605 - 3611 (2003); J. Rosen and D. Abookasis, “Noninvasive optical imaging by speckle ensemble,” Opt. Lett, 29, 253-255 (2004)) and its modified version NOISE-2 (see D. Abookasis and J. Rosen, "NOISE 2 imaging system: seeing through scattering tissue with a reference point," Opt. Lett, 29, 956-958 (2004)). Therefore, embodiments of the present invention may be referred to as NOISE-3D.
  • embodiments of the present invention reveal and acquire depth information of objects seen through a scattering medium.
  • This technique differs from recently proposed depth extraction systems using lens array ⁇ see Y. Frauel and B. Javidi, "Digital three-dimensional image correlation by use of computer-reconstructed integral imaging," Appl. Opt, 41, 5488-5496 (2002); J.-H. Park, S. Jung, H. Choi, Y. Kim, and B. Lee, "Depth extraction by use of a rectangular lens array and one-dimensional elemental image modification," Appl. Opt, 43, 4882- 4895 (2004)) since it observes scenes hidden beneath a scattering media and uses a different algorithm for depth acquiring.
  • Fig. 1 is a schematic diagram of a 3D imaging system in accordance with embodiments of the present invention.
  • the configuration consists of two MLAs accompanied by imaging lenses, a pinhole (implemented by an adjustable iris) placed behind the second scattering layer T 2 and conventional CCD cameras.
  • Each path, left and right separately, is similar to that described in D. Abookasis and J. Rosen, "NOISE 2 imaging system: seeing through scattering tissue with a reference point," Opt. Lett, 29, 956-958 (2004).
  • the point-source is placed in front of the scattering medium, and thus serves as a reference point instead of as a point-source of illumination.
  • the idea behind this point technique is to ascribe the location of an object to a location of some known point in space.
  • the computational process at each channel is schematically described in the lower portion of Fig. 1.
  • the computational process yields three spatially-separated terms at the output of each path.
  • One term is the zero-order at the vicinity of the output plane origin. This term is equal to the sum of the pinhole autocorrelation and the object autocorrelation.
  • the other two terms correspond to the cross-correlation between the object and the pinhole and thus, assuming the average pinhole image is close to a point, these terms approximately yield the object reconstruction.
  • the image of the hidden object can therefore be retrieved by reading it from one of these orders. Note that in this scheme, the distance of the reconstructed object from the output plane origin is related directly to the transverse gap between the object and the reference point.
  • z 0 and z p are the distances of the object and the pinhole from the center of the array, respectively.
  • B R and B L represent the distances of each array from the z axis, d ⁇ and JL are the gaps between the images of the reference point and the observed object, in the right and left channels, respectively;
  • XR and *L are the displacements of the reference point images from the horizontal axis which crosses the center of each microlens, in the right and left channels, respectively, h is the transverse gap between the object and the pinhole, and ⁇ is the tilt angle of each array.
  • Equation (2) has an advantage that one needs only to measure D, which is related to the sum of the distances of the reconstructed object from the output plane origin at the two channels. Accordingly, achieved is a simple method for obtaining the object's depth, since in the configuration the baseline B, the focal length/ and the pinhole distance from the MLA z p are known parameters. Calculating D, as described below, estimated is the depth information Z 0 of an object from the MLAs in an imaged scene. Note that by using the pinhole there is no need to know the object and the reference point locations, which are difficult to estimate in such a noisy scattering system. This is an additional advantage for using a method with a reference point.
  • the corrected displacement D from the measured displacement D is
  • D JR + d ⁇ is the sum of the object-reference point gaps at the two channels measured after introducing the layer T 2; t and n are the thickness and the refractive index of the scattering layer, respectively, and ⁇ is the angle between the z axis and the ray connecting the centers of the object and the MLA.
  • the second term in Eq. (3) describes the size change of the image as a consequence of the layer T 2 . Since ⁇ and z 0 are unknown before calculating Eq. (1), ⁇ and Z 0 are initially approximated as ⁇ and z p , respectively, in order to yield an initial estimation for D.
  • the next step is to extract the information about the displacement D from the reconstructed image.
  • each sub-image of the speckled point reference with a corresponding sub-image of the object (recorded by a different exposure where the iris is opened) is Fourier transformed and the square magnitude of each sub- spectrum is accumulated with all the other sub-spectrums.
  • This average joint power spectrum is then Fourier transformed, which yields the output correlation plane.
  • This plane in turn yields a symmetric image reconstruction around the plane origin. By taking one of the first orders, the number of pixels are counted that range from this order to the plane origin.
  • FIG. 4 illustrates an alternative embodiment of the present invention where a set-up of a single laser, a single camera, and a single MLA are utilized to visualize a hidden object between two layers T 1 and T 2 using algorithms as described above with respect to Figure 1.
  • either the hidden object and layers T 1 and T 2 are rotated relative to the camera, MLA, or laser, or the camera, MLA and laser are rotated around the hidden object.
  • a benefit of this arrangement is that only a single laser, a single camera, and a single MLA are required.
  • the system can rotate around the sample and take pictures of the sample at different angles theta.
  • theta ( ⁇ ) is indicated as the angle between a vertical line and the MLA. Because of the geometry, this angle is equal to the angle between the upper arm and a horizontal arm.
  • the system in Figure 2 takes pictures using two sets of cameras, MLA, and laser, one oriented at ⁇ degrees with the optical axis of the sample and one oriented at - ⁇ degrees with respect to the optical axis of the sample.
  • Figure 4 shows that it is possible to just utilize one setup that could be rotated around the sample and takes pictures and does reconstruction using the algorithms described herein for different angles of ⁇ .
  • one embodiment would be to position the arm at ⁇ degrees with respect to the optical axis of the sample, take a picture, and then rotate the arm, i.e. both the laser part of the arm (rail 1) and the pinhole, MLA, lens, camera part of the arm (rail 2) to an angle of - ⁇ and then take another picture.
  • real-time imaging is not possible as the whole arm has to rotate over two ⁇ and take another picture. The information of both pictures can then be used in order to calculate the longitudinal distance between two imaged objects.
  • the sticks had a length of 20 mm and a diameter of 2.1 mm each.
  • the left-stick was constantly attached to tissue T 1 while the right stick was moved longitudinally toward the MLAs, at three different positions.
  • the relative longitudinal displacements between the sticks were: 0 mm (the objects in the same plane), 2 mm and 4 mm.
  • the rear tissue Ti was illuminated by two diagonal collimated plane waves emerging from of He-Ne laser at 632.8 ran with 35 mw.
  • Each MLA comprises 42x42 micro-lenses, but 3x8 were used in this experiment. Using more than 3 columns per channel introduces a considerable different perspective of the object into the averaged image, and thus the reconstructed image is degraded.
  • the diameter of each micro-lens is 0.6 mm and its focal length 6.3 mm.
  • each MLA is projected onto the CCD (PCO, PixelFly, with 1024x1280 active pixels and pixel size of 6x8x6.8 ⁇ m) plane by a single spherical lens LL and LR respectively, each with a focal length of 120 mm and a diameter of 150 mm.
  • These lenses with magnification of 1.3, matches the MLA size with the CCD size and are sufficiently large to cover the MLAs.
  • the distance from the MLA to the imaging lens is 210 mm and the distance from the MLA to the CCD plane is 280 mm.
  • the baseline B is 80 mm.
  • FIG. 3 A summary of the results is displayed in Fig. 3.
  • Columns (a) and (d) show typical sub- images obtained from a typical microlens without using the averaging process.
  • the reconstructed images derived from the averaging process on the images of the hidden sticks with different relative displacements between the objects are shown in column (b) for the left channel, and (e) for the right channel. Note that the reconstruction of one of the sticks (the right- one from Fig. 1) in the pictures is improved as a consequence of its closeness to the scattering layer T 2 , while the other stick (the left-one from Fig. 1) remains far from T 2 .
  • Columns (c) and (f) show the same reconstructed images obtained by removing the second tissue T 2 on the same setup. The effect of stereoscopic vision is clearly demonstrated in these figures by observing that in the right path the relative distance between the sticks gets smaller while in the left path the distance grows as a consequence of moving the right stick longitudinally towards the MLAs.
  • optical tomography system which enables observation of the relative gap between objects hidden in a scattering medium.
  • the system has been verified experimentally with a low error of deviation.
  • the system operates with low computational complexity and is robust to the object depth variations.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Biochemistry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Optics & Photonics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Electromagnetism (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A method for three-dimensional imaging of hidden objects in a scattering medium. Objects hidden between two biological tissues (T1 and T2) at different depths from the viewing system are recovered, and their three dimensional locations are computed. Analogous to a fly's two eyes, two microlens arrays (LL and LR) are used to observe the hidden objects from different perspectives. At the output of each lens array, constructed are objects from several sets of many speckled images by a technique previously suggested which uses a reference point. The differences of the reconstructed images on both arrays in respect to the reference point yield the information regarding the relative depth between the various objects.

Description

STEREOSCOPIC IMAGING THROUGH SCATTERING MEDIA
This application claims priority to U.S. Provisional Patent Application Serial No. 60/711,548, filed August 26, 2005. This Application is a continuation-in-part of Application No. 11/226,572, which claims priority to 60/609,643.
TECHNICAL FIELD
The present invention relates in general to imaging systems, and in particular, to a system for imaging a hidden object.
BACKGROUND INFORMATION
Medical tomography techniques such as X-ray Computed Tomography (CT) {see J. G. Webster, Minimally Invasive Medical Technology, (IoP, Bristol and Philadelphia, 2001), Chap. 4, pp. 46-58) offer great advantages and are still widely used despite the fact that they suffer from several drawbacks such as ionizing radiation, a complex structure and high-cost. An advantage of optical tomography over other medical tomography techniques is that it provides quantitative information on functional properties of tissues, while being non-harmful (the radiation is non-ionizing). Accordingly, in recent years, researchers have invested considerable efforts towards developing optical tomography systems that use near-infrared (NIR) light. A well-known system is the optical coherence tomography (OCT) {see A. F. Fercheri, W. Drexler, C. K. Hitzenberger and, T. Lasser, Optical coherence tomography- principles and applications," Rep. Prog. Phys. 66, 239-303 (2003)). OCT synthesizes cross-sectional images from a series of laterally adjacent depth-scans with high depth and transversal resolution. However, OCT involves a cumbersome interferometric process with complicated reconstruction algorithms to generate accurate cross-section images of various body parts.
In the NOISE system described in U.S. Patent Application Serial No. 11/226,572, which is hereby incorporated by reference herein, a hidden object is reconstructed from many speckled images formed by a microlens array (MLA). Each microlens from the array projects a small different speckled image of the hidden object onto a CCD camera. In the reconstruction algorithm, all the noisy images from the array are shifted to a common center and then accumulated to a single average picture thereby revealing the shape of the hidden object. In NOISE-2, also described in Serial No. 11/226,572, a different algorithm is implemented on the same optical system, alleviating a need to shift the speckled images to a common center. In addition to recording the speckled images of the object, recorded are speckled images of a point- like object generated by illumination through the medium with a point-source. Each sub-image of the speckled object with a corresponding sub-image of the speckled point-like object is placed together side by side in the computer. Then, by computing the autocorrelation of each joint sub- image, and averaging over all the autocorrelations, a cross-correlation between the object function and a narrow point-like function is reveled.
BRIEF DESCRIPTION OF THE DRAWINGS
For a more complete understanding of the present invention, and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
Fig. 1 illustrates a system configured in accordance with an embodiment of the present invention;
Fig. 2 illustrates a perspective projection geometry of an object and a reference point through each channel;
Fig. 3 illustrates a summary of imaging results;
Fig. 4 illustrates an alternative embodiment of the present invention; and
Fig. 5 illustrates a block diagram of the computer in Fig. 1.
DETAILED DESCRIPTION
The present invention discloses an optical tomography technique that is based on speckled images. Embodiments of the present invention are an extension toward three dimensional (3D) imaging of the embodiments disclosed in Serial No. 11/226,572, referred to as Noninvasive Optical Imaging by Speckle E Ensemble (NOISE) (see J. Rosen and D. Abookasis, "Seeing through biological tissues using the fly eye principle," Opt. Exp. 11, 3605 - 3611 (2003); J. Rosen and D. Abookasis, "Noninvasive optical imaging by speckle ensemble," Opt. Lett, 29, 253-255 (2004)) and its modified version NOISE-2 (see D. Abookasis and J. Rosen, "NOISE 2 imaging system: seeing through scattering tissue with a reference point," Opt. Lett, 29, 956-958 (2004)). Therefore, embodiments of the present invention may be referred to as NOISE-3D.
In addition to reconstructing an object, its location in the 3D space is also extracted. Therefore, embodiments of the present invention reveal and acquire depth information of objects seen through a scattering medium. This technique differs from recently proposed depth extraction systems using lens array {see Y. Frauel and B. Javidi, "Digital three-dimensional image correlation by use of computer-reconstructed integral imaging," Appl. Opt, 41, 5488-5496 (2002); J.-H. Park, S. Jung, H. Choi, Y. Kim, and B. Lee, "Depth extraction by use of a rectangular lens array and one-dimensional elemental image modification," Appl. Opt, 43, 4882- 4895 (2004)) since it observes scenes hidden beneath a scattering media and uses a different algorithm for depth acquiring.
Fig. 1 is a schematic diagram of a 3D imaging system in accordance with embodiments of the present invention. The configuration consists of two MLAs accompanied by imaging lenses, a pinhole (implemented by an adjustable iris) placed behind the second scattering layer T2 and conventional CCD cameras. Each path, left and right separately, is similar to that described in D. Abookasis and J. Rosen, "NOISE 2 imaging system: seeing through scattering tissue with a reference point," Opt. Lett, 29, 956-958 (2004). However, it should be noted that unlike NOISE-2, in the present invention, the point-source is placed in front of the scattering medium, and thus serves as a reference point instead of as a point-source of illumination. This improves the reconstructed image because now light from the pinhole is no longer scattered. Therefore, the pinhole's image is narrower, and so is its cross-correlation with the object's image. The idea behind this point technique is to ascribe the location of an object to a location of some known point in space.
The computational process at each channel is schematically described in the lower portion of Fig. 1. The computational process yields three spatially-separated terms at the output of each path. One term is the zero-order at the vicinity of the output plane origin. This term is equal to the sum of the pinhole autocorrelation and the object autocorrelation. The other two terms correspond to the cross-correlation between the object and the pinhole and thus, assuming the average pinhole image is close to a point, these terms approximately yield the object reconstruction. The image of the hidden object can therefore be retrieved by reading it from one of these orders. Note that in this scheme, the distance of the reconstructed object from the output plane origin is related directly to the transverse gap between the object and the reference point. To extract the depth information about the object, initially used is the principle of stereoscopic vision, ignoring, for the time being, the effect of the scattering medium T2. In the diagram depicted in Fig. 2, the projection of the object and the point reference on the image planes of both arrays can be presented as follows:
XR∞SΘ BR (xR - d Jcosfl BR - h
- = — , = (Ia)
J zp J Z0
for the right channel, and
*£cosfl „ Ih \xL - dL)cosθ ^ BL + h (ib)
J Zp J Z0
for the left channel. z0 and zp are the distances of the object and the pinhole from the center of the array, respectively. BR and BL represent the distances of each array from the z axis, d^ and JL are the gaps between the images of the reference point and the observed object, in the right and left channels, respectively; XR and *L are the displacements of the reference point images from the horizontal axis which crosses the center of each microlens, in the right and left channels, respectively, h is the transverse gap between the object and the pinhole, and θ is the tilt angle of each array. Solving the four equations of Eqs. (1) yields the distance between each array plane to the object, given by
B - f z
Zo = p
B - f - zD - D - cosθ (2)
where B=BR+BL is the baseline and D=dR+dL is the sum of the object-reference point gaps at the two channels. It is clear from Eq. (2) that different perspectives lead to a slight displacement of the object in the two views of the scene. Equation (2) has an advantage that one needs only to measure D, which is related to the sum of the distances of the reconstructed object from the output plane origin at the two channels. Accordingly, achieved is a simple method for obtaining the object's depth, since in the configuration the baseline B, the focal length/ and the pinhole distance from the MLA zp are known parameters. Calculating D, as described below, estimated is the depth information Z0 of an object from the MLAs in an imaged scene. Note that by using the pinhole there is no need to know the object and the reference point locations, which are difficult to estimate in such a noisy scattering system. This is an additional advantage for using a method with a reference point.
Since objects are covered under the layer T2 with higher-than-one index of refraction, while the reference point is positioned in front of T2, the obtained gap between each object and the reference point in each channel is different than the situation without the layer T2. Considering the Snell law, and the fact that the layer T2 is tilted relatively to the optical axis of each array, the corrected displacement D from the measured displacement D is
where D = JR + d^ is the sum of the object-reference point gaps at the two channels measured after introducing the layer T2; t and n are the thickness and the refractive index of the scattering layer, respectively, and ø is the angle between the z axis and the ray connecting the centers of the object and the MLA. The second term in Eq. (3) describes the size change of the image as a consequence of the layer T2. Since ø and z0 are unknown before calculating Eq. (1), ø and Z0 are initially approximated as θ and zp, respectively, in order to yield an initial estimation for D.
The next step is to extract the information about the displacement D from the reconstructed image. Referring to the flowchart in Fig. 1, for each path, each sub-image of the speckled point reference with a corresponding sub-image of the object (recorded by a different exposure where the iris is opened) is Fourier transformed and the square magnitude of each sub- spectrum is accumulated with all the other sub-spectrums. This average joint power spectrum is then Fourier transformed, which yields the output correlation plane. This plane in turn yields a symmetric image reconstruction around the plane origin. By taking one of the first orders, the number of pixels are counted that range from this order to the plane origin. The distance in pixels is converted to a real distance at the arrays' image plane by subtracting the sub-matrix width, by multiplying the pixel number with the size of the CCD pixel and by dividing it by the magnification of the imaging lenses LR and LL. These operations are implemented on each path in order to estimate D. By calculating D using Eq. (3) the object depth can now be calculated by using Eq. (2). Figure 4 illustrates an alternative embodiment of the present invention where a set-up of a single laser, a single camera, and a single MLA are utilized to visualize a hidden object between two layers T1 and T2 using algorithms as described above with respect to Figure 1. In this alternative embodiment, either the hidden object and layers T1 and T2 are rotated relative to the camera, MLA, or laser, or the camera, MLA and laser are rotated around the hidden object. A benefit of this arrangement is that only a single laser, a single camera, and a single MLA are required. Thus with this alternative embodiment, the system can rotate around the sample and take pictures of the sample at different angles theta. In Figure 2, theta (θ) is indicated as the angle between a vertical line and the MLA. Because of the geometry, this angle is equal to the angle between the upper arm and a horizontal arm. The system in Figure 2 takes pictures using two sets of cameras, MLA, and laser, one oriented at θ degrees with the optical axis of the sample and one oriented at -θ degrees with respect to the optical axis of the sample. However, Figure 4 shows that it is possible to just utilize one setup that could be rotated around the sample and takes pictures and does reconstruction using the algorithms described herein for different angles of θ. For example, one embodiment would be to position the arm at θ degrees with respect to the optical axis of the sample, take a picture, and then rotate the arm, i.e. both the laser part of the arm (rail 1) and the pinhole, MLA, lens, camera part of the arm (rail 2) to an angle of -θ and then take another picture. However, real-time imaging is not possible as the whole arm has to rotate over two θ and take another picture. The information of both pictures can then be used in order to calculate the longitudinal distance between two imaged objects.
Experiments with two separated cylindrical sticks as observed objects were carried out using the configuration shown in Fig. 1. The sticks had a length of 20 mm and a diameter of 2.1 mm each. During the experiments, the left-stick was constantly attached to tissue T1 while the right stick was moved longitudinally toward the MLAs, at three different positions. Thus, the relative longitudinal displacements between the sticks were: 0 mm (the objects in the same plane), 2 mm and 4 mm. The sticks were embedded between two slabs of chicken breast separated by a distance of 12 mm. This scattering medium is characterized by a scattering coefficient of μs =12SiP(Cm'1) (see J. Rosen and D. Abookasis, "Noninvasive optical imaging by speckle ensemble," Opt. Lett, 29, 253-255 (2004)) and an absorption coefficient of μa~0.2(cm"1) (see G. Marquez, L. V. Wang, S-P Lin, J. A. Schwartz, and S. L. Thomsen, "Anisotropy in the absorption and scattering spectra of chicken breast tissue," Appl. Opt, 37, 798-804 (1998)). The thicknesses of the rear tissues T1 and the front tissue T2 were about 3 mm and 4.5 mm, respectively. The reference point was created by placing a pinhole with an adjustable aperture at a short distance (22 mm) behind tissue T2, between T2 and the MLAs. The rear tissue Ti was illuminated by two diagonal collimated plane waves emerging from of He-Ne laser at 632.8 ran with 35 mw. The two MLAs (source: AdaptiveOptics, 0600-6.3-S) were placed at a distance of Zp=162 mm from the pinhole. Each MLA comprises 42x42 micro-lenses, but 3x8 were used in this experiment. Using more than 3 columns per channel introduces a considerable different perspective of the object into the averaged image, and thus the reconstructed image is degraded. The diameter of each micro-lens is 0.6 mm and its focal length 6.3 mm. The image plane of each MLA is projected onto the CCD (PCO, PixelFly, with 1024x1280 active pixels and pixel size of 6x8x6.8 μm) plane by a single spherical lens LL and LR respectively, each with a focal length of 120 mm and a diameter of 150 mm. These lenses, with magnification of 1.3, matches the MLA size with the CCD size and are sufficiently large to cover the MLAs. At each channel the distance from the MLA to the imaging lens is 210 mm and the distance from the MLA to the CCD plane is 280 mm. The baseline B is 80 mm. After acquiring the sets of the observed image in each path, a computer program was employed to reconstruct the sticks and to determine their distance from the MLAs according to Eqs. (2) and (3).
A summary of the results is displayed in Fig. 3. Columns (a) and (d) show typical sub- images obtained from a typical microlens without using the averaging process. The reconstructed images derived from the averaging process on the images of the hidden sticks with different relative displacements between the objects are shown in column (b) for the left channel, and (e) for the right channel. Note that the reconstruction of one of the sticks (the right- one from Fig. 1) in the pictures is improved as a consequence of its closeness to the scattering layer T2, while the other stick (the left-one from Fig. 1) remains far from T2. Columns (c) and (f) show the same reconstructed images obtained by removing the second tissue T2 on the same setup. The effect of stereoscopic vision is clearly demonstrated in these figures by observing that in the right path the relative distance between the sticks gets smaller while in the left path the distance grows as a consequence of moving the right stick longitudinally towards the MLAs.
Measuring corresponding distances in different figures can succeed when there are well- seen indicators on the objects that are viewed from the two channels. In the almost vertical sticks, one can refer to the central point of each stick as the object point for the distance measurements. Using these measurements, and applying Eq. (3) and Eq. (2), results indicate that the relative distances between the sticks without taking T2 into account are 0.302 mm, 2.414 mm and 4.397 mm instead real distances of 0 mm, 2 mm and 4 mm while when taking T2 into account the relative distances between the sticks are 0.24 mm, 2.358 mm and 4.548 mm. The average error in measuring z0 is less than 0.1%.
Thus, disclosed is an optical tomography system which enables observation of the relative gap between objects hidden in a scattering medium. The system has been verified experimentally with a low error of deviation. The system operates with low computational complexity and is robust to the object depth variations.

Claims

WHAT IS CLAIMED IS:
1. A system for imaging an object hidden by a first scattering layer, comprising: a first coherent light source for emitting a plane wave; a first microlens array positioned to receive scattered light produced from the plane wave as it illuminates the object hidden by the first scattering layer, the first microlens array producing speckled images from the received scattered light; a first camera for collecting and storing the speckled images; and
A positioning system for enabling multiple speckled images to be collected at different angles around the hidden object.
2. The system as recited in claim 1, wherein the object is hidden between the first scattering layer and a second scattering layer, wherein the plane wave is directed through both scattering layers towards the microlens array.
3. The system as recited in claim 1, further comprising a focusing lens positioned between the first microlens array and the first camera.
4. The system as recited in claim 1, wherein the first camera comprises a charge-coupled device.
5. The system as recited in claim 1, further comprising a computer coupled to the first camera for displaying an image of the object recreated from the speckled images collected by the first camera.
6. The system as recited in claim 5, wherein the recreated image of the object is reconstructed by shifting the speckled images to a common center and accumulating the shifted speckled images into an average image.
7. The system as recited in claim 5, wherein the recreated image of the object is reconstructed by an algorithm operated as a computer program in the computer comprising the program steps of:
(a) Fourier transforming each of the speckled images jointly with an image of a speckled pointlike light source;
(b) accumulating a set of squared magnitude images created by step (a) to form an averaged image; and
(c) Fourier transforming the averaged image.
8. The system as recited in claim 1, further comprising a second coherent light source, a second microlens array, and a second camera all positioned at a location relative to the hidden object that is different from the first coherent light source, the first microlens array, and the first camera.
EP06802527A 2005-08-26 2006-08-25 Stereoscopic imaging through scattering media Withdrawn EP1929237A2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US71154805P 2005-08-26 2005-08-26
US11/226,572 US7336372B2 (en) 2004-09-14 2005-09-14 Noninvasive optical imaging by speckle ensemble
PCT/US2006/033638 WO2007025278A2 (en) 2005-08-26 2006-08-25 Stereoscopic imaging through scattering media

Publications (1)

Publication Number Publication Date
EP1929237A2 true EP1929237A2 (en) 2008-06-11

Family

ID=37772523

Family Applications (1)

Application Number Title Priority Date Filing Date
EP06802527A Withdrawn EP1929237A2 (en) 2005-08-26 2006-08-25 Stereoscopic imaging through scattering media

Country Status (2)

Country Link
EP (1) EP1929237A2 (en)
WO (1) WO2007025278A2 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5641723B2 (en) 2008-12-25 2014-12-17 キヤノン株式会社 Subject information acquisition device
US9232211B2 (en) * 2009-07-31 2016-01-05 The University Of Connecticut System and methods for three-dimensional imaging of objects in a scattering medium
CN113607086B (en) * 2021-07-01 2024-03-08 太原理工大学 Rapid three-dimensional imaging method through scattering medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000065531A (en) * 1998-08-26 2000-03-03 Minolta Co Ltd Interference image input device using birefringent plate

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2007025278A3 *

Also Published As

Publication number Publication date
WO2007025278A2 (en) 2007-03-01
WO2007025278A3 (en) 2007-08-16

Similar Documents

Publication Publication Date Title
Geng et al. Review of 3-D endoscopic surface imaging techniques
CN108040243B (en) Multispectral 3-D visual endoscope device and image interfusion method
FR2920085A1 (en) IMAGING SYSTEM FOR THREE-DIMENSIONAL OBSERVATION OF AN OPERATIVE FIELD
Miller et al. Coherence gating and adaptive optics in the eye
CN104799810B (en) Optical coherence tomography equipment and its control method
JP5284731B2 (en) Stereoscopic image display system
KR20140105816A (en) Method for combining a plurality of eye images into a plenoptic multifocal image
WO2013057352A1 (en) Multi-view fundus camera
CN110881947A (en) Optical coherence tomography imaging method
US20060055772A1 (en) Noninvasive optical imaging by speckle ensemble
EP3830628B1 (en) Device and process for capturing microscopic plenoptic images with turbulence attenuation
Chen et al. Effect of viewpoints on the accommodation response in integral imaging 3D display
WO2007025278A2 (en) Stereoscopic imaging through scattering media
JP6807442B2 (en) Fundus photography device
US9042963B2 (en) System and method for acquiring images from within a tissue
JP6557229B2 (en) Tomography system
WO2017174743A1 (en) Method and device for full-field interference microscopy using incoherent light
Rosen et al. Noninvasive optical tomographic imaging by speckle ensemble
DE102020124521B3 (en) Optical device and method for examining an object
Zhao et al. Miniaturized light-field endoscope via a GRIN lens array
Abookasis et al. Microlens array help imaging hidden objects for medical applications
Kwan et al. Development of a Light Field Laparoscope for Depth Reconstruction
Aizert et al. Application of shift-and-add algorithms for imaging objects within biological media
JP2011099728A (en) Optical interference tomographic imaging apparatus and data compression method
Abookasis et al. Imaging through scattering medium from multiple speckle images using microlens array

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20080314

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

RIN1 Information on inventor provided before grant (corrected)

Inventor name: ROSEN, JOSEPH

Inventor name: ABOOKASIS, DAVID

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20110301