WO2012127246A1 - Apparatus and method for reconstruction of coded aperture images - Google Patents

Apparatus and method for reconstruction of coded aperture images Download PDF

Info

Publication number
WO2012127246A1
WO2012127246A1 PCT/GB2012/050646 GB2012050646W WO2012127246A1 WO 2012127246 A1 WO2012127246 A1 WO 2012127246A1 GB 2012050646 W GB2012050646 W GB 2012050646W WO 2012127246 A1 WO2012127246 A1 WO 2012127246A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
mask
convolved
optical
decoding
Prior art date
Application number
PCT/GB2012/050646
Other languages
French (fr)
Inventor
Graham Patrick Wallis
Nicholas James New
Tega Boro EDO
Original Assignee
Mbda Uk Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mbda Uk Limited filed Critical Mbda Uk Limited
Publication of WO2012127246A1 publication Critical patent/WO2012127246A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06EOPTICAL COMPUTING DEVICES; COMPUTING DEVICES USING OTHER RADIATIONS WITH SIMILAR PROPERTIES
    • G06E3/00Devices not provided for in group G06E1/00, e.g. for processing analogue or hybrid data
    • G06T5/73
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/58Means for changing the camera field of view without moving the camera body, e.g. nutating or panning of optics or image sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/21Indexing scheme for image data processing or generation, in general involving computational photography

Definitions

  • the display unit may comprise an array of independently switchable LCD pixels.
  • the display unit may comprise an array of independently switchable mirror elements.
  • the display unit may perform the function of the decoding mask.
  • the image display unit may be arranged to display a combination of the convolved image and a representation of the decoding mask.
  • the representation of the decoding mask may thus act as a filter function in a correlation process.
  • the image display unit and the decoding mask may be provided by a single device.
  • the image display unit and the decoding mask may have components in common.
  • a spatial light modulator (SLM) may act both as the image display unit and provide the function of the decoding mask.
  • the decoding mask may thus form a part only of a larger reconfigurable array of elements.
  • the afore-mentioned SLM may display the decoding mask, in the form of the filter pattern, alongside the convolution of the source image.
  • the first optical apparatus and the second optical apparatus may share at least one optical apparatus in common and may for example be one and the same optical apparatus.
  • the same optical apparatus may act as the first optical apparatus on a first pass, prior to the afore-mentioned signal processor receives from the detector the detected intermediate optical pattern, and the same optical apparatus may act as the second optical apparatus on a second pass, after the detected intermediate optical pattern, or the derived image, is supplied to the image-forming device
  • the apparatus may also include the encoding mask, which produces the convolved image for deconvolution.
  • the encoding mask may share properties with the decoding mask.
  • the encoding mask may be a dynamically reconfigurable encoding mask. The arrangement of elements of the mask may be dynamically adaptable thus allowing the apparatus to extract different information from the scene.
  • an image detector for detecting a convolved image produced by the encoding mask
  • the image processing unit comprises
  • a decoding mask arranged to receive and decode the coherent light representation of the convolved image thus producing a deconvolved image providing information on the original scene
  • the apparatus according to embodiments of the third aspect of the invention may have the advantages of the imaging apparatus as set out in WO 2006/125975, and yet reduce the post-processing time by means of the provision of a decoding mask.
  • the image capture unit may include a collimator arranged to act on the light incident on the image detector (the collimator for example limiting the field of view of the beams reaching the detector) associated with the encoding mask.
  • the image-forming device may include a coherent light source and an image display unit.
  • the image display unit may for example be arranged to display the convolved image and to be illuminated by the coherent light source to produce the coherent light representation of the convolved image.
  • the apparatus is preferably arranged to capture the image of a real-life scene in real-time.
  • the scene may be separated from the apparatus by a distance of more than 2 metres.
  • the radiation that is incident on the encoding mask may thus be substantially collimated.
  • the apparatus may be arranged to receive at the encoding mask electromagnetic radiation reflected from objects in the scene.
  • an encoding mask having an arrangement of elements that filter incident images by altering the transverse spatial distribution of the amplitude and/or phase of the images, the elements of the encoding mask being arranged to produce a convolution of the image of the scene and the encoding mask function
  • an image processing unit comprising a decoding mask to produce a first deconvolved image
  • the image processing unit comprising the reconfigured decoding mask to produce a second deconvolved image, whereby by means of configuring the encoding mask with different arrangements of elements different information can be extracted from the scene.
  • the step of using the encoding mask to produce a first convolved image may include receiving at the encoding mask, as electromagnetic radiation, a real-time image of a real-life scene.
  • the real-life scene may be separated from the encoding mask by a distance of more than 2 metres.
  • the image processing unit may comprise one or more optical devices arranged to optically Fourier transform images.
  • the use of the image display unit to project a composite image such that portions of one image (in the aspects described above, the source image) are interleaved with portions of another image (in the aspects described above, the decoding mask) may have application in embodiments outside the scope of the second aspect of the invention described above.
  • the interleaving of such images in an optical processing apparatus may have application and advantage independent of the use in a deconvolution process. There may for example be application in an optical correlator for comparing a source image against a reference image.
  • an apparatus for optically processing a source image comprising:
  • an image display unit for displaying a composite image comprising a source image and a filter function image
  • the image display unit and the coherent light source are together arranged to project a representation of the composite image onto the image detector, the image received at the image detector being in the form of an optically processed image, for example a correlation image,
  • the image display unit is arranged to display the composite image such that portions of the source image are interleaved with portions of the filter function image.
  • the apparatus also includes an optical apparatus, for example comprising at least one lens, arranged to produce an optical Fourier transform of the composite image.
  • the image display unit and the coherent light source may thus together be arranged to project a representation of the composite image via the optical apparatus onto the image detector, the image received at the image detector thus being a Fourier transform of the composite image.
  • the apparatus being so arranged that the lens acts as a Fourier transform lens.
  • the image display unit and the coherent light source may together be arranged to project a representation of the composite image via the lens onto the image detector, the image received at the image detector being in the form of a Fourier transformed correlation image.
  • the filter function image may be any image and need not be in the form of a decoding mask function.
  • the filter function image could be in the form of an amplitude modifying function.
  • the filter function image could be in the form of a phase modifying function.
  • the filter function image could be in the form of a complex function, for example modifying both amplitude and phase.
  • the image display unit may be arranged to display the composite image such that portions of at least one of the source image and the filter function image are interleaved with nulls (for example blanks).
  • the portions may be in the form of strips, for example elongate strips.
  • the image may be formed (or defined by) an array of pixels. Each portion may be in the form of a collection of interconnected pixels.
  • Each strip may have a constant width. The width of at least one strip may be a single pixel.
  • the null strip may for example be one pixel wide. Substantially all of the null strips may be one pixel wide. Some or all of the null strips may have a width greater than one pixel.
  • One, more or substantially all of the null strips may each be wider than the median width of the source image strip.
  • One, more or substantially all of the null strips may each be wider than the median width of the filter function image strip.
  • substantially all of the strips may have substantially the same width.
  • Elongate strips of the source image may be interleaved with elongate strips of the filter function image. There is preferably a row of a multiplicity of successive spatially separated sets of elongate strips, each set comprising a first elongate strip of the source image and a second elongate strip of the filter function image and a third null strip disposed between the first and second strips.
  • a method of optically processing a source image comprising a step of projecting using coherent light a representation of a composite image (optionally through optical apparatus for example a lens arranged to provide a Fourier transform of the composite image) onto an image detector, the composite image comprising the source image and a filter function image, and wherein
  • the composite image is so arranged such that portions of the source image are interleaved with portions of the filter function image.
  • the method may include a first pass process and a second pass process.
  • the first pass process may include passing a representation of the composite image through optical apparatus, producing a Fourier transform of the composite image at the image detector as an intermediate image.
  • the second pass process may include passing the intermediate image, or an image derived therefrom, through optical apparatus, producing an optically processed image at the image detector.
  • the image detector may comprise an array of sensor pixels.
  • the method may include generating a two-dimensional data representation of an array of image pixels dependent on the intensity of incident radiation detected at the sensor pixels.
  • the method may include assigning a data value to an image pixel by comparing the intensity of radiation detected at the sensor pixel corresponding to the image pixel with the intensity of radiation detected at a plurality of neighbouring pixels.
  • the apparatus (and method) of the third aspect of the invention may incorporate any of the features described with reference to the apparatus (and method) of the second aspect of the invention and vice versa.
  • the image display unit of the fourth aspect of the invention may have any of the features of the image display unit associated with the decoding mask of any of the first to third aspects of the invention.
  • the image detector of the fourth aspect of the invention may be arranged in the same way as, or have any of the features of, the image detector unit associated with the decoding mask of any of the first to third aspects of the invention.
  • Figure 1 shows a coded aperture imaging apparatus of the prior art
  • Figure 2 shows an optical correlation apparatus of the prior art
  • Figure 3 shows an imaging system according to a first embodiment of the invention
  • Figures 4a and 4b show a pair of Modified Uniform Redundant Array (MURA)
  • Figure 5a shows an image of a scene
  • Figure 5b shows a convolved image resulting from a convolution of the image of
  • Figure 6a shows an imaging system according to a second embodiment of the
  • Figure 6b shows a face-on view of a reconfigurable mask forming part of the imaging system shown in Figure 6a;
  • Figure 6c shows an enlarged portion of the reconfigurable mask shown in Figure 6b
  • Figure 7 shows an imaging system according to the second embodiment operating in a second mode
  • Figure 8a shows an example of a joint transform correlator input image
  • Figure 8b shows the convolution obtained from the input image of Figure 8a
  • Figures 9a to 9c show the results of the image processing performed by the apparatus of the second embodiment
  • Figures 10a and 10b illustrate the results of a thresholding algorithm used by the
  • the first encoding CAI mask 2 is in the form of a dynamically updatable liquid crystal Spatial Light Modulator (SLM).
  • SLM Spatial Light Modulator
  • the encoding mask comprises tiled repetitions of a mask pattern m to ensure that each pixel of the resulting encoded scene is formed from the full convolution of the external image with the encoding mask.
  • the resulting distribution is a convolution of the scene data with the pattern of the first encoding CAI mask:
  • s is the encoded/convolved image
  • m represents the encoding mask function (encoding MURA pattern)
  • s ® m' $( ⁇ , ⁇ ) ⁇ ' (CQ - ⁇ , ⁇ - ⁇ ) ⁇ (2) where:
  • this deconvolving process is performed by means of an optical processor having an architecture similar to that of a 4-f Matched Filter (MF) architecture, in which the correlation between an input and reference pattern is defined as the Fourier transform (FT) of the product of an input and conjugate reference function, s and r, which have themselves been Fourier transformed (using the functions s and m ' as examples):
  • FT Fourier transform
  • s * m' JJ s(x, y)m' * (w - x, v - y)dxdy (5)
  • the two dimensional Optical Fourier Transform (OFT) of a collimated input distribution is formed at the rear focal plane of a positive converging lens.
  • the Fourier transform/inverse Fourier transform pair required are effected by the two lenses 6, 9, respectively.
  • the digitised image from the image processor 14 (the convolved image), is displayed as an encoded image on an SLM (spatial light modulator) 12.
  • the SLM device used in this embodiment has an array of 1024x768 pixels with a 9 micron pitch.
  • collimated, coherent light of wavelength ⁇ from a source 5 is used to illuminate the SLM 12, which modulates the light with the input function defined by the convolved image s(x,y).
  • This input function is thus projected through the first FT lens 6, producing the optical Fourier transform of the convolved image at the pixels of a second SLM 8.
  • the FT of the convolved image is optically multiplied with a second filter pattern (the decoding mask).
  • the pattern of the decoding mask is pre-calculated and based upon the Fast Fourier transform of the decoding pattern M' (which is such that M' ® M is a delta function; M being the mask function effected by the CAI encoding mask 2).
  • the pattern of the decoding mask may be represented as a complex function through the use of either a single SLM capable of providing the phase and amplitude modulation components, or by using two SLMs to provide the amplitude and phase components individually.
  • the multiplied distribution is then (inverse) Fourier transformed by a second lens 9 and the intensity of the resulting deconvolved image is finally captured in the focal plane of lens 9 by an image detector 3 comprising a sensor array.
  • the sensor array is in the form of a CMOS sensor array having a 9.9 micron pitch, an array of 659x494 pixels, with each pixel having 12-bit sensitivity (i.e. ability to distinguish between 4096 different intensities of light).
  • Figure 11a shows an example filter pattern that may be displayed either in phase or amplitude on the decoding SLM 8.
  • the filter was calculated from the imaginary part of the FFT of the M' pattern.
  • Figure 1 lb shows the resulting reconstructed image.
  • the pattern of the decoding mask may be represented as a complex function. It should be noted however that the full complex representation is not essential as illustrated by Figures 1 la and 1 lb.
  • the first encoding CAI mask 2 may be used as a fixed physical amplitude array.
  • the SLM that defines the CAI mask 2 is reconfigurable.
  • the SLM to display an array pattern in either amplitude or phase, the
  • functionality of the system may be extended to include a changeable field of view and the ability to zoom in and out.
  • the decoding pattern of the decoding CAI mask 8 is chosen such that the result of a convolution between (a) the inverse Fourier transform of the decoding mask pattern and (b) the encoding mask pattern is a delta function.
  • the decoding mask 8 shown in Figure 3 (the one-pass 4-f matched filter system) is based upon the FFT of the M' pattern. This provides the most accurate reconstruction of the original image.
  • These aperture patterns are specifically calculated by known techniques, with the best known perhaps being those classed as Modified Uniform Redundant Array (MURA) patterns, symbolised as m (encoding) and m ' (decoding).
  • MURA Modified Uniform Redundant Array
  • Figures 5a shows an image of a scene (Figure 5a) and a convolved image of that scene ( Figure 5b) of the type that would be used to configure the image projection effected by SLM 12.
  • Collimated coherent light of wavelength ⁇ from a source 5 is arranged (by position and/or one or more optical devices such as mirrors and beam splitters) to be incident on the SLM 2, which displays the input convolved source image s(x,y) and decoding mask function pattern m '(x,y) as spatially separated images (illustrated schematically in Figure 6a, and shown in greater and better detail in Figures 6b and 6c).
  • the light is modulated by the
  • the image detector array 3 is positioned at the rear focal plane of the lens, of focal length f, such that it captures the intensity distribution (square of the magnitude) of the Fourier transform of the input scene, known as the Joint Power Spectrum (JPS).
  • JPS Joint Power Spectrum
  • the detector array then passes via a digital processing means 4 a digital representation of the JPS which is then displayed on the SLM 2 in a second pass (represented by Figure 7) whereupon it undergoes a further (inverse) Fourier transform by lens 6 allowing the detector array on this second pass to detect and capture the deconvolved image of the original scene.
  • Figure 6b shows the composite image, comprising the convolved source image s(x,y) and decoding mask function pattern m '(x,y), in the form that they are displayed on the SLM 2 during the first pass.
  • Figure 6c shows an enlarged portion of the composite image of Figure 6b.
  • the input s(x,y) and mask m '(x,y) images are presented as interleaved columns in the form of a row of elongate strips 2c, 2d. The reason for doing this can be explained by considering the conventional use of a Joint Transform Correlator arranged to correlate a source image with a reference image.
  • JPS Joint Power Spectrum
  • the JPS would then be processed and then displayed on the SLM for a second pass through the lens 6 to produce a final correlation scene at the sensor array.
  • the final correlation scene would consist of terms relating to those featured in the right hand side of the above equation: the first term is a "zero order" noise term at the origin, whilst the second and third terms are pairs of 180 degree symmetrical conjugate peaks/spots whose intensity and positions denote the level of graphical similarity and relative alignment, respectively, of the input and reference functions.
  • the convolution is cyclic. This may be accounted for in the Fourier transform- based deconvolution by ensuring the size of the input is equal to the lateral dimensions of the imaging system.
  • the input plane as displayed on the SLM during the first pass is, in accordance with the second embodiment, reordered by interleaving the columns of the convolved object (columns 2c) and the decoding pattern (columns 2d) with at least one line of zeros (columns 2b) between them, as shown in Figure 6b.
  • the explanation for this method is described thus, substituting the function variables used so far to explain the theory in general terms.
  • JPS Joint Power Spectrum
  • the output plane contains the desired correlation terms.
  • the components of the correlation can be recovered with the mapping below:
  • Deconvolution of the CAI encoded image can thus be implemented by replacing ⁇ with the recorded convolved data, s(x,y) and ⁇ by the decoding pattern m ⁇ x,y).
  • the interleaved image on the SLM shown in Figures 6b and 6c gives rise to a JPS that is detected by the detector 3 and then electronically processed by a digital processing circuit 4 (which may be in the form of a computer or a bespoke digital processing circuit).
  • the resulting pattern is then displayed on the SLM 2 as shown in Figure 7 during the second pass.
  • a second, inverse, optical Fourier transform process then produces the final deconvolved at the sensor array 3 as a second pass process using the same apparatus.
  • the final image actually received at the sensor array 3 is in the form of a striped pattern, with the three component output images being interleaved with each other.
  • the final image is digitally processed by the processor 4 to reconstruct the source image.
  • Figures 9a to 9c The results of the JTC-based process are shown in Figures 9a to 9c, decoupled from the resulting output plane and reconstructed by the processor 4.
  • Figures 9a and 9c show the reconstructed input scene and its rotated self, whilst 9b shows the DC noise term.
  • the second embodiment utilises additional image processing techniques to mitigate against the physical limitations of the hardware used in the JTC architecture, in particular the fact that the CMOS/CCD sensor array 3 will not capture phase information from the Fourier transformed convolved scene incident upon it (detected as the Joint Power Spectrum).
  • the CMOS/CCD sensor array 3 positioned in the Fourier plane will also be limited in its dynamic range sensitivity when capturing the Fourier spectrum data of the Joint Power Spectrum - whose dynamic range will be several orders of magnitude greater than that of the sensor.
  • tests have shown that this is not a significant problem due to the fact the majority of the spectral data is contained with a much smaller range, located away from the central DC term.
  • the use of a physical aperture stop placed at the origin of the plane, plus setting a high camera gain has been shown to allow the system to operate with a low camera exposure time.
  • a further consideration is the possible disparity between the camera output bit depth and the SLM bit depth. If SLM bit depth is considerably lower than that of the camera, an adaptive thresholding algorithm may be applied electronically (e.g. in processor 4) to the JPS to preserve as much of the information as possible before being displayed on the SLM for the inverse Fourier transform stage (second pass) of the process.
  • One such algorithm that has been shown to be highly effective at extracting the spectral data from the JPS is a 3x3 nearest neighbour kernel, of the type described in PCT publication No. W099/31563.
  • the second SLM 8 shown in Figure 3 may be lower resolution than that of the SLM 2 used to display the source image.
  • the decoding mask may require a resolution of only 201x201 elements for example.
  • interleaving when using the JTC architecture may have application in other optical processing applications.
  • the interleaving of images / mask functions may be used in an optical correlator.
  • the interleaving of images / mask functions may be used in an optical processor, having a JTC architecture, to enable spatial separation of the results of a two pass optical process that might otherwise be performed by an optical processor having a 4-f matched filter architecture.
  • the interleaving of images / mask functions may be used in an optical processor of a JTC architecture used to calculate or evaluate derivatives or partial derivatives as an optical differentiator using the optical processes described in WO 2008/1 10779, the contents of which are hereby incorporated by reference thereto, but using a JTC architecture, instead of a 4-f matched filter architecture.
  • a dynamically reconfigurable decoding mask in conjunction with a dynamically reconfigurable encoding mask may have application independent of the particular optical processing architectures and arranged shown in the drawings.
  • the parts of the illustrated apparatus that perform the function of an optical processor may have independent application.
  • such parts could be used to optically process data obtained by other means, for example, by means of an image capture apparatus that is physically separate to, and independent of, such parts.

Abstract

A method of and apparatus for deconvolving an image from an encoding mask (2) with which it has been convolved is described. The encoding (2) mask has an arrangement of elements that filter incident images by altering the transverse spatial distribution of the amplitude and/or phase of the images. A convolved image (12) is received. A decoding mask (8) having an arrangement of elements dependent upon the arrangement of elements in the encoding mask is provided. An optical Fourier transform is used to form an intermediate optical pattern derived from the convolved image (12) and the arrangement of elements in the decoding mask (8). An optical inverse Fourier transform is performed on said intermediate optical pattern. The result may be detected at a detector (3) as a deconvolved image representative of the original scene (1).

Description

APPARATUS AND METHOD FOR RECONSTRUCTION OF CODED APERTURE IMAGES
Field of the Invention The present invention concerns optical processing, coded mask imaging and imaging methods and apparatus relating thereto.
Background of the Invention Coded Aperture Imaging (CAI) is a lens-less imaging method employed predominantly in astronomy to image wavelengths that cannot be manipulated with conventional optics, such as X-rays. The method is based upon the principle of the pinhole camera, with the primary difference being that the single pinhole aperture is replaced by an array of "pinholes". This increases the amount of light accepted into the system, though at the cost of producing a convolved image at the sensor plane based upon the shifted superposition of each of the pinhole apertures. The main limiting factor with CAI-based systems is in the reconstruction of the original scene from the convolved image, since the operation requires a significant degree of computer / digital processing time.
WO 2006/125975 describes a CAI imaging system, a schematic illustration of which is shown in Figure 1 of the accompanying drawings. A scene 101 is received at a coded aperture imaging (CAI) mask 102, resulting in a convolved image being detected by camera 103 (in the form of an array of detector pixels). The pattern of radiation detected by the camera requires decoding by means of a deconvolution algorithm. Such decoding is performed by a digital processing means (i.e. a computer) 104. The coded aperture imaging (CAI) mask 102 is in the form of a dynamically reconfigurable coded aperture imaging (CAI) mask and is thus able to be configured with different
arrangements of apertures so as to extract different information from the scene, such as imaging the scene with a different field of view. The CAI imaging system of WO 2006/125975 requires the provision of a high speed digital processor in order to produce an image of the scene 101. Increasing the resolution of the image produced adds to the requirements of processing power to the extent that even the fastest of modern processors would not be fast enough to produce very high resolution images in real-time.
The present invention seeks to mitigate the above-mentioned problems.
Alternatively or additionally, the present invention seeks to provide an improved imaging apparatus. Alternatively or additionally, the present invention seeks to provide an improved method of imaging. Alternatively or additionally, the present invention seeks to provide an apparatus or method with improved efficiency of processing a convolved or coded image for example of the type extracted by means of a CAI mask. Summary of the Invention
In accordance with a first aspect of the invention there is provided an apparatus for processing of an image of a scene, wherein
the apparatus comprises
an encoding mask,
a detector associated with the encoding mask for sampling an image convolved by means of the encoding mask, the detector for example comprising a sensor array
optionally, a collimator arranged to act on the light incident on the detector
(the collimator for example limiting the field of view of the beams reaching the detector)
associated with the encoding mask,
a decoding mask,
a coherent light source associated with the decoding mask, an image display unit associated with the decoding mask, and
one or more optical apparatuses associated with the decoding mask, each of the encoding mask and the decoding mask comprising an arrangement of elements that filter incident images by altering the transverse spatial distribution of their amplitude and/or phase, the arrangement of the elements of the decoding mask being dependent on the arrangement of elements of the encoding mask,
the apparatus is arranged to received an image of the scene at the encoding mask, the elements of the encoding mask are arranged to produce a convolution of the image of the scene and the encoding mask function, thus producing a convolved image which is detected by the detector associated with the encoding mask,
the image display unit is arranged to display the convolved image and to be illuminated by the coherent light source to produce a coherent radiation convolved image, at least one of said one or more optical apparatuses is arranged to form, by an optical Fourier transform, an intermediate optical pattern derived from the coherent radiation convolved image and the arrangement of elements in the decoding mask; and at least one of said one or more optical apparatuses is arranged to form optically an inverse Fourier transform of said intermediate optical pattern, thus producing an optically processed deconvolved image.
Such an apparatus can thus perform lens-less image-capture of a scene without the need for time consuming computer-processing of the image once captured. The apparatus of this aspect of the invention can be used to capture an image with the detector associated with the encoding mask without using any lenses. The image so captured may then be optically processed by means of an optical processing means comprising one or more optical apparatuses which may include one or more lenses. Such an apparatus could be used to significantly improve the speed at which imaging systems of the prior art extract useable images from lens-less systems, because the apparatus of the present invention utilises the inherent scalable parallel processing power of an optical system in contrast to the reliance on computer processing speeds of the prior art systems. It may be possible to construct an optical processing device for deconvolving a convolved image in accordance with the present invention that does not use one or more lenses in conjunction with the decoding mask. For example, Fourier transforming, if required, could be performed by other means.
The first aspect of the invention also provides a method of imaging a scene, wherein the method comprises the following steps:
providing an encoding mask and a decoding mask each mask comprising an arrangement of elements that filter incident images by altering the transverse spatial distribution of the amplitude and/or phase of the images,
using the encoding mask to produce a convolved image as a convolution of an image of a scene and an encoding mask function, and
decoding the convolved image so produced by optical processing, the optical processing including
reproducing said convolved image with coherent light,
using the decoding mask in conjunction with an optical Fourier transform process to deconvolve the convolved image, the arrangement of elements of the decoding mask being dependent on the arrangement of elements of the encoding mask.
The method may include a step of subsequently reconfiguring in situ the mask or masks used to perform the encoding convolution with a different arrangement of elements. The encoding mask may then produce a second convolved image as a convolution of an image of a scene and a second mask function, the second mask function being different from the first mask function. The arrangement of elements of the mask or masks used to perform the deconvolution may also be reconfigured in situ in dependence on the arrangement of elements of the mask or masks used to perform the encoding convolution. Thus, by means of (re-)configuring the encoding mask with different arrangements of elements, different information can be extracted from the scene. By means of also being able to (re-)configure a decoding mask with a correspondingly different arrangements of elements, a representation of the original source image (of the scene) may be generated at speed and, if desired, locally to the encoding mask apparatus.
It has been appreciated that the first aspect of the invention may have application and advantage independent of the means by which the convolved image for subsequent processing is produced. For example, parts of the first aspect of the invention provide an optics-based image processing apparatus that could be used to process (for example de- convolve) convolved images that would have hitherto been processed by means of a digital computer or processor.
According to a second aspect of the invention there is provided an apparatus performing a deconvolution of an image which has been convolved with an encoding mask, said encoding mask having an arrangement of elements that filter incident images by altering the transverse spatial distribution of their amplitude and/or phase, the apparatus comprising:
a decoding mask having an arrangement of elements dependent on the
arrangement of elements of the encoding mask,
first optical apparatus arranged to form, by an optical Fourier transform, an intermediate optical pattern derived from the convolved image and the arrangement of elements in the decoding mask; and
second optical apparatus arranged to form optically an inverse Fourier transform of said intermediate optical pattern.
An embodiment of the present invention thus provides an apparatus for performing a deconvolution of a convolved image, the apparatus including, in series, a decoding mask and one or more lenses (acting as the first and/or second optical Fourier transform apparatuses) to decode the convolved image, thus producing an optically processed deconvolved image.
It will be appreciated that the architecture of optical devices that are required in embodiments of this second aspect of the invention could be similar to those used to perform the function of an optical correlator. It will also be noted that the coded aperture imaging (CAI) mask as disclosed in WO 2006/125975 may be in the form of an array of binary switchable LCD elements, which could be suitable for use as the decoding mask of the present invention. Use of such LCD arrays have been proposed in other documents of the prior art for optical imaging techniques. For example, WO 99/31563, the contents of which are hereby incorporated by reference thereto, discloses the use of such an array as a spatial light modulator (SLM) in an optical correlator. Figure 2 of the accompanying drawings shows schematically a joint transform correlator of WO 99/31563. A reference image 20 lr and a source image 201s are digitisted (in this case by means of a binary digitisation but it will be appreciated that the digitisation could be greyscale, depending on the SLM used) and displayed on the SLM 202. Collimated laser light is projected from a source 205 and is modulated by the liquid crystal pixels of the SLM 202. The two images are thus projected through the lens 206 which forms their Fourier transform in the back focal plane of the lens 206, where there is positioned the sensor array (in this case a
CMOS or CCD camera sensor 203). The intensity of this Fourier distribution is thus captured detected by the detector 203 as a Fourier transformed joint power spectrum. The image of the joint power spectrum is then processed by a computer 204 to produce a thresholded binary or greyscale image that is itself used to program the SLM to produce a further Fourier transformed correlation image with conjugate pairs of correlation peaks per match. It will however be appreciated that whilst there are similarities in the architecture employed, the correlator of WO 99/31563 is not arranged to perform a deconvolution of a convolved image resulting from an image of a scene being convolved by an encoding mask.
Various optional features of the second aspect of the invention will now be described.
The first and/or second optical apparatus may be arranged to perform the optical Fourier transform using a lens.
The apparatus may include an image-forming device arranged to produce a representation of the convolved image, such that the convolved image received at the first optical apparatus results from the representation so produced. The apparatus may include an image display unit. The image-forming device may for example comprise an image display unit. The apparatus may include a light source for illuminating the image display unit. The light source is preferably a coherent light source, such as laser light. The image display unit and the light source may together be arranged to produce and/or project a representation of an image displayed on the display unit, the optically processed deconvolved image produced by the apparatus resulting from the representation so produced. The image display unit may be arranged to display one or more of the convolved image, the decoding mask, the intermediate optical pattern, or an image derived from any of the foregoing.
The apparatus may be arranged to receive the convolved image of the scene electronically. For example, there may be a digital representation of the scene which is passed electronically to the apparatus, for example as digital signals or as a data array. The digital representation of the scene may then be displayed on the display unit.
The apparatus may be arranged to receive the convolved image as electromagnetic radiation, possibly as substantially collimated electromagnetic radiation.
The decoding mask may be in the form of a coded aperture imaging (CAI) mask. The decoding mask may be arranged to function by transmission of an image via apertures defined by the elements. The elements may be in the form of apertures. The apertures may alternatively be defined as the gaps between elements. The decoding mask may be arranged to function by reflection of an image. The elements may each be reflective. The decoding mask may be a fixed mask. The elements may each have a switchable state. The state of each element may be switchable between multiple states, for example providing variable reflectivity/transmissibility. The decoding mask may be a dynamically reconfigurable mask. For example, the arrangement of elements of the decoding mask may be able to be dynamically adapted thus allowing the apparatus to deconvolve images that have been convolved by means of encoding masks having different configurations of elements. The decoding mask may comprise an array of independently switchable LCD pixels. The decoding mask may comprise an array of independently switchable mirror elements.
The decoding mask may be in the form of a spatial light modulating (SLM) unit. The decoding mask may be arranged to modulate the phase of the incident light. The decoding mask may be arranged to modulate the amplitude of the incident light.
The display unit may be arranged to function by defining the image by means of apertures defined by the elements. The elements may be in the form of apertures. The apertures may alternatively be defined as the gaps between elements. The display unit may be arranged to function by defining an image by means of reflection. The elements may each be reflective. The elements may each have a switchable state. The state of each element may be switchable between multiple states, for example providing variable pixel intensity.
The display unit may comprise an array of independently switchable LCD pixels. The display unit may comprise an array of independently switchable mirror elements.
The display unit may be in the form of a spatial light modulating (SLM) unit. The display unit may be arranged to modulate the phase of incident light. The display unit may be arranged to modulate the amplitude of incident light.
The apparatus may be arranged to operate by means of a two-pass process. The following description relates to features which are well suited to such an arrangement. It will be appreciated however that the features described below may have application in other configurations.
The apparatus may be so arranged that the decoding mask is applied to the convolved image and the optical Fourier transform is performed on the resulting masked convolved image. The apparatus may be so arranged that the optical Fourier transform is performed on a combination of the decoding mask and the convolved image. In this case, the decoding mask applied may define a mask function M', such that the encoding mask function M convolved with the mask function M' produces a delta function.
The display unit may perform the function of the decoding mask. The image display unit may be arranged to display a combination of the convolved image and a representation of the decoding mask. The representation of the decoding mask may thus act as a filter function in a correlation process. The image display unit and the decoding mask may be provided by a single device. The image display unit and the decoding mask may have components in common. For example, a spatial light modulator (SLM) may act both as the image display unit and provide the function of the decoding mask. The decoding mask may thus form a part only of a larger reconfigurable array of elements. For example, the afore-mentioned SLM may display the decoding mask, in the form of the filter pattern, alongside the convolution of the source image.
The apparatus may be arranged to optically process both the convolved image (from the source image) and the decoding mask filter function in parallel by means of performing a Fourier transform simultaneously with the same lens. The apparatus may comprise a joint transform correlator. The apparatus may have the architecture of a joint transform correlator.
The image display unit may be arranged to display a combination of the convolved image and the representation of the decoding mask such that portions of the convolved image are interleaved with portions of the decoding mask. Portions of at least one of the convolved image and the representation of the decoding mask may be interleaved with blanks. Elongate strips of the convolved image may be interleaved with elongate strips of the representation of the decoding mask.
The apparatus may further comprise an image detector for detecting the convolved image after it has been optically processed. The image detector may for example be arranged to detect the intermediate optical pattern formed by the first optical apparatus. The apparatus may further comprise a signal processor and an image-forming device. The image forming device may be arranged to display an image dependent on an image detected by the image detector. The signal processor may be arranged to receive from the detector the detected intermediate optical pattern and to supply it, or an image derived from it, to the image-forming device. The image forming device may be arranged to provide said intermediate optical pattern, or said derived image, to the second optical apparatus.
The first optical apparatus and the second optical apparatus may share at least one optical apparatus in common and may for example be one and the same optical apparatus. For example, the same optical apparatus may act as the first optical apparatus on a first pass, prior to the afore-mentioned signal processor receives from the detector the detected intermediate optical pattern, and the same optical apparatus may act as the second optical apparatus on a second pass, after the detected intermediate optical pattern, or the derived image, is supplied to the image-forming device
One and the same image-forming device may be used to produce a representation of the convolved image at the start of the first pass and to provide said intermediate optical pattern or said derived image, on the second pass. The image forming device may for example be arranged to be switchable between (i) a first mode in which it displays a combination of the convolved image and a representation of the decoding mask and (ii) a second mode in which it displays said intermediate optical pattern or said derived image.
For example, an image dependent on the joint power spectrum of the correlation performed in the first mode may be displayed on the image display unit and processed in the second mode. A digital processor, for example in the form of a computer processor, may process the result of the first mode of operation and in dependence thereon control the configuration of the display in the second mode of operation.
The image detector unit may, as mentioned above, comprise an array of sensor pixels, for example CCD elements, the pixels in the array each being arranged to measure the intensity of incident radiation. The image detector unit may also comprise a processing unit, the processing unit being arranged to generate a two-dimensional data representation of the image so detected. The processing unit may be arranged to generate the two-dimensional data representation by a process in which a data value is assigned to a pixel by comparing the intensity of radiation detected at that pixel with the intensity of radiation detected at a plurality of neighbouring pixels. The data value that is assigned to each such pixel may be one of a discrete number of possible values. The discrete number of possible values may be two (or may be more than two). The data value that is assigned to a pixel may depend on a comparison of the average intensity of radiation detected at that pixel with the average intensity of radiation detected at the neighbouring pixels. The four nearest neighbouring pixels may be used (apart of course from values assigned to pixels at the edge of the array). Alternatively, the eight nearest neighbouring pixels may be used.
The apparatus may be arranged to operate by means of a one-pass process. The following description relates to features which are well suited to such an arrangement. It will be appreciated however that the features described below may have application in other configurations.
The first optical apparatus may be arranged to perform the Fourier transform on the convolved image and then to apply the decoding mask to the resulting Fourier- transformed convolved image. In this case, the decoding mask applied may define the
Fourier transform of a mask function M', such that the encoding mask function M convolved with the mask function M' produces a delta function. It will therefore be appreciated that the decoding mask pattern may not necessarily simply be the inverse of the encoding mask pattern.
The image display unit may be spatially separated from the decoding mask in the direction of the optical pathway. For example, a lens (for example the lens for performing the first Fourier transform on the convolved image) may be disposed between the image display unit and the decoding mask.
The apparatus may include a further lens for performing a further Fourier transformation of an image. There may therefore be first and second lenses, one of which is arranged to perform a Fourier transform on the convolved image before it is decoded by use of the decoding mask and the other of which is arranged to perform a Fourier transform on the convolved image after it is decoded by the decoding mask.
The decoding mask may be in for the form of SLM comprising switchable LCD elements. The decoding mask may be in the form of a filter pattern which is pre- calculated and based upon an inverse Fourier transform of the encoding mask pattern.
The apparatus may comprise a matched filter correlator. The apparatus may have the architecture of a matched filter correlator.
The apparatus may also include the encoding mask, which produces the convolved image for deconvolution. The encoding mask may share properties with the decoding mask. The encoding mask may be a dynamically reconfigurable encoding mask. The arrangement of elements of the mask may be dynamically adaptable thus allowing the apparatus to extract different information from the scene.
The apparatus may include an image detector unit associated with the encoding mask, the image detector unit capturing an encoded image. The image detector unit associated with the encoding mask is preferably arranged to digitise the convolved image for deconvolution.
The apparatus may include a collimator positioned in front of the image detector unit associated with the encoding mask, for example to limit crosstalk by restricting the angular field of light incident on the pixels of the camera / detector. The collimator may also be positioned in front of the encoding mask.
According to the second aspect of the invention there is also provided a method of deconvolving an image from an encoding mask with which it has been convolved, said encoding mask having an arrangement of elements that filter incident images by altering the transverse spatial distribution of the amplitude and/or phase of the images, wherein the method includes the steps of:
receiving the convolved image;
providing a decoding mask, the decoding mask having an arrangement of elements dependent upon the arrangement of elements in the encoding mask;
using an optical Fourier transform to form an intermediate optical pattern derived from the convolved image and the arrangement of elements in the decoding mask;
and performing an optical inverse Fourier transform on said intermediate optical pattern.
The method may be performed as a one -pass process. The method may be so performed that the optical Fourier transform is performed on the convolved image and the decoding mask is then applied to the resulting Fourier-transformed convolved image.
The method may be performed as a two-pass process. The method may be so performed that the decoding mask and the convolved image are optically combined before being subjected to the optical Fourier transform.
The optical Fourier transform may be performed with the use of one or more lenses. The optical inverse Fourier transform may be performed with the use of one or more lenses.
There is also provided a method of detecting an image, wherein the method includes
producing a convolved image of a scene by means of convolving electromagnetic radiation from the scene by means of an encoding mask,
and deconvolving the convolved image by means of performing a method of deconvolving according to the second aspect of the invention as described herein.
It has been appreciated that the first aspect of the invention may have application and advantage independent of the means by which the convolved and transformed image is subsequently processed. For example, parts of the first aspect of the invention provide an optics-based adaptable imaging apparatus that could be used to obtain a convolved image containing certain information from a scene, irrespective of how the convolved image is subsequently processed (i.e. whether by a digital processing unit, such as a computer, or by an optical processing apparatus as set out above in relation to the second aspect of the invention).
According to a third aspect of the invention, there is provided an apparatus for adaptive imaging of a scene, wherein
the apparatus comprises an image capture unit and an image processing unit, the image capture unit comprises
an encoding mask, the encoding mask having an arrangement of elements that filter incident images by altering the transverse spatial distribution of the amplitude and/or phase of the images (such that for example the encoding mask forms, on reflection or transmission of the image, a convolution of the image and a mask function, the mask function depending on the arrangement of elements), the elements of the encoding mask being arranged to produce a convolution of the image of the scene and the encoding mask function, and
an image detector for detecting a convolved image produced by the encoding mask,
the image processing unit comprises
an image-forming device which projects a representation of the convolved image by means of a coherent light source (for example in dependence on signals from the image detector),
and a decoding mask arranged to receive and decode the coherent light representation of the convolved image thus producing a deconvolved image providing information on the original scene,
at least some of the elements of the encoding mask are controllable such that the arrangement of elements of the mask is dynamically adaptable thus allowing the image capture unit of the apparatus to extract different information from the scene,
at least some of the elements of the decoding mask are controllable such that the arrangement of elements of the mask is dynamically adaptable thus allowing the decoding mask of the image processing unit to adapt dynamically in dependence on the particular arrangement of elements of the encoding mask of the image capture unit.
Thus, the apparatus according to embodiments of the third aspect of the invention may have the advantages of the imaging apparatus as set out in WO 2006/125975, and yet reduce the post-processing time by means of the provision of a decoding mask.
The image processor unit may include digital processing means for performing digital processing of a digitised version of the image. For example, the image could be Fourier transformed (or inverse Fourier transformed) by means of a digital processor. Alternatively, or additionally, the apparatus may include at least one lens arranged to produce a Fourier transformation of the convolved image.
The image capture unit may include a collimator arranged to act on the light incident on the image detector (the collimator for example limiting the field of view of the beams reaching the detector) associated with the encoding mask.
The image-forming device may include a coherent light source and an image display unit. The image display unit may for example be arranged to display the convolved image and to be illuminated by the coherent light source to produce the coherent light representation of the convolved image.
The apparatus may be arranged to receive an image of the scene as
electromagnetic radiation. The apparatus is preferably arranged to capture the image of a real-life scene in real-time. The scene may be separated from the apparatus by a distance of more than 2 metres. The radiation that is incident on the encoding mask may thus be substantially collimated. The apparatus may be arranged to receive at the encoding mask electromagnetic radiation reflected from objects in the scene.
The third aspect of the invention also provides a method of imaging a scene, wherein the method comprises the following steps:
providing an encoding mask having an arrangement of elements that filter incident images by altering the transverse spatial distribution of the amplitude and/or phase of the images, the elements of the encoding mask being arranged to produce a convolution of the image of the scene and the encoding mask function
configuring the encoding mask with an arrangement of elements,
using the encoding mask to produce a first convolved image as a convolution of an image of a scene and a first mask function,
using an image processing unit comprising a decoding mask to produce a first deconvolved image,
the decoding mask having an arrangement of elements that filter incident images by altering the transverse spatial distribution of the amplitude and/or phase of the images (such that for example the decoding mask forms, on reflection or transmission of the image, a convolution of the image and a decoding mask function, the decoding mask function depending on the arrangement of elements),
subsequently reconfiguring in situ the encoding mask with a different arrangement of elements and reconfiguring in situ the decoding mask with a
corresponding different arrangement of elements,
using the encoding mask to produce a second convolved image as a convolution of an image of a scene and a second mask function, the second mask function being different from the first mask function,
using the image processing unit comprising the reconfigured decoding mask to produce a second deconvolved image, whereby by means of configuring the encoding mask with different arrangements of elements different information can be extracted from the scene.
The step of reconfiguring the encoding mask may be performed within Ι/Ιθ"1 second of the step of configuring an encoding mask with an arrangement of elements.
The step of using the encoding mask to produce a first convolved image may include receiving at the encoding mask, as electromagnetic radiation, a real-time image of a real-life scene. The real-life scene may be separated from the encoding mask by a distance of more than 2 metres.
The image processing unit may comprise one or more optical devices arranged to optically Fourier transform images.
It has also been appreciated that the use of the image display unit to project a composite image such that portions of one image (in the aspects described above, the source image) are interleaved with portions of another image (in the aspects described above, the decoding mask) may have application in embodiments outside the scope of the second aspect of the invention described above. The interleaving of such images in an optical processing apparatus may have application and advantage independent of the use in a deconvolution process. There may for example be application in an optical correlator for comparing a source image against a reference image.
According to a fourth aspect of the invention, there is provided an apparatus for optically processing a source image, the apparatus comprising:
an image display unit for displaying a composite image comprising a source image and a filter function image,
a coherent light source for illuminating the image display unit,
and
an image detector,
the image display unit and the coherent light source are together arranged to project a representation of the composite image onto the image detector, the image received at the image detector being in the form of an optically processed image, for example a correlation image,
wherein
the image display unit is arranged to display the composite image such that portions of the source image are interleaved with portions of the filter function image.
Preferably, the apparatus also includes an optical apparatus, for example comprising at least one lens, arranged to produce an optical Fourier transform of the composite image. The image display unit and the coherent light source may thus together be arranged to project a representation of the composite image via the optical apparatus onto the image detector, the image received at the image detector thus being a Fourier transform of the composite image. The apparatus being so arranged that the lens acts as a Fourier transform lens. For example, the image display unit and the coherent light source may together be arranged to project a representation of the composite image via the lens onto the image detector, the image received at the image detector being in the form of a Fourier transformed correlation image.
It will be seen that in this fourth aspect of the invention, the filter function image may be any image and need not be in the form of a decoding mask function. The filter function image could be in the form of an amplitude modifying function. The filter function image could be in the form of a phase modifying function. The filter function image could be in the form of a complex function, for example modifying both amplitude and phase.
The image display unit may be arranged to display the composite image such that portions of at least one of the source image and the filter function image are interleaved with nulls (for example blanks). The portions may be in the form of strips, for example elongate strips. The image may be formed (or defined by) an array of pixels. Each portion may be in the form of a collection of interconnected pixels. Each strip may have a constant width. The width of at least one strip may be a single pixel. The null strip may for example be one pixel wide. Substantially all of the null strips may be one pixel wide. Some or all of the null strips may have a width greater than one pixel. One, more or substantially all of the null strips may each be wider than the median width of the source image strip. One, more or substantially all of the null strips may each be wider than the median width of the filter function image strip. Alternatively, substantially all of the strips may have substantially the same width. Elongate strips of the source image may be interleaved with elongate strips of the filter function image. There is preferably a row of a multiplicity of successive spatially separated sets of elongate strips, each set comprising a first elongate strip of the source image and a second elongate strip of the filter function image and a third null strip disposed between the first and second strips. There may be a succession of equally spaced apart strips, such that the «th strip is an elongate strip of the source image, the («+i)th strip is an elongate strip of the filter function image and the (n+2)t strip is an elongate null strip. It will be appreciated that the direction of counting the strips (i.e. from the «th strip to the («+i)th strip may be in either direction (and could therefore be from right to left). Thus, if one considers the strips being arranged in successive sets of three available slots, the null strip may for example be considered as being placed in either the first, second or third slot. A row of a multiplicity of strips may for example be divided into slots or sets of strips, the sets or slots themselves being notionally divided as such. For example, the series ... SFN SFN SFN SFN... (having multiple sets of SFN) may be deemed as equivalent to ...S FNS FNS FNS FN ... (having multiple sets of FNS), where S, F, and N represent a source image strip, a filter function strip and a null strip, respectively. The strips may extend to touch the adjacent strip(s).
According to the fourth aspect of the invention, there is also provided a method of optically processing a source image, wherein the method comprises a step of projecting using coherent light a representation of a composite image (optionally through optical apparatus for example a lens arranged to provide a Fourier transform of the composite image) onto an image detector, the composite image comprising the source image and a filter function image, and wherein
the image received at the image detector is in the form of a (optionally Fourier transformed) correlation image, and
the composite image is so arranged such that portions of the source image are interleaved with portions of the filter function image.
The method may include a first pass process and a second pass process. The first pass process may include passing a representation of the composite image through optical apparatus, producing a Fourier transform of the composite image at the image detector as an intermediate image. The second pass process may include passing the intermediate image, or an image derived therefrom, through optical apparatus, producing an optically processed image at the image detector.
The image detector may comprise an array of sensor pixels. The method may include generating a two-dimensional data representation of an array of image pixels dependent on the intensity of incident radiation detected at the sensor pixels. The method may include assigning a data value to an image pixel by comparing the intensity of radiation detected at the sensor pixel corresponding to the image pixel with the intensity of radiation detected at a plurality of neighbouring pixels.
It will of course be appreciated that features described in relation to one aspect of the present invention may be incorporated into other aspects of the present invention. For example, the apparatus (and method) of the third aspect of the invention may incorporate any of the features described with reference to the apparatus (and method) of the second aspect of the invention and vice versa. The image display unit of the fourth aspect of the invention may have any of the features of the image display unit associated with the decoding mask of any of the first to third aspects of the invention. Similarly, the image detector of the fourth aspect of the invention may be arranged in the same way as, or have any of the features of, the image detector unit associated with the decoding mask of any of the first to third aspects of the invention.
Description of the Drawings
Embodiments of the present invention will now be described by way of example only with reference to the accompanying schematic drawings of which:
Figure 1 shows a coded aperture imaging apparatus of the prior art;
Figure 2 shows an optical correlation apparatus of the prior art;
Figure 3 shows an imaging system according to a first embodiment of the invention; Figures 4a and 4b show a pair of Modified Uniform Redundant Array (MURA)
patterns; Figure 5a shows an image of a scene;
Figure 5b shows a convolved image resulting from a convolution of the image of
Figure 5a and a mask function;
Figure 6a shows an imaging system according to a second embodiment of the
invention operating in a first mode;
Figure 6b shows a face-on view of a reconfigurable mask forming part of the imaging system shown in Figure 6a;
Figure 6c shows an enlarged portion of the reconfigurable mask shown in Figure 6b; Figure 7 shows an imaging system according to the second embodiment operating in a second mode;
Figure 8a shows an example of a joint transform correlator input image;
Figure 8b shows the convolution obtained from the input image of Figure 8a;
Figures 9a to 9c show the results of the image processing performed by the apparatus of the second embodiment;
Figures 10a and 10b illustrate the results of a thresholding algorithm used by the
apparatus of the second embodiment;
Figures 1 1a shows an optical mask used in the first embodiment of the invention; and Figures l ib shows the extracted and decoded image resulting from the use of the
apparatus of the first embodiment.
Detailed Description
Apparatus according to a first embodiment of the invention is shown
schematically by Figure 3. Light from an external scene 1 at a distance typically much greater than 2 metres from the apparatus, passes through a first encoding CAI mask 2 in series with a CCD array 13. The first encoding CAI mask 2 is in the form of a dynamically updatable liquid crystal Spatial Light Modulator (SLM). In practice the encoding mask comprises tiled repetitions of a mask pattern m to ensure that each pixel of the resulting encoded scene is formed from the full convolution of the external image with the encoding mask.
The resulting distribution is a convolution of the scene data with the pattern of the first encoding CAI mask:
Figure imgf000017_0001
where:
s is the encoded/convolved image;
i is the original image
m represents the encoding mask function (encoding MURA pattern)
® is the convolution operator This convolution image is captured by the CCD array 13 and then converted into a digitised image by means of an image processor 14. To enable the scene data to be reconstructed from the convolution, the convolved image is decoded by means of a deconvolution process: s ® m' = $(η, σ)ιη' (CQ - η,υ - σ)άηάσ (2) where:
m ' represents the decoding mask function (decoding MURA pattern). A perfect reconstruction would yield:
(3)
In the deconvolution/reconstruction stage of the process, a spectral operation consisting of the product of the Fourier transforms of the two functions, followed by an inverse Fourier transform, achieves the desired function: s ® m' = IFT[S(u, v) M' (u, v)] (4)
FT[s(x, y)] = [[ s(x, y) exp[-i2 (ux + vy)]dxdy
M' u, v) = FT[m(x, y)] = jj m(x, y) εχρ[-ζ'2π(«χ + vy)]dxdy
In the example embodiment first described herein, this deconvolving process is performed by means of an optical processor having an architecture similar to that of a 4-f Matched Filter (MF) architecture, in which the correlation between an input and reference pattern is defined as the Fourier transform (FT) of the product of an input and conjugate reference function, s and r, which have themselves been Fourier transformed (using the functions s and m ' as examples): s * m' = JJ s(x, y)m'* (w - x, v - y)dxdy (5) where:
m '* = the conjugate of m '
* is the correlation operator.
The convolution operation differs from the correlation operation by one of the functions being 180 degree symmetrically rotated about its origin. Therefore by careful selection and arrangement of the input and decoding mask functions, the optical correlator architectures may be applied to produce the required convolution (though it should be noted that the m ' mask is inherently 180 degrees rotationally symmetrical).
In coherent optical processing, the two dimensional Optical Fourier Transform (OFT) of a collimated input distribution is formed at the rear focal plane of a positive converging lens. In the first embodiment, the Fourier transform/inverse Fourier transform pair required are effected by the two lenses 6, 9, respectively.
The digitised image from the image processor 14 (the convolved image), is displayed as an encoded image on an SLM (spatial light modulator) 12. The SLM device used in this embodiment has an array of 1024x768 pixels with a 9 micron pitch.
Collimated, coherent light of wavelength λ from a source 5 is used to illuminate the SLM 12, which modulates the light with the input function defined by the convolved image s(x,y). This input function is thus projected through the first FT lens 6, producing the optical Fourier transform of the convolved image at the pixels of a second SLM 8. At the second SLM 8, the FT of the convolved image is optically multiplied with a second filter pattern (the decoding mask). The pattern of the decoding mask is pre-calculated and based upon the Fast Fourier transform of the decoding pattern M' (which is such that M' ® M is a delta function; M being the mask function effected by the CAI encoding mask 2). The pattern of the decoding mask may be represented as a complex function through the use of either a single SLM capable of providing the phase and amplitude modulation components, or by using two SLMs to provide the amplitude and phase components individually.
The multiplied distribution is then (inverse) Fourier transformed by a second lens 9 and the intensity of the resulting deconvolved image is finally captured in the focal plane of lens 9 by an image detector 3 comprising a sensor array. The sensor array is in the form of a CMOS sensor array having a 9.9 micron pitch, an array of 659x494 pixels, with each pixel having 12-bit sensitivity (i.e. ability to distinguish between 4096 different intensities of light).
Figure 11a shows an example filter pattern that may be displayed either in phase or amplitude on the decoding SLM 8. The filter was calculated from the imaginary part of the FFT of the M' pattern. Figure 1 lb shows the resulting reconstructed image.
Above it is stated that the pattern of the decoding mask may be represented as a complex function. It should be noted however that the full complex representation is not essential as illustrated by Figures 1 la and 1 lb.
The first encoding CAI mask 2 may be used as a fixed physical amplitude array. However, in this example, the SLM that defines the CAI mask 2 is reconfigurable. Thus, by using the SLM to display an array pattern in either amplitude or phase, the
functionality of the system may be extended to include a changeable field of view and the ability to zoom in and out.
Use of optical processing to perform the deconvolution, rather than the electronic processing of the prior art, has the particular advantage that the speed of the processing does not decrease with increased resolution, as the OFT is calculated at the speed of light, regardless of the data resolution, using the inherent parallel nature of the optics. In practice this may be limited by the frame rates of the electro-optic devices used: typically a liquid crystal Spatial Light Modulator (SLM) to insert the pattern data and a fast CMOS or CCD sensor array to capture the resulting intensity distribution. Current megapixel SLM fame rates are in the region of 400Hz for greyscale nematic types and 3KHz for binary ferroelectric types. However it should be noted that other SLM types may also be used in place of the liquid crystal devices, such deformable micro-mirror arrays (MEMs).
Furthermore, the ability to use the SLM in phase mode has the added benefit of minimising the absorption losses associated with amplitude arrays, which increases the amount of light entering the apparatus. The phase representation of the encoding mask is calculated to form the intended aperture as a near field diffraction pattern. The field of view may be dynamically shifted by incorporating a phase ramp into the coded aperture pattern.
As mentioned above, the decoding pattern of the decoding CAI mask 8 is chosen such that the result of a convolution between (a) the inverse Fourier transform of the decoding mask pattern and (b) the encoding mask pattern is a delta function. Thus the decoding mask 8 shown in Figure 3 (the one-pass 4-f matched filter system) is based upon the FFT of the M' pattern. This provides the most accurate reconstruction of the original image. These aperture patterns are specifically calculated by known techniques, with the best known perhaps being those classed as Modified Uniform Redundant Array (MURA) patterns, symbolised as m (encoding) and m ' (decoding). An example of a MURA pair is shown in Figures 4a and 4b, respectively.
An example of a scene and the convolved image of that scene are shown in Figures 5a which shows an image of a scene (Figure 5a) and a convolved image of that scene (Figure 5b) of the type that would be used to configure the image projection effected by SLM 12.
In a second embodiment, illustrated by Figures 6a to 7, an architecture similar to that of a 1/f Joint Transform Correlator (JTC) is employed. Figure 6a shows
schematically the layout of the reflective SLM-based 1/f Joint Transform Correlator.
Collimated coherent light of wavelength λ from a source 5 is arranged (by position and/or one or more optical devices such as mirrors and beam splitters) to be incident on the SLM 2, which displays the input convolved source image s(x,y) and decoding mask function pattern m '(x,y) as spatially separated images (illustrated schematically in Figure 6a, and shown in greater and better detail in Figures 6b and 6c). The light is modulated by the
SLM pixels and is focussed by the lens 6. The image detector array 3 is positioned at the rear focal plane of the lens, of focal length f, such that it captures the intensity distribution (square of the magnitude) of the Fourier transform of the input scene, known as the Joint Power Spectrum (JPS). The detector array then passes via a digital processing means 4 a digital representation of the JPS which is then displayed on the SLM 2 in a second pass (represented by Figure 7) whereupon it undergoes a further (inverse) Fourier transform by lens 6 allowing the detector array on this second pass to detect and capture the deconvolved image of the original scene.
Figure 6b shows the composite image, comprising the convolved source image s(x,y) and decoding mask function pattern m '(x,y), in the form that they are displayed on the SLM 2 during the first pass. Figure 6c shows an enlarged portion of the composite image of Figure 6b. In the second embodiment, as can be seen from Figure 6c, the input s(x,y) and mask m '(x,y) images are presented as interleaved columns in the form of a row of elongate strips 2c, 2d. The reason for doing this can be explained by considering the conventional use of a Joint Transform Correlator arranged to correlate a source image with a reference image. Typically, an SLM would display the input s(x,y) and reference pattern data m '(x,y), arranged such that they are spatially separated by distances (-Xr/2) and (+Xf 2) from the centre of the plane respectively and displayed in phase (or otherwise), to form an input scene: s(x - xQ / 2, y) + rri (x + xQ 12, y) (6)
The Joint Power Spectrum (JPS) thus produced would be:
\i- S(u, v) Qxv(-jgux0 / ΛΟ + 7 M' (u> v) exp(+^'w¾ / Α Ί)
At At
Figure imgf000021_0001
(7)
+ S*(u, v) ' (u, v) exp[-27g'ux0 1 Af]
2f
+ M' (u, v)S(u, v) QXTp[+2jg'uxQ I f]
The JPS would then be processed and then displayed on the SLM for a second pass through the lens 6 to produce a final correlation scene at the sensor array. The final correlation scene would consist of terms relating to those featured in the right hand side of the above equation: the first term is a "zero order" noise term at the origin, whilst the second and third terms are pairs of 180 degree symmetrical conjugate peaks/spots whose intensity and positions denote the level of graphical similarity and relative alignment, respectively, of the input and reference functions. Thus it follows from the conventional use of a JTC for correlating images, that one possible method for producing a deconvolution with a JTC would be to arrange on the SLM 2 the convolved source image 20 and decoding mask function 8 side by side, as shown in Figure 8a, separated by a displacement vector. As described above, this results in an output consisting of a "zero order" term in the centre of the plane, plus two terms representing the cross-correlation between the input and reference patterns, rotated 180 degrees about the origin of the output plane. If the classical input arrangement (as shown in Figure 8a) is used, the resulting output will contain two overlaid reconstructed images, one rotated 180 degrees with respect to the other, as shown in Figure 8b. The interleaving of the convolved source image 20 and decoding mask function 8 in accordance with the second embodiment however enables the two images to be separated to reconstruct the original image.
It should be noted that in the MURA pattern implementation of the coded aperture system, the convolution is cyclic. This may be accounted for in the Fourier transform- based deconvolution by ensuring the size of the input is equal to the lateral dimensions of the imaging system. In order to separate the reconstructed image components in the output plane, the input plane as displayed on the SLM during the first pass is, in accordance with the second embodiment, reordered by interleaving the columns of the convolved object (columns 2c) and the decoding pattern (columns 2d) with at least one line of zeros (columns 2b) between them, as shown in Figure 6b. The explanation for this method is described thus, substituting the function variables used so far to explain the theory in general terms.
Consider the input to comprise of two functions that are sampled at one half of the sampling frequency of the input plane, having successive separations of one sampling interval. The interaction between these two objects can be analysed in Fourier space in the Joint Power Spectrum (JPS), where a general representation of their form is given by
Ψ + β · Φ| , the input argument of the Fourier operator: ζ = 3 (| Ψ + /3 · Φ |2 )
= 3 (| Ψ |2 ) + 3 (| Φ |2 ) + 3 (Ψ · /Τ · Φ*) + 3 (Ψ* · /3 · Φ) (8)
= [ψ *ψ](χ) + [φ * φ](χ) + [ψ * φ](χ ~ ερ ) + [φ *ψ](χ + εβ ) where
Ψ = 3(γ
φ) * = the correlation operator. If we sete^ = 1 , the three components are separated by unit shift of the sampling interval at the output plane. For independent retrieval to be possible the sampling of ψ and φ be should be such that the shift does not result in overlap among the three shifted terms in equation (8). This means that ψ and φ should be sampled at 3 times the
minimum sampling interval where the multiplication of Oby β ίη the Fourier space is implemented by shifting φ with respect to ψ by at least one sampling interval (Fourier shift theorem, i.e. β = e'-(2,tk/3N) ) at the input plane. The form of β over the intermediate (JPS plane) and the corresponding information expression are shown below:
Figure imgf000023_0002
The corresponding input object ( ξ ) is given by [η] (9)
Figure imgf000023_0001
With the mapping of ψ and φ to the input object ( ξ ), the output plane contains the desired correlation terms. The components of the correlation can be recovered with the mapping below:
( *ψ + φ * φ) [η] — ζ [3η]
(ψ * φ ) [η] <- £ [3/1 - 1] (10)
(φ *ψ ) [η] - [3« + 1]
Deconvolution of the CAI encoded image can thus be implemented by replacing ψ with the recorded convolved data, s(x,y) and φ by the decoding pattern m \x,y).
Preliminary results show that this technique is robust to noise and background
subtraction. It can also be easily realised in an optical (JTC) setup since the negative value of -1 in the decoding pattern can be replaced by zeros (i.e. a null or blank pattern) without affecting the quality of the recovered image.
Thus, returning to the description of this embodiment with reference to Figures 6a to 7, the interleaved image on the SLM shown in Figures 6b and 6c gives rise to a JPS that is detected by the detector 3 and then electronically processed by a digital processing circuit 4 (which may be in the form of a computer or a bespoke digital processing circuit). The resulting pattern is then displayed on the SLM 2 as shown in Figure 7 during the second pass. A second, inverse, optical Fourier transform process then produces the final deconvolved at the sensor array 3 as a second pass process using the same apparatus. The final image actually received at the sensor array 3 is in the form of a striped pattern, with the three component output images being interleaved with each other. The final image is digitally processed by the processor 4 to reconstruct the source image.
The results of the JTC-based process are shown in Figures 9a to 9c, decoupled from the resulting output plane and reconstructed by the processor 4. Thus, Figures 9a and 9c show the reconstructed input scene and its rotated self, whilst 9b shows the DC noise term.
The second embodiment utilises additional image processing techniques to mitigate against the physical limitations of the hardware used in the JTC architecture, in particular the fact that the CMOS/CCD sensor array 3 will not capture phase information from the Fourier transformed convolved scene incident upon it (detected as the Joint Power Spectrum). The CMOS/CCD sensor array 3 positioned in the Fourier plane will also be limited in its dynamic range sensitivity when capturing the Fourier spectrum data of the Joint Power Spectrum - whose dynamic range will be several orders of magnitude greater than that of the sensor. However, tests have shown that this is not a significant problem due to the fact the majority of the spectral data is contained with a much smaller range, located away from the central DC term. The use of a physical aperture stop placed at the origin of the plane, plus setting a high camera gain has been shown to allow the system to operate with a low camera exposure time.
A further consideration is the possible disparity between the camera output bit depth and the SLM bit depth. If SLM bit depth is considerably lower than that of the camera, an adaptive thresholding algorithm may be applied electronically (e.g. in processor 4) to the JPS to preserve as much of the information as possible before being displayed on the SLM for the inverse Fourier transform stage (second pass) of the process. One such algorithm that has been shown to be highly effective at extracting the spectral data from the JPS is a 3x3 nearest neighbour kernel, of the type described in PCT publication No. W099/31563. This produces a binary image which is of particular use if, for example, a high speed ferroelectric binary SLM is being used in the inverse FT part of the deconvolution stage, plus it has the added benefit of producing a good balance of black (0 or -1 in phase) and white (+1) pixels which minimises the noise term present in the Fourier plane. The algorithm is of the form below. Each pixel in turn is set to either logical 1 or 0 (or -1 if displayed in phase) if its original value is greater than the average of its neighbouring pixels: l)
Figure imgf000025_0001
where
p(x,y) is the original greyscale function
q(x,y) is the resulting binary function
Figure 10a shows the results of applying the algorithm to the JPS produced from the first pass though the apparatus of Figure 6a using as a source the image shown in Figure 5a. Figure 10b shows the resulting image reconstruction following decoding and transforming after the second pass.
Whilst the present invention has been described and illustrated with reference to particular embodiments, it will be appreciated by those of ordinary skill in the art that the invention lends itself to many different variations not specifically illustrated herein. By way of example only, certain possible variations will now be described.
The first encoding mask (the CAI mask 2 shown in Figure 3 for example) could be a fixed pattern mask. The masks could be formed as micro-mirror arrays instead of LCD arrays.
The second SLM 8 shown in Figure 3 may be lower resolution than that of the SLM 2 used to display the source image. The decoding mask may require a resolution of only 201x201 elements for example.
The use of interleaving when using the JTC architecture may have application in other optical processing applications. For example, the interleaving of images / mask functions may be used in an optical correlator. The interleaving of images / mask functions may be used in an optical processor, having a JTC architecture, to enable spatial separation of the results of a two pass optical process that might otherwise be performed by an optical processor having a 4-f matched filter architecture. For example, the interleaving of images / mask functions may be used in an optical processor of a JTC architecture used to calculate or evaluate derivatives or partial derivatives as an optical differentiator using the optical processes described in WO 2008/1 10779, the contents of which are hereby incorporated by reference thereto, but using a JTC architecture, instead of a 4-f matched filter architecture.
The use of a dynamically reconfigurable decoding mask in conjunction with a dynamically reconfigurable encoding mask may have application independent of the particular optical processing architectures and arranged shown in the drawings. The parts of the illustrated apparatus that perform the function of an optical processor (for example, those parts downstream of, and excluding, the encoding mask and the detector associated with the encoding mask) may have independent application. For example, such parts could be used to optically process data obtained by other means, for example, by means of an image capture apparatus that is physically separate to, and independent of, such parts.
Where in the foregoing description, integers or elements are mentioned which have known, obvious or foreseeable equivalents, then such equivalents are herein incorporated as if individually set forth. Reference should be made to the claims for determining the true scope of the present invention, which should be construed so as to encompass any such equivalents. It will also be appreciated by the reader that integers or features of the invention that are described as preferable, advantageous, convenient or the like are optional and do not limit the scope of the independent claims. Moreover, it is to be understood that such optional integers or features, whilst of possible benefit in some embodiments of the invention, may not be desirable, and may therefore be absent, in other embodiments.

Claims

Claims
1. A method of deconvolving an image from an encoding mask with which it has been convolved, said encoding mask having an arrangement of elements that filter incident images by altering the transverse spatial distribution of the amplitude and/or phase of the images, wherein the method includes the steps of:
receiving the convolved image;
providing a decoding mask, the decoding mask having an arrangement of elements dependent upon the arrangement of elements in the encoding mask;
using an optical Fourier transform to form an intermediate optical pattern derived from the convolved image and the arrangement of elements in the decoding mask;
and performing an optical inverse Fourier transform on said intermediate optical pattern.
2. A method of coded aperture imaging according to the method of claim 1 , the encoding and decoding masks comprising coded apertures.
3. A method of image reconstruction comprising a method as claimed in either preceding claim.
4. A method as claimed in any one of claims 1 to 3, in which the optical Fourier transform is performed on a combination of the decoding mask and the convolved image.
5. A method as claimed in any one of claims 1 to 3, in which the optical Fourier transform is performed on the convolved image and the decoding mask is then applied to the resulting Fourier-transformed convolved image.
6. A method of detecting an image, wherein the method includes
producing a convolved image of a scene by passing electromagnetic radiation from the scene through an encoding mask,
and deconvolving the convolved image by a method according to any of claims 1 to 5.
7. Apparatus for performing a deconvolution of an image which has been convolved with an encoding mask, said encoding mask having an arrangement of elements that filter incident images by altering the transverse spatial distribution of their amplitude and/or phase, the apparatus comprising:
a decoding mask having an arrangement of elements dependent on the arrangement of elements of the encoding mask, first optical apparatus arranged to form, by an optical Fourier transform, an intermediate optical pattern derived from the convolved image and the arrangement of elements in the decoding mask; and
second optical apparatus arranged to form optically an inverse Fourier transform of said intermediate optical pattern.
8. A method of coded aperture imaging according to the method of Claim 7, the encoding and decoding masks comprising coded apertures.
9. A method of image reconstruction comprising a method as claimed in either one of Claims 7 or 8.
10. A method as claimed in any one of claims 7 to 9, in which the first optical apparatus is arranged to perform an optical Fourier transform on a combination of the decoding mask and the convolved image.
11. Apparatus as claimed in any one of claims 7 to 9, in which the first optical apparatus is arranged to perform the Fourier transform on the convolved image and then to apply the decoding mask to the resulting Fourier-transformed convolved image.
12. Apparatus as claimed in any of claims 7 to 11 , in which the first optical apparatus is arranged to perform the optical Fourier transform using a lens.
13. Apparatus as claimed in any of claims 7 to 12, in which the second optical apparatus is arranged to perform the optical inverse Fourier transform using a lens.
14. Apparatus according to any of claims 7 to 13, further comprising an image- forming device arranged to produce a representation of the convolved image, such that the convolved image received at the first optical apparatus results from the representation so produced.
15. Apparatus as claimed in any of claims 7 to 14, in which the apparatus further comprises a detector arranged to detect the intermediate optical pattern formed by the first optical apparatus, a signal processor, and an image-forming device, the signal processor being arranged to receive from the detector the detected intermediate optical pattern and to supply it, or an image derived from it, to the image-forming device, the image forming device being arranged to provide said intermediate optical pattern, or said derived image, to the second optical apparatus.
16. Apparatus as claimed in claim 15, in which the first optical apparatus and the second optical apparatus are one and the same optical apparatus, wherein the optical apparatus acts as the first optical apparatus on a first pass, prior to the signal processor receiving from the detector the detected intermediate optical pattern, and the optical apparatus acts as the second optical apparatus on a second pass, after the detected intermediate optical pattern, or the derived image, is supplied to the image-forming device.
17. Apparatus as claimed in claim 16, in which one and the same image-forming device is used to produce a representation of the convolved image at the start of the first pass and to provide said intermediate optical pattern or said derived image, on the second pass, the image forming device being arranged to be switchable between (i) a first mode in which it displays a combination of the convolved image and a representation of the decoding mask and (ii) a second mode in which it displays said intermediate optical pattern or said derived image.
18. Apparatus as claimed in any of claims 14 to 17 wherein the image forming device comprises an image display unit and a coherent light source arranged to illuminate the image display unit to produce an image.
19. Apparatus according to claim 18 , wherein the image display unit comprises an array of independently switchable LCD pixels.
20. Apparatus according to claim 18, wherein the image display unit comprises an array of independently switchable mirror elements.
21. Apparatus according to any of claims 18 to 20, wherein the image display unit is a spatial light modulator.
22. Apparatus according to any of claims 18 to 21 , wherein the image display unit is arranged to display a combination of the convolved image and a representation of the decoding mask.
23. Apparatus according to claim 22, wherein the image display unit is arranged to display a combination of the convolved image and the representation of the decoding mask such that portions of the convolved image are interleaved with portions of the decoding mask.
24. Apparatus according to claim 22 or claim 23, wherein the image display unit is arranged to display a combination of the convolved image and the representation of the decoding mask such that portions of at least one of the convolved image and the representation of the decoding mask are interleaved with blanks.
25. Apparatus according to any of claims 22 to 24, wherein the image display unit is arranged to display a combination of the convolved image and the representation of the decoding mask such that elongate strips of the convolved image are interleaved with elongate strips of the representation of the decoding mask.
26. Apparatus according to any of claims 7 to 25, wherein the apparatus further comprises an image detector for detecting the convolved image after it has been optically processed.
27. Apparatus according to claim 26 , wherein the image detector comprises an array of sensor pixels and a processing unit, the sensor pixels in the array each being arranged to measure the intensity of incident radiation and the processing unit being arranged to generate a two-dimensional data representation of the detected image, the generation by the processing unit of the two-dimensional data representation including assigning a data value to a pixel by comparing the intensity of radiation detected at the sensor pixel with the intensity of radiation detected at a plurality of neighbouring sensor pixels.
28. Apparatus according to any of claims 7 to 27, wherein the apparatus also includes the encoding mask, which produces the convolved image for deconvolution.
29. Apparatus according to claim 28, wherein the encoding mask is a dynamically reconfigurable mask and the decoding mask is a dynamically reconfigurable mask.
PCT/GB2012/050646 2011-03-23 2012-03-23 Apparatus and method for reconstruction of coded aperture images WO2012127246A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1104873.3 2011-03-23
GBGB1104873.3A GB201104873D0 (en) 2011-03-23 2011-03-23 Encoded image processing apparatus and method

Publications (1)

Publication Number Publication Date
WO2012127246A1 true WO2012127246A1 (en) 2012-09-27

Family

ID=44013021

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2012/050646 WO2012127246A1 (en) 2011-03-23 2012-03-23 Apparatus and method for reconstruction of coded aperture images

Country Status (2)

Country Link
GB (1) GB201104873D0 (en)
WO (1) WO2012127246A1 (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015073079A3 (en) * 2013-08-19 2015-09-03 Massachusetts Institute Of Technology Motion coded imaging
WO2014165294A3 (en) * 2013-03-12 2015-10-29 Rearden, Llc Apparatus and method for capturing still images and video using diffraction coded imaging techniques
US9819403B2 (en) 2004-04-02 2017-11-14 Rearden, Llc System and method for managing handoff of a client between different distributed-input-distributed-output (DIDO) networks based on detected velocity of the client
US9826537B2 (en) 2004-04-02 2017-11-21 Rearden, Llc System and method for managing inter-cluster handoff of clients which traverse multiple DIDO clusters
US9923657B2 (en) 2013-03-12 2018-03-20 Rearden, Llc Systems and methods for exploiting inter-cell multiplexing gain in wireless cellular systems via distributed input distributed output technology
CN108013891A (en) * 2018-01-26 2018-05-11 中国工程物理研究院激光聚变研究中心 A kind of radiographic apparatus
US9973246B2 (en) 2013-03-12 2018-05-15 Rearden, Llc Systems and methods for exploiting inter-cell multiplexing gain in wireless cellular systems via distributed input distributed output technology
US10148897B2 (en) 2005-07-20 2018-12-04 Rearden, Llc Apparatus and method for capturing still images and video using coded lens imaging techniques
US10277290B2 (en) 2004-04-02 2019-04-30 Rearden, Llc Systems and methods to exploit areas of coherence in wireless systems
US10333604B2 (en) 2004-04-02 2019-06-25 Rearden, Llc System and method for distributed antenna wireless communications
US10425134B2 (en) 2004-04-02 2019-09-24 Rearden, Llc System and methods for planned evolution and obsolescence of multiuser spectrum
US10547358B2 (en) 2013-03-15 2020-01-28 Rearden, Llc Systems and methods for radio frequency calibration exploiting channel reciprocity in distributed input distributed output wireless communications
CN111386538A (en) * 2017-09-20 2020-07-07 视觉动力公司 Photonic neural network system
US10727907B2 (en) 2004-07-30 2020-07-28 Rearden, Llc Systems and methods to enhance spatial diversity in distributed input distributed output wireless systems
CN112400175A (en) * 2018-04-27 2021-02-23 奥普特里斯有限公司 Optical processing system
US11189917B2 (en) 2014-04-16 2021-11-30 Rearden, Llc Systems and methods for distributing radioheads
EP3787276B1 (en) * 2019-08-27 2023-08-09 Rosemount Aerospace Inc. Secure image transmission

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999031563A1 (en) 1997-12-12 1999-06-24 Cambridge University Technical Services Ltd. Optical correlator
WO2006125975A1 (en) 2005-05-23 2006-11-30 Qinetiq Limited Coded aperture imaging system
WO2008110779A1 (en) 2007-03-13 2008-09-18 Cambridge Correlators Ltd Optical processing

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999031563A1 (en) 1997-12-12 1999-06-24 Cambridge University Technical Services Ltd. Optical correlator
WO2006125975A1 (en) 2005-05-23 2006-11-30 Qinetiq Limited Coded aperture imaging system
WO2008110779A1 (en) 2007-03-13 2008-09-18 Cambridge Correlators Ltd Optical processing

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
DICKE R H: "SCATTER-HOLE CAMERAS FOR X-RAYS AND GAMMA RAYS", THE ASTROPHYSICAL JOURNAL,, vol. 153, 1 August 1968 (1968-08-01), pages L101 - L106, XP001435803 *
FENIMORE E E ET AL: "Coded aperture imaging with uniformly redundant arrays", APPLIED OPTICS, OPTICAL SOCIETY OF AMERICA, WASHINGTON, DC; US, vol. 17, no. 3, 1 February 1978 (1978-02-01), pages 337 - 347, XP002558348, ISSN: 0003-6935, DOI: 10.1364/AO.17.000337 *
LEE S H: "COHERENT OPTICAL PROCESSING", OPTICAL INFORMATION PROCESSING, XX, XX, vol. 48, 1 January 1981 (1981-01-01), pages 43 - 67, XP000648101 *
TAKANORI NOMURA ET AL: "Optical Encryption System with a Binary Key Code", APPLIED OPTICS, vol. 39, no. 26, 10 September 2000 (2000-09-10), pages 4783, XP055034098, ISSN: 0003-6935, DOI: 10.1364/AO.39.004783 *

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10425134B2 (en) 2004-04-02 2019-09-24 Rearden, Llc System and methods for planned evolution and obsolescence of multiuser spectrum
US10333604B2 (en) 2004-04-02 2019-06-25 Rearden, Llc System and method for distributed antenna wireless communications
US9819403B2 (en) 2004-04-02 2017-11-14 Rearden, Llc System and method for managing handoff of a client between different distributed-input-distributed-output (DIDO) networks based on detected velocity of the client
US9826537B2 (en) 2004-04-02 2017-11-21 Rearden, Llc System and method for managing inter-cluster handoff of clients which traverse multiple DIDO clusters
US10277290B2 (en) 2004-04-02 2019-04-30 Rearden, Llc Systems and methods to exploit areas of coherence in wireless systems
US10727907B2 (en) 2004-07-30 2020-07-28 Rearden, Llc Systems and methods to enhance spatial diversity in distributed input distributed output wireless systems
US10148897B2 (en) 2005-07-20 2018-12-04 Rearden, Llc Apparatus and method for capturing still images and video using coded lens imaging techniques
US11681061B2 (en) 2013-03-12 2023-06-20 Rearden, Llc Apparatus and method for capturing still images and video using diffraction coded imaging techniques
US9973246B2 (en) 2013-03-12 2018-05-15 Rearden, Llc Systems and methods for exploiting inter-cell multiplexing gain in wireless cellular systems via distributed input distributed output technology
US11150363B2 (en) 2013-03-12 2021-10-19 Rearden, Llc Apparatus and method for capturing still images and video using diffraction coded imaging techniques
US9923657B2 (en) 2013-03-12 2018-03-20 Rearden, Llc Systems and methods for exploiting inter-cell multiplexing gain in wireless cellular systems via distributed input distributed output technology
GB2527969B (en) * 2013-03-12 2020-05-27 Rearden Llc Apparatus and method for capturing still images and video using diffraction coded imaging techniques
GB2527969A (en) * 2013-03-12 2016-01-06 Rearden Llc Apparatus and method for capturing still images and video using diffraction coded imaging techniques
US10488535B2 (en) 2013-03-12 2019-11-26 Rearden, Llc Apparatus and method for capturing still images and video using diffraction coded imaging techniques
WO2014165294A3 (en) * 2013-03-12 2015-10-29 Rearden, Llc Apparatus and method for capturing still images and video using diffraction coded imaging techniques
US11146313B2 (en) 2013-03-15 2021-10-12 Rearden, Llc Systems and methods for radio frequency calibration exploiting channel reciprocity in distributed input distributed output wireless communications
US10547358B2 (en) 2013-03-15 2020-01-28 Rearden, Llc Systems and methods for radio frequency calibration exploiting channel reciprocity in distributed input distributed output wireless communications
WO2015073079A3 (en) * 2013-08-19 2015-09-03 Massachusetts Institute Of Technology Motion coded imaging
US9681051B2 (en) 2013-08-19 2017-06-13 Massachusetts Institute Of Technology Method and apparatus for motion coded imaging
US11189917B2 (en) 2014-04-16 2021-11-30 Rearden, Llc Systems and methods for distributing radioheads
JP7345191B2 (en) 2017-09-20 2023-09-15 ルック ダイナミックス,インコーポレイテツド Photonic neural network system
JP2020534623A (en) * 2017-09-20 2020-11-26 ルック ダイナミックス, インコーポレイテツドLook Dynamics, Inc. Photonic neural network system
CN111386538B (en) * 2017-09-20 2023-11-03 视觉动力公司 Photonic Neural Network System
EP3685320A4 (en) * 2017-09-20 2021-06-16 Look Dynamics, Inc. Photonic neural network system
CN111386538A (en) * 2017-09-20 2020-07-07 视觉动力公司 Photonic neural network system
US11410028B2 (en) 2017-09-20 2022-08-09 Look Dynamics, Inc. Photonic neural network system
CN108013891A (en) * 2018-01-26 2018-05-11 中国工程物理研究院激光聚变研究中心 A kind of radiographic apparatus
CN108013891B (en) * 2018-01-26 2023-08-04 中国工程物理研究院激光聚变研究中心 X-ray diagnostic device
CN112400175A (en) * 2018-04-27 2021-02-23 奥普特里斯有限公司 Optical processing system
EP3787276B1 (en) * 2019-08-27 2023-08-09 Rosemount Aerospace Inc. Secure image transmission

Also Published As

Publication number Publication date
GB201104873D0 (en) 2011-05-04

Similar Documents

Publication Publication Date Title
WO2012127246A1 (en) Apparatus and method for reconstruction of coded aperture images
JP7245835B2 (en) Light field image processing method for depth acquisition
Liang Punching holes in light: recent progress in single-shot coded-aperture optical imaging
EP1982227B1 (en) Imaging system
CN108702440B (en) Image pickup apparatus
Cao et al. A prism-mask system for multispectral video acquisition
US20170045909A1 (en) Reconfigurable optical processing system
CN107896292B (en) Image pickup apparatus and image pickup method
JP6721698B2 (en) Imaging device
US20120044320A1 (en) High resolution 3-D holographic camera
KR20200094062A (en) Lensless Hyperspectral Imaging Method and Apparatus Therefore
US9013590B2 (en) Pixel multiplication using code spread functions
US4105289A (en) Apparatus and method for image sampling
US10012953B2 (en) Method of reconstructing a holographic image and apparatus therefor
CN109900249B (en) Distance measuring device and distance measuring method
US20220113674A1 (en) Differential holography
EP2503379A1 (en) Optical processing method and apparatus
CN117288720A (en) Non-invasive Shan Zhenkuan spectrum scattering imaging system and imaging method based on subarea homogenization
CN110198391B (en) Image pickup apparatus, image pickup method, and image processing apparatus
JP2019092088A (en) Imaging apparatus
JP6984736B2 (en) Imaging device and imaging method
EP3502783B1 (en) Holographic display method and device
US8507836B1 (en) Software defined lensing
CN114208145A (en) Image pickup apparatus and method
US20230280692A1 (en) Totagraphy: Coherent Diffractive/Digital Information Reconstruction by Iterative Phase Recovery Using a Second Camera Imaging the Input Plane

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12717444

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12717444

Country of ref document: EP

Kind code of ref document: A1