WO2010064051A1 - Provision of image data - Google Patents

Provision of image data Download PDF

Info

Publication number
WO2010064051A1
WO2010064051A1 PCT/GB2009/051652 GB2009051652W WO2010064051A1 WO 2010064051 A1 WO2010064051 A1 WO 2010064051A1 GB 2009051652 W GB2009051652 W GB 2009051652W WO 2010064051 A1 WO2010064051 A1 WO 2010064051A1
Authority
WO
WIPO (PCT)
Prior art keywords
target object
radiation
function
intensity
estimate
Prior art date
Application number
PCT/GB2009/051652
Other languages
French (fr)
Inventor
Andrew Maiden
Original Assignee
Phase Focus Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Phase Focus Limited filed Critical Phase Focus Limited
Priority to US13/132,501 priority Critical patent/US8908910B2/en
Priority to JP2011539099A priority patent/JP5619767B2/en
Priority to CN200980148909.9A priority patent/CN102239426B/en
Priority to EP09798949.5A priority patent/EP2356487B1/en
Priority to AU2009323838A priority patent/AU2009323838B2/en
Publication of WO2010064051A1 publication Critical patent/WO2010064051A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01TMEASUREMENT OF NUCLEAR OR X-RADIATION
    • G01T1/00Measuring X-radiation, gamma radiation, corpuscular radiation, or cosmic radiation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/006Inverse problem, transformation from projection-space into object-space, e.g. transform methods, back-projection, algebraic methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2211/00Image generation
    • G06T2211/40Computed tomography
    • G06T2211/424Iterative

Definitions

  • the present invention relates to a method and apparatus for providing image data of the type which may be utilised to construct an image of a region of a target object.
  • the present invention relates to a method of providing such image data using an iterative process making use of an unknown probe function.
  • Imaging techniques for deriving spatial information about a target object (sometimes referred to as a specimen).
  • a target object sometimes referred to as a specimen
  • imaging techniques For example in conventional transmission imaging an object is irradiated by plane wave illumination. The waves scattered by the object are re-interfered by a lens to form an image.
  • this technique In the case of very short wave length imaging (X-rays or electrons) this technique has many known difficulties associated with aberrations and instabilities introduced by the lens which limit the resolution and interpretability of the resulting image. Typical achievable resolution is many times larger than the theoretical limit.
  • Other types of imaging techniques are known but many of these have problems such as resolution limits, long data gathering times or the need for complex and expensive equipment.
  • a technique for high resolution imaging has been disclosed in WO 2005/106531.
  • This document which is herein incorporated by reference for all purposes, discloses a method and apparatus of providing image data for constructing an image of a region of a target object which includes the steps of providing incident radiation from a radiation source at a target object. Via at least one detector, detecting the intensity of radiation scattered by the target object and providing image data responsive to the detected intensity without high resolution positioning of the incident radiation or a post target object aperture relative to the target object. Also disclosed is a method for providing such image data via an iterative process using a moveable softly varying probe function such as a transmittance function or illumination function.
  • ptychographical iterative engine This is a powerful technique for the recovery of image data relating to an area of an object from a set of diffraction pattern measurements.
  • Each diffraction pattern is formed by illuminating an object with a known wave front of coherent radiation with the requirement that the intensity of the wave front is concentrated within a localised lateral region where it interacts with the object. Examples of such a wave front would be that generated a short distance beyond an aperture when it is illuminated by a plane wave, or the focal spot generated by a convex lens illuminated by a plane wave.
  • the technique is also applicable to scenarios where a target is illuminated by plane wave radiation and a post target object aperture is used to select illumination scattered by a region of the object.
  • a diffraction pattern is the distribution of intensity produced by an optical configuration some distance beyond the object and at a plane normal to the direction of propagation of the illumination wave front.
  • This plane is designated as the measurement plane and measurements made at this plane are denoted ⁇ k (u) with u being an appropriate coordinate vector.
  • ⁇ k (u) measurements made at this plane are denoted.
  • Ptychography relies upon the recording of several diffraction patterns at the measurement plane using a suitable recording device such as a CCD camera or the like.
  • a suitable recording device such as a CCD camera or the like.
  • the lateral positions of the object and the localised illumination wave front are different for each pattern.
  • a probe function which might be a transmittance function associated with a post target object aperture or an illumination function associated with incident radiation itself must be known or estimated. This either requires time consuming set up techniques or can lead to inaccuracies if the probe function used is in accurate. Furthermore the iterative process can be time consuming.
  • a method of providing image data for constructing an image of a region of a target object comprising the steps of: providing incident radiation from a radiation source at a target object and, via at least one detector, detecting an intensity of radiation scattered by the target object; via the at least one detector, detecting an intensity of radiation provided by the radiation source absent the target object; and providing image data via an iterative process responsive to the intensity of radiation detected absent the target object and the detected intensity of radiation scattered by the target object.
  • apparatus for providing image data for generating an image of a region of a target object comprising: locating means for locating a target object at a predetermined location; a radiation source for providing incident radiation at a target object located by the locating means; at least one detector device for detecting an intensity of radiation scattered by the target object locating means for locating incident radiation or a post-target aperture at one or more locations with respect to the target object; and processing means for providing image data via an iterative process responsive to an intensity of radiation detected absent the target object and a detected intensity of radiation scattered by the target object.
  • Certain embodiments of the present invention provide the advantage that during an iterative process a probe function is itself iteratively calculated step by step with a running estimate of the probe function being utilised to determine running estimates of an object function associated with the target object.
  • Certain embodiments of the present invention provide the advantage that illumination from an optical set up without a target may be efficiently measured before and/or after analysing a target object. This obviates the need for a probe function to be measured at other times. Certain embodiments of the present invention provide a method of providing high resolution images using image data gathered via an iterative process and constructing an image therefrom.
  • Certain embodiments of the present invention provide the advantage that image data indicating characteristics of a target object may be provided which may then be processed as data so as to determine some other characteristic of a target object. As such an image is not necessarily constructed using the image data.
  • Figure 1 illustrates incident at a target object
  • Figure 2 illustrates a Probe Function and formation of a diffraction pattern with a target object
  • FIG. 3 illustrates a phase retrieval algorithm
  • Figure 4 illustrates a Probe Function and formation of a diffraction pattern without a target object.
  • Figure 1 illustrates how a scattering pattern may be developed and used to determine image data corresponding to information about the structure of a target object.
  • target object refers to any specimen or item placed in the path of incident radiation which causes scattering of that radiation. It will be understood that the target object should be at least partially transparent to incident radiation.
  • the target object may or may not have some repetitive structure. Alternatively the target object may be wholly or partially reflective in which case a scattering pattern is measured based on reflected radiation.
  • Incident radiation 10 is caused to fall upon the target object 1 1 . It is to be understood that the term radiation is to be broadly construed as energy from a radiation source.
  • Such radiation may be represented by a wave function ⁇ (r).
  • This wave function includes a real part and an imaginary part as will be understood by those skilled in the art. This may be represented by the wave functions modulus and phase.
  • the incident radiation 10 is scattered as it passes through and beyond the specimen 1 1.
  • the wave function of the incident radiation as it exits the specimen will be modified in both amplitude and phase with respect to the wave function of the incident radiation at the pre-target side of the specimen.
  • the scattering which occurs may include Fourier diffraction, refraction and/or Fresnel diffraction and any other form of scattering in which characteristics of the incident radiation are modified as a result of propagating after the specimen. If an array of detectors such as a CCD detector 12 is arranged a long distance from the specimen then a diffraction pattern is formed at a diffraction plane 13.
  • a Fourier diffraction pattern will form if the detectors 12 are located a distance D from the specimen where D is sufficiently long for the diffraction pattern to be formed effectively from a point source. If the diffraction plane is formed closer to the specimen, by locating the detectors nearer, then a Fresnel diffraction pattern will be formed.
  • the incident radiation 10 falls upon a first surface of a target object 1 1.
  • the incident radiation is scattered in the specimen and transmitted radiation propagates through to a diffraction plane 13 where a diffraction pattern forms.
  • FIG 2 illustrates the process of Figure 1 in more detail.
  • the radiation 10 is roughly focused, for example by a weak lens, so that a region of a first surface of the target object is illuminated.
  • the weak lens may of course comprise any appropriate focusing apparatus such as a set of plates and a voltage supply for a beam of electrons or a reflective surface for X-rays.
  • the weak focusing is sufficient to substantially confine the probing radiation beam. It is thus not necessary to sharply focus radiation although of course strongly focussed radiation could be used.
  • the target object provides an object function O(r) which represents the phase and amplitude alteration introduced into an incident wave as a result of passing through the object of interest.
  • the illuminating radiation incident on the target object represents a probe function P(r) which forms an illumination function such as that generated by a caustic or illumination profile formed by the lens or other optical component.
  • P(r) is the complex stationary value of this wave field calculated at the plane of the object.
  • the exit wave function ⁇ (r,R) defines the scattered radiation as it exits the downstream surface of the target object. As this exit wave propagates through space it will form a diffraction pattern ⁇ (u) at the diffraction plane 13.
  • unfocused radiation can be used with a post target aperture.
  • An aperture is located post target object to thereby select a region of the target for investigation.
  • the aperture is formed in a mask so that the aperture defines a "support".
  • a support is an area of a function where that function is not zero. In other words outside the support the function is zero. Outside the support the mask blocks the transmittance of radiation.
  • the term aperture describes a localised transmission function of radiation. This may be represented by a complex variable in two dimensions having a modulus value between 0 and 1.
  • An example is a mask having a physical aperture region of varying transmittance.
  • Incident radiation would thus fall upon the up-stream side of the specimen and be scattered by the specimen as it is transmitted.
  • a specimen wave O(r) is thus formed as an exit wave function of radiation after interaction with the object.
  • O(r) represents a two-dimensional complex function so that each point in O(r), where r is a two-dimensional coordinate, has associated with it a complex number.
  • O(r) will physically represent an exit wave that would emanate from the object which is illuminated by a plane wave. For example, in the case of electron scattering, O(r) would represent the phase and amplitude alteration introduced into an incident wave as a result of passing through the object of interest.
  • the aperture provides a probe function P(r) (or transmission function) which selects a part of the object exit wave function for analysis. It will be understood that rather than selecting an aperture a transmission grating or other such filtering function may be located downstream of the object function.
  • the probe function P(r-R) is an aperture transmission function where an aperture is at a position R.
  • the probe function can be represented as a complex function with its complex value given by a modulus and phase which represent the modulus and phase alterations introduced by the probe into a perfect plane wave incident up it.
  • the exit wave function ⁇ (r,R) is an exit wave function of radiation as it exits the aperture.
  • This exit wave ⁇ (r,R) forms a diffraction pattern ⁇ (u) at a diffraction plane.
  • r is a vector coordinate in real space
  • u is a vector coordinate in diffraction space.
  • Figure 3 illustrates an iterative process according to an embodiment of the present invention which can be used to recover image data of the type which can be used to construct an image of an area of an object from a set of diffraction patterns.
  • the iterative process 30 illustrated begins with a guess 31 at the object and a guess 32 at the form of a probe function used. Subsequently these initial guesses are replaced by running guesses in the iterative process.
  • the initial guesses for the image and/or probe function can be random distributions or can themselves be precalculated approximations based on other measurements or prior calculations.
  • the guesses are modelled at a number of sample points and are thus represented by matrices. Such matrices can be stored and manipulated by a computer or other such processing unit. Aptly the sample points are equally spaced and form a rectangular array.
  • the probe function estimation after k iterations is denoted by P k (r) and the recovered image after k iterations by O k (r
  • a first step is to determine the exit wave front ⁇ (r, R k ) at step 35. This is carried out using equation 1 noted above.
  • a next step is to propagate the exit wave front to the measurement plane which is accomplished using a suitable model of propagation for the coherent wave front. The propagation is represented by the operator T where:
  • the forward transform T shown as step 36 generates a propagated wave front ⁇ k (u) where u references coordinates in the measurement plane. Since ⁇ k (u) is complex- value this can be written as:
  • the modulus of the propagated exit wave front equals the square root of the recorded diffraction pattern intensity. Generally this will not be the case as the guessed-at object will not correctly represent the true object at the sample points. To enforce the equality the modulus of the propagated exit wave front is replaced by the square root of the recorded diffraction pattern intensity as:
  • step 37 the modulus of the propagated exit wave front is replaced by the square root of the recorded diffraction pattern intensity.
  • the corrected wave front is then propagated back to the plane of the object using the inverse propagation operator:
  • This inverse propagation step 39 provides the corrected exit wave form ⁇ ' k (r, R k ).
  • An update step 40 is then calculated to produce an improved object guess O k+1 (r).
  • the update step 40 is carried out according to:
  • This update function is labelled U1 in Figure 3 which generates the update of the object guess O k+1 (r).
  • the parameter ⁇ governs the rate of change of the object guess. This value should be adjusted between 0 and 2 as higher values may lead to instability in the updated object guess.
  • the probe function is reconstructed in much the same manner as the object function. Aptly the probe function guess is carried out concurrently with the update of the object guess. (It will be appreciated that the Probe Function could optionally be updated more often or less often then the Object Function).
  • a further diffraction pattern is recorded in the measurement plane with the target object removed from the system. This further diffraction pattern may be recorded prior to the target object being put in place or subsequent to removal of the target object after the previously mentioned diffraction patterns have been used or may be a combination of diffraction patterns recorded before and after the target object is duly located.
  • P 0 (r) is chosen as an initial guess at the probe function which may be random or an approximation based on previous other measurements or calculations. Proceeding in a similar manner to the correction/update steps detailed above the probe function guess is propagated with a transform to the measurement plane so that:
  • a correction step 43 is then implemented by replacing the modulus of this propagated wave front with that recorded without the target object in the measurement plane 44.
  • the corrected wave front is then inverse propagated back at step 45 to give:
  • An update step 46 makes use of an update function U2 which is:
  • the result of this update function generates the running estimate for the probe function.
  • the parameter ⁇ governs the rate of change of the probe guess. This value should be adjusted between 0 and 2 as higher values may lead to instability in the updated probe guess.
  • the running guess for the probe function may be used at step 35 for generating the exit wave front as well as producing the new estimate to be transformed at step 42 to update the running estimate of the probe function itself in the next iteration.
  • Figure 5 illustrates apparatus for providing image data which may be used to construct a high-resolution image of a region of a target object according to the above-described embodiment illustrated in figures 1 and 2.
  • a source of radiation 50 provides illumination onto a lens 51 which weakly focuses the radiation onto a selected region of a target 1 1.
  • the incident radiation has an incident wave function 52 and an exit wave function 53. This exit wave function is propagated across distance D where a diffraction pattern is formed on an array of detectors 12.
  • the distance D is advantageously sufficiently long so that the propagated exit wave function 53 forms a Fourier diffraction pattern in the far- field.
  • the detector array provides at least one detector which can detect the intensity of radiation scattered by the target object 1 1 .
  • a locating device 54 is provided which may be a micro actuator and this can locate the target object at one or more locations as desired with respect to the target object. In this way radiation from source 50 may be made incident on different locations of the upstream surface of the target 1 1.
  • the distance D may be sufficiently small so that the propagated exit wave function 53 forms a Fresnel diffraction pattern on the detector array in the near field.
  • a control unit 55 provides control signals to the micro actuator and also receives intensity measurement results from each of the pixel detectors in the detector array 12.
  • the control unit 55 includes a microprocessor 56 and a data store 57 together with a user interface 58 which may include a user display and a user input key pad.
  • the control unit may be connected to a further processing device such as a laptop 59 or PC for remote control. Alternatively it will be understood that the control unit 55 could be provided by a laptop or PC.
  • the control unit 55 can automatically control the production of image data in real time. Alternatively a user can use the user interface 58 to select areas of the target object for imaging or provide further user input.
  • the source of radiation 50 illuminates the lens 51 with radiation.
  • the target object 1 1 is selectively located by the actuator 54 under control of the control unit 55.
  • the radiation forms a diffraction pattern detected at respective locations by each of the detectors in the detector array 12. Results from these detectors is input to the control unit and may be stored in the data store 57. If only one position is being used to derive image data the microprocessor uses this detected information together with program instructions including information about the algorithm above-noted to derive the image data. However if one or more further positions are required prior to finalising the image data the control unit next issues signals to the actuator 54 which locates the specimen at another selected location. The actuator may place the specimen at one of many different positions.
  • the array 12 may be a CCD array of 1200 x 1200 pixels. If no further intensity measurements are required image data may at this stage be generated by the control unit in accordance with the two newly stored sets of results using the algorithm above-noted.
  • the raw image data may be displayed or a high-resolution image generated from the image data may be displayed on the user interface 1209 or remote display on a PC or other such device. Alternatively or additionally the image data itself may be utilised to determine characteristics associated with the target object (for example by data values being compared with predetermined values.
  • the actuator can be used to move the target object out of the optical path to enable the diffraction pattern without target object to be measured. Alternatively this movement may be effected by another actuator (not shown) or by user interference.
  • a diffuser covers a post-target aperture.
  • the diffuser is arranged to diffuse the wavefront from the target such that the radiation incident on the sample is spread more evenly over all diffraction angles in the measured diffraction pattern.
  • the diffuser may diffuse the wavefront from the target in an arbitrary way, and it is not necessary to know a priori the nature of the diffuser.
  • the presence of the diffuser leads to a reduction in the dynamic range of the diffraction pattern. As most detectors have limited dynamic range, reducing the dynamic range of the diffraction pattern may allow a more faithful representation of the diffraction pattern to be determined. Furthermore, as the radiation incident on the sample is spread more evenly over all diffraction angles, the incident flux required to provide the image data may be reduced, thereby reducing the possibility of causing damage to the target object.
  • any type of diffuser having an arbitrary transfer function may be used.
  • the choice of diffuser will depend on the properties of the radiation used, and the desired diffusion effect.
  • the diffuser may comprise a ground glass diffuser.
  • a diffuser having a known transfer function may be used in conjunction with a known probe function.
  • a known probe function may be calculated, allowing the object function to be determined using a precalculated probe function.

Abstract

A method and apparatus are disclosed for providing image data. The method includes the steps of providing incident radiation from a radiation source at a target object and, via at least one detector, detecting an intensity of radiation scattered by the target object. Also via the at least one detector an intensity of radiation provided by the radiation source absent the target object is detected. Image data is provided via an iterative process responsive to the intensity of radiation detected absent the target object and the detected intensity of radiation scattered by the target object.

Description

PROVISION OF IMAGE DATA
The present invention relates to a method and apparatus for providing image data of the type which may be utilised to construct an image of a region of a target object. In particular, but not exclusively, the present invention relates to a method of providing such image data using an iterative process making use of an unknown probe function.
Many types of imaging techniques are known for deriving spatial information about a target object (sometimes referred to as a specimen). For example in conventional transmission imaging an object is irradiated by plane wave illumination. The waves scattered by the object are re-interfered by a lens to form an image. In the case of very short wave length imaging (X-rays or electrons) this technique has many known difficulties associated with aberrations and instabilities introduced by the lens which limit the resolution and interpretability of the resulting image. Typical achievable resolution is many times larger than the theoretical limit. Other types of imaging techniques are known but many of these have problems such as resolution limits, long data gathering times or the need for complex and expensive equipment.
A technique for high resolution imaging has been disclosed in WO 2005/106531. This document, which is herein incorporated by reference for all purposes, discloses a method and apparatus of providing image data for constructing an image of a region of a target object which includes the steps of providing incident radiation from a radiation source at a target object. Via at least one detector, detecting the intensity of radiation scattered by the target object and providing image data responsive to the detected intensity without high resolution positioning of the incident radiation or a post target object aperture relative to the target object. Also disclosed is a method for providing such image data via an iterative process using a moveable softly varying probe function such as a transmittance function or illumination function.
Those skilled in the art now refer to the technique disclosed in WO 2005/106531 as the ptychographical iterative engine (or PIE). This is a powerful technique for the recovery of image data relating to an area of an object from a set of diffraction pattern measurements. Each diffraction pattern is formed by illuminating an object with a known wave front of coherent radiation with the requirement that the intensity of the wave front is concentrated within a localised lateral region where it interacts with the object. Examples of such a wave front would be that generated a short distance beyond an aperture when it is illuminated by a plane wave, or the focal spot generated by a convex lens illuminated by a plane wave. The technique is also applicable to scenarios where a target is illuminated by plane wave radiation and a post target object aperture is used to select illumination scattered by a region of the object.
In this sense a diffraction pattern is the distribution of intensity produced by an optical configuration some distance beyond the object and at a plane normal to the direction of propagation of the illumination wave front. This plane is designated as the measurement plane and measurements made at this plane are denoted Ψk (u) with u being an appropriate coordinate vector. It is to be noted that when the distance between the measurement plane and a sample plane is small the diffraction pattern is known as a near-field diffraction pattern. When this distance is large the diffraction pattern is known as a far-field diffraction pattern.
Ptychography relies upon the recording of several diffraction patterns at the measurement plane using a suitable recording device such as a CCD camera or the like. The lateral positions of the object and the localised illumination wave front are different for each pattern.
In order to provide useful image data characteristics of a probe function which might be a transmittance function associated with a post target object aperture or an illumination function associated with incident radiation itself must be known or estimated. This either requires time consuming set up techniques or can lead to inaccuracies if the probe function used is in accurate. Furthermore the iterative process can be time consuming.
It is an aim of the present invention to at least partly mitigate the above-mentioned problems.
It is an aim of certain embodiments of the present invention to provide a method and apparatus suitable for providing image data which may or may not be used to construct an image of a region of a target object and which can be utilised without careful knowledge of a probe function being required.
It is an aim of certain embodiments of the present invention to provide a method and apparatus for providing image data in which an iterative process is used which produces useful results in an efficient manner. According to a first aspect of the present invention there is provided a method of providing image data for constructing an image of a region of a target object, comprising the steps of: providing incident radiation from a radiation source at a target object and, via at least one detector, detecting an intensity of radiation scattered by the target object; via the at least one detector, detecting an intensity of radiation provided by the radiation source absent the target object; and providing image data via an iterative process responsive to the intensity of radiation detected absent the target object and the detected intensity of radiation scattered by the target object.
According to a second aspect of the present invention there is provided apparatus for providing image data for generating an image of a region of a target object, comprising: locating means for locating a target object at a predetermined location; a radiation source for providing incident radiation at a target object located by the locating means; at least one detector device for detecting an intensity of radiation scattered by the target object locating means for locating incident radiation or a post-target aperture at one or more locations with respect to the target object; and processing means for providing image data via an iterative process responsive to an intensity of radiation detected absent the target object and a detected intensity of radiation scattered by the target object.
Certain embodiments of the present invention provide the advantage that during an iterative process a probe function is itself iteratively calculated step by step with a running estimate of the probe function being utilised to determine running estimates of an object function associated with the target object.
Certain embodiments of the present invention provide the advantage that illumination from an optical set up without a target may be efficiently measured before and/or after analysing a target object. This obviates the need for a probe function to be measured at other times. Certain embodiments of the present invention provide a method of providing high resolution images using image data gathered via an iterative process and constructing an image therefrom.
Certain embodiments of the present invention provide the advantage that image data indicating characteristics of a target object may be provided which may then be processed as data so as to determine some other characteristic of a target object. As such an image is not necessarily constructed using the image data.
Embodiments of the present invention will now be described hereinafter, by way of example only, with reference to the accompanying drawings in which:
Figure 1 illustrates incident at a target object;
Figure 2 illustrates a Probe Function and formation of a diffraction pattern with a target object;
Figure 3 illustrates a phase retrieval algorithm; and
Figure 4 illustrates a Probe Function and formation of a diffraction pattern without a target object.
In the drawings like reference numerals refer to like parts.
Figure 1 illustrates how a scattering pattern may be developed and used to determine image data corresponding to information about the structure of a target object. It will be understood that the term target object refers to any specimen or item placed in the path of incident radiation which causes scattering of that radiation. It will be understood that the target object should be at least partially transparent to incident radiation. The target object may or may not have some repetitive structure. Alternatively the target object may be wholly or partially reflective in which case a scattering pattern is measured based on reflected radiation.
Incident radiation 10 is caused to fall upon the target object 1 1 . It is to be understood that the term radiation is to be broadly construed as energy from a radiation source.
This will include electro magnetic radiation including X-rays, emitted particles such as electrons and/or acoustic waves. Such radiation may be represented by a wave function Ψ(r). This wave function includes a real part and an imaginary part as will be understood by those skilled in the art. This may be represented by the wave functions modulus and phase. Ψ(r)* is the complex conjugate of Ψ(r) and Ψ(r) Ψ(r)* = |Ψ(r) |2 where |Ψ(r)|2 is an intensity which may be measured for the wave function.
The incident radiation 10 is scattered as it passes through and beyond the specimen 1 1. As such the wave function of the incident radiation as it exits the specimen will be modified in both amplitude and phase with respect to the wave function of the incident radiation at the pre-target side of the specimen. The scattering which occurs may include Fourier diffraction, refraction and/or Fresnel diffraction and any other form of scattering in which characteristics of the incident radiation are modified as a result of propagating after the specimen. If an array of detectors such as a CCD detector 12 is arranged a long distance from the specimen then a diffraction pattern is formed at a diffraction plane 13. A Fourier diffraction pattern will form if the detectors 12 are located a distance D from the specimen where D is sufficiently long for the diffraction pattern to be formed effectively from a point source. If the diffraction plane is formed closer to the specimen, by locating the detectors nearer, then a Fresnel diffraction pattern will be formed.
The incident radiation 10 falls upon a first surface of a target object 1 1. The incident radiation is scattered in the specimen and transmitted radiation propagates through to a diffraction plane 13 where a diffraction pattern forms.
Figure 2 illustrates the process of Figure 1 in more detail. The radiation 10 is roughly focused, for example by a weak lens, so that a region of a first surface of the target object is illuminated. The weak lens may of course comprise any appropriate focusing apparatus such as a set of plates and a voltage supply for a beam of electrons or a reflective surface for X-rays. The weak focusing is sufficient to substantially confine the probing radiation beam. It is thus not necessary to sharply focus radiation although of course strongly focussed radiation could be used. Here the target object provides an object function O(r) which represents the phase and amplitude alteration introduced into an incident wave as a result of passing through the object of interest. The illuminating radiation incident on the target object represents a probe function P(r) which forms an illumination function such as that generated by a caustic or illumination profile formed by the lens or other optical component. P(r) is the complex stationary value of this wave field calculated at the plane of the object. The exit wave function ψ(r,R) defines the scattered radiation as it exits the downstream surface of the target object. As this exit wave propagates through space it will form a diffraction pattern Ψ(u) at the diffraction plane 13.
It will be understood that rather than weakly (or indeed strongly) focusing illumination on a target, unfocused radiation can be used with a post target aperture. An aperture is located post target object to thereby select a region of the target for investigation. The aperture is formed in a mask so that the aperture defines a "support". A support is an area of a function where that function is not zero. In other words outside the support the function is zero. Outside the support the mask blocks the transmittance of radiation. The term aperture describes a localised transmission function of radiation. This may be represented by a complex variable in two dimensions having a modulus value between 0 and 1. An example is a mask having a physical aperture region of varying transmittance.
Incident radiation would thus fall upon the up-stream side of the specimen and be scattered by the specimen as it is transmitted. A specimen wave O(r) is thus formed as an exit wave function of radiation after interaction with the object. In this way O(r) represents a two-dimensional complex function so that each point in O(r), where r is a two-dimensional coordinate, has associated with it a complex number. O(r) will physically represent an exit wave that would emanate from the object which is illuminated by a plane wave. For example, in the case of electron scattering, O(r) would represent the phase and amplitude alteration introduced into an incident wave as a result of passing through the object of interest. The aperture provides a probe function P(r) (or transmission function) which selects a part of the object exit wave function for analysis. It will be understood that rather than selecting an aperture a transmission grating or other such filtering function may be located downstream of the object function. The probe function P(r-R) is an aperture transmission function where an aperture is at a position R. The probe function can be represented as a complex function with its complex value given by a modulus and phase which represent the modulus and phase alterations introduced by the probe into a perfect plane wave incident up it.
The exit wave function ψ(r,R) is an exit wave function of radiation as it exits the aperture. This exit wave ψ(r,R) forms a diffraction pattern Ψ(u) at a diffraction plane. Here r is a vector coordinate in real space and u is a vector coordinate in diffraction space. It will be understood that with both the aperture formed embodiment and the non- aperture embodiment described with respect to figure 1 and 2 if the diffraction plane at which scattered radiation is detected is moved nearer to the specimen then Fresnel diffraction patterns will be detected rather than Fourier diffraction patterns. In such a case the propagation function from the exit wave ψ(r,R) to the diffraction pattern Ψ(u) will be a Fresnel transform rather than a Fourier transform.
Figure 3 illustrates an iterative process according to an embodiment of the present invention which can be used to recover image data of the type which can be used to construct an image of an area of an object from a set of diffraction patterns. The iterative process 30 illustrated begins with a guess 31 at the object and a guess 32 at the form of a probe function used. Subsequently these initial guesses are replaced by running guesses in the iterative process. The initial guesses for the image and/or probe function can be random distributions or can themselves be precalculated approximations based on other measurements or prior calculations. The guesses are modelled at a number of sample points and are thus represented by matrices. Such matrices can be stored and manipulated by a computer or other such processing unit. Aptly the sample points are equally spaced and form a rectangular array. The probe function estimation after k iterations is denoted by Pk(r) and the recovered image after k iterations by Ok(r).
The original guesses for the probe function and objection function are thus P0(r) and
Oo(r) respectively where r is an appropriate coordinate vector.
If the current translation vector relating to the relative positions of the object and the probe function is denoted Rk then the interaction between the guessed-at object distribution and the probe function is modelled by
( ■■, tr. Rfr ) = f) i (r) /\. (r - RJ 1
This is the current exit wave front. According to embodiments of the present invention an iterative process is used to update the object guess. This is illustrated by the left hand box 33 in Figure 3. An updated probe function guess is also iteratively calculated which is illustrated by the right hand box 34 in Figure 3.
Referring to the update of the object guess a first step is to determine the exit wave front ψ(r, Rk) at step 35. This is carried out using equation 1 noted above. A next step is to propagate the exit wave front to the measurement plane which is accomplished using a suitable model of propagation for the coherent wave front. The propagation is represented by the operator T where:
** («) = T [rfc(r. RJ]
The forward transform T shown as step 36 generates a propagated wave front Ψk(u) where u references coordinates in the measurement plane. Since Ψk(u) is complex- value this can be written as:
Φjt (u) ^ .-U (UJ oxp (<"#* ( «)) 3
Next this modelled wave front must be compared to a measured diffraction pattern. If the guessed-at object is correct then the following equality holds for every value of k.
--U(u) ™ vΛ>* ι 11
The modulus of the propagated exit wave front equals the square root of the recorded diffraction pattern intensity. Generally this will not be the case as the guessed-at object will not correctly represent the true object at the sample points. To enforce the equality the modulus of the propagated exit wave front is replaced by the square root of the recorded diffraction pattern intensity as:
Figure imgf000009_0001
At step 37 the modulus of the propagated exit wave front is replaced by the square root of the recorded diffraction pattern intensity.
The corrected wave front is then propagated back to the plane of the object using the inverse propagation operator:
Figure imgf000010_0001
This inverse propagation step 39 provides the corrected exit wave form ψ'k(r, Rk). An update step 40 is then calculated to produce an improved object guess Ok+1(r). The update step 40 is carried out according to:
OH. , (r) = ()k Ir) + n ( t'Ur. R) Oj.(r.R) )
Figure imgf000010_0002
This update function is labelled U1 in Figure 3 which generates the update of the object guess Ok+1(r). The parameter α governs the rate of change of the object guess. This value should be adjusted between 0 and 2 as higher values may lead to instability in the updated object guess. According to embodiments of the present invention the probe function is reconstructed in much the same manner as the object function. Aptly the probe function guess is carried out concurrently with the update of the object guess. (It will be appreciated that the Probe Function could optionally be updated more often or less often then the Object Function). In order to achieve this a further diffraction pattern is recorded in the measurement plane with the target object removed from the system. This further diffraction pattern may be recorded prior to the target object being put in place or subsequent to removal of the target object after the previously mentioned diffraction patterns have been used or may be a combination of diffraction patterns recorded before and after the target object is duly located.
That is to say the diffraction pattern of the probe function itself is recorded. This is denoted as the measurement ΩP(u). The measurement of this diffraction pattern is illustrated in Figure 4.
At step 32 P0(r) is chosen as an initial guess at the probe function which may be random or an approximation based on previous other measurements or calculations. Proceeding in a similar manner to the correction/update steps detailed above the probe function guess is propagated with a transform to the measurement plane so that:
Φ^u) = T [A (D] 8 which can be written as:
Figure imgf000011_0001
A correction step 43 is then implemented by replacing the modulus of this propagated wave front with that recorded without the target object in the measurement plane 44.
The corrected wave front is then inverse propagated back at step 45 to give:
^r) = T-1 Km)J H
An update step 46 makes use of an update function U2 which is:
iPlΛτ) tΑ (p. R))
Figure imgf000011_0002
12
The result of this update function generates the running estimate for the probe function. The parameter β governs the rate of change of the probe guess. This value should be adjusted between 0 and 2 as higher values may lead to instability in the updated probe guess. The running guess for the probe function may be used at step 35 for generating the exit wave front as well as producing the new estimate to be transformed at step 42 to update the running estimate of the probe function itself in the next iteration.
Figure 5 illustrates apparatus for providing image data which may be used to construct a high-resolution image of a region of a target object according to the above-described embodiment illustrated in figures 1 and 2. A source of radiation 50 provides illumination onto a lens 51 which weakly focuses the radiation onto a selected region of a target 1 1. The incident radiation has an incident wave function 52 and an exit wave function 53. This exit wave function is propagated across distance D where a diffraction pattern is formed on an array of detectors 12. The distance D is advantageously sufficiently long so that the propagated exit wave function 53 forms a Fourier diffraction pattern in the far- field. The detector array provides at least one detector which can detect the intensity of radiation scattered by the target object 1 1 . A locating device 54 is provided which may be a micro actuator and this can locate the target object at one or more locations as desired with respect to the target object. In this way radiation from source 50 may be made incident on different locations of the upstream surface of the target 1 1.
Alternatively, in some applications it may be advantageous for the distance D to be sufficiently small so that the propagated exit wave function 53 forms a Fresnel diffraction pattern on the detector array in the near field.
A control unit 55 provides control signals to the micro actuator and also receives intensity measurement results from each of the pixel detectors in the detector array 12. The control unit 55 includes a microprocessor 56 and a data store 57 together with a user interface 58 which may include a user display and a user input key pad. The control unit may be connected to a further processing device such as a laptop 59 or PC for remote control. Alternatively it will be understood that the control unit 55 could be provided by a laptop or PC. The control unit 55 can automatically control the production of image data in real time. Alternatively a user can use the user interface 58 to select areas of the target object for imaging or provide further user input.
In use the source of radiation 50 illuminates the lens 51 with radiation. The target object 1 1 is selectively located by the actuator 54 under control of the control unit 55. The radiation forms a diffraction pattern detected at respective locations by each of the detectors in the detector array 12. Results from these detectors is input to the control unit and may be stored in the data store 57. If only one position is being used to derive image data the microprocessor uses this detected information together with program instructions including information about the algorithm above-noted to derive the image data. However if one or more further positions are required prior to finalising the image data the control unit next issues signals to the actuator 54 which locates the specimen at another selected location. The actuator may place the specimen at one of many different positions. After relocation a further diffraction pattern formed on the detector array is measured and the results stored in the control unit. As an example the array 12 may be a CCD array of 1200 x 1200 pixels. If no further intensity measurements are required image data may at this stage be generated by the control unit in accordance with the two newly stored sets of results using the algorithm above-noted. The raw image data may be displayed or a high-resolution image generated from the image data may be displayed on the user interface 1209 or remote display on a PC or other such device. Alternatively or additionally the image data itself may be utilised to determine characteristics associated with the target object (for example by data values being compared with predetermined values.
The actuator can be used to move the target object out of the optical path to enable the diffraction pattern without target object to be measured. Alternatively this movement may be effected by another actuator (not shown) or by user interference.
According to a further embodiment of the invention, a diffuser covers a post-target aperture. The diffuser is arranged to diffuse the wavefront from the target such that the radiation incident on the sample is spread more evenly over all diffraction angles in the measured diffraction pattern. By performing the measurements required to recover the illumination function, or probe function, with the diffuser in place, the effect of the diffuser can be automatically recovered as well. Thus, the diffuser may diffuse the wavefront from the target in an arbitrary way, and it is not necessary to know a priori the nature of the diffuser.
The presence of the diffuser leads to a reduction in the dynamic range of the diffraction pattern. As most detectors have limited dynamic range, reducing the dynamic range of the diffraction pattern may allow a more faithful representation of the diffraction pattern to be determined. Furthermore, as the radiation incident on the sample is spread more evenly over all diffraction angles, the incident flux required to provide the image data may be reduced, thereby reducing the possibility of causing damage to the target object.
Any type of diffuser having an arbitrary transfer function may be used. As will be understood by the skilled man, the choice of diffuser will depend on the properties of the radiation used, and the desired diffusion effect. For example, for visible light the diffuser may comprise a ground glass diffuser.
According to a further embodiment of the invention, a diffuser having a known transfer function may be used in conjunction with a known probe function. Such an arrangement allows the diffused probe function to be calculated, allowing the object function to be determined using a precalculated probe function. Throughout the description and claims of this specification, the words "comprise" and "contain" and variations of the words, for example "comprising" and "comprises", means "including but not limited to", and is not intended to (and does not) exclude other moieties, additives, components, integers or steps.
Throughout the description and claims of this specification, the singular encompasses the plural unless the context otherwise requires. In particular, where the indefinite article is used, the specification is to be understood as contemplating plurality as well as singularity, unless the context requires otherwise.
Features, integers, characteristics, compounds, chemical moieties or groups described in conjunction with a particular aspect, embodiment or example of the invention are to be understood to be applicable to any other aspect, embodiment or example described herein unless incompatible therewith.

Claims

1. A method of providing image data for constructing an image of a region of a target object, comprising the steps of: providing incident radiation from a radiation source at a target object and, via at least one detector, detecting an intensity of radiation scattered by the target object; via the at least one detector, detecting an intensity of radiation provided by the radiation source absent the target object; and providing image data via an iterative process responsive to the intensity of radiation detected absent the target object and the detected intensity of radiation scattered by the target object.
2. The method as claimed in claim 1 , further comprising the steps of: detecting the intensity of radiation scattered by the target object with the incident radiation or a post target aperture at a first position with respect to the target object; re-positioning the incident radiation or post-target aperture at at least one further position relative to the target object; subsequently detecting the intensity of radiation scattered by the target object with the incident radiation or post-target aperture at the at least one further position; and providing the image data responsive to the intensity of radiation detected absent the target object and as scattered at the first and at least one further position.
3. The method as claimed in claim 1 or claim 2, further comprising the steps of: detecting the intensity of radiation provided absent the target object before and/or after locating a target object between the radiation source and the detector.
4. The method as claimed in any claim 2 or claim 3, further comprising the steps of: estimating an object function indicating at least one characteristic of said region of the target object; estimating a probe function indicating at least one characteristic of incident radiation at the target object or the post-target aperture; and iteratively re-estimating each of the object function and probe function.
5. The apparatus as claimed in claim 4, further comprising the steps of: multiplying the estimated object function by the estimated probe function to thereby provide an exit wave function; propagating the exit wave function to provide an estimate of an expected scattering pattern; and correcting at least one characteristic of said expected scattering pattern according to a detected intensity of radiation scattered by the target object.
6. The method as claimed in claim 5, further comprising the steps of; inverse propagating a corrected expected scattering pattern to thereby provide an updated exit wave function; and updating a running estimate of the object function responsive to the updated exit wave function according to
<h+ , (r) - <h iτ) ■■!■■ A ; *r ~ J ( t'Ur. R) - α (r. R) )
where ^Λ U^ is the current (kth) estimate of the object function, f VU'- R) is the current exit wave, * V r- -"- > is the updated exit wave whose Fourier Transform or Fresnel Transform has been corrected and A U* — Ri js ^Q current (kth) estimate of the illumination function, ^ is a constant whose value can be adjusted to optimise the performance of the algorithm.
7. The method as claimed in any one of claims 4 to 6, further comprising the steps of: propagating the estimated probe function to provide an estimate of an expected targetless scattering pattern; and correcting at least one characteristic of said expected targetless scattering pattern according to the intensity of radiation detected absent said target object.
8. The method as claimed in claim 7, further comprising the steps of: inverse propagating a corrected expected targetless scattering pattern to thereby provide a running estimate for the probe function; and updating the running estimate of the probe function responsive to the updated probe function according to
PUi (r) - n (r) + Ir. R))
Figure imgf000016_0001
where ^\ ir) is the current (kth) estimate of the object function, * "- U". R ) js the current exit wave, ' Λr ' is the updated illumination function whose Fourier Transform has been corrected using the measurement * '/'(u) and *\ (r) is the current (kth) estimate of the illumination function. ^ is a constant whose value can be adjusted to optimise the performance of the algorithm.
9. The method as claimed in claim 7 wherein the estimated probe function is propagated to provide an estimate scattering pattern in a measurement plane of Φι \ u\ - T [l\ {rΫ where the propagation operator T suitably models the propagation between the plane of the object and the measurement plane, wherein F comprises a Fourier Transform or a Fresnel Transform, ^- 111 ? is the propagated kth illumination function guess, whose modulus must match the recording
Figure imgf000017_0001
and A ! r Ms the current (kth) estimate of the illumination function.
10. The method as claimed in claim 9 wherein the corrected targetless scattering pattern is inversely propagated back to an object plane as r; ιr > ^ r~ s [Φ'. | U » where the propagation operator 1 suitably models the propagation between the measurement plane and the plane of the object, wherein T~ comprises an Inverse Fourier Transform or an Inverse Fresnel Transform, * I ^) is the corrected illumination function in the plane of the object and φ- lu < is the corrected diffraction pattern whose modulus matches the recorded modulus of the illumination function.
1 1. The method as claimed in any preceding claim, further comprising the steps of: updating a running estimate of the probe function and a running estimate of an object function simultaneously with each iteration.
12. The method as claimed in claim 4, further comprising the steps of: providing an initial estimate of the probe function as a prior modelled probe function,
13. The method as claimed in claim 4, further comprising the steps of: providing an initial estimate of the probe function by providing a random approximation for the probe function.
14. The method as claimed in any preceding claim, further comprising providing a diffuser arranged to diffuse radiation detected at the detector.
15. The method as claimed in any preceding claim, wherein the target object is at least partially transparent to the incident radiation and detecting an intensity of radiation scattered by the target object comprises detecting an intensity of radiation transmitted by the target object.
16. The method as claimed in any of claims 1 to 14, wherein the target object is at least partially reflective to the incident radiation and detecting an intensity of radiation scattered by the target object comprises detecting an intensity of radiation reflected by the target object.
17. Apparatus for providing image data for generating an image of a region of a target object, comprising: locating means for locating a target object at a predetermined location; a radiation source for providing incident radiation at a target object located by the locating means; at least one detector device for detecting an intensity of radiation scattered by the target object locating means for locating incident radiation or a post-target aperture at one or more locations with respect to the target object; and processing means for providing image data via an iterative process responsive to an intensity of radiation detected absent the target object and a detected intensity of radiation scattered by the target object.
18. The apparatus as claimed in claim 17, further comprising: the incident radiation is substantially localised.
19. The apparatus as claimed in claim 17 or 18, further comprising: a diffuser arranged to diffuse radiation detected at the detector.
20. The apparatus as claimed in claim 19, wherein the diffuser is located within a post target aperture.
21 . A computer-readable data storage medium having instructions stored thereon which, when executed by a computer, preform the method as claimed in any one of claims 1 to 16.
22. A method substantially as hereinbefore described with reference to the accompanying drawings.
23. Apparatus constructed and arranged substantially as hereinbefore described with reference to the accompanying drawings.
PCT/GB2009/051652 2008-12-04 2009-12-04 Provision of image data WO2010064051A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US13/132,501 US8908910B2 (en) 2008-12-04 2009-12-04 Provision of image data
JP2011539099A JP5619767B2 (en) 2008-12-04 2009-12-04 Supplying image data
CN200980148909.9A CN102239426B (en) 2008-12-04 2009-12-04 Provision of image data
EP09798949.5A EP2356487B1 (en) 2008-12-04 2009-12-04 Provision of image data
AU2009323838A AU2009323838B2 (en) 2008-12-04 2009-12-04 Provision of image data

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB0822149.1A GB0822149D0 (en) 2008-12-04 2008-12-04 Provision of image data
GB0822149.1 2008-12-04

Publications (1)

Publication Number Publication Date
WO2010064051A1 true WO2010064051A1 (en) 2010-06-10

Family

ID=40289488

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2009/051652 WO2010064051A1 (en) 2008-12-04 2009-12-04 Provision of image data

Country Status (7)

Country Link
US (1) US8908910B2 (en)
EP (2) EP2356487B1 (en)
JP (1) JP5619767B2 (en)
CN (1) CN102239426B (en)
AU (1) AU2009323838B2 (en)
GB (1) GB0822149D0 (en)
WO (1) WO2010064051A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012001397A2 (en) 2010-06-28 2012-01-05 Phase Focus Limited Calibration of a probe in ptychography
WO2012038749A2 (en) 2010-09-24 2012-03-29 Phase Focus Limited Improvements in three dimensional imaging
WO2012073043A2 (en) 2010-12-03 2012-06-07 Phase Focus Limited Improvements in providing image data
WO2012146929A1 (en) 2011-04-27 2012-11-01 Phase Focus Limited A method and apparatus for providing image data for constructing an image of a region of a target object
WO2013008034A1 (en) 2011-07-14 2013-01-17 Phase Focus Limited Method and apparatus for position determination
WO2013110941A2 (en) 2012-01-24 2013-08-01 Phase Focus Limited Method and apparatus for determining object characteristics
WO2016020671A1 (en) * 2014-08-08 2016-02-11 The University Of Sheffield Methods and apparatus for determining image data
WO2016174472A1 (en) 2015-04-30 2016-11-03 Phase Focus Limited Method and apparatus for determining temporal behaviour of an object
CN106324853A (en) * 2016-10-17 2017-01-11 北京工业大学 Visible light region double object distance overlapped imaging method
WO2018020217A1 (en) 2016-07-25 2018-02-01 Phase Focus Limited Method and apparatus for seed point determination
US10466184B2 (en) 2012-05-03 2019-11-05 Phase Focus Limited Providing image data
WO2020115480A1 (en) 2018-12-07 2020-06-11 Phase Focus Limited Method and apparatus for determining temporal behaviour of an object

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0822149D0 (en) * 2008-12-04 2009-01-14 Univ Sheffield Provision of image data
GB201215558D0 (en) * 2012-08-31 2012-10-17 Phase Focus Ltd Improvements in phase retrieval
US10652444B2 (en) 2012-10-30 2020-05-12 California Institute Of Technology Multiplexed Fourier ptychography imaging systems and methods
AU2013338193A1 (en) 2012-10-30 2015-05-21 California Institute Of Technology Fourier ptychographic imaging systems, devices, and methods
US9864184B2 (en) 2012-10-30 2018-01-09 California Institute Of Technology Embedded pupil function recovery for fourier ptychographic imaging devices
CN103246077B (en) * 2013-05-10 2015-04-15 中国科学院上海光学精密机械研究所 Device utilizing grating to realize object imaging
WO2015017730A1 (en) 2013-07-31 2015-02-05 California Institute Of Technoloby Aperture scanning fourier ptychographic imaging
WO2015027188A1 (en) 2013-08-22 2015-02-26 California Institute Of Technoloby Variable-illumination fourier ptychographic imaging devices, systems, and methods
US11468557B2 (en) 2014-03-13 2022-10-11 California Institute Of Technology Free orientation fourier camera
US10162161B2 (en) 2014-05-13 2018-12-25 California Institute Of Technology Ptychography imaging systems and methods with convex relaxation
CN104155320B (en) * 2014-08-22 2018-08-10 南京大学 A kind of time resolution overlapping associations Imaging
CN104132952B (en) * 2014-08-22 2017-05-17 南京大学 Time resolution ptychography
CN107111118B (en) 2014-12-22 2019-12-10 加州理工学院 EPI illumination Fourier ptychographic imaging for thick samples
JP2018508741A (en) 2015-01-21 2018-03-29 カリフォルニア インスティチュート オブ テクノロジー Fourier typography tomography
AU2016211634A1 (en) 2015-01-26 2017-05-04 California Institute Of Technology Array level fourier ptychographic imaging
US10684458B2 (en) 2015-03-13 2020-06-16 California Institute Of Technology Correcting for aberrations in incoherent imaging systems using fourier ptychographic techniques
US9993149B2 (en) 2015-03-25 2018-06-12 California Institute Of Technology Fourier ptychographic retinal imaging methods and systems
US10228550B2 (en) 2015-05-21 2019-03-12 California Institute Of Technology Laser-based Fourier ptychographic imaging systems and methods
CN107924119B (en) 2015-08-12 2022-08-09 Asml荷兰有限公司 Inspection apparatus, inspection method, and manufacturing method
WO2017157645A1 (en) 2016-03-15 2017-09-21 Stichting Vu Inspection method, inspection apparatus and illumination method and apparatus
US11092795B2 (en) 2016-06-10 2021-08-17 California Institute Of Technology Systems and methods for coded-aperture-based correction of aberration obtained from Fourier ptychography
US10568507B2 (en) 2016-06-10 2020-02-25 California Institute Of Technology Pupil ptychography methods and systems
CN110720207B (en) * 2017-06-01 2021-04-27 富士胶片株式会社 Image processing device, imaging system, image processing method, and recording medium
WO2019090149A1 (en) 2017-11-03 2019-05-09 California Institute Of Technology Parallel digital imaging acquisition and restoration methods and systems

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2897444A1 (en) * 2006-02-10 2007-08-17 Commissariat Energie Atomique METHOD OF ESTIMATING DIFFUSED RADIATION IN A BIDIMENSIONAL DETECTOR
WO2008142360A1 (en) * 2007-05-22 2008-11-27 Phase Focus Limited Three dimensional imaging

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3120567A1 (en) * 1981-05-23 1983-01-20 Philips Patentverwaltung Gmbh, 2000 Hamburg SCREEN BEAM EXAMINATION ARRANGEMENT
US7136452B2 (en) * 1995-05-31 2006-11-14 Goldpower Limited Radiation imaging system, device and method for scan imaging
US6690474B1 (en) * 1996-02-12 2004-02-10 Massachusetts Institute Of Technology Apparatus and methods for surface contour measurement
JP3957803B2 (en) * 1996-02-22 2007-08-15 キヤノン株式会社 Photoelectric conversion device
US6545790B2 (en) * 1999-11-08 2003-04-08 Ralph W. Gerchberg System and method for recovering phase information of a wave front
EP1230576B1 (en) * 1999-11-08 2009-07-22 Wavefront Analysis Inc. System and method for recovering phase information of a wave front
US6875973B2 (en) * 2000-08-25 2005-04-05 Amnis Corporation Auto focus for a flow imaging system
US7085426B2 (en) * 2001-10-15 2006-08-01 Jonas August Volterra filters for enhancement of contours in images
US7038787B2 (en) * 2002-09-03 2006-05-02 Ut-Battelle, Llc Content-based fused off-axis object illumination direct-to-digital holography
GB0409572D0 (en) 2004-04-29 2004-06-02 Univ Sheffield High resolution imaging
US8405890B2 (en) * 2007-01-29 2013-03-26 Celloptic, Inc. System, apparatus and method for extracting image cross-sections of an object from received electromagnetic radiation
GB0817650D0 (en) * 2008-09-26 2008-11-05 Phase Focus Ltd Improvements in the field of imaging
GB0822149D0 (en) * 2008-12-04 2009-01-14 Univ Sheffield Provision of image data

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2897444A1 (en) * 2006-02-10 2007-08-17 Commissariat Energie Atomique METHOD OF ESTIMATING DIFFUSED RADIATION IN A BIDIMENSIONAL DETECTOR
WO2008142360A1 (en) * 2007-05-22 2008-11-27 Phase Focus Limited Three dimensional imaging

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
BARUCHEL J ET AL: "Advances in synchrotron hard X-ray based imaging", COMPTES RENDUS - PHYSIQUE, ELSEVIER, PARIS, FR LNKD- DOI:10.1016/J.CRHY.2007.08.003, vol. 9, no. 5-6, 1 June 2008 (2008-06-01), pages 624 - 641, XP022701305, ISSN: 1631-0705, [retrieved on 20071031] *
PIERRE THIBAULT ET AL: "High-Resolution Scanning X-Ray Diffraction Microscopy: Supporting Online Material", INTERNET CITATION, 18 July 2008 (2008-07-18), pages 1 - 10, XP007911178, Retrieved from the Internet <URL:www.sciencemag.org/cgi/content/full/321/5887/379/DC1> [retrieved on 20100118] *
RODENBURG J ET AL: "A phase retrieval algorithm for shifting illumination", APPLIED PHYSICS LETTERS, AIP, AMERICAN INSTITUTE OF PHYSICS, MELVILLE, NY, US LNKD- DOI:10.1063/1.1823034, vol. 85, no. 20, 1 January 2004 (2004-01-01), pages 4795 - 4797, XP012063468, ISSN: 0003-6951 *

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8942449B2 (en) 2010-06-28 2015-01-27 Phase Focus Limited Calibration of a probe in ptychography
CN103201648A (en) * 2010-06-28 2013-07-10 相位聚焦有限公司 Calibration of a probe in ptychography
WO2012001397A2 (en) 2010-06-28 2012-01-05 Phase Focus Limited Calibration of a probe in ptychography
KR101810637B1 (en) 2010-06-28 2017-12-19 페이즈 포커스 리미티드 Calibration of a probe in ptychography
EA026014B1 (en) * 2010-06-28 2017-02-28 Фейз Фокус Лимитед Method of providing image data for constructing an image of a region of a target object by an iterative process
WO2012001397A3 (en) * 2010-06-28 2013-01-03 Phase Focus Limited Calibration of a probe in ptychography
JP2013534633A (en) * 2010-06-28 2013-09-05 フェーズ フォーカス リミテッド Calibration method of probe by typography method
CN103200870A (en) * 2010-09-24 2013-07-10 相位聚焦有限公司 Three dimensional imaging
KR101896506B1 (en) * 2010-09-24 2018-09-07 페이즈 포커스 리미티드 Three dimensional imaging
JP2013543582A (en) * 2010-09-24 2013-12-05 フェーズ フォーカス リミテッド Improvement of 3D imaging
KR20140037009A (en) * 2010-09-24 2014-03-26 페이즈 포커스 리미티드 Three dimensional imaging
WO2012038749A2 (en) 2010-09-24 2012-03-29 Phase Focus Limited Improvements in three dimensional imaging
US9029745B2 (en) 2010-12-03 2015-05-12 Phase Focus Limited Method and apparatus for providing image data
WO2012073043A2 (en) 2010-12-03 2012-06-07 Phase Focus Limited Improvements in providing image data
WO2012073043A3 (en) * 2010-12-03 2012-07-26 Phase Focus Limited Improvements in providing image data
US9121764B2 (en) 2010-12-03 2015-09-01 Phase Focus Limited Providing image data
US9448160B2 (en) 2011-04-27 2016-09-20 Phase Focus Limited Method and apparatus for providing image data for constructing an image of a region of a target object
CN103503022A (en) * 2011-04-27 2014-01-08 相位聚焦有限公司 Method and apparatus for providing image data for constructing image of region of target object
WO2012146929A1 (en) 2011-04-27 2012-11-01 Phase Focus Limited A method and apparatus for providing image data for constructing an image of a region of a target object
US9618332B2 (en) 2011-07-14 2017-04-11 Phase Focus Limited Method and apparatus for position determination
WO2013008034A1 (en) 2011-07-14 2013-01-17 Phase Focus Limited Method and apparatus for position determination
US9274024B2 (en) 2012-01-24 2016-03-01 Phase Focus Limited Method and apparatus for determining object characteristics
US9784640B2 (en) 2012-01-24 2017-10-10 Phase Focus Limited Method and apparatus for determining object characteristics
WO2013110941A2 (en) 2012-01-24 2013-08-01 Phase Focus Limited Method and apparatus for determining object characteristics
US10466184B2 (en) 2012-05-03 2019-11-05 Phase Focus Limited Providing image data
WO2016020671A1 (en) * 2014-08-08 2016-02-11 The University Of Sheffield Methods and apparatus for determining image data
WO2016174472A1 (en) 2015-04-30 2016-11-03 Phase Focus Limited Method and apparatus for determining temporal behaviour of an object
US10679038B2 (en) 2015-04-30 2020-06-09 Phase Focus Limited Method and apparatus for determining temporal behaviour of an object in image data
WO2018020217A1 (en) 2016-07-25 2018-02-01 Phase Focus Limited Method and apparatus for seed point determination
CN106324853A (en) * 2016-10-17 2017-01-11 北京工业大学 Visible light region double object distance overlapped imaging method
WO2020115480A1 (en) 2018-12-07 2020-06-11 Phase Focus Limited Method and apparatus for determining temporal behaviour of an object

Also Published As

Publication number Publication date
AU2009323838B2 (en) 2013-12-19
JP5619767B2 (en) 2014-11-05
US20110235863A1 (en) 2011-09-29
EP2410353A2 (en) 2012-01-25
EP2356487B1 (en) 2018-04-04
CN102239426A (en) 2011-11-09
CN102239426B (en) 2014-11-12
AU2009323838A1 (en) 2011-06-23
JP2012511147A (en) 2012-05-17
EP2356487A1 (en) 2011-08-17
GB0822149D0 (en) 2009-01-14
EP2410353A3 (en) 2012-02-22
AU2009323838A2 (en) 2011-11-03
US8908910B2 (en) 2014-12-09

Similar Documents

Publication Publication Date Title
EP2356487B1 (en) Provision of image data
EP2585853B1 (en) Calibration of a probe in ptychography
AU2005238692B2 (en) High resolution imaging
AU2008252706A1 (en) Three dimensional imaging
US20140043616A1 (en) Method and apparatus for providing image data for constructing an image of a region of a target object
US8917393B2 (en) Method and apparatus for providing image data

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200980148909.9

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09798949

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 13132501

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 3915/CHENP/2011

Country of ref document: IN

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2011539099

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 2009798949

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2009323838

Country of ref document: AU

Date of ref document: 20091204

Kind code of ref document: A