WO2022195731A1 - Dispositif d'estimation, système d'estimation, procédé d'estimation et support d'enregistrement - Google Patents

Dispositif d'estimation, système d'estimation, procédé d'estimation et support d'enregistrement Download PDF

Info

Publication number
WO2022195731A1
WO2022195731A1 PCT/JP2021/010679 JP2021010679W WO2022195731A1 WO 2022195731 A1 WO2022195731 A1 WO 2022195731A1 JP 2021010679 W JP2021010679 W JP 2021010679W WO 2022195731 A1 WO2022195731 A1 WO 2022195731A1
Authority
WO
WIPO (PCT)
Prior art keywords
voxel
estimation
wavefront
information
estimated
Prior art date
Application number
PCT/JP2021/010679
Other languages
English (en)
Japanese (ja)
Inventor
渡部智史
Original Assignee
株式会社エビデント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社エビデント filed Critical 株式会社エビデント
Priority to PCT/JP2021/010679 priority Critical patent/WO2022195731A1/fr
Publication of WO2022195731A1 publication Critical patent/WO2022195731A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/41Refractivity; Phase-affecting properties, e.g. optical path length
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/59Transmissivity
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements

Definitions

  • the present invention relates to an estimation device, an estimation system, an estimation method, and a recording medium.
  • Non-Patent Document 1 discloses an imaging device capable of acquiring spatial frequency information of a sample.
  • This imaging device has a light source, a microscope objective lens, an imaging lens, and a photodetector.
  • the subject is rotated around two orthogonal axes as rotation axes.
  • the two axes of rotation lie in a plane perpendicular to the optical axis of the microscope objective.
  • the spatial frequency information of the sample can be obtained isotropically. That is, the acquisition range of the scattering potential can be widened. As a result, the number of scattering potentials that can be obtained can be increased.
  • the three-dimensional optical properties of the sample can be obtained from the scattering potential.
  • a three-dimensional optical property is, for example, a refractive index distribution or an absorptance distribution.
  • Non-Patent Document 1 the spatial frequency information obtained isotropically is used as it is to calculate the three-dimensional optical characteristics of the sample. Therefore, the amount of calculation in calculating the three-dimensional optical characteristics is large.
  • an estimating device includes: comprising a memory, a processor, and the memory stores a plurality of composite information;
  • the composite information has wavefront information and rotation angle information, Wavefront information is wavefront information acquired based on illumination light that has passed through an object,
  • the rotation angle information is information indicating the orientation of the object when the illumination light passes through the object.
  • the processor performs an estimation process to estimate the three-dimensional optical properties of the object;
  • the three-dimensional optical property is a refractive index distribution or an absorptance distribution
  • the voxel space is composed of a set of first operation voxels,
  • the voxel space consists of a set of estimation voxels,
  • the voxel space consists of a set of update voxels,
  • the reference direction of the voxel space, the first calculation voxel, the estimation voxel, and the update voxel is the traveling direction of the pseudo illumination light,
  • the voxel width in the reference direction is ⁇ z1
  • the voxel width in the reference direction is ⁇ z2
  • ⁇ z1 is wider than ⁇ z2
  • the predetermined orientation is the orientation of the object with respect to the traveling direction of the illumination light specified by the rotation angle information
  • a process A for generating an estimated value of the first calculation voxel Estimated wavefront information is generated by calculating a wavefront propagating in the reference direction in which the pseudo illumination light travels using the estimated value of the first calculation voxel generated in the process A, and the estimated wavefront information is constrained by the wavefront information, A process B of calculating a wavefront propagating in a direction opposite to the reference direction in which the simulated illumination light travels and generating a result of update voxels; and a process C of updating the estimated values of the estimation voxels based on the result of the update voxels generated in the process B.
  • An estimation system comprises: the estimating device described above; a light source that emits illumination light; a photodetector; a stage on which an object is placed; and an angle changing mechanism, The stage is placed on the optical path from the light source to the photodetector, The angle changing mechanism is characterized by changing the arrangement angle of the object with respect to the optical axis of the optical path.
  • An estimation method comprises: An estimation method for estimating a three-dimensional optical property of an object, comprising: The three-dimensional optical property is a refractive index distribution or an absorptance distribution,
  • the composite information has wavefront information and rotation angle information, Wavefront information is wavefront information acquired based on illumination light that has passed through an object,
  • the rotation angle information is information indicating the orientation of the object when the illumination light passes through the object.
  • the voxel space is composed of a set of first operation voxels,
  • the voxel space consists of a set of estimation voxels,
  • the voxel space consists of a set of update voxels,
  • the reference direction of the voxel space, the first calculation voxel, the estimation voxel, and the update voxel is the traveling direction of the pseudo illumination light
  • the voxel width in the reference direction is ⁇ z1
  • the voxel width in the reference direction is ⁇ z2
  • ⁇ z1 is wider than ⁇ z2
  • the predetermined orientation is the orientation of the object with respect to the traveling direction of the illumination light specified by the rotation angle information
  • the result indicates the difference between the object wavefront and the estimated object wavefront, perform the estimation process, In the estimation process, Rotate the estimation object in the voxel space so that the orientation of the estimation object corresponding
  • Estimated wavefront information is generated by calculating a wavefront propagating in the reference direction in which the pseudo illumination light travels using the estimated value of the first calculation voxel generated in the process A, and the estimated wavefront information is constrained by the wavefront information, calculating a wavefront propagating in a direction opposite to the reference direction in which the pseudo-illumination light travels, and performing a process B of generating update voxel results from the obtained results; It is characterized by executing a process C of updating the estimated value of the estimation voxels based on the result of the update voxels generated in the process B.
  • a recording medium comprises: A computer-readable recording medium recording a program for causing a processor of a computer having a memory and a processor to perform estimation processing,
  • the composite information has wavefront information and rotation angle information,
  • Wavefront information is wavefront information acquired based on illumination light that has passed through an object,
  • the rotation angle information is information indicating the orientation of the object when the illumination light passes through the object.
  • the estimation process estimates the three-dimensional optical properties of the object,
  • the three-dimensional optical property is a refractive index distribution or an absorptance distribution
  • the voxel space is composed of a set of first operation voxels,
  • the voxel space consists of a set of estimation voxels,
  • the voxel space consists of a set of update voxels,
  • the reference direction of the voxel space, the first calculation voxel, the estimation voxel, and the update voxel is the traveling direction of the pseudo illumination light,
  • the voxel width in the reference direction is ⁇ z1
  • the voxel width in the reference direction is ⁇ z2
  • ⁇ z1 is wider than ⁇ z2
  • the predetermined orientation is the orientation of the object with respect to the traveling direction of the illumination light specified by the rotation angle information, The result indicates the difference between the object wavefront and the
  • estimated wavefront information is generated by calculating the wavefront propagating in the reference direction in which the simulated illumination light travels using the estimated value of the first calculation voxel generated in process A, and the estimated wavefront information is calculated based on the wavefront information. is constrained, a wavefront propagating in the direction opposite to the reference direction in which the simulated illumination light travels is calculated, and a result of updating voxels is generated,
  • the estimated values of the estimation voxels are updated based on the result of the update voxels generated in the B process
  • a computer-readable recording medium recording a program for causing a processor to execute an estimation process using a process a and processes A to C.
  • an estimating device an estimating system, an estimating method, and a recording medium that can acquire the three-dimensional optical properties of an object with a small amount of computation.
  • FIG. 4 is a diagram showing an image of an object; 4 is a graph showing the relationship between computation time and voxel width;
  • the estimation device of this embodiment includes a memory and a processor, and the memory stores a plurality of composite information.
  • the composite information includes wavefront information and rotation angle information.
  • the wavefront information is wavefront information acquired based on the illumination light that has passed through the object. This information indicates the orientation of the object at the time.
  • a processor performs an estimation process to estimate a three-dimensional optical property of the object, where the three-dimensional optical property is a refractive index distribution or an absorptance distribution.
  • an estimation process for estimating the three-dimensional optical properties of the object is executed.
  • a three-dimensional optical property is a refractive index distribution or an absorptance distribution.
  • FIG. 1 is a diagram showing the estimation device of this embodiment.
  • the estimating device 1 comprises a memory 2 and a processor 3 .
  • the memory 2 stores multiple pieces of composite information.
  • Processor 3 performs an estimation process to estimate the three-dimensional optical properties of the object.
  • a three-dimensional optical property is a refractive index distribution or an absorptance distribution. Multiple composite information can be used in the estimation process.
  • the processor may be implemented as an ASIC or FPGA, or may be a CPU.
  • the CPU reads a program from memory and executes processing.
  • the composite information has wavefront information and rotation angle information.
  • Wavefront information is wavefront information acquired based on illumination light that has passed through an object.
  • the rotation angle information is information indicating the orientation of the object when the illumination light passes through the object.
  • Wavefront information is used in the estimation process. The more wavefront information, the better. With a lot of wavefront information, the three-dimensional optical properties of an object can be estimated with high accuracy.
  • Wavefront information can be obtained, for example, from interference fringes.
  • Interference fringes are formed by the measurement light and the reference light.
  • the measurement light is the illumination light that has passed through the object.
  • Reference light is illumination light that does not pass through the object.
  • Parallel light is used as illumination light.
  • the image When estimating the wavefront with light intensity, the image can be used as wavefront information. When using an image as wavefront information, it is not necessary to acquire wavefront information by analyzing the image.
  • Wavefront information can be obtained by analyzing the image of the interference fringes. Therefore, it is necessary to acquire interference fringes. A method of obtaining an image of interference fringes will be described.
  • a plurality of wavefront information and a plurality of rotation angle information are used for the estimation process. Therefore, the memory stores a plurality of wavefront information and a plurality of rotation angle information.
  • the wavefront information is wavefront information acquired based on the illumination light that has passed through the object. With a plurality of pieces of wavefront information, the incident angle of the illumination light to the object differs for each wavefront information.
  • Wavefront information includes any of amplitude, phase, optical intensity, and complex amplitude.
  • the wavefront information is information on the wavefront on the imaging plane.
  • the imaging plane is a plane on which light is detected by the photodetector, and is also called an imaging plane.
  • FIG. 2 is a diagram showing how illumination light passes through an object.
  • 2(a), 2(b), and 2(c) are diagrams showing how the orientation of an object changes.
  • the orientation of the object when the illumination light passes through it is the orientation of the object relative to the traveling direction of the illumination light.
  • the traveling direction of the illumination light is the optical axis direction of the optical path of the measuring device. If the measuring apparatus is equipped with a detection optical system, the traveling direction of the illumination light is the optical axis direction of the detection optical system. If the measurement apparatus is equipped with an illumination optical system, the traveling direction of the illumination light is the optical axis direction of the illumination optical system. The direction of travel of the illumination light is also the direction perpendicular to the photodetection surface of the photodetector of the measuring device.
  • the orientation of the object when the illumination light passes through the object can be replaced with the relative orientation of the illumination light and the object (hereinafter referred to as "relative direction").
  • the orientation of the measuring device may be changed while the object is fixed, without changing the orientation of the object relative to the fixed measuring device.
  • wavefront information is wavefront information acquired based on illumination light that has passed through an object.
  • the amount of wavefront information is affected by relative orientation.
  • wavefront information can be acquired as many times as the number of changes. Therefore, the amount of wavefront information can be increased. Also, if the relative directions are different, the passage area of the illumination light inside the object will be different. Wavefront information in one relative direction contains information that is not present in wavefront information in another relative direction. Therefore, the amount of wavefront information can be increased.
  • FIG. 3 is a flowchart of the method for obtaining interference fringes. This acquisition method changes the relative orientation.
  • step S10 the angle change count N ⁇ is set.
  • the angle to change is the relative direction. For example, when the relative direction is changed five times, 5 is set as the value of the angle change count N ⁇ .
  • the deviation in the relative direction can be expressed as an angle.
  • the displacement in the relative direction is represented by the relative angle ⁇ (m).
  • the relative orientation deviation is 0°.
  • the value of the relative angle ⁇ (m) is set to 0°.
  • step S20 the relative angle ⁇ (m) is set.
  • the value of ⁇ (1) is set to 0°
  • the value of ⁇ (2) is set to 4°
  • the value of ⁇ (3) is set to 7°.
  • step S30 1 is set to the value of variable m.
  • step S40 positioning is performed based on the relative angle ⁇ (m).
  • the object to be positioned is illumination light or an object. In positioning, the object is rotated so that the orientation of the object with respect to the illumination light matches the value of the ⁇ relative angle (m).
  • step S50 an image of interference fringes I(m) is acquired.
  • Interference fringes I(m) are formed by irradiating the object with illumination light.
  • An image of the interference fringes I(m) can be obtained by imaging the interference fringes I(m) with a photodetector.
  • the photodetector includes an image sensor (imaging element) such as a CCD.
  • variable m represents the ordinal number for the relative angle.
  • fringes I(m) represent fringes formed at the m-th relative angle.
  • step S60 it is determined whether or not the value of the variable m matches the value of the angle change count N ⁇ . If the determination result is NO, step S70 is executed. If the determination result is YES, the process ends.
  • Step S70 is executed.
  • 1 is added to the value of variable m.
  • step S70 ends, the process returns to step S40.
  • step S70 the value of variable m is incremented by one. Therefore, steps S40 and S50 are executed at another relative angle. Steps S40 and S50 are repeated until all relative angles are positioned.
  • Wavefront information can be obtained from interference fringes.
  • the wavefront information acquired from the interference fringes I(m) is assumed to be wavefront information W(m).
  • a plurality of images of interference fringes include images of interference fringes at different angles of incidence of the illumination light on the object. Therefore, a plurality of pieces of wavefront information can be obtained from a plurality of images of interference fringes.
  • the wavefront information W(m) is stored in the memory 2. At this time, the value of the angle change count N ⁇ and the value of the relative angle ⁇ (m) are also stored in the memory 2 .
  • the processor 3 performs an estimation process of estimating the three-dimensional optical properties of the object.
  • estimation process estimation of the three-dimensional optical properties of the object is performed using both multiple pieces of wavefront information and multiple pieces of rotation angle information.
  • Wavefront information is used to estimate the three-dimensional optical properties of an object. In order to obtain wavefront information, it is necessary to obtain the wavefront that has passed through the object.
  • the estimation apparatus of this embodiment obtains the wavefront using the beam propagation method.
  • FDTD Finite Difference Time Domain
  • FDTD Finite Difference Time Domain
  • the processing executed by the processor 3 uses a plurality of voxel spaces.
  • a voxel space consists of a collection of voxels.
  • the plurality of voxel spaces has a voxel space configured with a set of first calculation voxels, a voxel space configured with a set of estimation voxels, and a voxel space configured with a set of update voxels.
  • the reference direction of the voxel space is the traveling direction of the pseudo illumination light. Therefore, the reference direction of the first calculation voxel, the estimation voxel, and the update voxel is also the traveling direction of the pseudo illumination light.
  • the reference direction is the optical axis direction of the optical path of the measurement device during measurement, in other words, the direction that coincides with the traveling direction of the illumination light. is the optical axis direction of the detection optical system, or the optical axis direction of the illumination optical system if the measuring apparatus is provided with the illumination optical system. It is also the direction perpendicular to the photodetection plane of the photodetector of the measurement device.
  • a voxel space configured by a set of first calculation voxels will be referred to as a first calculation voxel.
  • a voxel space composed of a set of estimation voxels is called an estimation voxel.
  • a voxel space composed of a set of update voxels is called an update voxel.
  • the voxel width in the reference direction is ⁇ z1
  • the voxel width in the reference direction is ⁇ z2
  • ⁇ z1 is wider than ⁇ z2.
  • the predetermined orientation is the orientation of the object with respect to the traveling direction of the illumination light specified by the rotation angle information.
  • the processing performed by processor 3 produces a result.
  • the result shows the difference between the object wavefront and the estimated object wavefront.
  • FIG. 4 is a flow chart of processing executed by the processor 3 .
  • the process executed by the processor 3 has steps S10, S20, and S30.
  • Step S10 is a process A for generating the estimated value of the first calculation voxel.
  • the estimation object is rotated in the voxel space so that the orientation of the estimation object corresponding to the set of estimated values of the estimation voxels with respect to the reference direction coincides with a predetermined orientation.
  • An estimate of a first computational voxel corresponding to the object is generated.
  • Step S20 is a process B for generating update voxel results.
  • estimated wavefront information is generated by calculating the wavefront propagating in the reference direction in which the pseudo illumination light travels using the estimated value of the first calculation voxel generated in process A, and the estimated wavefront information is calculated based on the wavefront information. is constrained, a wavefront propagating in the direction opposite to the reference direction in which the simulated illumination light travels is calculated, and the update voxel result is generated.
  • Step S30 is processing C for updating the estimated value of the estimation voxel.
  • the estimated values of the estimation voxels are updated based on the result of the update voxels generated in the process B.
  • the processing executed by the processor 3 includes first processing, second processing, and estimation processing.
  • FIG. 5 is a flowchart of processing executed by the processor. This process has steps S100, S200, S300, S400, and S500.
  • step S100 various settings are made.
  • Step S100 includes step S110, step S120, step S130, and step S140.
  • step S110 the angle change count N ⁇ is set.
  • the memory 2 stores the value of the angle change count N ⁇ . Therefore, the value stored in the memory 2 may be set as the value of the angle change count N ⁇ . For example, if 5 is stored in the memory 2, 5 is set as the value of the angle change count N ⁇ .
  • Step S120 a reference rotation angle is set.
  • Step S120 is the first process.
  • the memory 2 stores rotation angle information.
  • the rotation angle information includes multiple rotation angles.
  • Step S120 does not have to be executed when the rotation angle information of the composite information records a deviation from 0 degrees.
  • one of the multiple rotation angles is set as the reference rotation angle. For example, if the five rotation angles are ⁇ 10°, ⁇ 5°, 0°, +5°, and +10°, 0° is set as the reference rotation angle.
  • the rotation angle information is recorded as a rotation angle from 0 degrees, there is no need to set the reference rotation angle.
  • the rotation angle indicated by the read rotation angle information may be processed as it is.
  • a rotation angle not included in the rotation angle information may be set as the reference rotation angle.
  • 3D optical characteristics are estimated by simulation.
  • the simulation uses an estimated object.
  • the estimated object can be represented in voxel space.
  • voxel space is a collection of voxels.
  • an estimated object can be defined.
  • Three-dimensional optical properties of the putative object can also be represented by voxel data.
  • a simulation can be performed by defining voxels. An initial value can be set for each voxel.
  • the first calculation voxel, estimation voxel, and update voxel have a plurality of voxels. Assuming that the coordinate axes of the orthogonal coordinate system are the X-axis, Y-axis and Z-axis, each voxel can be represented by X-coordinate, Y-coordinate and Z-coordinate.
  • the wavefront propagating through the estimated object is calculated.
  • the fringe image can be obtained by changing the orientation of the object without changing the orientation of the illumination light.
  • the traveling direction of the illumination light is the Z-axis direction
  • the illumination light travels in the Z-axis direction.
  • the propagation direction of the wavefront is the direction of the Z-axis.
  • the estimation voxel data, the first calculation voxel data, and the update voxel data are used.
  • step S130 an estimation voxel is defined.
  • Step S130 is the second process.
  • estimation voxels are defined.
  • an estimated value can be set for the estimation voxels.
  • An initial estimated value may be set for the defined estimation voxel. If the initial value of each voxel is zero, it can be considered that the initial value is not set.
  • step S140 the estimated number of times Ns is set.
  • step S200 various initializations are performed.
  • Step S200 has step S210 and step S220.
  • step S210 1 is set to the value of variable n.
  • step S220 1 is set to the value of variable m.
  • step S300 estimation processing is performed.
  • the estimation process estimates the three-dimensional optical properties of the object.
  • Step S300 includes steps S310, S320, S400, S330, S340, S350, and S360.
  • an evaluation value is used in the estimation process.
  • the evaluation value is represented by the difference between the wavefront information of the measurement light and the wavefront information obtained by the simulation, or the ratio of the wavefront information of the measurement light and the wavefront information obtained by the simulation.
  • Wavefront information is information including, for example, any of amplitude, phase, light intensity, and complex amplitude.
  • the simulated wavefront information (hereinafter referred to as “estimated wavefront information”) is calculated from the estimated image.
  • the estimated image is an image obtained by light transmitted through the estimated object.
  • the light transmitted through the putative object is the simulated light.
  • Wavefront information of the measurement light (hereinafter referred to as “measurement wavefront information”) is calculated from the measurement image.
  • measurement wavefront information In the case of light intensity, a measurement image may be used as measurement wavefront information.
  • a measurement image is an image of an object acquired by an optical device.
  • the estimated image is an image of the estimated object obtained by simulation.
  • FIG. 6 is a diagram showing a measured image and an estimated image.
  • FIG. 6A is a diagram showing how a measurement image is acquired.
  • 6(b) and 6(c) are diagrams showing how the estimated image is obtained.
  • an object 20 and a measurement optical system 21 are used to acquire a measurement image.
  • the measurement optical system 21 has a lens 22 .
  • the position Zfo indicates the focal position of the measurement optical system 21.
  • Position Z s indicates the position of the image side of object 20 .
  • an optical image of the object 20 at the position Z fo is formed on the imaging plane IM.
  • the inside of the object 20 ⁇ Z away from the position Z s coincides with the position Z fo .
  • a CCD 23 is arranged on the imaging plane IM.
  • An optical image of the object 20 is picked up by the CCD 23 .
  • Measurement image Imea an image of an optical image of the object 20 (hereinafter referred to as "measurement image Imea ”) can be obtained.
  • Measured wavefront information is calculated from the measured image Imea .
  • a measurement image may be used as measurement wavefront information.
  • the measurement image I mea is also an image of light intensity. Since the measurement image I mea is an image of light intensity, the measurement wavefront information calculated from the measurement image I mea is light intensity. When using light intensity, the measured image can also be used as wavefront information.
  • the estimated wavefront information is calculated from an optical image of the estimated object 24 (hereinafter referred to as “estimated image I est ”).
  • the measurement optical system 21 is illustrated in FIG. 6(c). Since the calculation of the estimated image I est is performed by simulation, the measurement optical system 21 does not physically exist. Therefore, the pupil function of the measurement optical system 21 is used in calculating the estimated image I est .
  • the estimated image I est is obtained from the image of the estimated object 24 on the imaging plane IM. Since the measured image I mea is a light intensity image, the estimated image I est is also preferably a light intensity image. Therefore, it is necessary to calculate the light intensity of the estimated object 24 on the imaging plane IM.
  • Calculation of the light intensity of the estimated object 24 requires calculation of the wavefront propagating through the estimated object 24 .
  • the direction of the object 20 can be changed without changing the direction of the illumination light.
  • the orientation of the object 20 differs for each wavefront information W(m). Therefore, the simulation also uses the estimated object 24 with a different orientation.
  • the rotation angles of the object 20 are -10°, -5°, 0°, +5°, +10°. Therefore, the orientation of the estimated object 24 with respect to the Z axis is also ⁇ 10°, ⁇ 5°, 0°, +5°, +10°.
  • FIG. 7 is a diagram showing the orientation and voxels of an estimated object. Ellipses indicate estimated objects. The horizontal direction is the X-axis direction or the Y-axis direction. The vertical direction is the Z-axis direction.
  • FIGS. 7(a) and 7(b) are diagrams showing estimation voxels.
  • FIGS. 7(c), 7(d), and 7(e) are diagrams showing the first calculation voxels.
  • FIG. 7(f) is a diagram showing a second calculation voxel.
  • FIGS. 7(g) and 7(h) are diagrams showing update voxels.
  • the voxel width in the traveling direction of the illumination light is ⁇ z1.
  • the estimation voxel and update voxel have a voxel width of ⁇ z2 in the traveling direction of the illumination light. ⁇ z1 is wider than ⁇ z2.
  • step S310 the estimated object is rotated.
  • Step S310 is the third process. In the third process, one rotation angle is read from a plurality of rotation angles. Then, the estimated object is rotated so that the orientation of the estimated object defined in step S130 matches the orientation of the object at the read rotation angle.
  • the estimated object orientation defined in step S130 is the current (estimated) object orientation with respect to the reference direction in the voxel space, and an example is the object orientation at the reference rotation angle.
  • the orientation of the object at the read rotation angle is the orientation of the measurement object specified by the rotation angle information and the orientation of the measurement object with respect to the traveling direction of the illumination light.
  • step S310 can be provided as required.
  • the rotation angle of the object varies depending on the value of variable m.
  • the wavefront propagating through the estimated object is calculated.
  • the orientation of the estimated object is different for each estimated object.
  • the calculation of the wavefront is performed with the direction of the estimated object being the same.
  • the object orientation can be regarded as an estimated object orientation. Therefore, the orientation of the estimated object can be expressed using the rotation angle of the object.
  • step S200 the value of the variable m is set to 1. In this case, since the rotation angle is read when the value of the variable m is 1, the read rotation angle is set to -10°.
  • step S120 the reference rotation angle is set to 0°.
  • the read rotation angle is different from the reference rotation angle.
  • the orientation of the estimated object is the orientation at the reference rotation angle. Therefore, the angular relationship between the Z axis and the estimated object does not match the angular relationship between the optical axis of the optical path and the object at the read rotation angle. That is, the orientation of the estimated object does not match the orientation of the object with respect to the optical axis at the read rotation angle.
  • FIG. 7(a) shows the state before step S310 is executed.
  • a solid-line ellipse indicates an estimated object at the reference rotation angle.
  • a dashed ellipse indicates an estimated object at the read rotation angle. Since this is before step S310 is executed, the ellipse indicated by the solid line does not overlap the ellipse indicated by the broken line.
  • step S310 the estimated value of the estimation voxel is rotated so that the orientation of the object at the read rotation angle matches the orientation of the object with respect to the optical axis.
  • the angular relationship between the Z axis and the estimated object matches the angular relationship between the optical axis of the optical path and the object at the read rotation angle. That is, the orientation of the estimated object matches the orientation of the object with respect to the optical axis at the read rotation angle.
  • FIG. 7(b) shows the state after executing step S310.
  • the solid-line ellipse overlaps the dashed-line ellipse.
  • the orientation of the estimated object matches the orientation of the object with respect to the optical axis at the read rotation angle.
  • the wavefront propagating through the estimated object can be calculated with the direction of the estimated object being the same as the measurement condition.
  • step S320 an estimated value of the first calculation voxel is generated.
  • Step S320 is the fourth process.
  • the estimated value of the post-rotation estimation voxel is directly set to the first calculation voxel.
  • an estimated value of the first calculation voxel can be generated. It is also possible to set the estimated value of the post-rotation estimation voxel as the estimation voxel, and use the estimated value of the estimation voxel to generate the estimated value of the first calculation voxel.
  • FIG. 7(c) shows the state after executing step S320. As shown in FIG. 7(c), ⁇ z1 is wider than ⁇ z2.
  • the voxel width differs between the estimation voxel and the first calculation voxel. Therefore, the estimated value in the first calculation voxel is calculated using the estimated value in the estimation voxel.
  • an extrapolation method or an interpolation method can be used.
  • step S400 the wavefront and gradient propagating through the estimated object are calculated. If the voxel width is regarded as the wavefront propagation interval, the narrower the voxel width, the greater the number of wavefronts that propagate. As the number of propagating wavefronts increases, the calculation time for calculating the propagating wavefronts increases.
  • FIG. 7(d) shows the wavefront propagating in the forward direction. Dotted lines indicate wave fronts.
  • FIG. 7(e) shows the state of the wavefront propagating in the opposite direction.
  • the forward direction is the direction in which illumination light travels.
  • the reverse direction is the direction opposite to the direction in which the illumination light travels.
  • the calculation of the wavefront propagating through the estimated object uses the first calculation voxel data.
  • ⁇ z1 is wider than ⁇ z2. Therefore, the number of propagating wavefronts can be reduced compared to the case of using estimation voxel data. As a result, it is possible to shorten the calculation time by calculating the wavefront propagating in the forward direction and calculating the wavefront propagating in the reverse direction.
  • the image of the interference fringes can be obtained by changing the direction of the object without changing the direction of the illumination light.
  • the spatial resolution in the traveling direction of the illumination light is not high. Therefore, in the simulation, the traveling direction of the illumination light is the direction of the Z-axis. Therefore, the voxel width in the Z-axis direction can be widened.
  • Step S400 includes steps S410, S420, S430, S440, and S450.
  • step S400 the fifth process is performed.
  • the fifth process has steps S410, S420, and S440.
  • step S410 a forward propagation calculation is executed.
  • a wavefront propagating in the direction in which the illumination light travels is calculated. Since the wavefront is calculated by simulation, the illumination light is pseudo illumination light. As shown in FIG. can.
  • step S410 can be regarded as a step of calculating estimated wavefront information.
  • FIG. 8 is a flowchart for calculating estimated wavefront information.
  • Step S410 includes steps S411, S412, S413, S414, and S415.
  • the estimated wavefront information is calculated based on the forward propagation of the wavefront.
  • the wavefront propagates from the estimated object 24 toward the imaging plane IM, as shown in FIGS. 6(b) and 6(c).
  • step S411 the wavefront incident on the estimated object is calculated.
  • the position Z in is the position of the surface of the estimated object 24 corresponding to the surface of the object 20 on the light source (illumination) side.
  • the position Z in is the position of the surface on which the simulated light enters the estimated object 24 . Therefore, the wavefront U in at the position Z in is calculated.
  • the same wavefront as the wavefront of the measurement light with which the object 20 is irradiated can be used for the wavefront Uin .
  • step S412 the wavefront emitted from the estimated object is calculated.
  • the position Z out is the position of the surface of the estimated object 24 corresponding to the imaging side (lens side, CCD side) surface of the object 20 .
  • the position Z out is the position of the surface from which the simulated light exits from the estimated object 24 . Therefore, the wavefront U out at the position Z out is calculated.
  • the wavefront Uout can be calculated from the wavefront Uin , for example, using the beam propagation method.
  • step S413 the wavefront at a predetermined acquisition position is calculated.
  • the predetermined acquisition position is the position on the object side when the measurement image was acquired.
  • the predetermined acquisition position is any position between position Zin and position Zout .
  • Position Z p is one of the predetermined acquisition positions.
  • the position Z p is a position conjugate with the imaging plane IM.
  • the estimated image I est is calculated under the same conditions as the measured image I mea .
  • the measurement image I mea is obtained from an internal optical image of the object 20 ⁇ Z away from the position Z s . Therefore, the estimated image I est calculation requires a wavefront at a position ⁇ Z away from the position Z s .
  • position Z out corresponds to position Z s .
  • a position ⁇ Z away from the position Z out is a position Z p . Therefore, it suffices if the wavefront U p at the position Z p can be calculated.
  • Position Z p is ⁇ Z away from position Z out . Therefore, the wavefront Uout cannot be used as the wavefront Up .
  • the wavefront Up can be calculated from the wavefront Uout using, for example, the beam propagation method.
  • step S414 the wavefront on the imaging plane is calculated.
  • the wavefront Up passes through the measuring optical system 21 and reaches the imaging plane IM.
  • a wavefront U img on the imaging plane IM can be calculated from the wavefront Up and the pupil function of the measurement optical system 21 .
  • step S415 estimated wavefront information on the imaging plane is calculated.
  • a wavefront U img represents the amplitude of the light.
  • Light intensity is expressed as the square of the amplitude. Therefore, the light intensity of the estimated object 24 can be calculated by squaring the wavefront U img . As a result, the estimated image I est can be acquired. Estimated wavefront information is calculated from the estimated image I est .
  • Amplitude and phase may be used instead of light intensity. Amplitude and phase are represented using electric fields. Therefore, when amplitude and phase are used, values calculated from the electric field are used for the measurement location and the estimated value.
  • the electric field Emes based on the measurement and the electric field Eest based on the estimation are represented by the following equations.
  • Emes Ames x exp (i x Pmes)
  • Eest Aest x exp (i x Pest) here, Pmes is the measured phase; Ames is the measured amplitude; Pest is the estimated phase, Aest is the estimated amplitude, is.
  • the measurement light and the reference light are incident on the photodetector in a non-parallel state.
  • the measurement light and the reference light form interference fringes on the imaging surface of the photodetector.
  • the interference fringes are imaged by a photodetector. As a result, an image of interference fringes can be acquired.
  • the interference fringes are obtained with the measurement light and the reference light non-parallel. Therefore, by analyzing the interference fringes, it is possible to obtain the phase based on the measurement and the amplitude based on the measurement. The result is the measured electric field Emes.
  • the estimated electric field Eest can be obtained by simulation.
  • the complex amplitude can be obtained. Therefore, instead of light intensity, complex amplitude may be used for wavefront information.
  • step S420 optimization processing is performed.
  • the wavefront information W(m) constrains the estimated wavefront information.
  • the wavefront information W(m) is obtained from the image of the interference fringes I(m).
  • the interference fringes I(m) are formed by the measuring light. Therefore, the wavefront information W(m) can be regarded as the measured wavefront information.
  • variable m represents the ordinal number for the relative angle.
  • the wavefront information W(m) represents the measured wavefront information when using the m-th relative angle.
  • Measured wavefront information is calculated from the measured image Imea .
  • Estimated wavefront information is calculated from the estimated image I est .
  • An evaluation value can be calculated from the difference between the measured wavefront information and the estimated wavefront information or the ratio between the measured wavefront information and the estimated wavefront information.
  • Constraining the estimated wavefront information by wavefront information means correcting the estimated wavefront information using the measured wavefront information or calculating the error between the estimated wavefront information and the measured wavefront information, which is almost the same as calculating the evaluation value. Synonymous.
  • the difference between the measured image I mea and the estimated image I est or the ratio between the measured image I mea and the estimated image I est may be used as the evaluation value.
  • step S430 the evaluation value and the threshold are compared.
  • step S500 is executed. If the determination result is YES, step S440 is executed.
  • Step S500 is executed.
  • step S500 the three-dimensional optical characteristics of the estimated object are calculated.
  • the obtained three-dimensional optical properties of the estimated object 24 are the same or substantially the same as the three-dimensional optical properties of the object 20 .
  • a reconstructed estimated object can be obtained.
  • the reconstructed estimated object can be output to, for example, a display device.
  • the three-dimensional optical properties obtained in step S500 are the same or substantially the same as the three-dimensional optical properties of the object 20.
  • the reconstructed estimated object can be considered identical or nearly identical to the structure of object 20 .
  • Step S440 is executed.
  • a backpropagation operation is performed.
  • a wavefront propagating in the direction opposite to the direction in which the illumination light travels is calculated.
  • the calculation time in the backpropagation calculation can be shortened.
  • step S440 can be regarded as a step of calculating the slope.
  • Gradient calculation is based on back propagation of the wavefront.
  • the wavefront propagates from position Z out towards position Z in .
  • the wavefront at position Z out can be calculated from the wavefront at position Z p .
  • FIG. 6(c) shows the wavefront U'p .
  • Wavefront U′ p is the wavefront at position Z p .
  • an image can be used as wavefront information. Therefore, the measured image I mea and the estimated image I est are used in calculating the wavefront U′ p after correction.
  • the estimated image I est is calculated based on the wavefront U img . Also, the wavefront U img is calculated based on the wavefront Up .
  • the initial value set in step S130 is used to calculate the wavefront Up.
  • the initial values are values of the three-dimensional optical properties of the estimated object 24 .
  • the initial values are different from the values of the three-dimensional optical properties of the object 20 (hereinafter referred to as "object property values").
  • the difference between the estimated image I est and the measured image I mea increases. Therefore, the difference between the estimated image I est and the measured image I mea can be regarded as reflecting the difference between the initial value and the object characteristic value.
  • the wavefront Up is corrected using the estimated image I est and the measured image I mea .
  • the wavefront after correction that is, the wavefront U'p is obtained.
  • the wavefront U' p is represented by the following equation (1), for example.
  • U' p U p ⁇ (I mea /I est ) (1)
  • Wavefront U' out a corrected wavefront
  • the wavefront U'p is a wavefront obtained by correcting the wavefront Up .
  • the wavefront U'p is the wavefront at the position Zp .
  • the wavefront U'p is shown at a position shifted from the position Zp for ease of viewing.
  • the wavefront U' out is shown at a position shifted from the position Z out .
  • position Z out is separated from position Z p by ⁇ Z. Therefore, the wavefront U'p cannot be used as the wavefront U'out .
  • the wavefront U'out can be calculated from the wavefront U'p , for example using the beam propagation method.
  • the wavefront calculation is performed based on the wavefront backpropagation.
  • backpropagation of the wavefront the wavefront propagating inside the estimated object 24 is calculated.
  • Wavefronts U out and U′ out are used in the wavefront calculation.
  • Wavefront U'p is different from wavefront Up . Therefore, the wavefront U'out is also different from the wavefront Uout .
  • the gradient can be calculated using the wavefront U' out and the wavefront U out .
  • the slope is the slope of the wavefront at any position within the object. Gradients contain new information about the values of the three-dimensional optical properties of estimated object 24 .
  • step S450 the gradient is set to the second calculation voxel.
  • the first calculation voxel is used in the back propagation calculation for calculating the gradient.
  • the gradient is the result obtained from counterpropagating wavefronts. Therefore, the result obtained from the wavefront propagating in the opposite direction is set to each voxel of the second calculation voxels.
  • FIG. 7(f) shows the state after execution of step S450.
  • step S450 is unnecessary.
  • the shape of the estimated object is an ellipse. Therefore, the shape of the estimated object after executing step S400 is the same as the shape of the estimated object before executing step S400. However, in some cases, the shape of the estimated object after performing step S400 differs from the shape of the estimated object before performing step S400.
  • step S330 update voxels are generated.
  • Step S330 is the sixth process.
  • the gradients are set directly to the update voxels.
  • update voxel data can be generated.
  • FIG. 7(g) shows the state after execution of step S330.
  • the voxel width when calculating the gradient with the first calculation voxel is ⁇ z1
  • the voxel width of the update voxel is ⁇ z2. That is, the voxel width differs between the first calculation voxel and the update voxel. Therefore, the gradient in the update voxel is calculated using the gradient in the first calculation voxel. For the calculation, for example, an extrapolation method or an interpolation method can be used.
  • step S450 the gradient in the update voxel may be calculated using the gradient in the second calculation voxel.
  • step S340 the estimated object is rotated.
  • Step S340 is the seventh process.
  • the estimated object is rotated so that the orientation of the object at the read rotation angle matches the orientation of the object at the reference rotation angle.
  • step S340 may be provided as required.
  • S340 is necessary when updating the estimated value with the orientation of the object at the reference rotation angle. However, if not, S340 may not be provided. If S340 is not provided, the reference rotation angle of S310 performed next is the orientation of the object at the rotation angle of S310 performed immediately before. Also, the reference rotation angle need not be fixed during the estimation process, and may be changed during the process.
  • the orientation of the estimation object corresponding to the set of update voxel results with respect to the reference direction and the orientation of the estimation object corresponding to the set of estimated values of the estimation voxels match in the voxel space. , rotate one or both of the former and the latter estimated objects.
  • S350 updates the estimates of the estimation voxels using the resulting set of update voxels in which the orientation of the estimation object relative to the reference direction in voxel space matches the set of estimates of the estimation voxels.
  • one rotation angle is read from a plurality of rotation angles.
  • the read rotation angle represents the orientation of the estimated object. Processing is executed using the read rotation angle estimation object. Therefore, the gradient in the update voxels generated in step S330 is the gradient in the estimated object of the read rotation angle.
  • step S350 when updating the estimated value in step S350, it is necessary to match the orientation of the estimated object with the orientation indicated by the reference rotation angle.
  • FIG. 7(h) shows the state after execution of step S340.
  • step S370 it is determined whether or not the value of the variable m matches the value of the angle change count N ⁇ . If the determination result is NO, step S371 is executed. If the determination result is YES, step S350 is executed.
  • Step S371 is executed. In step S371, 1 is added to the value of variable m. After step S371 ends, the process returns to step S310.
  • step S371 the value of variable m is incremented by one. In this case, the value of m in the wavefront information W(m) changes. Therefore, steps S310 to S340 are executed with wavefront information of another relative angle. Steps S310 to S340 are repeated until all relative angles are positioned.
  • steps S310 to S340 are executed five times.
  • wavefront information A and wavefront information B have different relative angles
  • wavefront information A includes information that wavefront information B does not have
  • wavefront information B includes information that wavefront information A does not have. Therefore, the amount of information increases as the amount of wavefront information with different relative angles increases.
  • the wavefront after correction can be calculated more accurately in step S440.
  • the precision of the gradient is also increased.
  • Gradients contain information about the difference between the estimated value and the object property value. By increasing the accuracy of the gradient, it is possible to reduce the difference between the estimated value and the object property value. That is, the estimated value can be brought closer to the object characteristic value.
  • Step S350 is executed.
  • step S350 the estimated values of the estimation voxels are updated.
  • the gradient contains information about the difference between the estimated value and the object property value. So adding the gradient to the estimate gives an updated estimate.
  • the updated estimated value is closer to the object property value than the initial value. Accordingly, the values of the three-dimensional optical properties of the estimated object 24 can be updated using the updated estimated values.
  • step S360 TV regularization is performed.
  • TV regularization By performing TV regularization, it is possible to remove noise and correct blurred images. TV regularization may be performed as needed. Therefore, step 360 may be omitted.
  • step S380 it is determined whether or not the value of the variable n matches the value of the estimated number of times Ns. If the determination result is NO, step S381 is executed. If the determination result is YES, the process ends.
  • Step S381 is executed.
  • 1 is added to the value of the variable n.
  • Step S381 ends, the process returns to step S300.
  • Step S300 is repeated until the value of the variable n matches the value of the estimated number of times Ns.
  • step S500 Since the predetermined number of iterations has been reached, the three-dimensional optical properties of the estimated object are calculated in step S500, and the process ends.
  • the number of wavefronts in the forward propagation calculation and the number of wavefronts in the backward propagation calculation can be reduced. Therefore, the three-dimensional optical characteristics of the object can be obtained with a small amount of calculation. As a result, it is possible to shorten the computation time for the forward propagation computation and the backward propagation computation.
  • the third process to the forward propagation calculation of the fifth process can be performed in parallel in the calculation unit (eg, wavefront unit) of the forward propagation calculation of the fifth process by using matrix transformation or the like.
  • the third and fourth processes generate the estimated values required at the timing required for the forward propagation calculation.
  • the third process and the fourth process may generate only the estimated value of the wavefront to be calculated next in the forward propagation calculation of the fifth process, or collectively generate the estimated values of a plurality of wavefronts at the timing of generation.
  • the back propagation calculation of the fifth process to the seventh process can also be performed in parallel. Doing things in parallel saves memory.
  • the estimated value is updated with the reference rotation angle.
  • the second pattern updates the estimate with the post-rotation angle.
  • the orientations of the estimated object and gradient are adjusted to a rotation angle other than the reference rotation angle and the post-rotation angle, and the estimated value is updated.
  • the fourth process corresponds to process A for generating the estimated value of the first calculation voxel.
  • the fifth and sixth processes correspond to the process B that generates the update voxel results.
  • the eighth process corresponds to the process C of updating the estimated value of the estimation voxel.
  • the estimation device of this embodiment preferably satisfies the following conditional expression (1). ⁇ z2 ⁇ 2 ⁇ z1 (1)
  • conditional expression (1) the number of wavefronts in the forward propagation calculation and the number of wavefronts in the backward propagation calculation can be further reduced. Therefore, the three-dimensional optical characteristics of the object can be acquired with a smaller amount of calculation. As a result, it is possible to further shorten the computation time for the forward propagation computation and the backward propagation computation.
  • the 3D optical properties of the estimated object become closer to the 3D optical properties of the object.
  • the three-dimensional optical characteristics in the Z-axis direction can be estimated with higher accuracy.
  • ⁇ z1 it is also possible to set ⁇ z1 to be small after executing the estimation process a first number of times (eg, 5 times), and further to set ⁇ z1 to be even smaller after executing the estimation process a second number of times (eg, 3 times).
  • ⁇ z1 is preferably determined based on the maximum rotation angle difference between multiple pieces of rotation angle information.
  • ⁇ Z1 is the maximum rotation angle difference based on a plurality of pieces of rotation angle information, specifically, the difference between the maximum rotation angle in the negative direction and the maximum rotation angle in the positive direction with respect to the traveling direction of the illumination light. should be determined based on the difference between
  • the estimation system of this embodiment includes the estimation device of this embodiment, a light source that emits illumination light, a photodetector, a stage on which an object is placed, and an angle changing mechanism.
  • An angle changing mechanism is arranged on the optical path to the photodetector, and changes the placement angle of the object with respect to the optical axis of the optical path.
  • the angle changing mechanism changes the arrangement angle of the object with respect to the optical axis of the optical path, in other words, the orientation of the object with respect to the traveling direction of the illumination light.
  • FIG. 9 is a diagram showing the estimation system of this embodiment. The same numbers are assigned to the same configurations as in FIG. 1, and the description thereof is omitted.
  • the estimation system 30 includes a light source 31, a photodetector 32, a stage 33, and an estimation device 1.
  • the estimating device 1 has a memory 2 and a processor 3 .
  • the light source 31 emits illumination light.
  • a beam splitter 34 is arranged in the traveling direction of the illumination light.
  • the illumination light enters beam splitter 34 .
  • the beam splitter 34 has an optical surface on which an optical film is formed. In the beam splitter 34 , light that is transmitted in the first direction and light that is reflected in the second direction are generated from the incident light by the optical film.
  • the estimation system 30 forms a measurement optical path OP mea in a first direction and a reference optical path OP ref in a second direction.
  • the reference optical path OP ref may be formed in the first direction and the measurement optical path OP mea may be formed in the second direction.
  • the illumination light travels through a measurement optical path OP mea and a reference optical path OP ref respectively.
  • a mirror 35 is arranged in the measurement optical path OP mea .
  • the measurement optical path OP mea is bent in the second direction by the mirror 35 .
  • a mirror 36 is arranged in the reference optical path OP ref .
  • the reference optical path OP ref is folded in a first direction by mirror 36 .
  • the reference optical path OP ref intersects the measurement optical path OP mea .
  • a beam splitter 37 is arranged at the position where the two optical paths intersect.
  • a stage 33 is arranged between a mirror 35 and a beam splitter 37 in the measurement optical path OP mea .
  • An object S is placed on the stage 33 .
  • the object S is irradiated with the illumination light.
  • the measurement light L mea is illumination light that has passed through the object S.
  • a reference beam L ref travels in the reference optical path OP ref .
  • the reference light L ref is illumination light that does not pass through the object S.
  • the measurement light L mea and the reference light L ref enter the beam splitter 37 .
  • the beam splitter 37 has an optical surface on which an optical film is formed.
  • the optical film generates light that is transmitted in the first direction and light that is reflected in the second direction from the incident light.
  • a photodetector 32 is arranged in the first direction. When the light source 31 is on, the measurement light L mea and the reference light L ref are incident on the photodetector 32 .
  • An interference fringe is formed by the measurement light L mea and the reference light L ref .
  • the image of the interference fringes is sent to the estimation device 1.
  • the estimation device 1 acquires wavefront information based on the image of the interference fringes. Wavefront information is stored in memory 2 . An estimation process is performed using the wavefront information.
  • the estimation system of this embodiment has an angle changing mechanism.
  • the angle changing mechanism changes the relative orientation. Therefore, the incident angle of the illumination light to the object can be changed. As a result, a plurality of pieces of wavefront information can be obtained.
  • the angle changing mechanism has a driving device and a rotating member, the rotating member holds the stage, and the rotation axis of the rotating member intersects the object and the light in the optical path. It is preferably perpendicular to the axis.
  • the optical axis of the optical path is the axis perpendicular to the detection surface of the photodetector, the axis passing through the center of the photodetector, and the optical axis of the detection optical system if the detection optical system is provided. If an optical system is provided, it is the optical axis of the illumination optical system.
  • the estimation system 30 has an angle changing mechanism 40. As shown in FIG. 9, the estimation system 30 has an angle changing mechanism 40. As shown in FIG. The angle changing mechanism 40 is arranged on the measurement optical path OP mea side.
  • the angle changing mechanism 40 has a driving device 41 and a rotating member 42.
  • the rotating member 42 holds the stage 33 .
  • Axis RX is the rotation axis of rotating member 42 .
  • the axis RX intersects the object S and is perpendicular to the optical axis AX.
  • the rotating member 42 is rotated by the drive device 41 . Since the rotary member 42 holds the stage 33, the stage 33 rotates. By rotating the stage 33, the object S can be rotated around the axis RX.
  • the illumination light is reflected by the mirror 35 and enters the object S.
  • the rotation of the object S changes the orientation of the object S that illuminates. Therefore, the object S is irradiated with illumination light from various directions.
  • a measurement light beam L mea is emitted from the object S.
  • the measurement light L mea enters the photodetector 32 .
  • the direction of the object S changes without changing the direction of the illumination light. Therefore, the incident angle of the illumination light to the object S can be changed.
  • FIG. 10 is a diagram showing the estimation system of this embodiment. The same numbers are assigned to the same configurations as in FIG. 9, and the description thereof is omitted.
  • the estimation system 50 has a mirror 51 and a beam splitter 52 .
  • a mirror 51 is arranged in the measurement optical path OP mea .
  • the beam splitter 52 is arranged at a position where the reference optical path OP ref and the measurement optical path OP mea intersect.
  • the beam splitter 37 bends the measurement optical path OP mea in the first direction, and the mirror 36 bends the reference optical path OP ref in the first direction.
  • the mirror 51 bends the measurement optical path OP mea in the opposite direction to the first direction
  • the beam splitter 52 bends the reference optical path OP ref in the opposite direction to the first direction. Therefore, a difference occurs between the optical path length of the measurement optical path OP mea and the optical path length of the reference optical path OP ref .
  • interference fringes are formed. If the coherence length of the illumination light is shorter than the difference in optical path length, the optical path length adjuster 53 is arranged between the beam splitter 34 and the mirror 52 . With such an arrangement, interference fringes can be formed.
  • the optical path length adjusting section 53 has, for example, a piezo stage and four mirrors. Two mirrors are placed on the piezo stage. By moving the two mirrors, the optical path length in the reference optical path OP ref can be changed.
  • the estimation system of this embodiment preferably satisfies the following conditional expression (1). ⁇ z2 ⁇ 2 ⁇ z1 (1)
  • the estimation system of this embodiment preferably has a detection optical system and satisfies the following conditional expression (2). ⁇ z1 ⁇ /NA 2 ⁇ 5 (2) here, ⁇ is the wavelength of the illumination light, NA is the numerical aperture of the detection optical system; is.
  • the estimation system of the present embodiment preferably reduces ⁇ z1 each time the estimation process is performed.
  • the estimation system of this embodiment preferably determines ⁇ z1 based on the maximum rotation angle.
  • FIG. 11 is a diagram showing an image of an object.
  • the images are images obtained by simulation.
  • NA is the numerical aperture of the detection optical system.
  • ⁇ z1/ ⁇ z2 is the ratio of ⁇ z1 and ⁇ z2.
  • the horizontal direction of the image is the X-axis direction or the Y-axis direction.
  • the vertical direction of the image is the Z-axis direction.
  • the first image group is the image shown in FIG. 11(a), the image shown in FIG. 11(b), the image shown in FIG. 11(c), and the image shown in FIG. 11(d).
  • the value of ⁇ z1/ ⁇ z2 is 10, and the NA values are different in each figure.
  • the second image group is the image shown in FIG. 11(e), the image shown in FIG. 11(f), the image shown in FIG. 11(g), and the image shown in FIG. 11(h).
  • the value of ⁇ z1/ ⁇ z2 is 20, and the NA values are different in each figure.
  • the third image group is the image shown in FIG. 11(i), the image shown in FIG. 11(j), the image shown in FIG. 11(k), and the image shown in FIG. 11(l).
  • the value of ⁇ z1/ ⁇ z2 is 43, and the NA values are different in each figure.
  • indicates that the image quality is good. “ ⁇ ” indicates that the image quality is slightly poor. “X” indicates that the image quality is poor.
  • a PCF using a photonic crystal fiber (hereinafter referred to as "PCF") as an object has a cylindrical member and a through hole.
  • a plurality of through holes are formed inside the cylindrical member.
  • the through hole is cylindrical and formed along the generatrix of the cylindrical member.
  • the outer diameter of the PCF is 230 ⁇ m and the refractive index of the medium is 1.466.
  • the perimeter of the through-hole and the cylindrical member is filled with a liquid having a refractive index of 1.44.
  • Wavefront information obtained with illumination light with a wavelength ⁇ of 1000 nm is used to estimate each image.
  • the NA value the more susceptible it is to aberrations.
  • the Z-axis direction is greatly affected by the medium.
  • the NA value the more susceptible it is to aberrations.
  • the Z-axis direction is greatly affected by the medium.
  • the NA value the more the image quality is degraded.
  • FIG. 12 is a graph showing the relationship between computation time and voxel width.
  • the calculation time on the vertical axis is the calculation time for the forward propagation calculation and the backward propagation calculation.
  • the wavefront propagating through the estimated object is calculated using ⁇ z1.
  • ⁇ z1 becomes wider, the number of propagating wavefronts can be reduced. As a result, it is possible to shorten the computation time by calculating the propagating wavefront.
  • ⁇ z1/ ⁇ z2 in the third image group is greater than the value of ⁇ z1/ ⁇ z2 in the first image group. Therefore, the number of images with poor image quality is larger in the third image group than in the first image group.
  • the estimation method of this embodiment is an estimation method for estimating the three-dimensional optical properties of an object.
  • a three-dimensional optical property is a refractive index distribution or an absorptance distribution.
  • the composite information includes wavefront information and rotation angle information.
  • the wavefront information is wavefront information acquired based on the illumination light that has passed through the object. This information indicates the orientation of the object at the time.
  • the computer executes the estimation process.
  • the computer reads the composite information stored in the memory and performs the estimation process.
  • a voxel space consists of a set of voxels.
  • the plurality of voxel spaces has a voxel space configured with a set of first calculation voxels, a voxel space configured with a set of estimation voxels, and a voxel space configured with a set of update voxels.
  • the reference direction of the voxel space is the traveling direction of the pseudo illumination light. Therefore, the reference direction of the first calculation voxel, the estimation voxel, and the update voxel is also the traveling direction of the pseudo illumination light.
  • the reference direction is the direction of the optical axis of the optical path of the measurement device during measurement, in other words, a direction that coincides with the traveling direction of the illumination light. , the direction of the optical axis of the detection optical system, and the direction of the optical axis of the illumination optical system if the measurement apparatus is provided with the illumination optical system. It is also the direction perpendicular to the photodetection plane of the photodetector of the measurement device.
  • a voxel space composed of a set of first calculation voxels is referred to as a first calculation voxel.
  • a voxel space composed of a set of estimation voxels is called an estimation voxel.
  • a voxel space composed of a set of update voxels is called an update voxel.
  • the voxel width in the reference direction is ⁇ z1
  • the voxel width in the reference direction is ⁇ z2
  • ⁇ z1 is wider than ⁇ z2.
  • the predetermined orientation is the orientation of the object with respect to the traveling direction of the illumination light specified by the rotation angle information.
  • the processing performed by processor 3 produces a result.
  • the result shows the difference between the object wavefront and the estimated object wavefront.
  • steps S10, S20, and S30 are executed.
  • Step S10 is a process A for generating the estimated value of the first calculation voxel.
  • Step S20 is a process B for generating update voxel results.
  • Step S30 is a process C for updating the estimated value of the estimation voxel. Since processing A, processing B, and processing C have already been described, descriptions thereof will be omitted here.
  • the estimation method of this embodiment preferably satisfies the following conditional expression (1). ⁇ z2 ⁇ 2 ⁇ z1 (1)
  • ⁇ z1 is preferably determined based on the maximum rotation angle.
  • the recording medium of this embodiment is a computer-readable recording medium on which a program is recorded.
  • the recording medium stores a program for causing the processor to execute an estimation process for estimating the three-dimensional optical properties of an object in an estimation device having a memory and a processor.
  • the memory stores a plurality of composite information and causes the processor to perform the estimation process using process a, process A, process B, and process C.
  • the composite information includes wavefront information and rotation angle information.
  • the wavefront information is wavefront information acquired based on the illumination light that has passed through the object. This information indicates the orientation of the object at the time.
  • the estimation process estimates the three-dimensional optical properties of the object, where the three-dimensional optical properties are refractive index distributions or absorptance distributions.
  • the processing executed by the processor uses a voxel space configured with a set of first calculation voxels, a voxel space configured with a set of estimation voxels, and a voxel space configured with a set of update voxels. .
  • a voxel space composed of a set of first calculation voxels is called a first calculation voxel.
  • a voxel space composed of a set of estimation voxels is called an estimation voxel.
  • a voxel space composed of a set of update voxels is called an update voxel.
  • the voxel width in the reference direction is ⁇ z1
  • the voxel width in the reference direction is ⁇ z2
  • ⁇ z1 is wider than ⁇ z2.
  • the predetermined orientation is the orientation of the object with respect to the traveling direction of the illumination light specified by the rotation angle information.
  • the result shows the difference between the object wavefront and the estimated object wavefront.
  • the estimation object is rotated in the voxel space so that the orientation of the estimation object corresponding to the set of estimated values of the estimation voxels with respect to the reference direction matches a predetermined orientation.
  • estimated wavefront information is generated by calculating the wavefront propagating in the reference direction in which the simulated illumination light travels using the estimated value of the first calculation voxel generated in process A, and the estimated wavefront information is calculated based on the wavefront information. is constrained, a wavefront propagating in the direction opposite to the reference direction in which the simulated illumination light travels is calculated, and a result of updating voxels is generated,
  • the estimated values of the estimation voxels are updated based on the results of the update voxels generated in the B process.
  • the present invention is suitable for an estimating device, an estimating system, an estimating method, and a recording medium that can acquire the three-dimensional optical properties of an object with a small amount of computation.

Landscapes

  • Physics & Mathematics (AREA)
  • Analytical Chemistry (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Biochemistry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

La présente invention concerne un dispositif d'estimation permettant d'obtenir les caractéristiques optiques 3D d'un objet avec un faible volume de calcul. Dispositif d'estimation (1) comprenant une mémoire (2) et un processeur (3). La mémoire (2) stocke une pluralité d'éléments d'informations composites. Les informations composites comprennent des informations sur la surface des ondes et des informations sur l'angle de rotation. Le processeur (3) exécute un traitement d'estimation pour estimer les caractéristiques optiques 3D de l'objet. Les caractéristiques optiques 3D sont la distribution d'indice de réfraction et la distribution d'adsorption de lumière. Dans le traitement exécuté par le processeur (3), les éléments suivants sont utilisés : un espace voxel ayant une largeur de voxel de Δz1 ; et un espace voxel ayant une largeur de voxel de Δz2. Δz1 est plus large que Δz2. Une valeur estimée est mise en rotation à l'aide de données de voxel qui incluent Δz2. En utilisant l'espace voxel ayant la largeur de voxel Δz1, une opération de propagation avant et une opération de propagation arrière sont effectuées.
PCT/JP2021/010679 2021-03-16 2021-03-16 Dispositif d'estimation, système d'estimation, procédé d'estimation et support d'enregistrement WO2022195731A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/010679 WO2022195731A1 (fr) 2021-03-16 2021-03-16 Dispositif d'estimation, système d'estimation, procédé d'estimation et support d'enregistrement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/010679 WO2022195731A1 (fr) 2021-03-16 2021-03-16 Dispositif d'estimation, système d'estimation, procédé d'estimation et support d'enregistrement

Publications (1)

Publication Number Publication Date
WO2022195731A1 true WO2022195731A1 (fr) 2022-09-22

Family

ID=83320178

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/010679 WO2022195731A1 (fr) 2021-03-16 2021-03-16 Dispositif d'estimation, système d'estimation, procédé d'estimation et support d'enregistrement

Country Status (1)

Country Link
WO (1) WO2022195731A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006200999A (ja) * 2005-01-19 2006-08-03 Canon Inc 画像処理装置および屈折率分布測定装置
JP2015219502A (ja) * 2014-05-21 2015-12-07 浜松ホトニクス株式会社 光刺激装置及び光刺激方法
JP2019078635A (ja) * 2017-10-25 2019-05-23 キヤノン株式会社 測定装置、データ処理装置、データ処理方法およびプログラム
WO2020013325A1 (fr) * 2018-07-13 2020-01-16 国立大学法人東京大学 Dispositif de génération d'images et procédé de génération d'images
US20200182788A1 (en) * 2017-07-06 2020-06-11 Ramot At Tel-Aviv University System and method for three-dimensional label-free optical imaging of a biological cell sample in an environmental chamber

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006200999A (ja) * 2005-01-19 2006-08-03 Canon Inc 画像処理装置および屈折率分布測定装置
JP2015219502A (ja) * 2014-05-21 2015-12-07 浜松ホトニクス株式会社 光刺激装置及び光刺激方法
US20200182788A1 (en) * 2017-07-06 2020-06-11 Ramot At Tel-Aviv University System and method for three-dimensional label-free optical imaging of a biological cell sample in an environmental chamber
JP2019078635A (ja) * 2017-10-25 2019-05-23 キヤノン株式会社 測定装置、データ処理装置、データ処理方法およびプログラム
WO2020013325A1 (fr) * 2018-07-13 2020-01-16 国立大学法人東京大学 Dispositif de génération d'images et procédé de génération d'images

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MOR HABAZA, MICHAEL KIRSCHBAUM, CHRISTIAN GUERNTH-MARSCHNER, GILI DARDIKMAN, ITAY BARNEA, RAFI KORENSTEIN, CLAUS DUSCHL, NATAN T. : "Rapid 3D Refractive-Index Imaging of Live Cells in Suspension without Labeling Using Dielectrophoretic Cell Rotation", ADVANCED SCIENCE, vol. 4, no. 2, 1 February 2017 (2017-02-01), pages 1600205, XP055563982, ISSN: 2198-3844, DOI: 10.1002/advs.201600205 *

Similar Documents

Publication Publication Date Title
TWI629448B (zh) 角度解析反射計及用於量測之方法、系統及電腦程式產品
JP5808836B2 (ja) 波面を推定する方法
JP6494205B2 (ja) 波面計測方法、形状計測方法、光学素子の製造方法、光学機器の製造方法、プログラム、波面計測装置
JP5798719B2 (ja) 目標形状からの光学試験面のずれを測定する方法および装置
CN114002190B (zh) 三维光学衍射层析成像方法及装置
KR20110106823A (ko) 비구면체 측정 방법 및 장치
US9091614B2 (en) Wavefront optical measuring apparatus
WO2004052189A1 (fr) Analyseur optique de front d'onde
JPH0666537A (ja) システムエラー測定方法及びそれを用いた形状測定装置
JP2015118290A (ja) デジタルホログラフィ3次元撮像装置および撮像方法
JP5595463B2 (ja) 波面光学測定装置
TW202311715A (zh) 用於確定在待測量的入射光瞳內被照明光照明時光學系統的成像品質的方法
TW202422003A (zh) 藉助計量系統對測量物件的空間圖像而三維確定的方法以及用於實施該確定方法的計量系統
JP7204428B2 (ja) 偏心計測方法、レンズ製造方法、および偏心計測装置
WO2022195731A1 (fr) Dispositif d'estimation, système d'estimation, procédé d'estimation et support d'enregistrement
CN108760056B (zh) 一种基于相干衍射成像的激光复振幅测量方法及系统
JP3352298B2 (ja) レンズ性能測定方法及びそれを用いたレンズ性能測定装置
US20220074854A1 (en) Refractive index distribution estimating system
JP6792025B2 (ja) Octシステム、oct映像生成方法及び格納媒体
JP3871183B2 (ja) 光学素子の3次元形状測定方法及び測定装置
CN114270177B (zh) 试样结构测量装置及试样结构测量方法
JP2006126103A (ja) 非球面形状測定方法
JP3599921B2 (ja) 屈折率分布の測定方法及び装置
JP7451121B2 (ja) 収差推定方法、収差推定装置、プログラムおよび記録媒体
EP4404004A1 (fr) Système de mesure optique et procédé de mesure optique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21931483

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21931483

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP