WO2022195731A1 - Estimation device, estimation system, estimation method, and recording medium - Google Patents

Estimation device, estimation system, estimation method, and recording medium Download PDF

Info

Publication number
WO2022195731A1
WO2022195731A1 PCT/JP2021/010679 JP2021010679W WO2022195731A1 WO 2022195731 A1 WO2022195731 A1 WO 2022195731A1 JP 2021010679 W JP2021010679 W JP 2021010679W WO 2022195731 A1 WO2022195731 A1 WO 2022195731A1
Authority
WO
WIPO (PCT)
Prior art keywords
voxel
estimation
wavefront
information
estimated
Prior art date
Application number
PCT/JP2021/010679
Other languages
French (fr)
Japanese (ja)
Inventor
渡部智史
Original Assignee
株式会社エビデント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社エビデント filed Critical 株式会社エビデント
Priority to PCT/JP2021/010679 priority Critical patent/WO2022195731A1/en
Publication of WO2022195731A1 publication Critical patent/WO2022195731A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/41Refractivity; Phase-affecting properties, e.g. optical path length
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/59Transmissivity
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements

Definitions

  • the present invention relates to an estimation device, an estimation system, an estimation method, and a recording medium.
  • Non-Patent Document 1 discloses an imaging device capable of acquiring spatial frequency information of a sample.
  • This imaging device has a light source, a microscope objective lens, an imaging lens, and a photodetector.
  • the subject is rotated around two orthogonal axes as rotation axes.
  • the two axes of rotation lie in a plane perpendicular to the optical axis of the microscope objective.
  • the spatial frequency information of the sample can be obtained isotropically. That is, the acquisition range of the scattering potential can be widened. As a result, the number of scattering potentials that can be obtained can be increased.
  • the three-dimensional optical properties of the sample can be obtained from the scattering potential.
  • a three-dimensional optical property is, for example, a refractive index distribution or an absorptance distribution.
  • Non-Patent Document 1 the spatial frequency information obtained isotropically is used as it is to calculate the three-dimensional optical characteristics of the sample. Therefore, the amount of calculation in calculating the three-dimensional optical characteristics is large.
  • an estimating device includes: comprising a memory, a processor, and the memory stores a plurality of composite information;
  • the composite information has wavefront information and rotation angle information, Wavefront information is wavefront information acquired based on illumination light that has passed through an object,
  • the rotation angle information is information indicating the orientation of the object when the illumination light passes through the object.
  • the processor performs an estimation process to estimate the three-dimensional optical properties of the object;
  • the three-dimensional optical property is a refractive index distribution or an absorptance distribution
  • the voxel space is composed of a set of first operation voxels,
  • the voxel space consists of a set of estimation voxels,
  • the voxel space consists of a set of update voxels,
  • the reference direction of the voxel space, the first calculation voxel, the estimation voxel, and the update voxel is the traveling direction of the pseudo illumination light,
  • the voxel width in the reference direction is ⁇ z1
  • the voxel width in the reference direction is ⁇ z2
  • ⁇ z1 is wider than ⁇ z2
  • the predetermined orientation is the orientation of the object with respect to the traveling direction of the illumination light specified by the rotation angle information
  • a process A for generating an estimated value of the first calculation voxel Estimated wavefront information is generated by calculating a wavefront propagating in the reference direction in which the pseudo illumination light travels using the estimated value of the first calculation voxel generated in the process A, and the estimated wavefront information is constrained by the wavefront information, A process B of calculating a wavefront propagating in a direction opposite to the reference direction in which the simulated illumination light travels and generating a result of update voxels; and a process C of updating the estimated values of the estimation voxels based on the result of the update voxels generated in the process B.
  • An estimation system comprises: the estimating device described above; a light source that emits illumination light; a photodetector; a stage on which an object is placed; and an angle changing mechanism, The stage is placed on the optical path from the light source to the photodetector, The angle changing mechanism is characterized by changing the arrangement angle of the object with respect to the optical axis of the optical path.
  • An estimation method comprises: An estimation method for estimating a three-dimensional optical property of an object, comprising: The three-dimensional optical property is a refractive index distribution or an absorptance distribution,
  • the composite information has wavefront information and rotation angle information, Wavefront information is wavefront information acquired based on illumination light that has passed through an object,
  • the rotation angle information is information indicating the orientation of the object when the illumination light passes through the object.
  • the voxel space is composed of a set of first operation voxels,
  • the voxel space consists of a set of estimation voxels,
  • the voxel space consists of a set of update voxels,
  • the reference direction of the voxel space, the first calculation voxel, the estimation voxel, and the update voxel is the traveling direction of the pseudo illumination light
  • the voxel width in the reference direction is ⁇ z1
  • the voxel width in the reference direction is ⁇ z2
  • ⁇ z1 is wider than ⁇ z2
  • the predetermined orientation is the orientation of the object with respect to the traveling direction of the illumination light specified by the rotation angle information
  • the result indicates the difference between the object wavefront and the estimated object wavefront, perform the estimation process, In the estimation process, Rotate the estimation object in the voxel space so that the orientation of the estimation object corresponding
  • Estimated wavefront information is generated by calculating a wavefront propagating in the reference direction in which the pseudo illumination light travels using the estimated value of the first calculation voxel generated in the process A, and the estimated wavefront information is constrained by the wavefront information, calculating a wavefront propagating in a direction opposite to the reference direction in which the pseudo-illumination light travels, and performing a process B of generating update voxel results from the obtained results; It is characterized by executing a process C of updating the estimated value of the estimation voxels based on the result of the update voxels generated in the process B.
  • a recording medium comprises: A computer-readable recording medium recording a program for causing a processor of a computer having a memory and a processor to perform estimation processing,
  • the composite information has wavefront information and rotation angle information,
  • Wavefront information is wavefront information acquired based on illumination light that has passed through an object,
  • the rotation angle information is information indicating the orientation of the object when the illumination light passes through the object.
  • the estimation process estimates the three-dimensional optical properties of the object,
  • the three-dimensional optical property is a refractive index distribution or an absorptance distribution
  • the voxel space is composed of a set of first operation voxels,
  • the voxel space consists of a set of estimation voxels,
  • the voxel space consists of a set of update voxels,
  • the reference direction of the voxel space, the first calculation voxel, the estimation voxel, and the update voxel is the traveling direction of the pseudo illumination light,
  • the voxel width in the reference direction is ⁇ z1
  • the voxel width in the reference direction is ⁇ z2
  • ⁇ z1 is wider than ⁇ z2
  • the predetermined orientation is the orientation of the object with respect to the traveling direction of the illumination light specified by the rotation angle information, The result indicates the difference between the object wavefront and the
  • estimated wavefront information is generated by calculating the wavefront propagating in the reference direction in which the simulated illumination light travels using the estimated value of the first calculation voxel generated in process A, and the estimated wavefront information is calculated based on the wavefront information. is constrained, a wavefront propagating in the direction opposite to the reference direction in which the simulated illumination light travels is calculated, and a result of updating voxels is generated,
  • the estimated values of the estimation voxels are updated based on the result of the update voxels generated in the B process
  • a computer-readable recording medium recording a program for causing a processor to execute an estimation process using a process a and processes A to C.
  • an estimating device an estimating system, an estimating method, and a recording medium that can acquire the three-dimensional optical properties of an object with a small amount of computation.
  • FIG. 4 is a diagram showing an image of an object; 4 is a graph showing the relationship between computation time and voxel width;
  • the estimation device of this embodiment includes a memory and a processor, and the memory stores a plurality of composite information.
  • the composite information includes wavefront information and rotation angle information.
  • the wavefront information is wavefront information acquired based on the illumination light that has passed through the object. This information indicates the orientation of the object at the time.
  • a processor performs an estimation process to estimate a three-dimensional optical property of the object, where the three-dimensional optical property is a refractive index distribution or an absorptance distribution.
  • an estimation process for estimating the three-dimensional optical properties of the object is executed.
  • a three-dimensional optical property is a refractive index distribution or an absorptance distribution.
  • FIG. 1 is a diagram showing the estimation device of this embodiment.
  • the estimating device 1 comprises a memory 2 and a processor 3 .
  • the memory 2 stores multiple pieces of composite information.
  • Processor 3 performs an estimation process to estimate the three-dimensional optical properties of the object.
  • a three-dimensional optical property is a refractive index distribution or an absorptance distribution. Multiple composite information can be used in the estimation process.
  • the processor may be implemented as an ASIC or FPGA, or may be a CPU.
  • the CPU reads a program from memory and executes processing.
  • the composite information has wavefront information and rotation angle information.
  • Wavefront information is wavefront information acquired based on illumination light that has passed through an object.
  • the rotation angle information is information indicating the orientation of the object when the illumination light passes through the object.
  • Wavefront information is used in the estimation process. The more wavefront information, the better. With a lot of wavefront information, the three-dimensional optical properties of an object can be estimated with high accuracy.
  • Wavefront information can be obtained, for example, from interference fringes.
  • Interference fringes are formed by the measurement light and the reference light.
  • the measurement light is the illumination light that has passed through the object.
  • Reference light is illumination light that does not pass through the object.
  • Parallel light is used as illumination light.
  • the image When estimating the wavefront with light intensity, the image can be used as wavefront information. When using an image as wavefront information, it is not necessary to acquire wavefront information by analyzing the image.
  • Wavefront information can be obtained by analyzing the image of the interference fringes. Therefore, it is necessary to acquire interference fringes. A method of obtaining an image of interference fringes will be described.
  • a plurality of wavefront information and a plurality of rotation angle information are used for the estimation process. Therefore, the memory stores a plurality of wavefront information and a plurality of rotation angle information.
  • the wavefront information is wavefront information acquired based on the illumination light that has passed through the object. With a plurality of pieces of wavefront information, the incident angle of the illumination light to the object differs for each wavefront information.
  • Wavefront information includes any of amplitude, phase, optical intensity, and complex amplitude.
  • the wavefront information is information on the wavefront on the imaging plane.
  • the imaging plane is a plane on which light is detected by the photodetector, and is also called an imaging plane.
  • FIG. 2 is a diagram showing how illumination light passes through an object.
  • 2(a), 2(b), and 2(c) are diagrams showing how the orientation of an object changes.
  • the orientation of the object when the illumination light passes through it is the orientation of the object relative to the traveling direction of the illumination light.
  • the traveling direction of the illumination light is the optical axis direction of the optical path of the measuring device. If the measuring apparatus is equipped with a detection optical system, the traveling direction of the illumination light is the optical axis direction of the detection optical system. If the measurement apparatus is equipped with an illumination optical system, the traveling direction of the illumination light is the optical axis direction of the illumination optical system. The direction of travel of the illumination light is also the direction perpendicular to the photodetection surface of the photodetector of the measuring device.
  • the orientation of the object when the illumination light passes through the object can be replaced with the relative orientation of the illumination light and the object (hereinafter referred to as "relative direction").
  • the orientation of the measuring device may be changed while the object is fixed, without changing the orientation of the object relative to the fixed measuring device.
  • wavefront information is wavefront information acquired based on illumination light that has passed through an object.
  • the amount of wavefront information is affected by relative orientation.
  • wavefront information can be acquired as many times as the number of changes. Therefore, the amount of wavefront information can be increased. Also, if the relative directions are different, the passage area of the illumination light inside the object will be different. Wavefront information in one relative direction contains information that is not present in wavefront information in another relative direction. Therefore, the amount of wavefront information can be increased.
  • FIG. 3 is a flowchart of the method for obtaining interference fringes. This acquisition method changes the relative orientation.
  • step S10 the angle change count N ⁇ is set.
  • the angle to change is the relative direction. For example, when the relative direction is changed five times, 5 is set as the value of the angle change count N ⁇ .
  • the deviation in the relative direction can be expressed as an angle.
  • the displacement in the relative direction is represented by the relative angle ⁇ (m).
  • the relative orientation deviation is 0°.
  • the value of the relative angle ⁇ (m) is set to 0°.
  • step S20 the relative angle ⁇ (m) is set.
  • the value of ⁇ (1) is set to 0°
  • the value of ⁇ (2) is set to 4°
  • the value of ⁇ (3) is set to 7°.
  • step S30 1 is set to the value of variable m.
  • step S40 positioning is performed based on the relative angle ⁇ (m).
  • the object to be positioned is illumination light or an object. In positioning, the object is rotated so that the orientation of the object with respect to the illumination light matches the value of the ⁇ relative angle (m).
  • step S50 an image of interference fringes I(m) is acquired.
  • Interference fringes I(m) are formed by irradiating the object with illumination light.
  • An image of the interference fringes I(m) can be obtained by imaging the interference fringes I(m) with a photodetector.
  • the photodetector includes an image sensor (imaging element) such as a CCD.
  • variable m represents the ordinal number for the relative angle.
  • fringes I(m) represent fringes formed at the m-th relative angle.
  • step S60 it is determined whether or not the value of the variable m matches the value of the angle change count N ⁇ . If the determination result is NO, step S70 is executed. If the determination result is YES, the process ends.
  • Step S70 is executed.
  • 1 is added to the value of variable m.
  • step S70 ends, the process returns to step S40.
  • step S70 the value of variable m is incremented by one. Therefore, steps S40 and S50 are executed at another relative angle. Steps S40 and S50 are repeated until all relative angles are positioned.
  • Wavefront information can be obtained from interference fringes.
  • the wavefront information acquired from the interference fringes I(m) is assumed to be wavefront information W(m).
  • a plurality of images of interference fringes include images of interference fringes at different angles of incidence of the illumination light on the object. Therefore, a plurality of pieces of wavefront information can be obtained from a plurality of images of interference fringes.
  • the wavefront information W(m) is stored in the memory 2. At this time, the value of the angle change count N ⁇ and the value of the relative angle ⁇ (m) are also stored in the memory 2 .
  • the processor 3 performs an estimation process of estimating the three-dimensional optical properties of the object.
  • estimation process estimation of the three-dimensional optical properties of the object is performed using both multiple pieces of wavefront information and multiple pieces of rotation angle information.
  • Wavefront information is used to estimate the three-dimensional optical properties of an object. In order to obtain wavefront information, it is necessary to obtain the wavefront that has passed through the object.
  • the estimation apparatus of this embodiment obtains the wavefront using the beam propagation method.
  • FDTD Finite Difference Time Domain
  • FDTD Finite Difference Time Domain
  • the processing executed by the processor 3 uses a plurality of voxel spaces.
  • a voxel space consists of a collection of voxels.
  • the plurality of voxel spaces has a voxel space configured with a set of first calculation voxels, a voxel space configured with a set of estimation voxels, and a voxel space configured with a set of update voxels.
  • the reference direction of the voxel space is the traveling direction of the pseudo illumination light. Therefore, the reference direction of the first calculation voxel, the estimation voxel, and the update voxel is also the traveling direction of the pseudo illumination light.
  • the reference direction is the optical axis direction of the optical path of the measurement device during measurement, in other words, the direction that coincides with the traveling direction of the illumination light. is the optical axis direction of the detection optical system, or the optical axis direction of the illumination optical system if the measuring apparatus is provided with the illumination optical system. It is also the direction perpendicular to the photodetection plane of the photodetector of the measurement device.
  • a voxel space configured by a set of first calculation voxels will be referred to as a first calculation voxel.
  • a voxel space composed of a set of estimation voxels is called an estimation voxel.
  • a voxel space composed of a set of update voxels is called an update voxel.
  • the voxel width in the reference direction is ⁇ z1
  • the voxel width in the reference direction is ⁇ z2
  • ⁇ z1 is wider than ⁇ z2.
  • the predetermined orientation is the orientation of the object with respect to the traveling direction of the illumination light specified by the rotation angle information.
  • the processing performed by processor 3 produces a result.
  • the result shows the difference between the object wavefront and the estimated object wavefront.
  • FIG. 4 is a flow chart of processing executed by the processor 3 .
  • the process executed by the processor 3 has steps S10, S20, and S30.
  • Step S10 is a process A for generating the estimated value of the first calculation voxel.
  • the estimation object is rotated in the voxel space so that the orientation of the estimation object corresponding to the set of estimated values of the estimation voxels with respect to the reference direction coincides with a predetermined orientation.
  • An estimate of a first computational voxel corresponding to the object is generated.
  • Step S20 is a process B for generating update voxel results.
  • estimated wavefront information is generated by calculating the wavefront propagating in the reference direction in which the pseudo illumination light travels using the estimated value of the first calculation voxel generated in process A, and the estimated wavefront information is calculated based on the wavefront information. is constrained, a wavefront propagating in the direction opposite to the reference direction in which the simulated illumination light travels is calculated, and the update voxel result is generated.
  • Step S30 is processing C for updating the estimated value of the estimation voxel.
  • the estimated values of the estimation voxels are updated based on the result of the update voxels generated in the process B.
  • the processing executed by the processor 3 includes first processing, second processing, and estimation processing.
  • FIG. 5 is a flowchart of processing executed by the processor. This process has steps S100, S200, S300, S400, and S500.
  • step S100 various settings are made.
  • Step S100 includes step S110, step S120, step S130, and step S140.
  • step S110 the angle change count N ⁇ is set.
  • the memory 2 stores the value of the angle change count N ⁇ . Therefore, the value stored in the memory 2 may be set as the value of the angle change count N ⁇ . For example, if 5 is stored in the memory 2, 5 is set as the value of the angle change count N ⁇ .
  • Step S120 a reference rotation angle is set.
  • Step S120 is the first process.
  • the memory 2 stores rotation angle information.
  • the rotation angle information includes multiple rotation angles.
  • Step S120 does not have to be executed when the rotation angle information of the composite information records a deviation from 0 degrees.
  • one of the multiple rotation angles is set as the reference rotation angle. For example, if the five rotation angles are ⁇ 10°, ⁇ 5°, 0°, +5°, and +10°, 0° is set as the reference rotation angle.
  • the rotation angle information is recorded as a rotation angle from 0 degrees, there is no need to set the reference rotation angle.
  • the rotation angle indicated by the read rotation angle information may be processed as it is.
  • a rotation angle not included in the rotation angle information may be set as the reference rotation angle.
  • 3D optical characteristics are estimated by simulation.
  • the simulation uses an estimated object.
  • the estimated object can be represented in voxel space.
  • voxel space is a collection of voxels.
  • an estimated object can be defined.
  • Three-dimensional optical properties of the putative object can also be represented by voxel data.
  • a simulation can be performed by defining voxels. An initial value can be set for each voxel.
  • the first calculation voxel, estimation voxel, and update voxel have a plurality of voxels. Assuming that the coordinate axes of the orthogonal coordinate system are the X-axis, Y-axis and Z-axis, each voxel can be represented by X-coordinate, Y-coordinate and Z-coordinate.
  • the wavefront propagating through the estimated object is calculated.
  • the fringe image can be obtained by changing the orientation of the object without changing the orientation of the illumination light.
  • the traveling direction of the illumination light is the Z-axis direction
  • the illumination light travels in the Z-axis direction.
  • the propagation direction of the wavefront is the direction of the Z-axis.
  • the estimation voxel data, the first calculation voxel data, and the update voxel data are used.
  • step S130 an estimation voxel is defined.
  • Step S130 is the second process.
  • estimation voxels are defined.
  • an estimated value can be set for the estimation voxels.
  • An initial estimated value may be set for the defined estimation voxel. If the initial value of each voxel is zero, it can be considered that the initial value is not set.
  • step S140 the estimated number of times Ns is set.
  • step S200 various initializations are performed.
  • Step S200 has step S210 and step S220.
  • step S210 1 is set to the value of variable n.
  • step S220 1 is set to the value of variable m.
  • step S300 estimation processing is performed.
  • the estimation process estimates the three-dimensional optical properties of the object.
  • Step S300 includes steps S310, S320, S400, S330, S340, S350, and S360.
  • an evaluation value is used in the estimation process.
  • the evaluation value is represented by the difference between the wavefront information of the measurement light and the wavefront information obtained by the simulation, or the ratio of the wavefront information of the measurement light and the wavefront information obtained by the simulation.
  • Wavefront information is information including, for example, any of amplitude, phase, light intensity, and complex amplitude.
  • the simulated wavefront information (hereinafter referred to as “estimated wavefront information”) is calculated from the estimated image.
  • the estimated image is an image obtained by light transmitted through the estimated object.
  • the light transmitted through the putative object is the simulated light.
  • Wavefront information of the measurement light (hereinafter referred to as “measurement wavefront information”) is calculated from the measurement image.
  • measurement wavefront information In the case of light intensity, a measurement image may be used as measurement wavefront information.
  • a measurement image is an image of an object acquired by an optical device.
  • the estimated image is an image of the estimated object obtained by simulation.
  • FIG. 6 is a diagram showing a measured image and an estimated image.
  • FIG. 6A is a diagram showing how a measurement image is acquired.
  • 6(b) and 6(c) are diagrams showing how the estimated image is obtained.
  • an object 20 and a measurement optical system 21 are used to acquire a measurement image.
  • the measurement optical system 21 has a lens 22 .
  • the position Zfo indicates the focal position of the measurement optical system 21.
  • Position Z s indicates the position of the image side of object 20 .
  • an optical image of the object 20 at the position Z fo is formed on the imaging plane IM.
  • the inside of the object 20 ⁇ Z away from the position Z s coincides with the position Z fo .
  • a CCD 23 is arranged on the imaging plane IM.
  • An optical image of the object 20 is picked up by the CCD 23 .
  • Measurement image Imea an image of an optical image of the object 20 (hereinafter referred to as "measurement image Imea ”) can be obtained.
  • Measured wavefront information is calculated from the measured image Imea .
  • a measurement image may be used as measurement wavefront information.
  • the measurement image I mea is also an image of light intensity. Since the measurement image I mea is an image of light intensity, the measurement wavefront information calculated from the measurement image I mea is light intensity. When using light intensity, the measured image can also be used as wavefront information.
  • the estimated wavefront information is calculated from an optical image of the estimated object 24 (hereinafter referred to as “estimated image I est ”).
  • the measurement optical system 21 is illustrated in FIG. 6(c). Since the calculation of the estimated image I est is performed by simulation, the measurement optical system 21 does not physically exist. Therefore, the pupil function of the measurement optical system 21 is used in calculating the estimated image I est .
  • the estimated image I est is obtained from the image of the estimated object 24 on the imaging plane IM. Since the measured image I mea is a light intensity image, the estimated image I est is also preferably a light intensity image. Therefore, it is necessary to calculate the light intensity of the estimated object 24 on the imaging plane IM.
  • Calculation of the light intensity of the estimated object 24 requires calculation of the wavefront propagating through the estimated object 24 .
  • the direction of the object 20 can be changed without changing the direction of the illumination light.
  • the orientation of the object 20 differs for each wavefront information W(m). Therefore, the simulation also uses the estimated object 24 with a different orientation.
  • the rotation angles of the object 20 are -10°, -5°, 0°, +5°, +10°. Therefore, the orientation of the estimated object 24 with respect to the Z axis is also ⁇ 10°, ⁇ 5°, 0°, +5°, +10°.
  • FIG. 7 is a diagram showing the orientation and voxels of an estimated object. Ellipses indicate estimated objects. The horizontal direction is the X-axis direction or the Y-axis direction. The vertical direction is the Z-axis direction.
  • FIGS. 7(a) and 7(b) are diagrams showing estimation voxels.
  • FIGS. 7(c), 7(d), and 7(e) are diagrams showing the first calculation voxels.
  • FIG. 7(f) is a diagram showing a second calculation voxel.
  • FIGS. 7(g) and 7(h) are diagrams showing update voxels.
  • the voxel width in the traveling direction of the illumination light is ⁇ z1.
  • the estimation voxel and update voxel have a voxel width of ⁇ z2 in the traveling direction of the illumination light. ⁇ z1 is wider than ⁇ z2.
  • step S310 the estimated object is rotated.
  • Step S310 is the third process. In the third process, one rotation angle is read from a plurality of rotation angles. Then, the estimated object is rotated so that the orientation of the estimated object defined in step S130 matches the orientation of the object at the read rotation angle.
  • the estimated object orientation defined in step S130 is the current (estimated) object orientation with respect to the reference direction in the voxel space, and an example is the object orientation at the reference rotation angle.
  • the orientation of the object at the read rotation angle is the orientation of the measurement object specified by the rotation angle information and the orientation of the measurement object with respect to the traveling direction of the illumination light.
  • step S310 can be provided as required.
  • the rotation angle of the object varies depending on the value of variable m.
  • the wavefront propagating through the estimated object is calculated.
  • the orientation of the estimated object is different for each estimated object.
  • the calculation of the wavefront is performed with the direction of the estimated object being the same.
  • the object orientation can be regarded as an estimated object orientation. Therefore, the orientation of the estimated object can be expressed using the rotation angle of the object.
  • step S200 the value of the variable m is set to 1. In this case, since the rotation angle is read when the value of the variable m is 1, the read rotation angle is set to -10°.
  • step S120 the reference rotation angle is set to 0°.
  • the read rotation angle is different from the reference rotation angle.
  • the orientation of the estimated object is the orientation at the reference rotation angle. Therefore, the angular relationship between the Z axis and the estimated object does not match the angular relationship between the optical axis of the optical path and the object at the read rotation angle. That is, the orientation of the estimated object does not match the orientation of the object with respect to the optical axis at the read rotation angle.
  • FIG. 7(a) shows the state before step S310 is executed.
  • a solid-line ellipse indicates an estimated object at the reference rotation angle.
  • a dashed ellipse indicates an estimated object at the read rotation angle. Since this is before step S310 is executed, the ellipse indicated by the solid line does not overlap the ellipse indicated by the broken line.
  • step S310 the estimated value of the estimation voxel is rotated so that the orientation of the object at the read rotation angle matches the orientation of the object with respect to the optical axis.
  • the angular relationship between the Z axis and the estimated object matches the angular relationship between the optical axis of the optical path and the object at the read rotation angle. That is, the orientation of the estimated object matches the orientation of the object with respect to the optical axis at the read rotation angle.
  • FIG. 7(b) shows the state after executing step S310.
  • the solid-line ellipse overlaps the dashed-line ellipse.
  • the orientation of the estimated object matches the orientation of the object with respect to the optical axis at the read rotation angle.
  • the wavefront propagating through the estimated object can be calculated with the direction of the estimated object being the same as the measurement condition.
  • step S320 an estimated value of the first calculation voxel is generated.
  • Step S320 is the fourth process.
  • the estimated value of the post-rotation estimation voxel is directly set to the first calculation voxel.
  • an estimated value of the first calculation voxel can be generated. It is also possible to set the estimated value of the post-rotation estimation voxel as the estimation voxel, and use the estimated value of the estimation voxel to generate the estimated value of the first calculation voxel.
  • FIG. 7(c) shows the state after executing step S320. As shown in FIG. 7(c), ⁇ z1 is wider than ⁇ z2.
  • the voxel width differs between the estimation voxel and the first calculation voxel. Therefore, the estimated value in the first calculation voxel is calculated using the estimated value in the estimation voxel.
  • an extrapolation method or an interpolation method can be used.
  • step S400 the wavefront and gradient propagating through the estimated object are calculated. If the voxel width is regarded as the wavefront propagation interval, the narrower the voxel width, the greater the number of wavefronts that propagate. As the number of propagating wavefronts increases, the calculation time for calculating the propagating wavefronts increases.
  • FIG. 7(d) shows the wavefront propagating in the forward direction. Dotted lines indicate wave fronts.
  • FIG. 7(e) shows the state of the wavefront propagating in the opposite direction.
  • the forward direction is the direction in which illumination light travels.
  • the reverse direction is the direction opposite to the direction in which the illumination light travels.
  • the calculation of the wavefront propagating through the estimated object uses the first calculation voxel data.
  • ⁇ z1 is wider than ⁇ z2. Therefore, the number of propagating wavefronts can be reduced compared to the case of using estimation voxel data. As a result, it is possible to shorten the calculation time by calculating the wavefront propagating in the forward direction and calculating the wavefront propagating in the reverse direction.
  • the image of the interference fringes can be obtained by changing the direction of the object without changing the direction of the illumination light.
  • the spatial resolution in the traveling direction of the illumination light is not high. Therefore, in the simulation, the traveling direction of the illumination light is the direction of the Z-axis. Therefore, the voxel width in the Z-axis direction can be widened.
  • Step S400 includes steps S410, S420, S430, S440, and S450.
  • step S400 the fifth process is performed.
  • the fifth process has steps S410, S420, and S440.
  • step S410 a forward propagation calculation is executed.
  • a wavefront propagating in the direction in which the illumination light travels is calculated. Since the wavefront is calculated by simulation, the illumination light is pseudo illumination light. As shown in FIG. can.
  • step S410 can be regarded as a step of calculating estimated wavefront information.
  • FIG. 8 is a flowchart for calculating estimated wavefront information.
  • Step S410 includes steps S411, S412, S413, S414, and S415.
  • the estimated wavefront information is calculated based on the forward propagation of the wavefront.
  • the wavefront propagates from the estimated object 24 toward the imaging plane IM, as shown in FIGS. 6(b) and 6(c).
  • step S411 the wavefront incident on the estimated object is calculated.
  • the position Z in is the position of the surface of the estimated object 24 corresponding to the surface of the object 20 on the light source (illumination) side.
  • the position Z in is the position of the surface on which the simulated light enters the estimated object 24 . Therefore, the wavefront U in at the position Z in is calculated.
  • the same wavefront as the wavefront of the measurement light with which the object 20 is irradiated can be used for the wavefront Uin .
  • step S412 the wavefront emitted from the estimated object is calculated.
  • the position Z out is the position of the surface of the estimated object 24 corresponding to the imaging side (lens side, CCD side) surface of the object 20 .
  • the position Z out is the position of the surface from which the simulated light exits from the estimated object 24 . Therefore, the wavefront U out at the position Z out is calculated.
  • the wavefront Uout can be calculated from the wavefront Uin , for example, using the beam propagation method.
  • step S413 the wavefront at a predetermined acquisition position is calculated.
  • the predetermined acquisition position is the position on the object side when the measurement image was acquired.
  • the predetermined acquisition position is any position between position Zin and position Zout .
  • Position Z p is one of the predetermined acquisition positions.
  • the position Z p is a position conjugate with the imaging plane IM.
  • the estimated image I est is calculated under the same conditions as the measured image I mea .
  • the measurement image I mea is obtained from an internal optical image of the object 20 ⁇ Z away from the position Z s . Therefore, the estimated image I est calculation requires a wavefront at a position ⁇ Z away from the position Z s .
  • position Z out corresponds to position Z s .
  • a position ⁇ Z away from the position Z out is a position Z p . Therefore, it suffices if the wavefront U p at the position Z p can be calculated.
  • Position Z p is ⁇ Z away from position Z out . Therefore, the wavefront Uout cannot be used as the wavefront Up .
  • the wavefront Up can be calculated from the wavefront Uout using, for example, the beam propagation method.
  • step S414 the wavefront on the imaging plane is calculated.
  • the wavefront Up passes through the measuring optical system 21 and reaches the imaging plane IM.
  • a wavefront U img on the imaging plane IM can be calculated from the wavefront Up and the pupil function of the measurement optical system 21 .
  • step S415 estimated wavefront information on the imaging plane is calculated.
  • a wavefront U img represents the amplitude of the light.
  • Light intensity is expressed as the square of the amplitude. Therefore, the light intensity of the estimated object 24 can be calculated by squaring the wavefront U img . As a result, the estimated image I est can be acquired. Estimated wavefront information is calculated from the estimated image I est .
  • Amplitude and phase may be used instead of light intensity. Amplitude and phase are represented using electric fields. Therefore, when amplitude and phase are used, values calculated from the electric field are used for the measurement location and the estimated value.
  • the electric field Emes based on the measurement and the electric field Eest based on the estimation are represented by the following equations.
  • Emes Ames x exp (i x Pmes)
  • Eest Aest x exp (i x Pest) here, Pmes is the measured phase; Ames is the measured amplitude; Pest is the estimated phase, Aest is the estimated amplitude, is.
  • the measurement light and the reference light are incident on the photodetector in a non-parallel state.
  • the measurement light and the reference light form interference fringes on the imaging surface of the photodetector.
  • the interference fringes are imaged by a photodetector. As a result, an image of interference fringes can be acquired.
  • the interference fringes are obtained with the measurement light and the reference light non-parallel. Therefore, by analyzing the interference fringes, it is possible to obtain the phase based on the measurement and the amplitude based on the measurement. The result is the measured electric field Emes.
  • the estimated electric field Eest can be obtained by simulation.
  • the complex amplitude can be obtained. Therefore, instead of light intensity, complex amplitude may be used for wavefront information.
  • step S420 optimization processing is performed.
  • the wavefront information W(m) constrains the estimated wavefront information.
  • the wavefront information W(m) is obtained from the image of the interference fringes I(m).
  • the interference fringes I(m) are formed by the measuring light. Therefore, the wavefront information W(m) can be regarded as the measured wavefront information.
  • variable m represents the ordinal number for the relative angle.
  • the wavefront information W(m) represents the measured wavefront information when using the m-th relative angle.
  • Measured wavefront information is calculated from the measured image Imea .
  • Estimated wavefront information is calculated from the estimated image I est .
  • An evaluation value can be calculated from the difference between the measured wavefront information and the estimated wavefront information or the ratio between the measured wavefront information and the estimated wavefront information.
  • Constraining the estimated wavefront information by wavefront information means correcting the estimated wavefront information using the measured wavefront information or calculating the error between the estimated wavefront information and the measured wavefront information, which is almost the same as calculating the evaluation value. Synonymous.
  • the difference between the measured image I mea and the estimated image I est or the ratio between the measured image I mea and the estimated image I est may be used as the evaluation value.
  • step S430 the evaluation value and the threshold are compared.
  • step S500 is executed. If the determination result is YES, step S440 is executed.
  • Step S500 is executed.
  • step S500 the three-dimensional optical characteristics of the estimated object are calculated.
  • the obtained three-dimensional optical properties of the estimated object 24 are the same or substantially the same as the three-dimensional optical properties of the object 20 .
  • a reconstructed estimated object can be obtained.
  • the reconstructed estimated object can be output to, for example, a display device.
  • the three-dimensional optical properties obtained in step S500 are the same or substantially the same as the three-dimensional optical properties of the object 20.
  • the reconstructed estimated object can be considered identical or nearly identical to the structure of object 20 .
  • Step S440 is executed.
  • a backpropagation operation is performed.
  • a wavefront propagating in the direction opposite to the direction in which the illumination light travels is calculated.
  • the calculation time in the backpropagation calculation can be shortened.
  • step S440 can be regarded as a step of calculating the slope.
  • Gradient calculation is based on back propagation of the wavefront.
  • the wavefront propagates from position Z out towards position Z in .
  • the wavefront at position Z out can be calculated from the wavefront at position Z p .
  • FIG. 6(c) shows the wavefront U'p .
  • Wavefront U′ p is the wavefront at position Z p .
  • an image can be used as wavefront information. Therefore, the measured image I mea and the estimated image I est are used in calculating the wavefront U′ p after correction.
  • the estimated image I est is calculated based on the wavefront U img . Also, the wavefront U img is calculated based on the wavefront Up .
  • the initial value set in step S130 is used to calculate the wavefront Up.
  • the initial values are values of the three-dimensional optical properties of the estimated object 24 .
  • the initial values are different from the values of the three-dimensional optical properties of the object 20 (hereinafter referred to as "object property values").
  • the difference between the estimated image I est and the measured image I mea increases. Therefore, the difference between the estimated image I est and the measured image I mea can be regarded as reflecting the difference between the initial value and the object characteristic value.
  • the wavefront Up is corrected using the estimated image I est and the measured image I mea .
  • the wavefront after correction that is, the wavefront U'p is obtained.
  • the wavefront U' p is represented by the following equation (1), for example.
  • U' p U p ⁇ (I mea /I est ) (1)
  • Wavefront U' out a corrected wavefront
  • the wavefront U'p is a wavefront obtained by correcting the wavefront Up .
  • the wavefront U'p is the wavefront at the position Zp .
  • the wavefront U'p is shown at a position shifted from the position Zp for ease of viewing.
  • the wavefront U' out is shown at a position shifted from the position Z out .
  • position Z out is separated from position Z p by ⁇ Z. Therefore, the wavefront U'p cannot be used as the wavefront U'out .
  • the wavefront U'out can be calculated from the wavefront U'p , for example using the beam propagation method.
  • the wavefront calculation is performed based on the wavefront backpropagation.
  • backpropagation of the wavefront the wavefront propagating inside the estimated object 24 is calculated.
  • Wavefronts U out and U′ out are used in the wavefront calculation.
  • Wavefront U'p is different from wavefront Up . Therefore, the wavefront U'out is also different from the wavefront Uout .
  • the gradient can be calculated using the wavefront U' out and the wavefront U out .
  • the slope is the slope of the wavefront at any position within the object. Gradients contain new information about the values of the three-dimensional optical properties of estimated object 24 .
  • step S450 the gradient is set to the second calculation voxel.
  • the first calculation voxel is used in the back propagation calculation for calculating the gradient.
  • the gradient is the result obtained from counterpropagating wavefronts. Therefore, the result obtained from the wavefront propagating in the opposite direction is set to each voxel of the second calculation voxels.
  • FIG. 7(f) shows the state after execution of step S450.
  • step S450 is unnecessary.
  • the shape of the estimated object is an ellipse. Therefore, the shape of the estimated object after executing step S400 is the same as the shape of the estimated object before executing step S400. However, in some cases, the shape of the estimated object after performing step S400 differs from the shape of the estimated object before performing step S400.
  • step S330 update voxels are generated.
  • Step S330 is the sixth process.
  • the gradients are set directly to the update voxels.
  • update voxel data can be generated.
  • FIG. 7(g) shows the state after execution of step S330.
  • the voxel width when calculating the gradient with the first calculation voxel is ⁇ z1
  • the voxel width of the update voxel is ⁇ z2. That is, the voxel width differs between the first calculation voxel and the update voxel. Therefore, the gradient in the update voxel is calculated using the gradient in the first calculation voxel. For the calculation, for example, an extrapolation method or an interpolation method can be used.
  • step S450 the gradient in the update voxel may be calculated using the gradient in the second calculation voxel.
  • step S340 the estimated object is rotated.
  • Step S340 is the seventh process.
  • the estimated object is rotated so that the orientation of the object at the read rotation angle matches the orientation of the object at the reference rotation angle.
  • step S340 may be provided as required.
  • S340 is necessary when updating the estimated value with the orientation of the object at the reference rotation angle. However, if not, S340 may not be provided. If S340 is not provided, the reference rotation angle of S310 performed next is the orientation of the object at the rotation angle of S310 performed immediately before. Also, the reference rotation angle need not be fixed during the estimation process, and may be changed during the process.
  • the orientation of the estimation object corresponding to the set of update voxel results with respect to the reference direction and the orientation of the estimation object corresponding to the set of estimated values of the estimation voxels match in the voxel space. , rotate one or both of the former and the latter estimated objects.
  • S350 updates the estimates of the estimation voxels using the resulting set of update voxels in which the orientation of the estimation object relative to the reference direction in voxel space matches the set of estimates of the estimation voxels.
  • one rotation angle is read from a plurality of rotation angles.
  • the read rotation angle represents the orientation of the estimated object. Processing is executed using the read rotation angle estimation object. Therefore, the gradient in the update voxels generated in step S330 is the gradient in the estimated object of the read rotation angle.
  • step S350 when updating the estimated value in step S350, it is necessary to match the orientation of the estimated object with the orientation indicated by the reference rotation angle.
  • FIG. 7(h) shows the state after execution of step S340.
  • step S370 it is determined whether or not the value of the variable m matches the value of the angle change count N ⁇ . If the determination result is NO, step S371 is executed. If the determination result is YES, step S350 is executed.
  • Step S371 is executed. In step S371, 1 is added to the value of variable m. After step S371 ends, the process returns to step S310.
  • step S371 the value of variable m is incremented by one. In this case, the value of m in the wavefront information W(m) changes. Therefore, steps S310 to S340 are executed with wavefront information of another relative angle. Steps S310 to S340 are repeated until all relative angles are positioned.
  • steps S310 to S340 are executed five times.
  • wavefront information A and wavefront information B have different relative angles
  • wavefront information A includes information that wavefront information B does not have
  • wavefront information B includes information that wavefront information A does not have. Therefore, the amount of information increases as the amount of wavefront information with different relative angles increases.
  • the wavefront after correction can be calculated more accurately in step S440.
  • the precision of the gradient is also increased.
  • Gradients contain information about the difference between the estimated value and the object property value. By increasing the accuracy of the gradient, it is possible to reduce the difference between the estimated value and the object property value. That is, the estimated value can be brought closer to the object characteristic value.
  • Step S350 is executed.
  • step S350 the estimated values of the estimation voxels are updated.
  • the gradient contains information about the difference between the estimated value and the object property value. So adding the gradient to the estimate gives an updated estimate.
  • the updated estimated value is closer to the object property value than the initial value. Accordingly, the values of the three-dimensional optical properties of the estimated object 24 can be updated using the updated estimated values.
  • step S360 TV regularization is performed.
  • TV regularization By performing TV regularization, it is possible to remove noise and correct blurred images. TV regularization may be performed as needed. Therefore, step 360 may be omitted.
  • step S380 it is determined whether or not the value of the variable n matches the value of the estimated number of times Ns. If the determination result is NO, step S381 is executed. If the determination result is YES, the process ends.
  • Step S381 is executed.
  • 1 is added to the value of the variable n.
  • Step S381 ends, the process returns to step S300.
  • Step S300 is repeated until the value of the variable n matches the value of the estimated number of times Ns.
  • step S500 Since the predetermined number of iterations has been reached, the three-dimensional optical properties of the estimated object are calculated in step S500, and the process ends.
  • the number of wavefronts in the forward propagation calculation and the number of wavefronts in the backward propagation calculation can be reduced. Therefore, the three-dimensional optical characteristics of the object can be obtained with a small amount of calculation. As a result, it is possible to shorten the computation time for the forward propagation computation and the backward propagation computation.
  • the third process to the forward propagation calculation of the fifth process can be performed in parallel in the calculation unit (eg, wavefront unit) of the forward propagation calculation of the fifth process by using matrix transformation or the like.
  • the third and fourth processes generate the estimated values required at the timing required for the forward propagation calculation.
  • the third process and the fourth process may generate only the estimated value of the wavefront to be calculated next in the forward propagation calculation of the fifth process, or collectively generate the estimated values of a plurality of wavefronts at the timing of generation.
  • the back propagation calculation of the fifth process to the seventh process can also be performed in parallel. Doing things in parallel saves memory.
  • the estimated value is updated with the reference rotation angle.
  • the second pattern updates the estimate with the post-rotation angle.
  • the orientations of the estimated object and gradient are adjusted to a rotation angle other than the reference rotation angle and the post-rotation angle, and the estimated value is updated.
  • the fourth process corresponds to process A for generating the estimated value of the first calculation voxel.
  • the fifth and sixth processes correspond to the process B that generates the update voxel results.
  • the eighth process corresponds to the process C of updating the estimated value of the estimation voxel.
  • the estimation device of this embodiment preferably satisfies the following conditional expression (1). ⁇ z2 ⁇ 2 ⁇ z1 (1)
  • conditional expression (1) the number of wavefronts in the forward propagation calculation and the number of wavefronts in the backward propagation calculation can be further reduced. Therefore, the three-dimensional optical characteristics of the object can be acquired with a smaller amount of calculation. As a result, it is possible to further shorten the computation time for the forward propagation computation and the backward propagation computation.
  • the 3D optical properties of the estimated object become closer to the 3D optical properties of the object.
  • the three-dimensional optical characteristics in the Z-axis direction can be estimated with higher accuracy.
  • ⁇ z1 it is also possible to set ⁇ z1 to be small after executing the estimation process a first number of times (eg, 5 times), and further to set ⁇ z1 to be even smaller after executing the estimation process a second number of times (eg, 3 times).
  • ⁇ z1 is preferably determined based on the maximum rotation angle difference between multiple pieces of rotation angle information.
  • ⁇ Z1 is the maximum rotation angle difference based on a plurality of pieces of rotation angle information, specifically, the difference between the maximum rotation angle in the negative direction and the maximum rotation angle in the positive direction with respect to the traveling direction of the illumination light. should be determined based on the difference between
  • the estimation system of this embodiment includes the estimation device of this embodiment, a light source that emits illumination light, a photodetector, a stage on which an object is placed, and an angle changing mechanism.
  • An angle changing mechanism is arranged on the optical path to the photodetector, and changes the placement angle of the object with respect to the optical axis of the optical path.
  • the angle changing mechanism changes the arrangement angle of the object with respect to the optical axis of the optical path, in other words, the orientation of the object with respect to the traveling direction of the illumination light.
  • FIG. 9 is a diagram showing the estimation system of this embodiment. The same numbers are assigned to the same configurations as in FIG. 1, and the description thereof is omitted.
  • the estimation system 30 includes a light source 31, a photodetector 32, a stage 33, and an estimation device 1.
  • the estimating device 1 has a memory 2 and a processor 3 .
  • the light source 31 emits illumination light.
  • a beam splitter 34 is arranged in the traveling direction of the illumination light.
  • the illumination light enters beam splitter 34 .
  • the beam splitter 34 has an optical surface on which an optical film is formed. In the beam splitter 34 , light that is transmitted in the first direction and light that is reflected in the second direction are generated from the incident light by the optical film.
  • the estimation system 30 forms a measurement optical path OP mea in a first direction and a reference optical path OP ref in a second direction.
  • the reference optical path OP ref may be formed in the first direction and the measurement optical path OP mea may be formed in the second direction.
  • the illumination light travels through a measurement optical path OP mea and a reference optical path OP ref respectively.
  • a mirror 35 is arranged in the measurement optical path OP mea .
  • the measurement optical path OP mea is bent in the second direction by the mirror 35 .
  • a mirror 36 is arranged in the reference optical path OP ref .
  • the reference optical path OP ref is folded in a first direction by mirror 36 .
  • the reference optical path OP ref intersects the measurement optical path OP mea .
  • a beam splitter 37 is arranged at the position where the two optical paths intersect.
  • a stage 33 is arranged between a mirror 35 and a beam splitter 37 in the measurement optical path OP mea .
  • An object S is placed on the stage 33 .
  • the object S is irradiated with the illumination light.
  • the measurement light L mea is illumination light that has passed through the object S.
  • a reference beam L ref travels in the reference optical path OP ref .
  • the reference light L ref is illumination light that does not pass through the object S.
  • the measurement light L mea and the reference light L ref enter the beam splitter 37 .
  • the beam splitter 37 has an optical surface on which an optical film is formed.
  • the optical film generates light that is transmitted in the first direction and light that is reflected in the second direction from the incident light.
  • a photodetector 32 is arranged in the first direction. When the light source 31 is on, the measurement light L mea and the reference light L ref are incident on the photodetector 32 .
  • An interference fringe is formed by the measurement light L mea and the reference light L ref .
  • the image of the interference fringes is sent to the estimation device 1.
  • the estimation device 1 acquires wavefront information based on the image of the interference fringes. Wavefront information is stored in memory 2 . An estimation process is performed using the wavefront information.
  • the estimation system of this embodiment has an angle changing mechanism.
  • the angle changing mechanism changes the relative orientation. Therefore, the incident angle of the illumination light to the object can be changed. As a result, a plurality of pieces of wavefront information can be obtained.
  • the angle changing mechanism has a driving device and a rotating member, the rotating member holds the stage, and the rotation axis of the rotating member intersects the object and the light in the optical path. It is preferably perpendicular to the axis.
  • the optical axis of the optical path is the axis perpendicular to the detection surface of the photodetector, the axis passing through the center of the photodetector, and the optical axis of the detection optical system if the detection optical system is provided. If an optical system is provided, it is the optical axis of the illumination optical system.
  • the estimation system 30 has an angle changing mechanism 40. As shown in FIG. 9, the estimation system 30 has an angle changing mechanism 40. As shown in FIG. The angle changing mechanism 40 is arranged on the measurement optical path OP mea side.
  • the angle changing mechanism 40 has a driving device 41 and a rotating member 42.
  • the rotating member 42 holds the stage 33 .
  • Axis RX is the rotation axis of rotating member 42 .
  • the axis RX intersects the object S and is perpendicular to the optical axis AX.
  • the rotating member 42 is rotated by the drive device 41 . Since the rotary member 42 holds the stage 33, the stage 33 rotates. By rotating the stage 33, the object S can be rotated around the axis RX.
  • the illumination light is reflected by the mirror 35 and enters the object S.
  • the rotation of the object S changes the orientation of the object S that illuminates. Therefore, the object S is irradiated with illumination light from various directions.
  • a measurement light beam L mea is emitted from the object S.
  • the measurement light L mea enters the photodetector 32 .
  • the direction of the object S changes without changing the direction of the illumination light. Therefore, the incident angle of the illumination light to the object S can be changed.
  • FIG. 10 is a diagram showing the estimation system of this embodiment. The same numbers are assigned to the same configurations as in FIG. 9, and the description thereof is omitted.
  • the estimation system 50 has a mirror 51 and a beam splitter 52 .
  • a mirror 51 is arranged in the measurement optical path OP mea .
  • the beam splitter 52 is arranged at a position where the reference optical path OP ref and the measurement optical path OP mea intersect.
  • the beam splitter 37 bends the measurement optical path OP mea in the first direction, and the mirror 36 bends the reference optical path OP ref in the first direction.
  • the mirror 51 bends the measurement optical path OP mea in the opposite direction to the first direction
  • the beam splitter 52 bends the reference optical path OP ref in the opposite direction to the first direction. Therefore, a difference occurs between the optical path length of the measurement optical path OP mea and the optical path length of the reference optical path OP ref .
  • interference fringes are formed. If the coherence length of the illumination light is shorter than the difference in optical path length, the optical path length adjuster 53 is arranged between the beam splitter 34 and the mirror 52 . With such an arrangement, interference fringes can be formed.
  • the optical path length adjusting section 53 has, for example, a piezo stage and four mirrors. Two mirrors are placed on the piezo stage. By moving the two mirrors, the optical path length in the reference optical path OP ref can be changed.
  • the estimation system of this embodiment preferably satisfies the following conditional expression (1). ⁇ z2 ⁇ 2 ⁇ z1 (1)
  • the estimation system of this embodiment preferably has a detection optical system and satisfies the following conditional expression (2). ⁇ z1 ⁇ /NA 2 ⁇ 5 (2) here, ⁇ is the wavelength of the illumination light, NA is the numerical aperture of the detection optical system; is.
  • the estimation system of the present embodiment preferably reduces ⁇ z1 each time the estimation process is performed.
  • the estimation system of this embodiment preferably determines ⁇ z1 based on the maximum rotation angle.
  • FIG. 11 is a diagram showing an image of an object.
  • the images are images obtained by simulation.
  • NA is the numerical aperture of the detection optical system.
  • ⁇ z1/ ⁇ z2 is the ratio of ⁇ z1 and ⁇ z2.
  • the horizontal direction of the image is the X-axis direction or the Y-axis direction.
  • the vertical direction of the image is the Z-axis direction.
  • the first image group is the image shown in FIG. 11(a), the image shown in FIG. 11(b), the image shown in FIG. 11(c), and the image shown in FIG. 11(d).
  • the value of ⁇ z1/ ⁇ z2 is 10, and the NA values are different in each figure.
  • the second image group is the image shown in FIG. 11(e), the image shown in FIG. 11(f), the image shown in FIG. 11(g), and the image shown in FIG. 11(h).
  • the value of ⁇ z1/ ⁇ z2 is 20, and the NA values are different in each figure.
  • the third image group is the image shown in FIG. 11(i), the image shown in FIG. 11(j), the image shown in FIG. 11(k), and the image shown in FIG. 11(l).
  • the value of ⁇ z1/ ⁇ z2 is 43, and the NA values are different in each figure.
  • indicates that the image quality is good. “ ⁇ ” indicates that the image quality is slightly poor. “X” indicates that the image quality is poor.
  • a PCF using a photonic crystal fiber (hereinafter referred to as "PCF") as an object has a cylindrical member and a through hole.
  • a plurality of through holes are formed inside the cylindrical member.
  • the through hole is cylindrical and formed along the generatrix of the cylindrical member.
  • the outer diameter of the PCF is 230 ⁇ m and the refractive index of the medium is 1.466.
  • the perimeter of the through-hole and the cylindrical member is filled with a liquid having a refractive index of 1.44.
  • Wavefront information obtained with illumination light with a wavelength ⁇ of 1000 nm is used to estimate each image.
  • the NA value the more susceptible it is to aberrations.
  • the Z-axis direction is greatly affected by the medium.
  • the NA value the more susceptible it is to aberrations.
  • the Z-axis direction is greatly affected by the medium.
  • the NA value the more the image quality is degraded.
  • FIG. 12 is a graph showing the relationship between computation time and voxel width.
  • the calculation time on the vertical axis is the calculation time for the forward propagation calculation and the backward propagation calculation.
  • the wavefront propagating through the estimated object is calculated using ⁇ z1.
  • ⁇ z1 becomes wider, the number of propagating wavefronts can be reduced. As a result, it is possible to shorten the computation time by calculating the propagating wavefront.
  • ⁇ z1/ ⁇ z2 in the third image group is greater than the value of ⁇ z1/ ⁇ z2 in the first image group. Therefore, the number of images with poor image quality is larger in the third image group than in the first image group.
  • the estimation method of this embodiment is an estimation method for estimating the three-dimensional optical properties of an object.
  • a three-dimensional optical property is a refractive index distribution or an absorptance distribution.
  • the composite information includes wavefront information and rotation angle information.
  • the wavefront information is wavefront information acquired based on the illumination light that has passed through the object. This information indicates the orientation of the object at the time.
  • the computer executes the estimation process.
  • the computer reads the composite information stored in the memory and performs the estimation process.
  • a voxel space consists of a set of voxels.
  • the plurality of voxel spaces has a voxel space configured with a set of first calculation voxels, a voxel space configured with a set of estimation voxels, and a voxel space configured with a set of update voxels.
  • the reference direction of the voxel space is the traveling direction of the pseudo illumination light. Therefore, the reference direction of the first calculation voxel, the estimation voxel, and the update voxel is also the traveling direction of the pseudo illumination light.
  • the reference direction is the direction of the optical axis of the optical path of the measurement device during measurement, in other words, a direction that coincides with the traveling direction of the illumination light. , the direction of the optical axis of the detection optical system, and the direction of the optical axis of the illumination optical system if the measurement apparatus is provided with the illumination optical system. It is also the direction perpendicular to the photodetection plane of the photodetector of the measurement device.
  • a voxel space composed of a set of first calculation voxels is referred to as a first calculation voxel.
  • a voxel space composed of a set of estimation voxels is called an estimation voxel.
  • a voxel space composed of a set of update voxels is called an update voxel.
  • the voxel width in the reference direction is ⁇ z1
  • the voxel width in the reference direction is ⁇ z2
  • ⁇ z1 is wider than ⁇ z2.
  • the predetermined orientation is the orientation of the object with respect to the traveling direction of the illumination light specified by the rotation angle information.
  • the processing performed by processor 3 produces a result.
  • the result shows the difference between the object wavefront and the estimated object wavefront.
  • steps S10, S20, and S30 are executed.
  • Step S10 is a process A for generating the estimated value of the first calculation voxel.
  • Step S20 is a process B for generating update voxel results.
  • Step S30 is a process C for updating the estimated value of the estimation voxel. Since processing A, processing B, and processing C have already been described, descriptions thereof will be omitted here.
  • the estimation method of this embodiment preferably satisfies the following conditional expression (1). ⁇ z2 ⁇ 2 ⁇ z1 (1)
  • ⁇ z1 is preferably determined based on the maximum rotation angle.
  • the recording medium of this embodiment is a computer-readable recording medium on which a program is recorded.
  • the recording medium stores a program for causing the processor to execute an estimation process for estimating the three-dimensional optical properties of an object in an estimation device having a memory and a processor.
  • the memory stores a plurality of composite information and causes the processor to perform the estimation process using process a, process A, process B, and process C.
  • the composite information includes wavefront information and rotation angle information.
  • the wavefront information is wavefront information acquired based on the illumination light that has passed through the object. This information indicates the orientation of the object at the time.
  • the estimation process estimates the three-dimensional optical properties of the object, where the three-dimensional optical properties are refractive index distributions or absorptance distributions.
  • the processing executed by the processor uses a voxel space configured with a set of first calculation voxels, a voxel space configured with a set of estimation voxels, and a voxel space configured with a set of update voxels. .
  • a voxel space composed of a set of first calculation voxels is called a first calculation voxel.
  • a voxel space composed of a set of estimation voxels is called an estimation voxel.
  • a voxel space composed of a set of update voxels is called an update voxel.
  • the voxel width in the reference direction is ⁇ z1
  • the voxel width in the reference direction is ⁇ z2
  • ⁇ z1 is wider than ⁇ z2.
  • the predetermined orientation is the orientation of the object with respect to the traveling direction of the illumination light specified by the rotation angle information.
  • the result shows the difference between the object wavefront and the estimated object wavefront.
  • the estimation object is rotated in the voxel space so that the orientation of the estimation object corresponding to the set of estimated values of the estimation voxels with respect to the reference direction matches a predetermined orientation.
  • estimated wavefront information is generated by calculating the wavefront propagating in the reference direction in which the simulated illumination light travels using the estimated value of the first calculation voxel generated in process A, and the estimated wavefront information is calculated based on the wavefront information. is constrained, a wavefront propagating in the direction opposite to the reference direction in which the simulated illumination light travels is calculated, and a result of updating voxels is generated,
  • the estimated values of the estimation voxels are updated based on the results of the update voxels generated in the B process.
  • the present invention is suitable for an estimating device, an estimating system, an estimating method, and a recording medium that can acquire the three-dimensional optical properties of an object with a small amount of computation.

Abstract

Provided is an estimation device with which the 3D optical characteristics of an object can be obtained with a small amount of computation. An estimation device 1 comprises a memory 2 and a processor 3. The memory 2 stores a plurality of pieces of composite information. The composite information includes wave surface information and rotation angle information. The processor 3 executes estimation processing for estimating the 3D optical characteristics of the object. The 3D optical characteristics are the refractive index distribution and the light adsorption distribution. In the processing executed by the processor 3, the following are used: a voxel space having a voxel width of Δz1; and a voxel space having a voxel width of Δz2. Δz1 is wider than Δz2. An estimated value is rotated using voxel data that includes Δz2. Using the voxel space having the voxel width Δz1, a forward propagation operation and a reverse propagation operation are carried out.

Description

推定装置、推定システム、推定方法、及び記録媒体Estimation device, estimation system, estimation method, and recording medium
 本発明は、推定装置、推定システム、推定方法、及び記録媒体に関する。 The present invention relates to an estimation device, an estimation system, an estimation method, and a recording medium.
 試料の空間周波数情報を取得することができる撮像装置が、非特許文献1に開示されている。 Non-Patent Document 1 discloses an imaging device capable of acquiring spatial frequency information of a sample.
 この撮像装置は、光源と、顕微鏡対物レンズと、結像レンズと、光検出器と、を有する。この装置では、直交する2つの軸を回転軸として、被写体を回転させている。2つの回転軸は、顕微鏡対物レンズの光軸と直交する面内に位置している。 This imaging device has a light source, a microscope objective lens, an imaging lens, and a photodetector. In this device, the subject is rotated around two orthogonal axes as rotation axes. The two axes of rotation lie in a plane perpendicular to the optical axis of the microscope objective.
 2つの軸で試料を回転させることで、試料の空間周波数情報を等方的に取得することができる。すなわち、散乱ポテンシャルの取得範囲を広げられる。その結果、取得できる散乱ポテンシャルの数を増やせる。 By rotating the sample on two axes, the spatial frequency information of the sample can be obtained isotropically. That is, the acquisition range of the scattering potential can be widened. As a result, the number of scattering potentials that can be obtained can be increased.
 散乱ポテンシャルから、試料の3次元光学特性を求めることができる。3次元光学特性は、例えば、屈折率分布又は吸収率分布である。散乱ポテンシャルの数が増えることで、試料の画像を、より正確に生成することができる。散乱ポテンシャルは、例えば、干渉縞を解析することで取得することができる。 The three-dimensional optical properties of the sample can be obtained from the scattering potential. A three-dimensional optical property is, for example, a refractive index distribution or an absorptance distribution. By increasing the number of scattering potentials, a more accurate image of the sample can be generated. The scattering potential can be obtained, for example, by analyzing interference fringes.
 非特許文献1では、等方的に取得した空間周波数情報をそのまま用いて、試料の3次元光学特性を算出している。そのため、3次元光学特性の算出における演算量が多い。 In Non-Patent Document 1, the spatial frequency information obtained isotropically is used as it is to calculate the three-dimensional optical characteristics of the sample. Therefore, the amount of calculation in calculating the three-dimensional optical characteristics is large.
 本発明は、このような課題に鑑みてなされたものであって、物体の3次元光学特性を少ない演算量で取得できる推定装置、推定システム、推定方法、及び記録媒体を提供することを目的とする。 SUMMARY OF THE INVENTION It is an object of the present invention to provide an estimating device, an estimating system, an estimating method, and a recording medium that can acquire the three-dimensional optical characteristics of an object with a small amount of computation. do.
 上述した課題を解決し、目的を達成するために、本発明の少なくとも幾つかの実施形態に係る推定装置は、
 メモリと、プロセッサと、を備え、
 メモリは、複数の複合情報を記憶し、
 複合情報は、波面情報と、回転角度情報と、を有し、
 波面情報は、物体を通過した照明光に基づいて取得した波面の情報であり、
 回転角度情報は、照明光が物体を通過したときの物体の向きを示す情報であり、
 プロセッサは、物体の3次元光学特性を推定する推定処理を実行し、
 3次元光学特性は、屈折率分布又は吸収率分布であり、
 ボクセル空間は、第1演算用ボクセルの集合で構成され、
 ボクセル空間は、推定用ボクセルの集合で構成され、
 ボクセル空間は、更新用ボクセルの集合で構成され、
 ボクセル空間、第1演算用ボクセル、推定用ボクセル、及び、更新用ボクセルの基準方向は、疑似照明光の進行方向であり、
 第1演算用ボクセルでは、基準方向におけるボクセル幅がΔz1であり、
 推定用ボクセルと更新用ボクセルでは、基準方向におけるボクセル幅がΔz2であり、
 Δz1は、Δz2よりも広く、
 所定の向きは、回転角度情報により特定される照明光の進行方向に対する物体の向きであり、
 結果は、物体の波面と推定物体の波面との差を示し、
 推定処理は、
 ボクセル空間において、推定用ボクセルの推定値の集合に対応する推定物体の基準方向に対する向きが、所定の向きと一致するように、ボクセル空間において推定物体を回転し、回転後の推定物体に対応する第1演算用ボクセルの推定値を生成する処理Aと、
 処理Aで生成した第1演算用ボクセルの推定値を用いて疑似照明光が進行する基準方向に伝搬する波面を算出することで推定波面情報を生成し、波面情報で推定波面情報を拘束し、模擬照明光が進行する基準方向と逆方向に伝搬する波面を算出し、更新用ボクセルの結果を生成する処理Bと、
 処理Bで生成した更新用ボクセルの結果に基づいて、推定用ボクセルの推定値を更新する処理Cと、を有することを特徴とする。
In order to solve the above-described problems and achieve the object, an estimating device according to at least some embodiments of the present invention includes:
comprising a memory, a processor, and
the memory stores a plurality of composite information;
The composite information has wavefront information and rotation angle information,
Wavefront information is wavefront information acquired based on illumination light that has passed through an object,
The rotation angle information is information indicating the orientation of the object when the illumination light passes through the object.
the processor performs an estimation process to estimate the three-dimensional optical properties of the object;
The three-dimensional optical property is a refractive index distribution or an absorptance distribution,
The voxel space is composed of a set of first operation voxels,
The voxel space consists of a set of estimation voxels,
The voxel space consists of a set of update voxels,
The reference direction of the voxel space, the first calculation voxel, the estimation voxel, and the update voxel is the traveling direction of the pseudo illumination light,
In the first calculation voxel, the voxel width in the reference direction is Δz1,
In the estimation voxel and the update voxel, the voxel width in the reference direction is Δz2,
Δz1 is wider than Δz2,
The predetermined orientation is the orientation of the object with respect to the traveling direction of the illumination light specified by the rotation angle information,
The result indicates the difference between the object wavefront and the estimated object wavefront,
The estimation process is
Rotate the estimation object in the voxel space so that the orientation of the estimation object corresponding to the set of estimated values of the estimation voxels with respect to the reference direction matches the predetermined orientation, and correspond to the rotation estimation object. A process A for generating an estimated value of the first calculation voxel;
Estimated wavefront information is generated by calculating a wavefront propagating in the reference direction in which the pseudo illumination light travels using the estimated value of the first calculation voxel generated in the process A, and the estimated wavefront information is constrained by the wavefront information, A process B of calculating a wavefront propagating in a direction opposite to the reference direction in which the simulated illumination light travels and generating a result of update voxels;
and a process C of updating the estimated values of the estimation voxels based on the result of the update voxels generated in the process B.
 本発明の少なくとも幾つかの実施形態に係る推定システムは、
 上述の推定装置と、
 照明光を射出する光源と、
 光検出器と、
 物体を載置するステージと、
 角度変更機構と、を備え、
 ステージは、光源から光検出器まで間の光路上に配置され、
 角度変更機構は、光路の光軸に対する物体の配置角度を変化させることを特徴とする。
An estimation system according to at least some embodiments of the present invention comprises:
the estimating device described above;
a light source that emits illumination light;
a photodetector;
a stage on which an object is placed;
and an angle changing mechanism,
The stage is placed on the optical path from the light source to the photodetector,
The angle changing mechanism is characterized by changing the arrangement angle of the object with respect to the optical axis of the optical path.
 本発明の少なくとも幾つかの実施形態に係る推定方法は、
 物体の3次元光学特性を推定する推定方法であって、
 3次元光学特性は、屈折率分布又は吸収率分布であり、
 複合情報は、波面情報と、回転角度情報と、を有し、
 波面情報は、物体を通過した照明光に基づいて取得した波面の情報であり、
 回転角度情報は、照明光が物体を通過したときの物体の向きを示す情報であり、
 ボクセル空間は、第1演算用ボクセルの集合で構成され、
 ボクセル空間は、推定用ボクセルの集合で構成され、
 ボクセル空間は、更新用ボクセルの集合で構成され、
 ボクセル空間、第1演算用ボクセル、推定用ボクセル、及び、更新用ボクセルの基準方向は、疑似照明光の進行方向であり、
 第1演算用ボクセルでは、基準方向におけるボクセル幅がΔz1であり、
 推定用ボクセルと更新用ボクセルでは、基準方向におけるボクセル幅がΔz2であり、
 Δz1は、Δz2よりも広く、
 所定の向きは、回転角度情報により特定される照明光の進行方向に対する物体の向きであり、
 結果は、物体の波面と推定物体の波面との差を示し、
 推定処理を実行し、
 推定処理では、
 ボクセル空間において、推定用ボクセルの推定値の集合に対応する推定物体の基準方向に対する向きが、所定の向きと一致するように、ボクセル空間において推定物体を回転し、回転後の推定物体に対応する第1演算用ボクセルの推定値を生成する処理Aを実行し、
 処理Aで生成した第1演算用ボクセルの推定値を用いて疑似照明光が進行する基準方向に伝搬する波面を算出することで推定波面情報を生成し、波面情報で推定波面情報を拘束し、疑似照明光が進行する基準方向と逆方向に伝搬する波面を算出し、得た結果を更新用ボクセルの結果を生成する処理Bを実行し、
 処理Bで生成した更新用ボクセルの結果に基づいて、推定用ボクセルの推定値を更新する処理Cを実行することを特徴とする。
An estimation method according to at least some embodiments of the present invention comprises:
An estimation method for estimating a three-dimensional optical property of an object, comprising:
The three-dimensional optical property is a refractive index distribution or an absorptance distribution,
The composite information has wavefront information and rotation angle information,
Wavefront information is wavefront information acquired based on illumination light that has passed through an object,
The rotation angle information is information indicating the orientation of the object when the illumination light passes through the object.
The voxel space is composed of a set of first operation voxels,
The voxel space consists of a set of estimation voxels,
The voxel space consists of a set of update voxels,
The reference direction of the voxel space, the first calculation voxel, the estimation voxel, and the update voxel is the traveling direction of the pseudo illumination light,
In the first calculation voxel, the voxel width in the reference direction is Δz1,
In the estimation voxel and the update voxel, the voxel width in the reference direction is Δz2,
Δz1 is wider than Δz2,
The predetermined orientation is the orientation of the object with respect to the traveling direction of the illumination light specified by the rotation angle information,
The result indicates the difference between the object wavefront and the estimated object wavefront,
perform the estimation process,
In the estimation process,
Rotate the estimation object in the voxel space so that the orientation of the estimation object corresponding to the set of estimated values of the estimation voxels with respect to the reference direction matches the predetermined orientation, and correspond to the rotation estimation object. Execute a process A for generating an estimated value of the first calculation voxel,
Estimated wavefront information is generated by calculating a wavefront propagating in the reference direction in which the pseudo illumination light travels using the estimated value of the first calculation voxel generated in the process A, and the estimated wavefront information is constrained by the wavefront information, calculating a wavefront propagating in a direction opposite to the reference direction in which the pseudo-illumination light travels, and performing a process B of generating update voxel results from the obtained results;
It is characterized by executing a process C of updating the estimated value of the estimation voxels based on the result of the update voxels generated in the process B. FIG.
 本発明の少なくとも幾つかの実施形態に係る記録媒体は、
 メモリとプロセッサを備えたコンピュータのプロセッサに推定処理を実行させるためのプログラムを記録したコンピュータ読み取り可能な記録媒体であって、
 複合情報は、波面情報と、回転角度情報と、を有し、
 波面情報は、物体を通過した照明光に基づいて取得した波面の情報であり、
 回転角度情報は、照明光が物体を通過したときの物体の向きを示す情報であり、
 推定処理では、物体の3次元光学特性を推定し、
 3次元光学特性は、屈折率分布又は吸収率分布であり、
 ボクセル空間は、第1演算用ボクセルの集合で構成され、
 ボクセル空間は、推定用ボクセルの集合で構成され、
 ボクセル空間は、更新用ボクセルの集合で構成され、
 ボクセル空間、第1演算用ボクセル、推定用ボクセル、及び、更新用ボクセルの基準方向は、疑似照明光の進行方向であり、
 第1演算用ボクセルでは、基準方向におけるボクセル幅がΔz1であり、
 推定用ボクセルと更新用ボクセルでは、基準方向におけるボクセル幅がΔz2であり、
 Δz1は、Δz2よりも広く、
 所定の向きは、回転角度情報により特定される照明光の進行方向に対する物体の向きであり、
 結果は、物体の波面と推定物体の波面との差を示し、
 処理aでは、メモリから複数の複合情報を読み出し、
 処理Aでは、ボクセル空間において、推定用ボクセルの推定値の集合に対応する推定物体の基準方向に対する向きが、所定の向きと一致するように、ボクセル空間において推定物体を回転し、第1演算用ボクセルの推定値を生成し、
 処理Bでは、処理Aで生成した第1演算用ボクセルの推定値を用いて模擬照明光が進行する基準方向に伝搬する波面を算出することで推定波面情報を生成し、波面情報で推定波面情報を拘束し、模擬照明光が進行する基準方向と逆方向に伝搬する波面を算出し、更新用ボクセルの結果を生成し、
 処理Cでは、第B処理で生成した更新用ボクセルの結果に基づいて、推定用ボクセルの推定値を更新し、
 処理aと、処理Aから処理Cまでを用いて、プロセッサに推定処理を実行させることを特徴とするプログラムを記録したコンピュータ読み取り可能な記録媒体。
A recording medium according to at least some embodiments of the present invention comprises:
A computer-readable recording medium recording a program for causing a processor of a computer having a memory and a processor to perform estimation processing,
The composite information has wavefront information and rotation angle information,
Wavefront information is wavefront information acquired based on illumination light that has passed through an object,
The rotation angle information is information indicating the orientation of the object when the illumination light passes through the object.
The estimation process estimates the three-dimensional optical properties of the object,
The three-dimensional optical property is a refractive index distribution or an absorptance distribution,
The voxel space is composed of a set of first operation voxels,
The voxel space consists of a set of estimation voxels,
The voxel space consists of a set of update voxels,
The reference direction of the voxel space, the first calculation voxel, the estimation voxel, and the update voxel is the traveling direction of the pseudo illumination light,
In the first calculation voxel, the voxel width in the reference direction is Δz1,
In the estimation voxel and the update voxel, the voxel width in the reference direction is Δz2,
Δz1 is wider than Δz2,
The predetermined orientation is the orientation of the object with respect to the traveling direction of the illumination light specified by the rotation angle information,
The result indicates the difference between the object wavefront and the estimated object wavefront,
In processing a, multiple pieces of composite information are read out from the memory,
In the process A, the estimation object is rotated in the voxel space so that the orientation of the estimation object corresponding to the set of estimated values of the estimation voxels with respect to the reference direction matches a predetermined orientation. Generate voxel estimates,
In process B, estimated wavefront information is generated by calculating the wavefront propagating in the reference direction in which the simulated illumination light travels using the estimated value of the first calculation voxel generated in process A, and the estimated wavefront information is calculated based on the wavefront information. is constrained, a wavefront propagating in the direction opposite to the reference direction in which the simulated illumination light travels is calculated, and a result of updating voxels is generated,
In process C, the estimated values of the estimation voxels are updated based on the result of the update voxels generated in the B process,
A computer-readable recording medium recording a program for causing a processor to execute an estimation process using a process a and processes A to C.
 本発明によれば、物体の3次元光学特性を少ない演算量で取得できる推定装置、推定システム、推定方法、及び記録媒体を提供することができる。 According to the present invention, it is possible to provide an estimating device, an estimating system, an estimating method, and a recording medium that can acquire the three-dimensional optical properties of an object with a small amount of computation.
本実施形態の推定装置を示す図である。It is a figure which shows the estimation apparatus of this embodiment. 照明光が物体を通過する様子を示す図である。It is a figure which shows a mode that illumination light passes through an object. 干渉縞の取得方法におけるフローチャートである。4 is a flowchart of a method for acquiring interference fringes; プロセッサで実行する処理のフローチャートである。4 is a flowchart of processing executed by a processor; プロセッサで実行する処理のフローチャートである。4 is a flowchart of processing executed by a processor; 測定画像と推定画像を示す図である。FIG. 10 is a diagram showing a measured image and an estimated image; 推定物体の向きとボクセルデータを示す図である。FIG. 4 is a diagram showing the orientation of an estimated object and voxel data; 推定波面情報の算出におけるフローチャートである。4 is a flowchart for calculating estimated wavefront information; 本実施形態の推定システムを示す図である。It is a figure which shows the estimation system of this embodiment. 本実施形態の推定システムを示す図である。It is a figure which shows the estimation system of this embodiment. 物体の画像を示す図である。FIG. 4 is a diagram showing an image of an object; 演算時間とボクセル幅の関係を示すグラフである。4 is a graph showing the relationship between computation time and voxel width;
 実施例の説明に先立ち、本発明のある態様にかかる実施形態の作用効果を説明する。なお、本実施形態の作用効果を具体的に説明するに際しては、具体的な例を示して説明することになる。しかし、後述する実施例の場合と同様に、それらの例示される態様はあくまでも本発明に含まれる態様のうちの一部に過ぎず、その態様には数多くのバリエーションが存在する。したがって、本発明は例示される態様に限定されるものではない。 Before describing the examples, the effects of embodiments according to certain aspects of the present invention will be described. It should be noted that when specifically explaining the effects of the present embodiment, a specific example will be shown and explained. However, as with the examples described later, the illustrated aspects are only a part of the aspects included in the present invention, and there are many variations of the aspects. Accordingly, the invention is not limited to the illustrated embodiments.
 本実施形態の推定装置は、メモリと、プロセッサと、を備え、メモリは、複数の複合情報を記憶する。複合情報は、波面情報と、回転角度情報と、を有し、波面情報は、物体を通過した照明光に基づいて取得した波面の情報であり、回転角度情報は、照明光が物体を通過したときの物体の向きを示す情報である。プロセッサは、物体の3次元光学特性を推定する推定処理を実行し、3次元光学特性は、屈折率分布又は吸収率分布である。 The estimation device of this embodiment includes a memory and a processor, and the memory stores a plurality of composite information. The composite information includes wavefront information and rotation angle information. The wavefront information is wavefront information acquired based on the illumination light that has passed through the object. This information indicates the orientation of the object at the time. A processor performs an estimation process to estimate a three-dimensional optical property of the object, where the three-dimensional optical property is a refractive index distribution or an absorptance distribution.
 プロセッサで実行する処理では、物体の3次元光学特性を推定する推定処理を実行する。3次元光学特性は、屈折率分布又は吸収率分布である。 In the process executed by the processor, an estimation process for estimating the three-dimensional optical properties of the object is executed. A three-dimensional optical property is a refractive index distribution or an absorptance distribution.
 図1は、本実施形態の推定装置を示す図である。推定装置1は、メモリ2と、プロセッサ3と、を備える。メモリ2は、複数の複合情報を記憶している。プロセッサ3は、物体の3次元光学特性を推定する推定処理を実行する。3次元光学特性は、屈折率分布又は吸収率分布である。推定処理には、複数の複合情報を用いることができる。 FIG. 1 is a diagram showing the estimation device of this embodiment. The estimating device 1 comprises a memory 2 and a processor 3 . The memory 2 stores multiple pieces of composite information. Processor 3 performs an estimation process to estimate the three-dimensional optical properties of the object. A three-dimensional optical property is a refractive index distribution or an absorptance distribution. Multiple composite information can be used in the estimation process.
 プロセッサはASICやFPGAで実現されても良いし、CPUであっても良い。CPUでプロセッサを実現する場合、CPUはプログラムをメモリから読み出して処理を実行する。 The processor may be implemented as an ASIC or FPGA, or may be a CPU. When a processor is implemented by a CPU, the CPU reads a program from memory and executes processing.
 複合情報は、波面情報と、回転角度情報と、を有する。波面情報は、物体を通過した照明光に基づいて取得した波面情報である。回転角度情報は、照明光が物体を通過したときの物体の向きを示す情報である。 The composite information has wavefront information and rotation angle information. Wavefront information is wavefront information acquired based on illumination light that has passed through an object. The rotation angle information is information indicating the orientation of the object when the illumination light passes through the object.
 推定処理では、波面情報が用いられる。波面情報は多い方が良い。波面情報が多いと、物体の3次元光学特性を高い精度で推定することができる。 Wavefront information is used in the estimation process. The more wavefront information, the better. With a lot of wavefront information, the three-dimensional optical properties of an object can be estimated with high accuracy.
 波面情報は、例えば、干渉縞から取得することができる。干渉縞は、測定光と参照光で形成される。測定光は、物体を通過した照明光である。参照光は、物体を通過しない照明光である。照明光には、平行光が用いられる。 Wavefront information can be obtained, for example, from interference fringes. Interference fringes are formed by the measurement light and the reference light. The measurement light is the illumination light that has passed through the object. Reference light is illumination light that does not pass through the object. Parallel light is used as illumination light.
 波面を光強度で推定する場合、画像を波面情報として使用することができる。画像を波面情報として使用する場合、画像を解析して波面情報を取得しなくても良い。 When estimating the wavefront with light intensity, the image can be used as wavefront information. When using an image as wavefront information, it is not necessary to acquire wavefront information by analyzing the image.
 波面情報は、干渉縞の画像を解析することで取得することができる。よって、干渉縞を取得する必要がある。干渉縞の画像の取得方法について説明する。 Wavefront information can be obtained by analyzing the image of the interference fringes. Therefore, it is necessary to acquire interference fringes. A method of obtaining an image of interference fringes will be described.
 推定処理には、複数の波面情報と複数の回転角度情報を用いる。そのため、メモリは、複数の波面情報と複数の回転角度情報を記憶している。 A plurality of wavefront information and a plurality of rotation angle information are used for the estimation process. Therefore, the memory stores a plurality of wavefront information and a plurality of rotation angle information.
 波面情報は、物体を通過した照明光に基づいて取得した波面の情報である。複数の波面情報では、照明光の物体に対する入射角度が、波面情報毎に異なる。 The wavefront information is wavefront information acquired based on the illumination light that has passed through the object. With a plurality of pieces of wavefront information, the incident angle of the illumination light to the object differs for each wavefront information.
 波面情報は、振幅、位相、光強度、複素振幅の何れかを含む。波面情報は、結像面における波面の情報である。結像面とは、光検出器が光を検出する面であり、撮像面とも言う。 Wavefront information includes any of amplitude, phase, optical intensity, and complex amplitude. The wavefront information is information on the wavefront on the imaging plane. The imaging plane is a plane on which light is detected by the photodetector, and is also called an imaging plane.
 図2は、照明光が物体を通過する様子を示す図である。図2(a)、図2(b)、図2(c)は、物体の向きが変化する様子示す図である。 FIG. 2 is a diagram showing how illumination light passes through an object. 2(a), 2(b), and 2(c) are diagrams showing how the orientation of an object changes.
 図2(a)、図2(b)、図2(c)では、照明光Lλの向きを変えずに、物体10の向きを変えている。そのため、照明光Lλに対する物体10の向きが、変化する。θSは、物体の傾き角である。図2(a)ではθS=-20°、図2(b)ではθS=0°、図2(c)ではθS=+20°である。 In FIGS. 2(a), 2(b), and 2(c), the direction of the object 10 is changed without changing the direction of the illumination light Lλ. Therefore, the orientation of the object 10 with respect to the illumination light Lλ changes. θS is the tilt angle of the object. θS=−20° in FIG. 2(a), θS=0° in FIG. 2(b), and θS=+20° in FIG. 2(c).
 図2(a)、図2(b)、図2(c)では、照明光Lλの向きは変化しない。よって、見かけ上、照明光Lλの物体10に対する入射角度は変化しない。しかしながら、物体10の向きが変化しているので、実質的には、照明光Lλの物体10に対する入射角度は変化している。 In FIGS. 2(a), 2(b), and 2(c), the direction of the illumination light Lλ does not change. Therefore, apparently, the incident angle of the illumination light Lλ with respect to the object 10 does not change. However, since the orientation of the object 10 changes, the incident angle of the illumination light Lλ with respect to the object 10 substantially changes.
 照明光が物体を通過したときの物体の向きは、照明光の進行方向に対する物体の相対的な向きである。照明光の進行方向は、測定装置の光路の光軸方向である。測定装置に検出光学系が備わっている場合は、照明光の進行方向は検出光学系の光軸方向である。測定装置に照明光学系が備わっている場合は、照明光の進行方向は照明光学系の光軸方向である。また、照明光の進行方向は、測定装置の光検出器の光検出面の垂線方向でもある。 The orientation of the object when the illumination light passes through it is the orientation of the object relative to the traveling direction of the illumination light. The traveling direction of the illumination light is the optical axis direction of the optical path of the measuring device. If the measuring apparatus is equipped with a detection optical system, the traveling direction of the illumination light is the optical axis direction of the detection optical system. If the measurement apparatus is equipped with an illumination optical system, the traveling direction of the illumination light is the optical axis direction of the illumination optical system. The direction of travel of the illumination light is also the direction perpendicular to the photodetection surface of the photodetector of the measuring device.
 照明光が物体を通過したときの物体の向きは、照明光と物体の相対的な向き(以下、「相対方向」という)に置き換えることができる。 The orientation of the object when the illumination light passes through the object can be replaced with the relative orientation of the illumination light and the object (hereinafter referred to as "relative direction").
 相対方向の変更では、固定の測定装置に対して物体の向きを相対的に変えずに、物体を固定しておいて測定装置の向きを変えても良い。 In changing the relative direction, the orientation of the measuring device may be changed while the object is fixed, without changing the orientation of the object relative to the fixed measuring device.
 上述のように、波面情報は、物体を通過した照明光に基づいて取得した波面情報である。よって、波面情報の量は、相対方向の影響を受ける。 As described above, wavefront information is wavefront information acquired based on illumination light that has passed through an object. Thus, the amount of wavefront information is affected by relative orientation.
 相対方向を変化させると、変化させた回数だけ波面情報を取得できる。そのため、波面情報の量を増やすことができる。また、相対方向が異なると、物体内部における照明光の通過領域が異なる。1つの相対方向における波面情報は、他の相対方向における波面情報には無い情報を含んでいる。そのため、波面情報の量を増やすことができる。 By changing the relative direction, wavefront information can be acquired as many times as the number of changes. Therefore, the amount of wavefront information can be increased. Also, if the relative directions are different, the passage area of the illumination light inside the object will be different. Wavefront information in one relative direction contains information that is not present in wavefront information in another relative direction. Therefore, the amount of wavefront information can be increased.
 図3は、干渉縞の取得方法におけるフローチャートである。この取得方法では、相対方向を変える。 FIG. 3 is a flowchart of the method for obtaining interference fringes. This acquisition method changes the relative orientation.
 ステップS10では、角度変更回数Nθを設定する。変更する角度は、相対方向である。例えば、相対方向を5回変更する場合、角度変更回数Nθの値に5を設定する。 In step S10, the angle change count Nθ is set. The angle to change is the relative direction. For example, when the relative direction is changed five times, 5 is set as the value of the angle change count Nθ.
 相対方向のずれは、角度で表わすことができる。以下では、相対方向のずれを、相対角度θ(m)で表わす。照明光と物体の相対的な向きが一致している場合、相対方向のずれは0°である。この場合、相対角度θ(m)の値に0°を設定する。 The deviation in the relative direction can be expressed as an angle. In the following, the displacement in the relative direction is represented by the relative angle θ (m). When the relative orientations of the illuminating light and the object match, the relative orientation deviation is 0°. In this case, the value of the relative angle θ(m) is set to 0°.
 ステップS20では、相対角度θ(m)を設定する。相対角度θ(m)の設定では、例えば、θ(1)の値に0°を設定し、θ(2)の値に4°を設定し、θ(3)の値に7°を設定し、θ(4)の値に10°を設定し、θ(5)の値に15°を設定する。 In step S20, the relative angle θ (m) is set. In the setting of the relative angle θ (m), for example, the value of θ(1) is set to 0°, the value of θ(2) is set to 4°, and the value of θ(3) is set to 7°. , set the value of θ(4) to 10° and set the value of θ(5) to 15°.
 また、初期値と増分を設定しても良い。この場合、θ(1)の値に初期値を設定し、θ(2)の値、θ(3)の値、θ(4)の値、及びθ(5)の値に、初期値に増分を加えた角度を設定すれば良い。 You can also set the initial value and increment. In this case, set the value of θ(1) to the initial value, and increment the values of θ(2), θ(3), θ(4), and θ(5) to the initial values. It is sufficient to set the angle by adding .
 角度については、数値に位記号を付けて説明している。実際には、数値のみを設定する。 The angle is explained by adding a place symbol to the numerical value. Actually, only numeric values are set.
 ステップS30では、変数mの値に1を設定する。 In step S30, 1 is set to the value of variable m.
 ステップS40では、相対角度θ(m)に基づいて位置決めする。位置決めする対象は、照明光又は物体である。位置決めでは、照明光に対する物体の向きがθ相対角度(m)の値と一致するように、物体を回転させる。 In step S40, positioning is performed based on the relative angle θ (m). The object to be positioned is illumination light or an object. In positioning, the object is rotated so that the orientation of the object with respect to the illumination light matches the value of the θ relative angle (m).
 ステップS50では、干渉縞I(m)の画像を取得する。照明光が物体に照射されることで、干渉縞I(m)が形成される。干渉縞I(m)を光検出器で撮像することで、干渉縞I(m)の画像を取得することができる。光検出器は、CCD等のイメージセンサ(撮像素子)を含む。 In step S50, an image of interference fringes I(m) is acquired. Interference fringes I(m) are formed by irradiating the object with illumination light. An image of the interference fringes I(m) can be obtained by imaging the interference fringes I(m) with a photodetector. The photodetector includes an image sensor (imaging element) such as a CCD.
 変数mの値は、相対角度に関する序数を表している。よって、干渉縞I(m)は、m番目の相対角度で形成された干渉縞を表している。 The value of variable m represents the ordinal number for the relative angle. Thus, fringes I(m) represent fringes formed at the m-th relative angle.
 ステップS60では、変数mの値が角度変更回数Nθの値と一致したか否かを判断する。判断結果がNOの場合は、ステップS70を実行する。判断結果がYESの場合は、終了する。 At step S60, it is determined whether or not the value of the variable m matches the value of the angle change count Nθ. If the determination result is NO, step S70 is executed. If the determination result is YES, the process ends.
(判断結果がNOの場合:m≠Nθ)
 ステップS70を実行する。ステップS70では、変数mの値に1を加算する。ステップS70が終ると、ステップS40に戻る。ステップS70で、変数mの値が1つ増えている。そのため、別の相対角度で、ステップS40とステップS50を実行する。ステップS40とステップS50を、全ての相対角度で位置決めされるまで繰り返す。
(If the judgment result is NO: m≠Nθ)
Step S70 is executed. In step S70, 1 is added to the value of variable m. After step S70 ends, the process returns to step S40. In step S70, the value of variable m is incremented by one. Therefore, steps S40 and S50 are executed at another relative angle. Steps S40 and S50 are repeated until all relative angles are positioned.
(判断結果がYESの場合:m=Nθ)
 干渉縞の画像の取得を終了する。干渉縞の画像の取得は、Nθ回行われる。よって、Nθ枚の干渉縞の画像が取得される。上記の例では、角度変更回数Nθの値に5を設定している。よって、5枚の干渉縞の画像が取得される。
(If the judgment result is YES: m=Nθ)
Terminate the acquisition of the fringe image. Acquisition of the image of the interference fringes is performed Nθ times. Therefore, Nθ images of interference fringes are obtained. In the above example, 5 is set as the value of the angle change count Nθ. Therefore, five images of interference fringes are obtained.
 干渉縞から波面情報を取得することができる。干渉縞I(m)から取得した波面情報を、波面情報W(m)とする。 Wavefront information can be obtained from interference fringes. The wavefront information acquired from the interference fringes I(m) is assumed to be wavefront information W(m).
 複数の干渉縞の画像は、照明光の物体に対する入射角度が異なる干渉縞の画像を含む。よって、複数の干渉縞の画像から複数の波面情報を取得することができる。 A plurality of images of interference fringes include images of interference fringes at different angles of incidence of the illumination light on the object. Therefore, a plurality of pieces of wavefront information can be obtained from a plurality of images of interference fringes.
 波面情報W(m)をメモリ2に記憶する。このとき、角度変更回数Nθの値と相対角度θ(m)の値も、メモリ2に記憶する。 The wavefront information W(m) is stored in the memory 2. At this time, the value of the angle change count Nθ and the value of the relative angle θ (m) are also stored in the memory 2 .
 上述のように、プロセッサ3は、物体の3次元光学特性を推定する推定処理を実行する。推定処理では、物体の3次元光学特性の推定を、複数の波面情報と複数の回転角度情報の両方を用いて実行する。 As described above, the processor 3 performs an estimation process of estimating the three-dimensional optical properties of the object. In the estimation process, estimation of the three-dimensional optical properties of the object is performed using both multiple pieces of wavefront information and multiple pieces of rotation angle information.
 物体の3次元光学特性の推定では、波面情報を用いる。波面情報を取得するためには、物体を通過した波面を求める必要がある。本実施形態の推定装置では、ビーム伝搬法を用いて、波面を求めている。ビーム伝搬法の代わりに、FDTD(Finite Difference Time Domain)を用いても良い。 Wavefront information is used to estimate the three-dimensional optical properties of an object. In order to obtain wavefront information, it is necessary to obtain the wavefront that has passed through the object. The estimation apparatus of this embodiment obtains the wavefront using the beam propagation method. FDTD (Finite Difference Time Domain) may be used instead of the beam propagation method.
 プロセッサ3で実行する処理では、複数のボクセル空間を用いる。ボクセル空間は、ボクセルの集合で構成されている。複数のボクセル空間は、第1演算用ボクセルの集合で構成されたボクセル空間と、推定用ボクセルの集合で構成されたボクセル空間と、更新用ボクセルの集合で構成されたボクセル空間と、を有する。 The processing executed by the processor 3 uses a plurality of voxel spaces. A voxel space consists of a collection of voxels. The plurality of voxel spaces has a voxel space configured with a set of first calculation voxels, a voxel space configured with a set of estimation voxels, and a voxel space configured with a set of update voxels.
 ボクセル空間の基準方向は、疑似照明光の進行方向である。よって、第1演算用ボクセル、推定用ボクセル、及び更新用ボクセルでも、基準方向は疑似照明光の進行方向である。 The reference direction of the voxel space is the traveling direction of the pseudo illumination light. Therefore, the reference direction of the first calculation voxel, the estimation voxel, and the update voxel is also the traveling direction of the pseudo illumination light.
 基準方向とは、測定時の測定装置の光路の光軸方向、言い換えれば、照明光の進行方向と一致する方向であり、照明光の進行方向は、測定装置に検出光学系が備わっている場合は、検出光学系の光軸方向、測定装置に照明光学系が備わっている場合は、照明光学系の光軸方向である。また、測定装置の光検出器の光検出面の垂線方向でもある。 The reference direction is the optical axis direction of the optical path of the measurement device during measurement, in other words, the direction that coincides with the traveling direction of the illumination light. is the optical axis direction of the detection optical system, or the optical axis direction of the illumination optical system if the measuring apparatus is provided with the illumination optical system. It is also the direction perpendicular to the photodetection plane of the photodetector of the measurement device.
 以下、説明の容易のために、第1演算用ボクセルの集合で構成されたボクセル空間を、第1演算用ボクセルという。推定用ボクセルの集合で構成されたボクセル空間を推定用ボクセルという。更新用ボクセルの集合で構成されたボクセル空間を、更新用ボクセルという。 Hereinafter, for ease of explanation, a voxel space configured by a set of first calculation voxels will be referred to as a first calculation voxel. A voxel space composed of a set of estimation voxels is called an estimation voxel. A voxel space composed of a set of update voxels is called an update voxel.
 第1演算用ボクセルでは、基準方向におけるボクセル幅がΔz1であり、推定用ボクセルと更新用ボクセルでは、基準方向におけるボクセル幅がΔz2であり、Δz1は、Δz2よりも広い。 In the first calculation voxel, the voxel width in the reference direction is Δz1, and in the estimation voxel and the update voxel, the voxel width in the reference direction is Δz2, and Δz1 is wider than Δz2.
 所定の向きは、回転角度情報により特定される照明光の進行方向に対する物体の向きである。プロセッサ3で実行する処理では、結果が得られる。結果は、物体の波面と推定物体の波面との差を示している。 The predetermined orientation is the orientation of the object with respect to the traveling direction of the illumination light specified by the rotation angle information. The processing performed by processor 3 produces a result. The result shows the difference between the object wavefront and the estimated object wavefront.
 プロセッサ3で実行する処理について説明する。図4は、プロセッサ3で実行する処理のフローチャートである。プロセッサ3で実行する処理は、ステップS10と、ステップS20と、ステップS30と、を有する。 The processing executed by the processor 3 will be explained. FIG. 4 is a flow chart of processing executed by the processor 3 . The process executed by the processor 3 has steps S10, S20, and S30.
 ステップS10は、第1演算用ボクセルの推定値を生成する処理Aである。ステップS10では、ボクセル空間において、推定用ボクセルの推定値の集合に対応する推定物体の基準方向に対する向きが、所定の向きと一致するように、ボクセル空間において推定物体を回転し、回転後の推定物体に対応する第1演算用ボクセルの推定値を生成する。 Step S10 is a process A for generating the estimated value of the first calculation voxel. In step S10, the estimation object is rotated in the voxel space so that the orientation of the estimation object corresponding to the set of estimated values of the estimation voxels with respect to the reference direction coincides with a predetermined orientation. An estimate of a first computational voxel corresponding to the object is generated.
 ステップS20は、更新用ボクセルの結果を生成する処理Bである。ステップS20では、処理Aで生成した第1演算用ボクセルの推定値を用いて疑似照明光が進行する基準方向に伝搬する波面を算出することで推定波面情報を生成し、波面情報で推定波面情報を拘束し、模擬照明光が進行する基準方向と逆方向に伝搬する波面を算出し、更新用ボクセルの結果を生成する。 Step S20 is a process B for generating update voxel results. In step S20, estimated wavefront information is generated by calculating the wavefront propagating in the reference direction in which the pseudo illumination light travels using the estimated value of the first calculation voxel generated in process A, and the estimated wavefront information is calculated based on the wavefront information. is constrained, a wavefront propagating in the direction opposite to the reference direction in which the simulated illumination light travels is calculated, and the update voxel result is generated.
 ステップS30は、推定用ボクセルの推定値を更新する処理Cである。ステップS30では、処理Bで生成した更新用ボクセルの結果に基づいて、推定用ボクセルの推定値を更新する。 Step S30 is processing C for updating the estimated value of the estimation voxel. In step S30, the estimated values of the estimation voxels are updated based on the result of the update voxels generated in the process B. FIG.
 プロセッサ3で実行する処理は、第1処理と、第2処理と、推定処理と、を含む。 The processing executed by the processor 3 includes first processing, second processing, and estimation processing.
 図5は、プロセッサで実行する処理のフローチャートである。この処理は、ステップS100と、ステップS200と、ステップS300と、ステップS400と、ステップS500と、を有する。 FIG. 5 is a flowchart of processing executed by the processor. This process has steps S100, S200, S300, S400, and S500.
 ステップS100では、各種の設定を行う。 In step S100, various settings are made.
 ステップS100は、ステップS110と、ステップS120と、ステップS130と、ステップS140と、を有する。 Step S100 includes step S110, step S120, step S130, and step S140.
 ステップS110では、角度変更回数Nθを設定する。メモリ2には、角度変更回数Nθの値が記憶されている。よって、角度変更回数Nθの値に、メモリ2に記憶された値を設定すれば良い。例えば、メモリ2に5が記憶されている場合、角度変更回数Nθの値に5を設定する。 In step S110, the angle change count Nθ is set. The memory 2 stores the value of the angle change count Nθ. Therefore, the value stored in the memory 2 may be set as the value of the angle change count Nθ. For example, if 5 is stored in the memory 2, 5 is set as the value of the angle change count Nθ.
 ステップS120では、基準回転角度を設定する。ステップS120は、第1処理である。メモリ2には、回転角度情報が記憶されている。回転角度情報は、複数の回転角度を含む。 In step S120, a reference rotation angle is set. Step S120 is the first process. The memory 2 stores rotation angle information. The rotation angle information includes multiple rotation angles.
 ステップS120は、複合情報の回転角度情報が0度からのズレを記録している場合、ステップS120を実行しなくてもよい。 Step S120 does not have to be executed when the rotation angle information of the composite information records a deviation from 0 degrees.
 第1処理では、複数の回転角度のうちの1つを、基準回転角度に設定する。例えば、5つの回転角度が、-10°、-5°、0°、+5°、+10°の場合、0°を基準回転角度に設定する。 In the first process, one of the multiple rotation angles is set as the reference rotation angle. For example, if the five rotation angles are −10°, −5°, 0°, +5°, and +10°, 0° is set as the reference rotation angle.
 第1処理では、回転角度情報を0度からの回転角度で記録している場合、基準回転角度を設定する必要はない。この場合、読み取った回転角度情報が示す回転角度をそのまま処理すれば良い。また、回転角度情報に含まれない回転角度を基準回転角度に設定しても良い。 In the first process, if the rotation angle information is recorded as a rotation angle from 0 degrees, there is no need to set the reference rotation angle. In this case, the rotation angle indicated by the read rotation angle information may be processed as it is. Also, a rotation angle not included in the rotation angle information may be set as the reference rotation angle.
 3次元光学特性の推定は、シミュレーションで行う。シミュレーションでは、推定物体を用いる。推定物体は、ボクセル空間で表すことができる。上述のように、ボクセル空間は、ボクセルの集合体である。各ボクセルにデータを付与することで、推定物体を規定することができる。推定物体の3次元光学特性も、ボクセルデータで表わすことができる。  3D optical characteristics are estimated by simulation. The simulation uses an estimated object. The estimated object can be represented in voxel space. As mentioned above, voxel space is a collection of voxels. By assigning data to each voxel, an estimated object can be defined. Three-dimensional optical properties of the putative object can also be represented by voxel data.
 ボクセルを定義することでシミュレーションを行うことができる。各ボクセルに初期値を設定することができる。 A simulation can be performed by defining voxels. An initial value can be set for each voxel.
 第1演算用ボクセル、推定用ボクセル、及び更新用ボクセルは、複数のボクセルを有する。直交座標系の座標軸を、X軸、Y軸、及びZ軸とすると、各ボクセルは、X座標、Y座標、及びZ座標で表わすことができる。 The first calculation voxel, estimation voxel, and update voxel have a plurality of voxels. Assuming that the coordinate axes of the orthogonal coordinate system are the X-axis, Y-axis and Z-axis, each voxel can be represented by X-coordinate, Y-coordinate and Z-coordinate.
 シミュレーションでは、推定物体を伝搬する波面を算出する。上述のように、干渉縞の画像は、照明光の向きを変えず、物体の向きを変えて取得することができる。この場合、照明光の進行方向をZ軸の方向とすると、照明光はZ軸の方向に進行する。シミュレーションでも、波面の伝搬方向はZ軸の方向になる。 In the simulation, the wavefront propagating through the estimated object is calculated. As described above, the fringe image can be obtained by changing the orientation of the object without changing the orientation of the illumination light. In this case, assuming that the traveling direction of the illumination light is the Z-axis direction, the illumination light travels in the Z-axis direction. Also in the simulation, the propagation direction of the wavefront is the direction of the Z-axis.
 プロセッサ3で実行する処理では、推定用ボクセルのデータと、第1演算用ボクセルのデータと、更新用ボクセルのデータと、を用いる。 In the processing executed by the processor 3, the estimation voxel data, the first calculation voxel data, and the update voxel data are used.
 ステップS130では、推定用ボクセルを定義する。ステップS130は、第2処理である。第2処理では、推定用ボクセルを定義する。推定用ボクセルを定義することで、推定用ボクセルに推定値を設定することができる。定義した推定用ボクセルに、推定値の初期値を設定しても良い。各ボクセルの初期値がゼロの場合、初期値は設定されていないと見なせば良い。 In step S130, an estimation voxel is defined. Step S130 is the second process. In the second process, estimation voxels are defined. By defining the estimation voxels, an estimated value can be set for the estimation voxels. An initial estimated value may be set for the defined estimation voxel. If the initial value of each voxel is zero, it can be considered that the initial value is not set.
 ステップS140では、推定回数Nsを設定する。 In step S140, the estimated number of times Ns is set.
 ステップS200では、各種の初期化を行う。 In step S200, various initializations are performed.
 ステップS200は、ステップS210と、ステップS220と、を有する。 Step S200 has step S210 and step S220.
 ステップS210では、変数nの値に1を設定する。ステップS220では、変数mの値に1を設定する。 In step S210, 1 is set to the value of variable n. In step S220, 1 is set to the value of variable m.
 ステップS300では、推定処理を実施する。推定処理では、物体の3次元光学特性を推定する。 In step S300, estimation processing is performed. The estimation process estimates the three-dimensional optical properties of the object.
 ステップS300は、ステップS310と、ステップS320と、ステップS400と、ステップS330と、ステップS340と、ステップS350と、ステップS360と、を有する。 Step S300 includes steps S310, S320, S400, S330, S340, S350, and S360.
 推定処理では、例えば、評価値が用いられる。評価値は、測定光の波面情報とシミュレーションによる波面情報との差、又は、測定光の波面情報とシミュレーションによる波面情報との比で表わされる。波面情報は、例えば、振幅、位相、光強度、複素振幅の何れかを含んでいる情報である。 For example, an evaluation value is used in the estimation process. The evaluation value is represented by the difference between the wavefront information of the measurement light and the wavefront information obtained by the simulation, or the ratio of the wavefront information of the measurement light and the wavefront information obtained by the simulation. Wavefront information is information including, for example, any of amplitude, phase, light intensity, and complex amplitude.
 シミュレーションによる波面情報(以下、「推定波面情報」という)は、推定画像から算出される。推定画像は、推定物体を透過した光によって得られる画像である。推定物体を透過する光は、シミュレーションによる光である。測定光の波面情報(以下、「測定波面情報」という)は、測定画像から算出される。光強度の場合は、測定画像を測定波面情報として用いても良い。 The simulated wavefront information (hereinafter referred to as "estimated wavefront information") is calculated from the estimated image. The estimated image is an image obtained by light transmitted through the estimated object. The light transmitted through the putative object is the simulated light. Wavefront information of the measurement light (hereinafter referred to as “measurement wavefront information”) is calculated from the measurement image. In the case of light intensity, a measurement image may be used as measurement wavefront information.
 測定画像は、光学装置で取得した物体の画像である。推定画像は、シミュレーションで取得した推定物体の画像である。 A measurement image is an image of an object acquired by an optical device. The estimated image is an image of the estimated object obtained by simulation.
 図6は、測定画像と推定画像を示す図である。図6(a)は、測定画像の取得の様子を示す図である。図6(b)と図6(c)は、推定画像の取得の様子を示す図である。 FIG. 6 is a diagram showing a measured image and an estimated image. FIG. 6A is a diagram showing how a measurement image is acquired. 6(b) and 6(c) are diagrams showing how the estimated image is obtained.
 図6(a)に示すように、測定画像の取得では、物体20と測定光学系21が用いられる。測定光学系21は、レンズ22を有する。 As shown in FIG. 6(a), an object 20 and a measurement optical system 21 are used to acquire a measurement image. The measurement optical system 21 has a lens 22 .
 図6(a)において、位置Zfoは、測定光学系21の焦点の位置を示している。位置Zsは、物体20の像側面の位置を示している。 In FIG. 6A, the position Zfo indicates the focal position of the measurement optical system 21. In FIG. Position Z s indicates the position of the image side of object 20 .
 測定光学系21では、位置Zfoにおける物体20の光学像が、結像面IMに形成される。図6(a)では、位置ZsからΔZ離れた物体20の内部が、位置Zfoと一致している。 In the measurement optical system 21, an optical image of the object 20 at the position Z fo is formed on the imaging plane IM. In FIG. 6A, the inside of the object 20 ΔZ away from the position Z s coincides with the position Z fo .
 結像面IMには、CCD23が配置されている。物体20の光学像は、CCD23によって撮像される。その結果、物体20の光学像の画像(以下、「測定画像Imea」という)を取得できる。測定画像Imeaから、測定波面情報が算出される。光強度の場合は、測定画像を測定波面情報として用いても良い。 A CCD 23 is arranged on the imaging plane IM. An optical image of the object 20 is picked up by the CCD 23 . As a result, an image of an optical image of the object 20 (hereinafter referred to as "measurement image Imea ") can be obtained. Measured wavefront information is calculated from the measured image Imea . In the case of light intensity, a measurement image may be used as measurement wavefront information.
 光学像の画像は光強度の画像なので、測定画像Imeaも光強度の画像である。測定画像Imeaは光強度の画像なので、測定画像Imeaから算出される測定波面情報は、光強度である。光強度を用いる場合、測定画像を波面情報として使用することもできる。 Since the optical image is an image of light intensity, the measurement image I mea is also an image of light intensity. Since the measurement image I mea is an image of light intensity, the measurement wavefront information calculated from the measurement image I mea is light intensity. When using light intensity, the measured image can also be used as wavefront information.
 推定波面情報は、推定物体24の光学像の画像(以下、「推定画像Iest」という)から算出される。 The estimated wavefront information is calculated from an optical image of the estimated object 24 (hereinafter referred to as “estimated image I est ”).
 図6(c)には測定光学系21が図示されている。推定画像Iestの算出はシミュレーションで行われるので、測定光学系21は物理的に存在しない。そのため、推定画像Iestの算出では、測定光学系21の瞳関数が用いられる。 The measurement optical system 21 is illustrated in FIG. 6(c). Since the calculation of the estimated image I est is performed by simulation, the measurement optical system 21 does not physically exist. Therefore, the pupil function of the measurement optical system 21 is used in calculating the estimated image I est .
 推定画像Iestは、結像面IMにおける推定物体24の像から得られる。測定画像Imeaは光強度の画像なので、推定画像Iestも光強度の画像であると良い。よって、結像面IMにおける推定物体24の光強度を算出する必要がある。 The estimated image I est is obtained from the image of the estimated object 24 on the imaging plane IM. Since the measured image I mea is a light intensity image, the estimated image I est is also preferably a light intensity image. Therefore, it is necessary to calculate the light intensity of the estimated object 24 on the imaging plane IM.
 推定物体24の光強度の算出には、推定物体24を伝搬する波面の算出が必要である。上述のように、波面情報W(m)の取得では、照明光の向きを変えず、物体20の向きを変えることができる。この場合、各波面情報W(m)で物体20の向きが異なる。よって、シミュレーションでも、向きが異なる推定物体24を用いる。 Calculation of the light intensity of the estimated object 24 requires calculation of the wavefront propagating through the estimated object 24 . As described above, in acquiring the wavefront information W(m), the direction of the object 20 can be changed without changing the direction of the illumination light. In this case, the orientation of the object 20 differs for each wavefront information W(m). Therefore, the simulation also uses the estimated object 24 with a different orientation.
 上述の例では、物体20の回転角度は、-10°、-5°、0°、+5°、+10°である。よって、Z軸に対する推定物体24の向きも、-10°、-5°、0°、+5°、+10°になる。 In the above example, the rotation angles of the object 20 are -10°, -5°, 0°, +5°, +10°. Therefore, the orientation of the estimated object 24 with respect to the Z axis is also −10°, −5°, 0°, +5°, +10°.
 図7は、推定物体の向きとボクセルを示す図である。楕円は、推定物体を示している。横方向は、X軸方向又はY軸方向である。縦方向は、Z軸方向である。 FIG. 7 is a diagram showing the orientation and voxels of an estimated object. Ellipses indicate estimated objects. The horizontal direction is the X-axis direction or the Y-axis direction. The vertical direction is the Z-axis direction.
 図7(a)、図7(b)は、推定用ボクセルを示す図である。図7(c)、図7(d)、図7(e)は、第1演算用ボクセルを示す図である。図7(f)は、第2演算用ボクセルを示す図である。図7(g)、図7(h)は、更新用ボクセルを示す図である。 FIGS. 7(a) and 7(b) are diagrams showing estimation voxels. FIGS. 7(c), 7(d), and 7(e) are diagrams showing the first calculation voxels. FIG. 7(f) is a diagram showing a second calculation voxel. FIGS. 7(g) and 7(h) are diagrams showing update voxels.
 図7(c)、図7(d)、図7(e)、図7(f)に示すように、第1演算用ボクセルと第2演算用ボクセルでは、照明光の進行方向におけるボクセル幅がΔz1である。図7(a)、図7(b)、図7(g)、図7(h)に示すように、推定用ボクセルと更新用ボクセルでは、照明光の進行方向におけるボクセル幅がΔz2である。Δz1は、Δz2よりも広い。 As shown in FIGS. 7(c), 7(d), 7(e), and 7(f), in the first calculation voxel and the second calculation voxel, the voxel width in the traveling direction of the illumination light is Δz1. As shown in FIGS. 7(a), 7(b), 7(g), and 7(h), the estimation voxel and update voxel have a voxel width of Δz2 in the traveling direction of the illumination light. Δz1 is wider than Δz2.
 ステップS310では、推定物体を回転する。ステップS310は、第3処理である。第3処理では、複数の回転角度から1つの回転角度を読み出す。そして、ステップS130で定義した推定物体の向きが、読み出した回転角度における物体の向きと一致するように、推定物体を回転する。 In step S310, the estimated object is rotated. Step S310 is the third process. In the third process, one rotation angle is read from a plurality of rotation angles. Then, the estimated object is rotated so that the orientation of the estimated object defined in step S130 matches the orientation of the object at the read rotation angle.
 ステップS130で定義した推定物体の向きは、ボクセル空間における基準方向に対する現在の(推定)物体の向きであり、一例が基準回転角度における物体の向きである。読み出した回転角度における物体の向きは、回転角度情報により特定される測定物体の向きであって、測定物体の照明光の進行方向に対する向きである。 The estimated object orientation defined in step S130 is the current (estimated) object orientation with respect to the reference direction in the voxel space, and an example is the object orientation at the reference rotation angle. The orientation of the object at the read rotation angle is the orientation of the measurement object specified by the rotation angle information and the orientation of the measurement object with respect to the traveling direction of the illumination light.
 ステップS130で定義した推定物体に初期値が設定されていない場合、推定物体の向きは決まらない。よって、ステップS310は、必要に応じて設けることができる。 If no initial value is set for the estimated object defined in step S130, the orientation of the estimated object cannot be determined. Therefore, step S310 can be provided as required.
 物体の回転角度は、変数mの値に応じて異なる。例えば、変数mの値と物体の回転角度は、以下のように対応させおくことができる。
 m=1:-10°
 m=2:-5°
 m=3:0°
 m=4:+5°
 m=5:+10°
The rotation angle of the object varies depending on the value of variable m. For example, the value of variable m and the rotation angle of the object can be associated as follows.
m=1:-10°
m=2:-5°
m=3:0°
m=4:+5°
m=5:+10°
 推定処理では、推定物体を伝搬する波面を算出する。推定物体の向きは、各々の推定物体で異なる。波面の算出は、推定物体の向きを同じにして行う。物体の向きは、推定物体の向きと見なすことができる。よって、推定物体の向きは、物体の回転角度を用いて表すことができる。 In the estimation process, the wavefront propagating through the estimated object is calculated. The orientation of the estimated object is different for each estimated object. The calculation of the wavefront is performed with the direction of the estimated object being the same. The object orientation can be regarded as an estimated object orientation. Therefore, the orientation of the estimated object can be expressed using the rotation angle of the object.
 ステップS200では、変数mの値に1を設定している。この場合、変数mの値が1のときの回転角度が読み出されるので、読み出した回転角度に-10°が設定される。ステップS120では、基準回転角度に0°を設定している。  In step S200, the value of the variable m is set to 1. In this case, since the rotation angle is read when the value of the variable m is 1, the read rotation angle is set to -10°. At step S120, the reference rotation angle is set to 0°.
 読み出した回転角度は、基準回転角度と異なる。ステップS310を実行する前では、推定物体の向きは、基準回転角度における向きとなっている。そのため、Z軸と推定物体の角度関係は、読み出した回転角度における光路の光軸と物体の角度関係に一致しない。すなわち、推定物体の向きは、読み出した回転角度での光軸に対する物体の向きと一致していない。 The read rotation angle is different from the reference rotation angle. Before step S310 is executed, the orientation of the estimated object is the orientation at the reference rotation angle. Therefore, the angular relationship between the Z axis and the estimated object does not match the angular relationship between the optical axis of the optical path and the object at the read rotation angle. That is, the orientation of the estimated object does not match the orientation of the object with respect to the optical axis at the read rotation angle.
 図7(a)は、ステップS310を実行する前の様子を示している。実線で示す楕円は、基準回転角度における推定物体を示している。破線で示す楕円は、読み出した回転角度における推定物体を示している。ステップS310を実行する前なので、実線で示す楕円は、破線で示す楕円と重なっていない。 FIG. 7(a) shows the state before step S310 is executed. A solid-line ellipse indicates an estimated object at the reference rotation angle. A dashed ellipse indicates an estimated object at the read rotation angle. Since this is before step S310 is executed, the ellipse indicated by the solid line does not overlap the ellipse indicated by the broken line.
 ステップS310を実行すると、推定用ボクセルの推定値は、読み出した回転角度における物体の向きが、光軸に対する物体の向きと一致するように回転する。その結果、Z軸と推定物体の角度関係は、読み出した回転角度における光路の光軸と物体の角度関係と一致する。すなわち、推定物体の向きは、読み出した回転角度での光軸に対する物体の向きと一致する。 When step S310 is executed, the estimated value of the estimation voxel is rotated so that the orientation of the object at the read rotation angle matches the orientation of the object with respect to the optical axis. As a result, the angular relationship between the Z axis and the estimated object matches the angular relationship between the optical axis of the optical path and the object at the read rotation angle. That is, the orientation of the estimated object matches the orientation of the object with respect to the optical axis at the read rotation angle.
 図7(b)は、ステップS310を実行した後の様子を示している。実線で示す楕円は、破線で示す楕円と重なっている。ステップS310を実行することで、推定物体の向きは、読み出した回転角度での光軸に対する物体の向きと一致する。その結果、推定物体の向きを測定の条件と同じにして、推定物体を伝搬する波面を算出することができる。 FIG. 7(b) shows the state after executing step S310. The solid-line ellipse overlaps the dashed-line ellipse. By executing step S310, the orientation of the estimated object matches the orientation of the object with respect to the optical axis at the read rotation angle. As a result, the wavefront propagating through the estimated object can be calculated with the direction of the estimated object being the same as the measurement condition.
 ステップS320では、第1演算用ボクセルの推定値を生成する。ステップS320は、第4処理である。第4処理では、回転後の推定用ボクセルの推定値を、直接、第1演算用ボクセルに設定する。その結果、第1演算用ボクセルの推定値を生成することができる。回転後の推定用ボクセルの推定値を推定用ボクセルに設定し、推定用ボクセルの推定値を用いて、第1演算用ボクセルの推定値を生成することもできる。 In step S320, an estimated value of the first calculation voxel is generated. Step S320 is the fourth process. In the fourth process, the estimated value of the post-rotation estimation voxel is directly set to the first calculation voxel. As a result, an estimated value of the first calculation voxel can be generated. It is also possible to set the estimated value of the post-rotation estimation voxel as the estimation voxel, and use the estimated value of the estimation voxel to generate the estimated value of the first calculation voxel.
 図7(c)は、ステップS320を実行した後の様子を示している。図7(c)に示すように、Δz1は、Δz2よりも広い。 FIG. 7(c) shows the state after executing step S320. As shown in FIG. 7(c), Δz1 is wider than Δz2.
 推定用ボクセルと第1演算用ボクセルでは、ボクセル幅が異なる。よって、第1演算用ボクセルにおける推定値は、推定用ボクセルにおける推定値を用いて算出する。算出には、例えば、外挿法又は内挿法を用いることができる。 The voxel width differs between the estimation voxel and the first calculation voxel. Therefore, the estimated value in the first calculation voxel is calculated using the estimated value in the estimation voxel. For the calculation, for example, an extrapolation method or an interpolation method can be used.
 ステップS400では、推定物体を伝搬する波面と勾配を算出する。ボクセル幅を波面の伝搬間隔と見なすと、ボクセル幅が狭いほど伝搬する波面の数は多くなる。伝搬する波面の数は多くなるほど、伝搬する波面の算出における演算時間が長くなる。 In step S400, the wavefront and gradient propagating through the estimated object are calculated. If the voxel width is regarded as the wavefront propagation interval, the narrower the voxel width, the greater the number of wavefronts that propagate. As the number of propagating wavefronts increases, the calculation time for calculating the propagating wavefronts increases.
 図7(d)は、順方向に伝搬する波面の様子を示している。点線は、波面を示している。図7(e)は、逆方向に伝搬する波面の様子を示している。順方向は、照明光が進行する方向である。逆方向は、照明光が進行する方向と逆の方向である。 FIG. 7(d) shows the wavefront propagating in the forward direction. Dotted lines indicate wave fronts. FIG. 7(e) shows the state of the wavefront propagating in the opposite direction. The forward direction is the direction in which illumination light travels. The reverse direction is the direction opposite to the direction in which the illumination light travels.
 図7(d)、図7(e)に示すように、推定物体を伝搬する波面の算出では、第1演算用ボクセルデータを用いる。Δz1は、Δz2よりも広い。よって、推定用ボクセルデータを用いる場合に比べて、伝搬する波面の数を少なくすることができる。その結果、順方向に伝搬する波面の算出と、逆方向に伝搬する波面の算出で、演算時間を短縮することができる。 As shown in FIGS. 7(d) and 7(e), the calculation of the wavefront propagating through the estimated object uses the first calculation voxel data. Δz1 is wider than Δz2. Therefore, the number of propagating wavefronts can be reduced compared to the case of using estimation voxel data. As a result, it is possible to shorten the calculation time by calculating the wavefront propagating in the forward direction and calculating the wavefront propagating in the reverse direction.
 上述のように、干渉縞の画像は、照明光の向きを変えず、物体の向きを変えて取得することができる。この場合、照明光の進行方向における空間分解能は高くない。よって、シミュレーションでは、照明光の進行方向は、Z軸の方向である。よって、Z軸の方向におけるボクセル幅を広くすることができる。 As described above, the image of the interference fringes can be obtained by changing the direction of the object without changing the direction of the illumination light. In this case, the spatial resolution in the traveling direction of the illumination light is not high. Therefore, in the simulation, the traveling direction of the illumination light is the direction of the Z-axis. Therefore, the voxel width in the Z-axis direction can be widened.
 ステップS400は、ステップS410と、ステップS420と、ステップS430と、ステップS440と、ステップS450と、を有する。 Step S400 includes steps S410, S420, S430, S440, and S450.
 ステップS400では、第5処理が実施される。第5処理は、ステップS410と、ステップS420と、ステップS440と、を有する。 In step S400, the fifth process is performed. The fifth process has steps S410, S420, and S440.
 ステップS410では、順伝搬演算を実行する。順伝搬演算では、照明光が進行する方向に伝搬する波面を算出する。シミュレーションによって波面を算出するので、照明光は、疑似照明光である、図7(d)に示すように、第1演算用ボクセルを用いているので、順伝搬演算における演算時間を短縮することができる。 In step S410, a forward propagation calculation is executed. In the forward propagation calculation, a wavefront propagating in the direction in which the illumination light travels is calculated. Since the wavefront is calculated by simulation, the illumination light is pseudo illumination light. As shown in FIG. can.
 順伝搬演算の結果から、推定波面情報を算出することができる。よって、ステップS410は、推定波面情報を算出するステップと見なすことができる。  Estimated wavefront information can be calculated from the result of the forward propagation calculation. Therefore, step S410 can be regarded as a step of calculating estimated wavefront information.
 図8は、推定波面情報の算出におけるフローチャートである。ステップS410は、ステップS411と、ステップS412と、ステップS413と、ステップS414と、ステップS415と、を有する。 FIG. 8 is a flowchart for calculating estimated wavefront information. Step S410 includes steps S411, S412, S413, S414, and S415.
 推定波面情報の算出は、波面の順伝搬に基づいて行う。順伝搬では、図6(b)と図6(c)に示すように、波面は推定物体24から結像面IMに向かって伝搬する。  The estimated wavefront information is calculated based on the forward propagation of the wavefront. In forward propagation, the wavefront propagates from the estimated object 24 toward the imaging plane IM, as shown in FIGS. 6(b) and 6(c).
 ステップS411では、推定物体へ入射する波面を算出する。 In step S411, the wavefront incident on the estimated object is calculated.
 位置Zinは、物体20の光源(照明)側の面に対応する推定物体24の面の位置である。位置Zinは、シミュレーションによる光が推定物体24に入射する側の面の位置である。よって、位置Zinにおける波面Uinを算出する。波面Uinには、物体20に照射される測定光の波面と同じ波面を用いることができる。 The position Z in is the position of the surface of the estimated object 24 corresponding to the surface of the object 20 on the light source (illumination) side. The position Z in is the position of the surface on which the simulated light enters the estimated object 24 . Therefore, the wavefront U in at the position Z in is calculated. The same wavefront as the wavefront of the measurement light with which the object 20 is irradiated can be used for the wavefront Uin .
 ステップS412では、推定物体から射出する波面を算出する。 In step S412, the wavefront emitted from the estimated object is calculated.
 位置Zoutは、物体20の結像側(レンズ側、CCD側)の面に対応する推定物体24の面の位置である。位置Zoutは、推定物体24からシミュレーションによる光が射出する側の面の位置である。よって、位置Zoutにおける波面Uoutを算出する。波面Uoutは、例えば、ビーム伝搬法を用いて、波面Uinから算出することができる。 The position Z out is the position of the surface of the estimated object 24 corresponding to the imaging side (lens side, CCD side) surface of the object 20 . The position Z out is the position of the surface from which the simulated light exits from the estimated object 24 . Therefore, the wavefront U out at the position Z out is calculated. The wavefront Uout can be calculated from the wavefront Uin , for example, using the beam propagation method.
 ステップS413では、所定の取得位置における波面を算出する。 In step S413, the wavefront at a predetermined acquisition position is calculated.
 所定の取得位置は、測定画像が取得されたときの物体側の位置である。所定の取得位置は、位置Zinから位置Zoutまでの間の任意の位置である。位置Zpは、所定の取得位置の一つである。位置Zpは、結像面IMと共役な位置である。 The predetermined acquisition position is the position on the object side when the measurement image was acquired. The predetermined acquisition position is any position between position Zin and position Zout . Position Z p is one of the predetermined acquisition positions. The position Z p is a position conjugate with the imaging plane IM.
 推定画像Iestは、測定画像Imeaと同じ条件で算出される。測定画像Imeaは、位置ZsからΔZ離れた物体20の内部の光学像から取得されている。よって、推定画像Iest算出では、位置ZsからΔZ離れた位置における波面が必要である。 The estimated image I est is calculated under the same conditions as the measured image I mea . The measurement image I mea is obtained from an internal optical image of the object 20 ΔZ away from the position Z s . Therefore, the estimated image I est calculation requires a wavefront at a position ΔZ away from the position Z s .
 図6(b)では、位置Zoutが位置Zsに対応している。位置ZoutからΔZ離れた位置は、位置Zpである。よって、位置Zpにおける波面Upが算出できれば良い。 In FIG. 6(b), position Z out corresponds to position Z s . A position ΔZ away from the position Z out is a position Z p . Therefore, it suffices if the wavefront U p at the position Z p can be calculated.
 位置Zpは、位置ZoutからΔZ離れている。よって、波面Uoutを波面Upとして用いることはできない。波面Upは、例えば、ビーム伝搬法を用いて、波面Uoutから算出することができる。 Position Z p is ΔZ away from position Z out . Therefore, the wavefront Uout cannot be used as the wavefront Up . The wavefront Up can be calculated from the wavefront Uout using, for example, the beam propagation method.
 ステップS414では、結像面における波面を算出する。 In step S414, the wavefront on the imaging plane is calculated.
 波面Upは、測定光学系21を通過して、結像面IMに到達する。結像面IMにおける波面Uimgは、波面Upと測定光学系21の瞳関数から算出することができる。 The wavefront Up passes through the measuring optical system 21 and reaches the imaging plane IM. A wavefront U img on the imaging plane IM can be calculated from the wavefront Up and the pupil function of the measurement optical system 21 .
 ステップS415では、結像面における推定波面情報を算出する。 In step S415, estimated wavefront information on the imaging plane is calculated.
 波面Uimgは、光の振幅を表している。光強度は、振幅の二乗で表わされる。よって、波面Uimgを二乗することで、推定物体24の光強度を算出することができる。その結果、推定画像Iestを取得できる。推定画像Iestから、推定波面情報が算出される。 A wavefront U img represents the amplitude of the light. Light intensity is expressed as the square of the amplitude. Therefore, the light intensity of the estimated object 24 can be calculated by squaring the wavefront U img . As a result, the estimated image I est can be acquired. Estimated wavefront information is calculated from the estimated image I est .
 光強度の代わりに、振幅と位相を用いても良い。振幅と位相は、電場を用いて表される。よって、振幅と位相を用いる場合、測定地と推定値には、電場から算出された値が用いられる。測定に基づく電場Emesと、推定に基づく電場Eestは、以下の式で表される。 Amplitude and phase may be used instead of light intensity. Amplitude and phase are represented using electric fields. Therefore, when amplitude and phase are used, values calculated from the electric field are used for the measurement location and the estimated value. The electric field Emes based on the measurement and the electric field Eest based on the estimation are represented by the following equations.
 Emes=Ames×exp(i×Pmes)
 Eest=Aest×exp(i×Pest)
 ここで、
 Pmesは、測定に基づく位相、
 Amesは、測定に基づく振幅、
 Pestは、推定に基づく位相、
 Aestは、推定に基づく振幅、
である。
Emes = Ames x exp (i x Pmes)
Eest = Aest x exp (i x Pest)
here,
Pmes is the measured phase;
Ames is the measured amplitude;
Pest is the estimated phase,
Aest is the estimated amplitude,
is.
 測定に基づく電場Emesの取得では、測定光と参照光は非平行な状態で、光検出器に入射する。  In the acquisition of the electric field Emes based on the measurement, the measurement light and the reference light are incident on the photodetector in a non-parallel state.
 光検出器では、測定光と参照光によって、光検出器の撮像面に干渉縞が形成される。干渉縞は光検出器によって撮像される。その結果、干渉縞の画像を取得することができる。 In the photodetector, the measurement light and the reference light form interference fringes on the imaging surface of the photodetector. The interference fringes are imaged by a photodetector. As a result, an image of interference fringes can be acquired.
 干渉縞は、測定光と参照光が非平行な状態で取得されている。よって、この干渉縞を解析することで、測定に基づく位相と、測定に基づく振幅と、を得ることができる。その結果、測定に基づく電場Emesが得られる。推定に基づく電場Eestは、シミュレーションで得ることができる。 The interference fringes are obtained with the measurement light and the reference light non-parallel. Therefore, by analyzing the interference fringes, it is possible to obtain the phase based on the measurement and the amplitude based on the measurement. The result is the measured electric field Emes. The estimated electric field Eest can be obtained by simulation.
 また、干渉縞を解析することで、複素振幅を得ることができる。よって、光強度の代わりに、複素振幅を波面情報に用いても良い。 Also, by analyzing the interference fringes, the complex amplitude can be obtained. Therefore, instead of light intensity, complex amplitude may be used for wavefront information.
 図5に戻って説明を続ける。ステップS420では、最適化処理を実施する。最適化処理では、波面情報W(m)で推定波面情報を拘束する。 Return to Figure 5 and continue the explanation. In step S420, optimization processing is performed. In the optimization process, the wavefront information W(m) constrains the estimated wavefront information.
 波面情報W(m)は、干渉縞I(m)の画像から取得している。干渉縞I(m)は、測定光によって形成されている。よって、波面情報W(m)は、測定波面情報と見なすことができる。 The wavefront information W(m) is obtained from the image of the interference fringes I(m). The interference fringes I(m) are formed by the measuring light. Therefore, the wavefront information W(m) can be regarded as the measured wavefront information.
 変数mの値は、相対角度に関する序数を表している。波面情報W(m)は、m番目の相対角度を用いたときの測定波面情報を表している。 The value of variable m represents the ordinal number for the relative angle. The wavefront information W(m) represents the measured wavefront information when using the m-th relative angle.
 測定画像Imeaから、測定波面情報が算出される。推定画像Iestから、推定波面情報が算出される。測定波面情報と推定波面情報との差、又は測定波面情報と推定波面情報との比から評価値を算出することができる。波面情報による推定波面情報の拘束とは、測定波面情報を用いて推定波面情報を修正する、又は推定波面情報と測定波面情報との誤差を算出することであり、評価値を算出することとほぼ同義である。 Measured wavefront information is calculated from the measured image Imea . Estimated wavefront information is calculated from the estimated image I est . An evaluation value can be calculated from the difference between the measured wavefront information and the estimated wavefront information or the ratio between the measured wavefront information and the estimated wavefront information. Constraining the estimated wavefront information by wavefront information means correcting the estimated wavefront information using the measured wavefront information or calculating the error between the estimated wavefront information and the measured wavefront information, which is almost the same as calculating the evaluation value. Synonymous.
 測定画像Imeaと推定画像Iestとの差、又は、測定画像Imeaと推定画像Iestとの比を、評価値に用いても良い。 The difference between the measured image I mea and the estimated image I est or the ratio between the measured image I mea and the estimated image I est may be used as the evaluation value.
 ステップS430では、評価値と閾値との比較を行う。 In step S430, the evaluation value and the threshold are compared.
 評価値が測定波面情報と推定波面情報との差で表されている場合、測定波面情報と推定波面情報との差が、評価値として算出される。評価値は、閾値と比較される。判断結果がNOの場合は、ステップS500を実行する。判断結果がYESの場合は、ステップS440を実行する。 When the evaluation value is represented by the difference between the measured wavefront information and the estimated wavefront information, the difference between the measured wavefront information and the estimated wavefront information is calculated as the evaluation value. The evaluation value is compared with a threshold. If the determination result is NO, step S500 is executed. If the determination result is YES, step S440 is executed.
 (判断結果がNOの場合:閾値≧評価値)
 ステップS500を実行する。
(If the judgment result is NO: threshold ≧ evaluation value)
Step S500 is executed.
 ステップS500では、推定物体の3次元光学特性を算出する。 In step S500, the three-dimensional optical characteristics of the estimated object are calculated.
 得られた推定物体24の3次元光学特性は、物体20の3次元光学特性と同一か、又は、略同一である。ステップS500で得られた3次元光学特性を用いることで、再構成された推定物体を得ることができる。 The obtained three-dimensional optical properties of the estimated object 24 are the same or substantially the same as the three-dimensional optical properties of the object 20 . Using the three-dimensional optical properties obtained in step S500, a reconstructed estimated object can be obtained.
 再構成された推定物体は、例えば、表示装置に出力することができる。 The reconstructed estimated object can be output to, for example, a display device.
 上述のように、ステップS500で得られた3次元光学特は、物体20の3次元光学特と同一か、又は、略同一である。よって、再構成された推定物体は、物体20の構造と同一か、又は、略同一と見なすことができる。 As described above, the three-dimensional optical properties obtained in step S500 are the same or substantially the same as the three-dimensional optical properties of the object 20. Thus, the reconstructed estimated object can be considered identical or nearly identical to the structure of object 20 .
 (判断結果がYESの場合:閾値≦評価値の場合)
 ステップS440を実行する。ステップS440では、逆伝搬演算を実行する。逆伝搬演算では、照明光が進行する方向と逆方向に伝搬する波面を算出する。図7(e)に示すように、第1演算用ボクセルデータを用いているので、逆伝搬演算における演算時間を短縮することができる。
(If the judgment result is YES: if the threshold value is less than or equal to the evaluation value)
Step S440 is executed. In step S440, a backpropagation operation is performed. In the backward propagation calculation, a wavefront propagating in the direction opposite to the direction in which the illumination light travels is calculated. As shown in FIG. 7(e), since the voxel data for the first calculation is used, the calculation time in the backpropagation calculation can be shortened.
 逆伝搬演算の結果から、勾配を算出することができる。よって、ステップS440は、勾配を算出するステップと見なすことができる。 The gradient can be calculated from the results of the backpropagation operation. Therefore, step S440 can be regarded as a step of calculating the slope.
 勾配の算出は、波面の逆伝搬に基づいて行う。逆伝搬では、波面は位置Zoutから位置Zinに向かって伝搬する。位置Zoutにおける波面は、位置Zpにおける波面から算出することができる。 Gradient calculation is based on back propagation of the wavefront. In counter-propagation, the wavefront propagates from position Z out towards position Z in . The wavefront at position Z out can be calculated from the wavefront at position Z p .
 図6(c)には、波面U’pが示されている。波面U’pは、位置Zpにおける波面である。光強度で算出する場合は、波面情報として画像を使用することができる。よって、補正後の波面U’pの算出では、測定画像Imeaと推定画像Iestが用いられる。 FIG. 6(c) shows the wavefront U'p . Wavefront U′ p is the wavefront at position Z p . When calculating with light intensity, an image can be used as wavefront information. Therefore, the measured image I mea and the estimated image I est are used in calculating the wavefront U′ p after correction.
 推定画像Iestは、波面Uimgに基づいて算出される。また、波面Uimgは、波面Upに基づいて算出される。 The estimated image I est is calculated based on the wavefront U img . Also, the wavefront U img is calculated based on the wavefront Up .
 波面Upの算出には、ステップS130で設定した初期値が用いられている。初期値は、推定物体24の3次元光学特性の値である。ステップS440の1回目の実行時、初期値は、物体20の3次元光学特性の値(以下、「物体特性値」という)と異なる。 The initial value set in step S130 is used to calculate the wavefront Up. The initial values are values of the three-dimensional optical properties of the estimated object 24 . When step S440 is executed for the first time, the initial values are different from the values of the three-dimensional optical properties of the object 20 (hereinafter referred to as "object property values").
 初期値と物体特性値との差が大きくなるほど、推定画像Iestと測定画像Imeaとの差も大きくなる。よって、推定画像Iestと測定画像Imeaとの差は、初期値と物体特性値との差を反映していると見なすことができる。 As the difference between the initial value and the object property value increases, the difference between the estimated image I est and the measured image I mea also increases. Therefore, the difference between the estimated image I est and the measured image I mea can be regarded as reflecting the difference between the initial value and the object characteristic value.
 光強度で算出する場合は、波面情報として画像を使用することができる。そこで、推定画像Iestと測定画像Imeaとを用いて、波面Upを補正する。その結果、補正後の波面、すなわち、波面U’pが得られる。 When calculating with light intensity, an image can be used as wavefront information. Therefore, the wavefront Up is corrected using the estimated image I est and the measured image I mea . As a result, the wavefront after correction, that is, the wavefront U'p is obtained.
 波面U’pは、例えば、以下の式(1)で表される。
 U’p=Up×√(Imea/Iest)   (1)
The wavefront U' p is represented by the following equation (1), for example.
U' p = U p ×√(I mea /I est ) (1)
 波面の逆伝搬では、位置Zoutから位置Zinに向かう波面が算出される。よって、勾配を算出するためには、位置Zoutにおける補正後の波面(以下「波面U’out」という)が必要である。 Backpropagation of the wavefront calculates the wavefront from position Z out to position Z in . Therefore, in order to calculate the gradient, a corrected wavefront (hereinafter referred to as "wavefront U' out ") at the position Z out is required.
 波面U’pは波面Upを補正した波面なので、波面U’pは位置Zpにおける波面である。図6(c)では、見易さのために、波面U’pは、位置Zpからずれた位置に図示されている。また、図6(b)では、波面U’outは、位置Zoutからずれた位置に図示されている。 Since the wavefront U'p is a wavefront obtained by correcting the wavefront Up , the wavefront U'p is the wavefront at the position Zp . In FIG. 6(c), the wavefront U'p is shown at a position shifted from the position Zp for ease of viewing. Also, in FIG. 6(b), the wavefront U' out is shown at a position shifted from the position Z out .
 図6(b)と図6(c)に示すように、位置Zoutは、位置ZpからΔZだけ離れている。よって、波面U’pを波面U’outとして用いることはできない。波面U’outは、例えば、ビーム伝搬法を用いて、波面U’pから算出することができる。 As shown in FIGS. 6(b) and 6(c), position Z out is separated from position Z p by ΔZ. Therefore, the wavefront U'p cannot be used as the wavefront U'out . The wavefront U'out can be calculated from the wavefront U'p , for example using the beam propagation method.
 波面U’outが算出されると、波面の逆伝搬に基づいて、波面の算出が行われる。波面の逆伝搬では、推定物体24の内部を伝搬する波面が算出される。波面の算出では、波面UoutとU’outとが用いられる。 Once the wavefront U' out has been calculated, the wavefront calculation is performed based on the wavefront backpropagation. In backpropagation of the wavefront, the wavefront propagating inside the estimated object 24 is calculated. Wavefronts U out and U′ out are used in the wavefront calculation.
 波面U’pは、波面Upと異なる。よって、波面U’outも、波面Uoutと異なる。波面U’outと波面Uoutを用いることで、勾配を算出することができる。勾配は、物体内の任意の位置の波面の勾配である。勾配には、推定物体24の3次元光学特性の値に関する新たな情報が含まれている。 Wavefront U'p is different from wavefront Up . Therefore, the wavefront U'out is also different from the wavefront Uout . The gradient can be calculated using the wavefront U' out and the wavefront U out . The slope is the slope of the wavefront at any position within the object. Gradients contain new information about the values of the three-dimensional optical properties of estimated object 24 .
 ステップS450では、勾配を第2演算用ボクセルに設定する。勾配を算出した逆伝搬演算では、第1演算用ボクセルを用いている。勾配は、逆方向に伝搬する波面から得た結果である。よって、逆方向に伝搬する波面から得た結果を、第2演算用ボクセルの各ボクセルに設定する。図7(f)は、ステップS450を実行した後の様子を示している。 In step S450, the gradient is set to the second calculation voxel. The first calculation voxel is used in the back propagation calculation for calculating the gradient. The gradient is the result obtained from counterpropagating wavefronts. Therefore, the result obtained from the wavefront propagating in the opposite direction is set to each voxel of the second calculation voxels. FIG. 7(f) shows the state after execution of step S450.
 後述のように、勾配は、直接、更新用ボクセルに設定することができる。この場合、ステップS450は不要である。 As described later, gradients can be set directly to update voxels. In this case, step S450 is unnecessary.
 図7(c)と図7(f)に示すように、推定物体の形状は楕円である。よって、ステップS400を実行した後の推定物体の形状は、ステップS400を実行する前の推定物体の形状と同じである。ただし、場合によっては、ステップS400を実行した後の推定物体の形状は、ステップS400を実行する前の推定物体の形状と異なる。 As shown in FIGS. 7(c) and 7(f), the shape of the estimated object is an ellipse. Therefore, the shape of the estimated object after executing step S400 is the same as the shape of the estimated object before executing step S400. However, in some cases, the shape of the estimated object after performing step S400 differs from the shape of the estimated object before performing step S400.
 ステップS330では、更新用ボクセルを生成する。ステップS330は、第6処理である。第6処理では、勾配を、直接、更新用ボクセルに設定する。その結果、更新用ボクセルデータを生成することができる。図7(g)は、ステップS330を実行した後の様子を示している。 In step S330, update voxels are generated. Step S330 is the sixth process. In the sixth process, the gradients are set directly to the update voxels. As a result, update voxel data can be generated. FIG. 7(g) shows the state after execution of step S330.
 図7(e)と図7(g)の比較から分かるように、第1演算用ボクセルで勾配を算出したときのボクセル幅はΔz1で、更新用ボクセルのボクセル幅はΔz2である。すなわち、第1演算用ボクセルと更新用ボクセルでは、ボクセル幅が異なる。よって、更新用ボクセルにおける勾配は、第1演算用ボクセルにおける勾配を用いて算出する。算出には、例えば、外挿法又は内挿法を用いることができる。 As can be seen from the comparison between FIG. 7(e) and FIG. 7(g), the voxel width when calculating the gradient with the first calculation voxel is Δz1, and the voxel width of the update voxel is Δz2. That is, the voxel width differs between the first calculation voxel and the update voxel. Therefore, the gradient in the update voxel is calculated using the gradient in the first calculation voxel. For the calculation, for example, an extrapolation method or an interpolation method can be used.
 ステップS450を設けている場合は、更新用ボクセルにおける勾配は、第2演算用ボクセルにおける勾配を用いて算出すれば良い。 If step S450 is provided, the gradient in the update voxel may be calculated using the gradient in the second calculation voxel.
 ステップS340では、推定物体を回転する。ステップS340は、第7処理である。第7処理では、読み出した回転角度における物体の向きが、基準回転角度における物体の向きと一致するように、推定物体を回転する。ただし、ステップS340は、必要に応じて設ければ良い。 In step S340, the estimated object is rotated. Step S340 is the seventh process. In the seventh process, the estimated object is rotated so that the orientation of the object at the read rotation angle matches the orientation of the object at the reference rotation angle. However, step S340 may be provided as required.
 推定値の更新を基準回転角度における物体の向きで行う場合、S340は必要である。しかしながら、そうでない場合、S340は設けなくても良い。S340を設けない場合、次に行うS310の基準回転角度は、直前に行ったS310の回転角度における物体の向きとなる。また、基準回転角度は推定処理中で固定されている必要はなく、途中で変更しても良い。  S340 is necessary when updating the estimated value with the orientation of the object at the reference rotation angle. However, if not, S340 may not be provided. If S340 is not provided, the reference rotation angle of S310 performed next is the orientation of the object at the rotation angle of S310 performed immediately before. Also, the reference rotation angle need not be fixed during the estimation process, and may be changed during the process.
 その場合、ボクセル空間において、更新用ボクセルの結果の集合に対応する推定物体の基準方向に対する向きと、推定用ボクセルの推定値の集合に対応する推定物体の向きが一致するように、ボクセル空間において、前者と後者の推定物体のうち一方若しくは両方を回転する。S350は、ボクセル空間において基準方向に対する推定物体の向きが推定用ボクセルの推定値の集合と一致している更新用ボクセルの結果の集合を用いて、推定用ボクセルの推定値を更新する。 In that case, in the voxel space, the orientation of the estimation object corresponding to the set of update voxel results with respect to the reference direction and the orientation of the estimation object corresponding to the set of estimated values of the estimation voxels match in the voxel space. , rotate one or both of the former and the latter estimated objects. S350 updates the estimates of the estimation voxels using the resulting set of update voxels in which the orientation of the estimation object relative to the reference direction in voxel space matches the set of estimates of the estimation voxels.
 ステップS310では、複数の回転角度から1つの回転角度を読み出している。読み出した回転角度は、推定物体の向きを表している。読み出した回転角度の推定物体を用いて、処理が実行されている。よって、ステップS330で生成された更新用ボクセルにおける勾配は、読み出した回転角度の推定物体における勾配となっている。 At step S310, one rotation angle is read from a plurality of rotation angles. The read rotation angle represents the orientation of the estimated object. Processing is executed using the read rotation angle estimation object. Therefore, the gradient in the update voxels generated in step S330 is the gradient in the estimated object of the read rotation angle.
 しかしながら、ステップS350で推定値の更新を実施する際には、推定物体の向きを、基準回転角度で示す向きと一致させる必要がある。 However, when updating the estimated value in step S350, it is necessary to match the orientation of the estimated object with the orientation indicated by the reference rotation angle.
 そこで、読み出した回転角度における物体の向きが、基準回転角度における物体の向きと一致するように、推定物体を回転する。その結果、更新用ボクセルにおける勾配で表される推定物体の向きを、基準回転角度で示す向きと一致することができる。図7(h)は、ステップS340を実行した後の様子を示している。 Therefore, the estimated object is rotated so that the orientation of the object at the read rotation angle matches the orientation of the object at the reference rotation angle. As a result, the orientation of the estimated object represented by the gradient in the update voxels can match the orientation indicated by the reference rotation angle. FIG. 7(h) shows the state after execution of step S340.
 ステップS370では、変数mの値が角度変更回数Nθの値と一致したか否かを判断する。判断結果がNOの場合は、ステップS371を実行する。判断結果がYESの場合は、ステップS350を実行する。 At step S370, it is determined whether or not the value of the variable m matches the value of the angle change count Nθ. If the determination result is NO, step S371 is executed. If the determination result is YES, step S350 is executed.
(判断結果がNOの場合:m≠Nθ)
 ステップS371を実行する。ステップS371では、変数mの値に1を加算する。ステップS371が終ると、ステップS310に戻る。
(If the judgment result is NO: m≠Nθ)
Step S371 is executed. In step S371, 1 is added to the value of variable m. After step S371 ends, the process returns to step S310.
 ステップS371で、変数mの値が1つ増えている。この場合、波面情報W(m)におけるmの値が変化する。よって、別の相対角度の波面情報で、ステップS310からステップS340までを実行する。ステップS310からステップS340までを、全ての相対角度で位置決めされるまで繰り返す。 At step S371, the value of variable m is incremented by one. In this case, the value of m in the wavefront information W(m) changes. Therefore, steps S310 to S340 are executed with wavefront information of another relative angle. Steps S310 to S340 are repeated until all relative angles are positioned.
 上記の例では、角度変更回数Nθの値に5を設定している。よって、ステップS310からステップS340までを、5回実行する。 In the above example, 5 is set as the value of the angle change count Nθ. Therefore, steps S310 to S340 are executed five times.
 例えば、波面情報Aと波面情報Bで相対角度が異なる場合、波面情報Aは波面情報Bには無い情報を含み、波面情報Bは波面情報Aには無い情報を含む。よって、相対角度が異なる波面情報が多いほど、情報量が多くなる。 For example, when wavefront information A and wavefront information B have different relative angles, wavefront information A includes information that wavefront information B does not have, and wavefront information B includes information that wavefront information A does not have. Therefore, the amount of information increases as the amount of wavefront information with different relative angles increases.
 情報量が多くなると、ステップS440で、より正確に、補正後の波面を算出することができる。その結果、勾配の精度も高まる。勾配には、推定値と物体特性値との差に関する情報が含まれている。勾配の精度を高めることで、推定値と物体特性値との差を小さくすることができる。すなわち、推定値を、より物体特性値に近づけることができる。 When the amount of information increases, the wavefront after correction can be calculated more accurately in step S440. As a result, the precision of the gradient is also increased. Gradients contain information about the difference between the estimated value and the object property value. By increasing the accuracy of the gradient, it is possible to reduce the difference between the estimated value and the object property value. That is, the estimated value can be brought closer to the object characteristic value.
(判断結果がYESの場合:m=Nθ)
 ステップS350を実行する。ステップS350では、推定用ボクセルの推定値を更新する。
(If the judgment result is YES: m=Nθ)
Step S350 is executed. In step S350, the estimated values of the estimation voxels are updated.
 勾配には、推定値と物体特性値との差に関する情報が含まれている。よって推定値に勾配を加えることで、更新された推定値が得られる。 The gradient contains information about the difference between the estimated value and the object property value. So adding the gradient to the estimate gives an updated estimate.
 更新された推定値は、初期値に比べて、物体特性値により近い。よって、更新された推定値を用いて、推定物体24の3次元光学特性の値を更新することができる。  The updated estimated value is closer to the object property value than the initial value. Accordingly, the values of the three-dimensional optical properties of the estimated object 24 can be updated using the updated estimated values.
 ステップS360では、TV正則化を行う。 In step S360, TV regularization is performed.
 TV正則化を行うことで、ノイズ除去やぼけ画像の修正を行うことができる。TV正則化は、必要に応じて実行すれば良い。よって、ステップ360を省略しても良い。 By performing TV regularization, it is possible to remove noise and correct blurred images. TV regularization may be performed as needed. Therefore, step 360 may be omitted.
 ステップS380では、変数nの値が推定回数Nsの値と一致したか否かを判断する。判断結果がNOの場合は、ステップS381を実行する。判断結果がYESの場合は、終了する。 At step S380, it is determined whether or not the value of the variable n matches the value of the estimated number of times Ns. If the determination result is NO, step S381 is executed. If the determination result is YES, the process ends.
(判断結果がNOの場合:n≠Ns))
 ステップS381を実行する。ステップS381では、変数nの値に1を加算する。ステップS381が終ると、ステップS300に戻る。ステップS300を、変数nの値が推定回数Nsの値と一致するまで繰り返す。
(If the judgment result is NO: n≠Ns))
Step S381 is executed. In step S381, 1 is added to the value of the variable n. After step S381 ends, the process returns to step S300. Step S300 is repeated until the value of the variable n matches the value of the estimated number of times Ns.
(判断結果がYESの場合:n=Ns)
 既定の反復回数に達したため、ステップS500で推定物体の3次元光学特性を算出し、終了する。
(If the judgment result is YES: n=Ns)
Since the predetermined number of iterations has been reached, the three-dimensional optical properties of the estimated object are calculated in step S500, and the process ends.
 上述のように、順伝搬演算における波面の数と逆伝搬演算における波面の数を、少なくすることができる。そのため、物体の3次元光学特性を、少ない演算量で取得できる。その結果、順伝搬演算と逆伝搬演算で、演算時間を短縮することができる。 As described above, the number of wavefronts in the forward propagation calculation and the number of wavefronts in the backward propagation calculation can be reduced. Therefore, the three-dimensional optical characteristics of the object can be obtained with a small amount of calculation. As a result, it is possible to shorten the computation time for the forward propagation computation and the backward propagation computation.
 なお、第3処理から第5処理の順伝搬演算までは、行列変換等を使用することで、第5処理の順伝搬演算の演算単位(例:波面単位)で、並列的に行うことができる。順伝搬演算で必要になったタイミングで必要となった推定値を第3処理、第4処理で生成する。例えば、第3処理、第4処理は、第5処理の順伝搬演算で次に演算する波面の推定値のみを生成してもよいし、生成するタイミングで複数の波面の推定値をまとめて生成してもよい。同様に、第5処理の逆伝搬演算から第7処理までも並列的に行うことができる。並列的に行うことで、メモリを節約することができる。 Note that the third process to the forward propagation calculation of the fifth process can be performed in parallel in the calculation unit (eg, wavefront unit) of the forward propagation calculation of the fifth process by using matrix transformation or the like. . The third and fourth processes generate the estimated values required at the timing required for the forward propagation calculation. For example, the third process and the fourth process may generate only the estimated value of the wavefront to be calculated next in the forward propagation calculation of the fifth process, or collectively generate the estimated values of a plurality of wavefronts at the timing of generation. You may Similarly, the back propagation calculation of the fifth process to the seventh process can also be performed in parallel. Doing things in parallel saves memory.
 また、第6処理を終えた時点で、推定物体の向きと、読みだした回転角度における物体の向きは一致しているので第7処理は不要である。推定値を更新には、大きく分けて三つのパターンがある。 Also, when the sixth processing is completed, the orientation of the estimated object and the orientation of the object at the read rotation angle match, so the seventh processing is unnecessary. There are roughly three patterns for updating the estimated value.
 第1のパターンでは、基準回転角度で推定値を更新する。第2のパターンでは、回転後の角度で推定値を更新する。第3のパターンでは、基準回転角度と回転後の角度以外の回転角度に推定物体と勾配との向きを合わせて、推定値を更新する。 In the first pattern, the estimated value is updated with the reference rotation angle. The second pattern updates the estimate with the post-rotation angle. In the third pattern, the orientations of the estimated object and gradient are adjusted to a rotation angle other than the reference rotation angle and the post-rotation angle, and the estimated value is updated.
 第4処理は、第1演算用ボクセルの推定値を生成する処理Aに対応する。第5処理と第6処理は、更新用ボクセルの結果を生成する処理Bに対応する。第8処理は、推定用ボクセルの推定値を更新する処理Cに対応する。 The fourth process corresponds to process A for generating the estimated value of the first calculation voxel. The fifth and sixth processes correspond to the process B that generates the update voxel results. The eighth process corresponds to the process C of updating the estimated value of the estimation voxel.
 本実施形態の推定装置では、以下の条件式(1)を満足することが好ましい。
 Δz2×2<Δz1   (1)
The estimation device of this embodiment preferably satisfies the following conditional expression (1).
Δz2×2<Δz1 (1)
 条件式(1)を満足することで、順伝搬演算における波面の数と逆伝搬演算における波面の数を、より少なくすることができる。そのため、物体の3次元光学特性を、より少ない演算量で取得できる。その結果、順伝搬演算と逆伝搬演算で、演算時間をより短縮することができる。 By satisfying conditional expression (1), the number of wavefronts in the forward propagation calculation and the number of wavefronts in the backward propagation calculation can be further reduced. Therefore, the three-dimensional optical characteristics of the object can be acquired with a smaller amount of calculation. As a result, it is possible to further shorten the computation time for the forward propagation computation and the backward propagation computation.
 本実施形態の推定装置では、推定処理を所定回数実行した後、Δz1を小さくすることが好ましい。 In the estimation device of this embodiment, it is preferable to reduce Δz1 after executing the estimation process a predetermined number of times.
 推定処理の実行回数が多くなると、推定物体の3次元光学特性は、物体の3次元光学特性に近くなる。推定処理を実行するたびにΔz1を小さくすることで、Z軸方向における3次元光学特性を、より高い精度で推定することができる。 As the number of executions of the estimation process increases, the 3D optical properties of the estimated object become closer to the 3D optical properties of the object. By decreasing Δz1 each time the estimation process is performed, the three-dimensional optical characteristics in the Z-axis direction can be estimated with higher accuracy.
 推定処理を第1の回数(例えば5回)実行した後にΔz1を小さく設定し、さらに、第2の回数(例えば3回)実行した後に、Δz1を更に小さく設定しても良い。 It is also possible to set Δz1 to be small after executing the estimation process a first number of times (eg, 5 times), and further to set Δz1 to be even smaller after executing the estimation process a second number of times (eg, 3 times).
 本実施形態の推定装置では、Δz1は、複数の回転角度情報の間の最大の回転角度差に基づいて決定することが好ましい。 In the estimation device of this embodiment, Δz1 is preferably determined based on the maximum rotation angle difference between multiple pieces of rotation angle information.
 Δz1が広くなり過ぎると、伝搬する波面の数を少なくなり過ぎる。そのため、Z軸方向における3次元光学特性の推定精度が低下する。Δz1を最大の回転角度に基づいて決定することで、Δz1が広くなり過ぎない。そのため、Z軸方向における推定精度の低下を防止することができる。 When Δz1 becomes too wide, the number of propagating wavefronts becomes too small. Therefore, the estimation accuracy of the three-dimensional optical characteristics in the Z-axis direction is lowered. By determining Δz1 based on the maximum rotation angle, Δz1 does not become too wide. Therefore, it is possible to prevent deterioration of estimation accuracy in the Z-axis direction.
 ΔZ1は、複数の回転角度情報に基づく最大の回転角度の差、具体的には、照明光の進行方向に対して物体の負の向きの最大回転角度と正の向きの最大回転角度との間の差に基づいて決定することが望ましい。 ΔZ1 is the maximum rotation angle difference based on a plurality of pieces of rotation angle information, specifically, the difference between the maximum rotation angle in the negative direction and the maximum rotation angle in the positive direction with respect to the traveling direction of the illumination light. should be determined based on the difference between
 本実施形態の推定システムは、本実施形態の推定装置と、照明光を射出する光源と、光検出器と、物体を載置するステージと、角度変更機構と、を備え、ステージは、光源から光検出器まで間の光路上に配置され、角度変更機構は、光路の光軸に対する物体の配置角度を変化させる。 The estimation system of this embodiment includes the estimation device of this embodiment, a light source that emits illumination light, a photodetector, a stage on which an object is placed, and an angle changing mechanism. An angle changing mechanism is arranged on the optical path to the photodetector, and changes the placement angle of the object with respect to the optical axis of the optical path.
 角度変更機構は、ステージを回転させることで、光路の光軸に対する物体の配置角度、言い換えれば、照明光の進行方向に対する物体の向きを変化させる。 By rotating the stage, the angle changing mechanism changes the arrangement angle of the object with respect to the optical axis of the optical path, in other words, the orientation of the object with respect to the traveling direction of the illumination light.
 図9は、本実施形態の推定システムを示す図である。図1と同じ構成については同じ番号を付し、説明を省略する。 FIG. 9 is a diagram showing the estimation system of this embodiment. The same numbers are assigned to the same configurations as in FIG. 1, and the description thereof is omitted.
 推定システム30は、光源31と、光検出器32と、ステージ33と、推定装置1と、を備える。推定装置1は、メモリ2と、プロセッサ3と、を有する。 The estimation system 30 includes a light source 31, a photodetector 32, a stage 33, and an estimation device 1. The estimating device 1 has a memory 2 and a processor 3 .
 光源31は、照明光を射出する。照明光の進行方向には、ビームスプリッタ34が配置されている。照明光は、ビームスプリッタ34に入射する。ビームスプリッタ34は、光学膜が形成された光学面を有する。ビームスプリッタ34では、光学膜によって、入射した光から、第1の方向に透過する光と、第2の方向に反射する光と、が生成される。 The light source 31 emits illumination light. A beam splitter 34 is arranged in the traveling direction of the illumination light. The illumination light enters beam splitter 34 . The beam splitter 34 has an optical surface on which an optical film is formed. In the beam splitter 34 , light that is transmitted in the first direction and light that is reflected in the second direction are generated from the incident light by the optical film.
 推定システム30では、第1の方向に測定光路OPmeaを形成し、第2の方向に参照光路OPrefを形成している。しかしながら、第1の方向に参照光路OPrefを形成し、第2の方向に測定光路OPmeaを形成しても良い。照明光は、各々、測定光路OPmeaと参照光路OPrefを進行する。 The estimation system 30 forms a measurement optical path OP mea in a first direction and a reference optical path OP ref in a second direction. However, the reference optical path OP ref may be formed in the first direction and the measurement optical path OP mea may be formed in the second direction. The illumination light travels through a measurement optical path OP mea and a reference optical path OP ref respectively.
 測定光路OPmeaには、ミラー35が配置されている。測定光路OPmeaは、ミラー35で第2の方向に折り曲げられる。参照光路OPrefには、ミラー36が配置されている。参照光路OPrefは、ミラー36で第1の方向に折り曲げられる。その結果、参照光路OPrefは、測定光路OPmeaと交差する。2つの光路が交差する位置に、ビームスプリッタ37が配置されている。 A mirror 35 is arranged in the measurement optical path OP mea . The measurement optical path OP mea is bent in the second direction by the mirror 35 . A mirror 36 is arranged in the reference optical path OP ref . The reference optical path OP ref is folded in a first direction by mirror 36 . As a result, the reference optical path OP ref intersects the measurement optical path OP mea . A beam splitter 37 is arranged at the position where the two optical paths intersect.
 測定光路OPmeaでは、ミラー35とビームスプリッタ37の間に、ステージ33が配置されている。ステージ33上に、物体Sが載置されている。照明光は、物体Sに照射される。 A stage 33 is arranged between a mirror 35 and a beam splitter 37 in the measurement optical path OP mea . An object S is placed on the stage 33 . The object S is irradiated with the illumination light.
 照明光を物体Sに照射すると、物体Sから測定光Lmeaが出射する。測定光Lmeaは、物体Sを通過した照明光である。 When the object S is irradiated with illumination light, the object S emits measurement light L mea . The measurement light L mea is illumination light that has passed through the object S.
 参照光路OPrefでは、参照光Lrefが進行する。参照光Lrefは、物体Sを通過しない照明光である。 A reference beam L ref travels in the reference optical path OP ref . The reference light L ref is illumination light that does not pass through the object S.
 測定光Lmeaと参照光Lrefは、ビームスプリッタ37に入射する。ビームスプリッタ37は、光学膜が形成された光学面を有する。ビームスプリッタ37では、光学膜によって、入射した光から、第1の方向に透過する光と、第2の方向に反射する光と、が生成される。 The measurement light L mea and the reference light L ref enter the beam splitter 37 . The beam splitter 37 has an optical surface on which an optical film is formed. In the beam splitter 37, the optical film generates light that is transmitted in the first direction and light that is reflected in the second direction from the incident light.
 第1の方向には、光検出器32が配置されている。光源31が点灯している場合、光検出器32に、測定光Lmeaと参照光Lrefが入射する。 A photodetector 32 is arranged in the first direction. When the light source 31 is on, the measurement light L mea and the reference light L ref are incident on the photodetector 32 .
 測定光Lmeaと参照光Lrefで、干渉縞が形成される。干渉縞を光検出器32で撮像することで、干渉縞の画像を取得することができる。 An interference fringe is formed by the measurement light L mea and the reference light L ref . By imaging the interference fringes with the photodetector 32, an image of the interference fringes can be obtained.
 干渉縞の画像は、推定装置1に送られる。推定装置1では、干渉縞の画像に基づいて波面情報を取得する。波面情報を、メモリ2に記憶する。波面情報を用いて、推定処理を実行する。 The image of the interference fringes is sent to the estimation device 1. The estimation device 1 acquires wavefront information based on the image of the interference fringes. Wavefront information is stored in memory 2 . An estimation process is performed using the wavefront information.
 推定処理では、複数の波面情報を用いる。複数の波面情報では、照明光の物体への入射角度が、波面情報毎に異なる。 In the estimation process, multiple pieces of wavefront information are used. With a plurality of pieces of wavefront information, the incident angle of the illumination light to the object differs for each wavefront information.
 本実施形態の推定システムは、角度変更機構を有する。角度変更機構は、相対方向を変える。そのため、照明光の物体に対する入射角度を変えることができる。その結果、複数の波面情報を取得することができる。 The estimation system of this embodiment has an angle changing mechanism. The angle changing mechanism changes the relative orientation. Therefore, the incident angle of the illumination light to the object can be changed. As a result, a plurality of pieces of wavefront information can be obtained.
 本実施形態の推定システムでは、角度変更機構は、駆動装置と、回転部材と、を有し、回転部材は、ステージを保持し、回転部材の回転軸は、物体と交差すると共に、光路の光軸と直交することが好ましい。 In the estimation system of this embodiment, the angle changing mechanism has a driving device and a rotating member, the rotating member holds the stage, and the rotation axis of the rotating member intersects the object and the light in the optical path. It is preferably perpendicular to the axis.
 光路の光軸とは、光検出器の検出面に直交する軸であり、光検出器の中心を通る軸であり、検出光学系が備わっている場合は検出光学系の光軸であり、照明光学系が備わっている場合は照明光学系の光軸である。 The optical axis of the optical path is the axis perpendicular to the detection surface of the photodetector, the axis passing through the center of the photodetector, and the optical axis of the detection optical system if the detection optical system is provided. If an optical system is provided, it is the optical axis of the illumination optical system.
 図9に示すように、推定システム30は、角度変更機構40を有する。角度変更機構40は、測定光路OPmea側に配置されている。 As shown in FIG. 9, the estimation system 30 has an angle changing mechanism 40. As shown in FIG. The angle changing mechanism 40 is arranged on the measurement optical path OP mea side.
 角度変更機構40は、駆動装置41と、回転部材42と、を有する。回転部材42は、ステージ33を保持している。軸RXは、回転部材42の回転軸である。軸RXは、物体Sと交差すると共に、光軸AXと直交している。 The angle changing mechanism 40 has a driving device 41 and a rotating member 42. The rotating member 42 holds the stage 33 . Axis RX is the rotation axis of rotating member 42 . The axis RX intersects the object S and is perpendicular to the optical axis AX.
 角度変更機構40では、駆動装置41によって回転部材42が回転する。回転部材42がステージ33を保持しているので、ステージ33が回転する。ステージ33を回転させることで、軸RXを中心に物体Sを回転させることができる。 In the angle changing mechanism 40 , the rotating member 42 is rotated by the drive device 41 . Since the rotary member 42 holds the stage 33, the stage 33 rotates. By rotating the stage 33, the object S can be rotated around the axis RX.
 照明光は、ミラー35で反射して、物体Sに入射する。物体Sの回転によって、照明光する物体Sの向きが変わる。よって、様々な方向から、照明光が物体Sに照射される。 The illumination light is reflected by the mirror 35 and enters the object S. The rotation of the object S changes the orientation of the object S that illuminates. Therefore, the object S is irradiated with illumination light from various directions.
 物体Sから、測定光Lmeaが出射する。測定光Lmeaは、光検出器32に入射する。 A measurement light beam L mea is emitted from the object S. The measurement light L mea enters the photodetector 32 .
 推定システム30では、照明光の向きは変わらずに、物体Sの向きが変わる。よって、照明光の物体Sに対する入射角度を変えることができる。 In the estimation system 30, the direction of the object S changes without changing the direction of the illumination light. Therefore, the incident angle of the illumination light to the object S can be changed.
 図10は、本実施形態の推定システムを示す図である。図9と同じ構成については同じ番号を付し、説明を省略する。 FIG. 10 is a diagram showing the estimation system of this embodiment. The same numbers are assigned to the same configurations as in FIG. 9, and the description thereof is omitted.
 推定システム50は、ミラー51と、ビームスプリッタ52と、を有する。ミラー51は、測定光路OPmeaに配置されている。ビームスプリッタ52は、参照光路OPrefと測定光路OPmeaが交差する位置に配置されている。 The estimation system 50 has a mirror 51 and a beam splitter 52 . A mirror 51 is arranged in the measurement optical path OP mea . The beam splitter 52 is arranged at a position where the reference optical path OP ref and the measurement optical path OP mea intersect.
 図9に示す推定システム30では、ビームスプリッタ37で測定光路OPmeaを第1の方向に折り曲げ、ミラー36で参照光路OPrefを第1の方向に折り曲げている。 In the estimation system 30 shown in FIG. 9, the beam splitter 37 bends the measurement optical path OP mea in the first direction, and the mirror 36 bends the reference optical path OP ref in the first direction.
 これに対して、推定システム50ではミラー51で測定光路OPmeaを第1の方向と逆方向に折り曲げ、ビームスプリッタ52で参照光路OPrefを第1の方向と逆方向に折り曲げている。そのため、測定光路OPmeaの光路長と参照光路OPrefの光路長と間で、差が生じる。 On the other hand, in the estimation system 50, the mirror 51 bends the measurement optical path OP mea in the opposite direction to the first direction, and the beam splitter 52 bends the reference optical path OP ref in the opposite direction to the first direction. Therefore, a difference occurs between the optical path length of the measurement optical path OP mea and the optical path length of the reference optical path OP ref .
 照明光における可干渉距離が光路長の差よりも長い場合、干渉縞が形成される。照明光における可干渉距離が光路長の差よりも短い場合、光路長調整部53をビームスプリッタ34とミラー52の間に配置する。このような配置にすることで、干渉縞を形成することができる。 When the coherence length in the illumination light is longer than the difference in optical path length, interference fringes are formed. If the coherence length of the illumination light is shorter than the difference in optical path length, the optical path length adjuster 53 is arranged between the beam splitter 34 and the mirror 52 . With such an arrangement, interference fringes can be formed.
 光路長調整部53は、例えば、ピエゾステージと4枚のミラーを有する。ピエゾステージには、2枚のミラーが載置されている。2枚のミラーを移動させることで、参照光路OPrefにおける光路長を変化させることができる。 The optical path length adjusting section 53 has, for example, a piezo stage and four mirrors. Two mirrors are placed on the piezo stage. By moving the two mirrors, the optical path length in the reference optical path OP ref can be changed.
 本実施形態の推定システムは、以下の条件式(1)を満足することが好ましい。
 Δz2×2<Δz1   (1)
The estimation system of this embodiment preferably satisfies the following conditional expression (1).
Δz2×2<Δz1 (1)
 本実施形態の推定システムは、検出光学系を有し、以下の条件式(2)を満足することが好ましい。
 Δz1<λ/NA×5   (2)
 ここで、
 λは、照明光の波長、
 NAは、検出光学系の開口数、
である。
The estimation system of this embodiment preferably has a detection optical system and satisfies the following conditional expression (2).
Δz1<λ/NA 2 ×5 (2)
here,
λ is the wavelength of the illumination light,
NA is the numerical aperture of the detection optical system;
is.
 本実施形態の推定システムは、推定処理を実行する毎に、Δz1を小さくすることが好ましい。 The estimation system of the present embodiment preferably reduces Δz1 each time the estimation process is performed.
 本実施形態の推定システムは、Δz1は、最大の回転角度に基づいて決定することが好ましい。 The estimation system of this embodiment preferably determines Δz1 based on the maximum rotation angle.
 図11は、物体の画像を示す図である。画像は、シミュレーションで得られた画像である。NAは、検出光学系の開口数である。Δz1/Δz2は、Δz1とΔz2の比である。画像の横方向は、X軸方向又はY軸方向である。画像の縦方向は、Z軸方向である。 FIG. 11 is a diagram showing an image of an object. The images are images obtained by simulation. NA is the numerical aperture of the detection optical system. Δz1/Δz2 is the ratio of Δz1 and Δz2. The horizontal direction of the image is the X-axis direction or the Y-axis direction. The vertical direction of the image is the Z-axis direction.
 第1画像群は、図11(a)に示す画像、図11(b)に示す画像、図11(c)に示す画像、及び図11(d)に示す画像である。第1画像群では、Δz1/Δz2の値は10で、各図でNAの値が異なる。 The first image group is the image shown in FIG. 11(a), the image shown in FIG. 11(b), the image shown in FIG. 11(c), and the image shown in FIG. 11(d). In the first image group, the value of Δz1/Δz2 is 10, and the NA values are different in each figure.
 第2画像群は、図11(e)に示す画像、図11(f)に示す画像、図11(g)に示す画像、図11(h)に示す画像である。第2画像群では、Δz1/Δz2の値は20で、各図でNAの値が異なる。 The second image group is the image shown in FIG. 11(e), the image shown in FIG. 11(f), the image shown in FIG. 11(g), and the image shown in FIG. 11(h). In the second image group, the value of Δz1/Δz2 is 20, and the NA values are different in each figure.
 第3画像群は、図11(i)に示す画像、図11(j)に示す画像、図11(k)に示す画像、図11(l)に示す画像である。第3画像群では、Δz1/Δz2の値は43で、各図でNAの値が異なる。 The third image group is the image shown in FIG. 11(i), the image shown in FIG. 11(j), the image shown in FIG. 11(k), and the image shown in FIG. 11(l). In the third image group, the value of Δz1/Δz2 is 43, and the NA values are different in each figure.
 “○”は、画質が良好であることを示している。“△”は、画質がやや不良であることを示している。“×”は、画質が不良であることを示している。 "○" indicates that the image quality is good. “Δ” indicates that the image quality is slightly poor. "X" indicates that the image quality is poor.
 シミュレーションでは、物体として、フォトニッククリスタルファイバー(以下、「PCF」という)を用いているPCFは、円柱部材と、貫通孔と、を有する。 In the simulation, a PCF using a photonic crystal fiber (hereinafter referred to as "PCF") as an object has a cylindrical member and a through hole.
 PCFでは、貫通孔が複数、円柱部材の内部に形成されている。貫通孔は円筒形で、円柱部材の母線に沿って形成されている。PCFの外径は230μmで、媒質の屈折率は1.466である。貫通孔と円柱部材の周囲は、屈折率が1.44の液体で満たされている。 In the PCF, a plurality of through holes are formed inside the cylindrical member. The through hole is cylindrical and formed along the generatrix of the cylindrical member. The outer diameter of the PCF is 230 μm and the refractive index of the medium is 1.466. The perimeter of the through-hole and the cylindrical member is filled with a liquid having a refractive index of 1.44.
 各画像の推定には、波長λが1000nmの照明光で取得した波面情報を用いている。 Wavefront information obtained with illumination light with a wavelength λ of 1000 nm is used to estimate each image.
 NAの値が大きくなると、収差の影響を受けやすい。特にZ軸方向では、媒質の影響を大きく受ける。第1画像群、第2画像群、及び第3画像群のいずれにおいても、NAの値が大きくなるほど、画質に問題画質が劣化している。  The larger the NA value, the more susceptible it is to aberrations. In particular, the Z-axis direction is greatly affected by the medium. In any of the first image group, the second image group, and the third image group, the larger the NA value, the more the image quality is degraded.
 図12は、演算時間とボクセル幅の関係を示すグラフである。縦軸における計算時間は、順伝搬演算と逆伝搬演算における演算時間である。計算時間は、Δz1=Δz2のときの計算時間で規格化されている。 FIG. 12 is a graph showing the relationship between computation time and voxel width. The calculation time on the vertical axis is the calculation time for the forward propagation calculation and the backward propagation calculation. The calculation time is normalized by the calculation time when Δz1=Δz2.
 図12に示すように、Δz1/Δz2の値が大きくなるほど、計算時間が短くなる。上述のように、推定物体を伝搬する波面は、Δz1を用いて算出する。Δz1が広くなるほど、伝搬する波面の数を少なくすることができる。その結果、伝搬する波面の算出で、演算時間を短縮することができる。 As shown in FIG. 12, the larger the value of Δz1/Δz2, the shorter the calculation time. As described above, the wavefront propagating through the estimated object is calculated using Δz1. As Δz1 becomes wider, the number of propagating wavefronts can be reduced. As a result, it is possible to shorten the computation time by calculating the propagating wavefront.
 ただし、Δz1が広くなり過ぎると、伝搬する波面の数を少なくなり過ぎる。そのため、Z軸方向における3次元光学特性の推定精度が低下する。第3画像群におけるΔz1/Δz2の値は、第1画像群におけるΔz1/Δz2の値よりも大きい。そのため、画質が不良な画像の数は、第3画像群の方が第1画像群よりも多い。 However, if Δz1 becomes too wide, the number of propagating wavefronts becomes too small. Therefore, the estimation accuracy of the three-dimensional optical characteristics in the Z-axis direction is lowered. The value of Δz1/Δz2 in the third image group is greater than the value of Δz1/Δz2 in the first image group. Therefore, the number of images with poor image quality is larger in the third image group than in the first image group.
 本実施形態の推定方法は、物体の3次元光学特性を推定する推定方法である。3次元光学特性は、屈折率分布又は吸収率分布である。複合情報は、波面情報と、回転角度情報と、を有し、波面情報は、物体を通過した照明光に基づいて取得した波面の情報であり、回転角度情報は、照明光が物体を通過したときの物体の向きを示す情報である。 The estimation method of this embodiment is an estimation method for estimating the three-dimensional optical properties of an object. A three-dimensional optical property is a refractive index distribution or an absorptance distribution. The composite information includes wavefront information and rotation angle information. The wavefront information is wavefront information acquired based on the illumination light that has passed through the object. This information indicates the orientation of the object at the time.
 推定処理はコンピュータが実行する。コンピュータは、メモリに記憶されている複合情報を読み出して推定処理を実行する。 The computer executes the estimation process. The computer reads the composite information stored in the memory and performs the estimation process.
 推定処理では、複数のボクセル空間を用いる。ボクセル空間は、ボクセルの集合で構成されている。複数のボクセル空間は、第1演算用ボクセルの集合で構成されたボクセル空間と、推定用ボクセルの集合で構成されたボクセル空間と、更新用ボクセルの集合で構成されたボクセル空間と、を有する。  The estimation process uses multiple voxel spaces. A voxel space consists of a set of voxels. The plurality of voxel spaces has a voxel space configured with a set of first calculation voxels, a voxel space configured with a set of estimation voxels, and a voxel space configured with a set of update voxels.
 ボクセル空間の基準方向は、疑似照明光の進行方向である。よって、第1演算用ボクセル、推定用ボクセル、及び更新用ボクセルでも、基準方向は疑似照明光の進行方向である。 The reference direction of the voxel space is the traveling direction of the pseudo illumination light. Therefore, the reference direction of the first calculation voxel, the estimation voxel, and the update voxel is also the traveling direction of the pseudo illumination light.
 基準方向とは、測定時の測定装置の光路の光軸方向、言い換えれば、照明光の進行方向と一致する方向あり、照明光の進行方向は、測定装置に検出光学系が備わっている場合は、検出光学系の光軸方向、測定装置に照明光学系が備わっている場合は、照明光学系の光軸方向である。また、測定装置の光検出器の光検出面の垂線方向でもある。 The reference direction is the direction of the optical axis of the optical path of the measurement device during measurement, in other words, a direction that coincides with the traveling direction of the illumination light. , the direction of the optical axis of the detection optical system, and the direction of the optical axis of the illumination optical system if the measurement apparatus is provided with the illumination optical system. It is also the direction perpendicular to the photodetection plane of the photodetector of the measurement device.
 上述の説明と同様に、第1演算用ボクセルの集合で構成されたボクセル空間を、第1演算用ボクセルという。推定用ボクセルの集合で構成されたボクセル空間を推定用ボクセルという。更新用ボクセルの集合で構成されたボクセル空間を、更新用ボクセルという。 As in the above description, a voxel space composed of a set of first calculation voxels is referred to as a first calculation voxel. A voxel space composed of a set of estimation voxels is called an estimation voxel. A voxel space composed of a set of update voxels is called an update voxel.
 第1演算用ボクセルでは、基準方向におけるボクセル幅がΔz1であり、推定用ボクセルと更新用ボクセルでは、基準方向におけるボクセル幅がΔz2であり、Δz1は、Δz2よりも広い。 In the first calculation voxel, the voxel width in the reference direction is Δz1, and in the estimation voxel and the update voxel, the voxel width in the reference direction is Δz2, and Δz1 is wider than Δz2.
 所定の向きは、回転角度情報により特定される照明光の進行方向に対する物体の向きである。プロセッサ3で実行する処理では、結果が得られる。結果は、物体の波面と推定物体の波面との差を示している。 The predetermined orientation is the orientation of the object with respect to the traveling direction of the illumination light specified by the rotation angle information. The processing performed by processor 3 produces a result. The result shows the difference between the object wavefront and the estimated object wavefront.
 推定処理では、図4に示すように、ステップS10と、ステップS20と、ステップS30と、を実行する。 In the estimation process, as shown in FIG. 4, steps S10, S20, and S30 are executed.
 ステップS10は、第1演算用ボクセルの推定値を生成する処理Aである。ステップS20は、更新用ボクセルの結果を生成する処理Bである。ステップS30は、推定用ボクセルの推定値を更新する処理Cである。処理A、処理B、及び処理Cについては既に説明したので、ここでの説明は省略する。 Step S10 is a process A for generating the estimated value of the first calculation voxel. Step S20 is a process B for generating update voxel results. Step S30 is a process C for updating the estimated value of the estimation voxel. Since processing A, processing B, and processing C have already been described, descriptions thereof will be omitted here.
 本実施形態の推定方法では、以下の条件式(1)を満足することが好ましい。
 Δz2×2<Δz1   (1)
The estimation method of this embodiment preferably satisfies the following conditional expression (1).
Δz2×2<Δz1 (1)
 本実施形態の推定方法では、推定処理を実行する毎に、Δz1を小さくすることが好ましい。 In the estimation method of this embodiment, it is preferable to reduce Δz1 each time the estimation process is performed.
 本実施形態の推定方法では、Δz1は、最大の回転角度に基づいて決定することが好ましい。 In the estimation method of this embodiment, Δz1 is preferably determined based on the maximum rotation angle.
 本実施形態の記録媒体は、プログラムを記録したコンピュータ読み取り可能な記録媒体である。記録媒体には、メモリとプロセッサを備えた推定装置で、プロセッサに物体の3次元光学特性を推定する推定処理を実行させるためのプログラムが記録されている。メモリは、複数の複合情報を記憶し、処理a、処理A、処理B、及び処理Cを用いて、プロセッサに推定処理を実行させる。 The recording medium of this embodiment is a computer-readable recording medium on which a program is recorded. The recording medium stores a program for causing the processor to execute an estimation process for estimating the three-dimensional optical properties of an object in an estimation device having a memory and a processor. The memory stores a plurality of composite information and causes the processor to perform the estimation process using process a, process A, process B, and process C.
 複合情報は、波面情報と、回転角度情報と、を有し、波面情報は、物体を通過した照明光に基づいて取得した波面の情報であり、回転角度情報は、照明光が物体を通過したときの物体の向きを示す情報である。推定処理では、物体の3次元光学特性を推定し、3次元光学特性は、屈折率分布又は吸収率分布である。 The composite information includes wavefront information and rotation angle information. The wavefront information is wavefront information acquired based on the illumination light that has passed through the object. This information indicates the orientation of the object at the time. The estimation process estimates the three-dimensional optical properties of the object, where the three-dimensional optical properties are refractive index distributions or absorptance distributions.
 プロセッサで実行する処理では、第1演算用ボクセルの集合で構成されたボクセル空間と、推定用ボクセルの集合で構成されたボクセル空間と、更新用ボクセルの集合で構成されたボクセル空間と、を用いる。 The processing executed by the processor uses a voxel space configured with a set of first calculation voxels, a voxel space configured with a set of estimation voxels, and a voxel space configured with a set of update voxels. .
 第1演算用ボクセルの集合で構成されたボクセル空間を、第1演算用ボクセルという。推定用ボクセルの集合で構成されたボクセル空間を推定用ボクセルという。更新用ボクセルの集合で構成されたボクセル空間を、更新用ボクセルという。 A voxel space composed of a set of first calculation voxels is called a first calculation voxel. A voxel space composed of a set of estimation voxels is called an estimation voxel. A voxel space composed of a set of update voxels is called an update voxel.
 第1演算用ボクセルでは、基準方向におけるボクセル幅がΔz1であり、推定用ボクセルと更新用ボクセルでは、基準方向におけるボクセル幅がΔz2であり、Δz1は、Δz2よりも広い。 In the first calculation voxel, the voxel width in the reference direction is Δz1, and in the estimation voxel and the update voxel, the voxel width in the reference direction is Δz2, and Δz1 is wider than Δz2.
 所定の向きは、回転角度情報により特定される照明光の進行方向に対する物体の向きである。結果は、物体の波面と推定物体の波面との差を示している。 The predetermined orientation is the orientation of the object with respect to the traveling direction of the illumination light specified by the rotation angle information. The result shows the difference between the object wavefront and the estimated object wavefront.
 処理aでは、メモリから複数の複合情報を読み出す。 In process a, multiple pieces of composite information are read from the memory.
 処理Aでは、ボクセル空間において、推定用ボクセルの推定値の集合に対応する推定物体の基準方向に対する向きが、所定の向きと一致するように、ボクセル空間において推定物体を回転し、第1演算用ボクセルの推定値を生成する。 In the process A, the estimation object is rotated in the voxel space so that the orientation of the estimation object corresponding to the set of estimated values of the estimation voxels with respect to the reference direction matches a predetermined orientation. Generate voxel estimates.
 処理Bでは、処理Aで生成した第1演算用ボクセルの推定値を用いて模擬照明光が進行する基準方向に伝搬する波面を算出することで推定波面情報を生成し、波面情報で推定波面情報を拘束し、模擬照明光が進行する基準方向と逆方向に伝搬する波面を算出し、更新用ボクセルの結果を生成し、
 処理Cでは、第B処理で生成した更新用ボクセルの結果に基づいて、推定用ボクセルの推定値を更新しする。
In process B, estimated wavefront information is generated by calculating the wavefront propagating in the reference direction in which the simulated illumination light travels using the estimated value of the first calculation voxel generated in process A, and the estimated wavefront information is calculated based on the wavefront information. is constrained, a wavefront propagating in the direction opposite to the reference direction in which the simulated illumination light travels is calculated, and a result of updating voxels is generated,
In process C, the estimated values of the estimation voxels are updated based on the results of the update voxels generated in the B process.
 以上のように、本発明は、物体の3次元光学特性を少ない演算量で取得できる推定装置、推定システム、推定方法、及び記録媒体に適している。 As described above, the present invention is suitable for an estimating device, an estimating system, an estimating method, and a recording medium that can acquire the three-dimensional optical properties of an object with a small amount of computation.
 1 推定装置
 2 メモリ
 3 プロセッサ
 10、20 物体
 21 測定光学系
 22 レンズ
 23 CCD
 24 推定物体
 30、50、60 推定システム
 31 光源
 32 光検出器
 33 ステージ
 34、37、52 ビームスプリッタ
 35、36、51 ミラー
 40 角度変更機構
 41 駆動装置
 42 回転部材
 53 光路長調整部
 61 光源
 62、67 レンズ
 63 照明光学系
 64 ステージ
 65 検出光学系
 66 光検出器
 AX 光軸
 Lλ 照明光
 Lmea 測定光
 Lref 参照光
 OPmea 測定光路
 OPref 参照光路
 RX 軸
 S 物体
Reference Signs List 1 estimation device 2 memory 3 processor 10, 20 object 21 measurement optics 22 lens 23 CCD
24 estimation object 30, 50, 60 estimation system 31 light source 32 photodetector 33 stage 34, 37, 52 beam splitter 35, 36, 51 mirror 40 angle changing mechanism 41 driving device 42 rotating member 53 optical path length adjusting section 61 light source 62, 67 lens 63 illumination optical system 64 stage 65 detection optical system 66 photodetector AX optical axis Lλ illumination light L mea measurement light L ref reference light OP mea measurement light path OP ref reference light path RX axis S object

Claims (9)

  1.  メモリと、プロセッサと、を備え、
     前記メモリは、複数の複合情報を記憶し、
     前記複合情報は、波面情報と、回転角度情報と、を有し、
     前記波面情報は、物体を通過した照明光に基づいて取得した波面の情報であり、
     前記回転角度情報は、前記照明光が前記物体を通過したときの前記物体の向きを示す情報であり、
     前記プロセッサは、前記物体の3次元光学特性を推定する推定処理を実行し、
     前記3次元光学特性は、屈折率分布又は吸収率分布であり、
     ボクセル空間は、第1演算用ボクセルの集合で構成され、
     前記ボクセル空間は、推定用ボクセルの集合で構成され、
     前記ボクセル空間は、更新用ボクセルの集合で構成され、
     前記ボクセル空間、前記第1演算用ボクセル、前記推定用ボクセル、及び、前記更新用ボクセルの基準方向は、疑似照明光の進行方向であり、
     前記第1演算用ボクセルでは、前記基準方向におけるボクセル幅がΔz1であり、
     前記推定用ボクセルと前記更新用ボクセルでは、前記基準方向におけるボクセル幅がΔz2であり、
     Δz1は、Δz2よりも広く、
     所定の向きは、前記回転角度情報により特定される前記照明光の進行方向に対する前記物体の向きであり、
     結果は、前記物体の波面と推定物体の波面との差を示し、
     前記推定処理は、
     前記ボクセル空間において、前記推定用ボクセルの推定値の集合に対応する前記推定物体の前記基準方向に対する向きが、前記所定の向きと一致するように、前記ボクセル空間において前記推定物体を回転し、回転後の前記推定物体に対応する前記第1演算用ボクセルの推定値を生成する処理Aと、
     前記処理Aで生成した前記第1演算用ボクセルの前記推定値を用いて前記疑似照明光が進行する前記基準方向に伝搬する波面を算出することで推定波面情報を生成し、前記波面情報で前記推定波面情報を拘束し、前記模擬照明光が進行する前記基準方向と逆方向に伝搬する波面を算出し、前記更新用ボクセルの前記結果を生成する処理Bと、
     前記処理Bで生成した前記更新用ボクセルの前記結果に基づいて、前記推定用ボクセルの前記推定値を更新する処理Cと、を有することを特徴とする推定装置。
    comprising a memory, a processor, and
    the memory stores a plurality of composite information;
    The composite information has wavefront information and rotation angle information,
    The wavefront information is wavefront information obtained based on illumination light that has passed through an object,
    the rotation angle information is information indicating the orientation of the object when the illumination light passes through the object;
    the processor performs an estimation process to estimate three-dimensional optical properties of the object;
    The three-dimensional optical property is a refractive index distribution or an absorptance distribution,
    The voxel space is composed of a set of first operation voxels,
    The voxel space is composed of a set of estimation voxels,
    The voxel space is composed of a set of update voxels,
    a reference direction of the voxel space, the first calculation voxel, the estimation voxel, and the update voxel is a traveling direction of pseudo illumination light;
    In the first calculation voxel, the voxel width in the reference direction is Δz1,
    In the estimation voxel and the update voxel, the voxel width in the reference direction is Δz2,
    Δz1 is wider than Δz2,
    the predetermined orientation is the orientation of the object with respect to the traveling direction of the illumination light specified by the rotation angle information;
    the result indicates the difference between the wavefront of said object and the wavefront of the estimated object;
    The estimation process includes
    rotating the estimation object in the voxel space so that the orientation of the estimation object corresponding to the set of estimated values of the estimation voxels with respect to the reference direction in the voxel space coincides with the predetermined orientation; a process A for generating an estimated value of the first calculation voxel corresponding to the later estimated object;
    Estimated wavefront information is generated by calculating a wavefront propagating in the reference direction in which the pseudo illumination light travels using the estimated value of the first calculation voxel generated in the process A, and the wavefront information a process B of constraining estimated wavefront information, calculating a wavefront propagating in a direction opposite to the reference direction in which the simulated illumination light travels, and generating the result of the update voxel;
    and a process C of updating the estimated value of the estimation voxel based on the result of the update voxel generated in the process B.
  2.  以下の条件式(1)を満足することを特徴とする請求項1に記載の推定装置。
     Δz2×2<Δz1   (1)
    2. The estimation device according to claim 1, wherein the following conditional expression (1) is satisfied.
    Δz2×2<Δz1 (1)
  3.  前記推定処理を所定回数実行した後、Δz1を小さくすることを特徴とする請求項1に記載の推定装置。 The estimation device according to claim 1, wherein Δz1 is reduced after executing the estimation process a predetermined number of times.
  4.  Δz1は、複数の前記回転角度情報の間の最大の回転角度差に基づいて決定することを特徴とする請求項1に記載の推定装置。 The estimation apparatus according to claim 1, wherein Δz1 is determined based on the maximum rotation angle difference between a plurality of pieces of rotation angle information.
  5.  請求項1から4のいずれか1項に記載の推定装置と、
     前記照明光を射出する光源と、
     光検出器と、
     前記物体を載置するステージと、
     角度変更機構と、を備え、
     前記ステージは、前記光源から前記光検出器まで間の光路上に配置され、
     前記角度変更機構は、前記光路の光軸に対する前記物体の配置角度を変化させることを特徴とする推定システム。
    an estimating device according to any one of claims 1 to 4;
    a light source that emits the illumination light;
    a photodetector;
    a stage on which the object is placed;
    and an angle changing mechanism,
    The stage is arranged on an optical path between the light source and the photodetector,
    The estimation system, wherein the angle changing mechanism changes an arrangement angle of the object with respect to the optical axis of the optical path.
  6.  前記角度変更機構は、駆動装置と、回転部材と、を有し、
     前記回転部材は、前記ステージを保持し、
     前記回転部材の回転軸は、前記物体と交差すると共に、前記光路の光軸と直交することを特徴とする請求項5に記載の推定システム。
    The angle changing mechanism has a driving device and a rotating member,
    The rotating member holds the stage,
    6. The estimation system according to claim 5, wherein the rotation axis of said rotating member intersects said object and is perpendicular to the optical axis of said optical path.
  7.  検出光学系を有し、
     以下の条件式(2)を満足することを特徴とする請求項5に記載の推定システム。
     Δz1<λ/NA×5   (2)
     ここで、
     λは、前記照明光の波長、
     NAは、前記検出光学系の開口数、
    である。
    having a detection optical system,
    6. The estimation system according to claim 5, wherein the following conditional expression (2) is satisfied.
    Δz1<λ/NA 2 ×5 (2)
    here,
    λ is the wavelength of the illumination light;
    NA is the numerical aperture of the detection optical system;
    is.
  8.  物体の3次元光学特性を推定する推定方法であって、
     前記3次元光学特性は、屈折率分布又は吸収率分布であり、
     複合情報は、波面情報と、回転角度情報と、を有し、
     前記波面情報は、物体を通過した照明光に基づいて取得した波面の情報であり、
     前記回転角度情報は、前記照明光が前記物体を通過したときの前記物体の向きを示す情報であり、
     ボクセル空間は、第1演算用ボクセルの集合で構成され、
     前記ボクセル空間は、推定用ボクセルの集合で構成され、
     前記ボクセル空間は、更新用ボクセルの集合で構成され、
     前記ボクセル空間、前記第1演算用ボクセル、前記推定用ボクセル、及び、前記更新用ボクセルの基準方向は、疑似照明光の進行方向であり、
     前記第1演算用ボクセルでは、前記基準方向におけるボクセル幅がΔz1であり、
     前記推定用ボクセルと前記更新用ボクセルでは、前記基準方向におけるボクセル幅がΔz2であり、
     Δz1は、Δz2よりも広く、
     所定の向きは、前記回転角度情報により特定される前記照明光の進行方向に対する前記物体の向きであり、
     結果は、前記物体の波面と推定物体の波面との差を示し、
     推定処理を実行し、
     前記推定処理では、
     前記ボクセル空間において、前記推定用ボクセルの推定値の集合に対応する前記推定物体の前記基準方向に対する向きが、前記所定の向きと一致するように、前記ボクセル空間において前記推定物体を回転し、回転後の前記推定物体に対応する前記第1演算用ボクセルの推定値を生成する処理Aを実行し、
     前記処理Aで生成した前記第1演算用ボクセルの前記推定値を用いて前記疑似照明光が進行する前記基準方向に伝搬する波面を算出することで推定波面情報を生成し、前記波面情報で前記推定波面情報を拘束し、前記疑似照明光が進行する前記基準方向と逆方向に伝搬する波面を算出し、得た前記結果を前記更新用ボクセルの前記結果を生成する処理Bを実行し、
     前記処理Bで生成した前記更新用ボクセルの前記結果に基づいて、前記推定用ボクセルの前記推定値を更新する処理Cを実行することを特徴とする推定方法。
    An estimation method for estimating a three-dimensional optical property of an object, comprising:
    The three-dimensional optical property is a refractive index distribution or an absorptance distribution,
    The composite information has wavefront information and rotation angle information,
    The wavefront information is wavefront information obtained based on illumination light that has passed through an object,
    the rotation angle information is information indicating the orientation of the object when the illumination light passes through the object;
    The voxel space is composed of a set of first operation voxels,
    The voxel space is composed of a set of estimation voxels,
    The voxel space is composed of a set of update voxels,
    a reference direction of the voxel space, the first calculation voxel, the estimation voxel, and the update voxel is a traveling direction of pseudo illumination light;
    In the first calculation voxel, the voxel width in the reference direction is Δz1,
    In the estimation voxel and the update voxel, the voxel width in the reference direction is Δz2,
    Δz1 is wider than Δz2,
    the predetermined orientation is the orientation of the object with respect to the traveling direction of the illumination light specified by the rotation angle information;
    the result indicates the difference between the wavefront of said object and the wavefront of the estimated object;
    perform the estimation process,
    In the estimation process,
    rotating the estimation object in the voxel space so that the orientation of the estimation object corresponding to the set of estimated values of the estimation voxels with respect to the reference direction in the voxel space coincides with the predetermined orientation; performing a process A for generating an estimated value of the first calculation voxel corresponding to the later estimated object;
    Estimated wavefront information is generated by calculating a wavefront propagating in the reference direction in which the pseudo illumination light travels using the estimated value of the first calculation voxel generated in the process A, and the wavefront information Constraining the estimated wavefront information, calculating a wavefront propagating in a direction opposite to the reference direction in which the pseudo illumination light travels, and executing a process B of generating the result of the update voxel from the obtained result,
    An estimation method characterized by executing a process C of updating the estimated value of the estimation voxel based on the result of the update voxel generated in the process B.
  9.  メモリとプロセッサを備えたコンピュータの前記プロセッサに推定処理を実行させるためのプログラムを記録したコンピュータ読み取り可能な記録媒体であって、
     複合情報は、波面情報と、回転角度情報と、を有し、
     前記波面情報は、物体を通過した照明光に基づいて取得した波面の情報であり、
     前記回転角度情報は、前記照明光が前記物体を通過したときの前記物体の向きを示す情報であり、
     前記推定処理では、前記物体の3次元光学特性を推定し、
     前記3次元光学特性は、屈折率分布又は吸収率分布であり、
     ボクセル空間は、第1演算用ボクセルの集合で構成され、
     前記ボクセル空間は、推定用ボクセルの集合で構成され、
     前記ボクセル空間は、更新用ボクセルの集合で構成され、
     前記ボクセル空間、前記第1演算用ボクセル、前記推定用ボクセル、及び、前記更新用ボクセルの基準方向は、疑似照明光の進行方向であり、
     前記第1演算用ボクセルでは、前記基準方向におけるボクセル幅がΔz1であり、
     前記推定用ボクセルと前記更新用ボクセルでは、前記基準方向におけるボクセル幅がΔz2であり、
     Δz1は、Δz2よりも広く、
     所定の向きは、前記回転角度情報により特定される前記照明光の進行方向に対する前記物体の向きであり、
     結果は、前記物体の波面と推定物体の波面との差を示し、
     処理aでは、前記メモリから複数の前記複合情報を読み出し、
     処理Aでは、前記ボクセル空間において、前記推定用ボクセルの推定値の集合に対応する前記推定物体の前記基準方向に対する向きが、前記所定の向きと一致するように、前記ボクセル空間において前記推定物体を回転し、前記第1演算用ボクセルの推定値を生成し、
     処理Bでは、前記処理Aで生成した前記第1演算用ボクセルの前記推定値を用いて前記模擬照明光が進行する前記基準方向に伝搬する波面を算出することで推定波面情報を生成し、前記波面情報で前記推定波面情報を拘束し、前記模擬照明光が進行する前記基準方向と逆方向に伝搬する波面を算出し、前記更新用ボクセルの前記結果を生成し、
     処理Cでは、前記第B処理で生成した前記更新用ボクセルの前記結果に基づいて、前記推定用ボクセルの前記推定値を更新し、
     前記処理aと、前記処理Aから前記処理Cまでを用いて、前記プロセッサに前記推定処理を実行させることを特徴とするプログラムを記録したコンピュータ読み取り可能な記録媒体。
     
    A computer-readable recording medium recording a program for causing the processor of a computer comprising a memory and a processor to perform an estimation process,
    The composite information has wavefront information and rotation angle information,
    The wavefront information is wavefront information obtained based on illumination light that has passed through an object,
    the rotation angle information is information indicating the orientation of the object when the illumination light passes through the object;
    In the estimation process, estimating a three-dimensional optical characteristic of the object,
    The three-dimensional optical property is a refractive index distribution or an absorptance distribution,
    The voxel space is composed of a set of first operation voxels,
    The voxel space is composed of a set of estimation voxels,
    The voxel space is composed of a set of update voxels,
    a reference direction of the voxel space, the first calculation voxel, the estimation voxel, and the update voxel is a traveling direction of pseudo illumination light;
    In the first calculation voxel, the voxel width in the reference direction is Δz1,
    In the estimation voxel and the update voxel, the voxel width in the reference direction is Δz2,
    Δz1 is wider than Δz2,
    the predetermined orientation is the orientation of the object with respect to the traveling direction of the illumination light specified by the rotation angle information;
    the result indicates the difference between the wavefront of said object and the wavefront of the estimated object;
    In processing a, a plurality of the composite information are read from the memory;
    In process A, the estimated object is moved in the voxel space such that the orientation of the estimated object corresponding to the set of estimated values of the estimation voxels with respect to the reference direction matches the predetermined orientation. rotate to generate an estimate of the first computational voxel;
    In process B, estimated wavefront information is generated by calculating a wavefront propagating in the reference direction in which the simulated illumination light travels using the estimated value of the first calculation voxel generated in process A, and constraining the estimated wavefront information with wavefront information, calculating a wavefront propagating in a direction opposite to the reference direction in which the simulated illumination light travels, and generating the update voxel result;
    In process C, updating the estimated value of the estimation voxel based on the result of the update voxel generated in the B process,
    A computer-readable recording medium recording a program for causing the processor to execute the estimation process using the process a and the processes A to C.
PCT/JP2021/010679 2021-03-16 2021-03-16 Estimation device, estimation system, estimation method, and recording medium WO2022195731A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/010679 WO2022195731A1 (en) 2021-03-16 2021-03-16 Estimation device, estimation system, estimation method, and recording medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/010679 WO2022195731A1 (en) 2021-03-16 2021-03-16 Estimation device, estimation system, estimation method, and recording medium

Publications (1)

Publication Number Publication Date
WO2022195731A1 true WO2022195731A1 (en) 2022-09-22

Family

ID=83320178

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/010679 WO2022195731A1 (en) 2021-03-16 2021-03-16 Estimation device, estimation system, estimation method, and recording medium

Country Status (1)

Country Link
WO (1) WO2022195731A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006200999A (en) * 2005-01-19 2006-08-03 Canon Inc Image processor and refractive index distribution measuring instrument
JP2015219502A (en) * 2014-05-21 2015-12-07 浜松ホトニクス株式会社 Light stimulation device and light stimulation method
JP2019078635A (en) * 2017-10-25 2019-05-23 キヤノン株式会社 Measuring apparatus, data processor, method for processing data, and program
WO2020013325A1 (en) * 2018-07-13 2020-01-16 国立大学法人東京大学 Image generation device and image generation method
US20200182788A1 (en) * 2017-07-06 2020-06-11 Ramot At Tel-Aviv University System and method for three-dimensional label-free optical imaging of a biological cell sample in an environmental chamber

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006200999A (en) * 2005-01-19 2006-08-03 Canon Inc Image processor and refractive index distribution measuring instrument
JP2015219502A (en) * 2014-05-21 2015-12-07 浜松ホトニクス株式会社 Light stimulation device and light stimulation method
US20200182788A1 (en) * 2017-07-06 2020-06-11 Ramot At Tel-Aviv University System and method for three-dimensional label-free optical imaging of a biological cell sample in an environmental chamber
JP2019078635A (en) * 2017-10-25 2019-05-23 キヤノン株式会社 Measuring apparatus, data processor, method for processing data, and program
WO2020013325A1 (en) * 2018-07-13 2020-01-16 国立大学法人東京大学 Image generation device and image generation method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MOR HABAZA, MICHAEL KIRSCHBAUM, CHRISTIAN GUERNTH-MARSCHNER, GILI DARDIKMAN, ITAY BARNEA, RAFI KORENSTEIN, CLAUS DUSCHL, NATAN T. : "Rapid 3D Refractive-Index Imaging of Live Cells in Suspension without Labeling Using Dielectrophoretic Cell Rotation", ADVANCED SCIENCE, vol. 4, no. 2, 1 February 2017 (2017-02-01), pages 1600205, XP055563982, ISSN: 2198-3844, DOI: 10.1002/advs.201600205 *

Similar Documents

Publication Publication Date Title
TWI629448B (en) Angle-resolved reflectometer and method, system and computer program product for metrology
JP5808836B2 (en) How to estimate the wavefront
JP6494205B2 (en) Wavefront measuring method, shape measuring method, optical element manufacturing method, optical device manufacturing method, program, wavefront measuring apparatus
JP5798719B2 (en) Method and apparatus for measuring deviation of optical test surface from target shape
KR20110106823A (en) Aspheric object measuring method and apparatus
US9091614B2 (en) Wavefront optical measuring apparatus
WO2004052189A1 (en) Optical wavefront analyzer
CN114002190B (en) Three-dimensional optical diffraction tomography method and device
JPH0666537A (en) System error measuring method and shape measuring device using it
JP2015118290A (en) Digital holographic three-dimensional imaging device and digital holographic three-dimensional imaging method
JP2014081216A (en) Wavefront optical measuring device
JP7204428B2 (en) Eccentricity measuring method, lens manufacturing method, and eccentricity measuring device
WO2022195731A1 (en) Estimation device, estimation system, estimation method, and recording medium
CN108760056B (en) A kind of laser complex amplitude measurement method and system based on coherent diffraction imaging
JP3352298B2 (en) Lens performance measuring method and lens performance measuring device using the same
US20220074854A1 (en) Refractive index distribution estimating system
JP2021051038A (en) Aberration estimation method, aberration estimation device, program, and recording medium
JP7207813B2 (en) OCT image generation method, OCT system and storage medium
CN114270177B (en) Sample structure measuring device and sample structure measuring method
JP3871183B2 (en) Method and apparatus for measuring three-dimensional shape of optical element
JP2006126103A (en) Aspheric surface shape measuring method
JP3599921B2 (en) Method and apparatus for measuring refractive index distribution
JP7451121B2 (en) Aberration estimation method, aberration estimation device, program and recording medium
WO2022195765A1 (en) Estimation device, estimation system, estimation method, and recording medium
JPH11311600A (en) Method and apparatus for measuring refractive index distribution

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21931483

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21931483

Country of ref document: EP

Kind code of ref document: A1