CN114972614A - Reverse mapping cross-interface tomography method - Google Patents

Reverse mapping cross-interface tomography method Download PDF

Info

Publication number
CN114972614A
CN114972614A CN202210497153.6A CN202210497153A CN114972614A CN 114972614 A CN114972614 A CN 114972614A CN 202210497153 A CN202210497153 A CN 202210497153A CN 114972614 A CN114972614 A CN 114972614A
Authority
CN
China
Prior art keywords
voxel
point
reverse
ray
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210497153.6A
Other languages
Chinese (zh)
Inventor
伍岳
龚步高
凌晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN202210497153.6A priority Critical patent/CN114972614A/en
Publication of CN114972614A publication Critical patent/CN114972614A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/06Ray-tracing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/08Projecting images onto non-planar surfaces, e.g. geodetic screens

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Image Generation (AREA)

Abstract

The invention discloses a reverse mapping cross-interface tomography method, which comprises the steps of firstly establishing a reverse ray tracing relation, starting from a point on a projected pixel coordinate system, passing light through the center of a lens to a unique corresponding point on a certain integral pixel layer on a target to be measured, then establishing a reverse mapping relation, directly calculating a projection weight coefficient according to the intersection volume fraction of a reverse tracing light path and a spherical voxel, and finally obtaining a point spread function. The method not only solves the problem of analytical imaging projection distortion in a limited space, but also only needs to project the reverse mapping relation from the two-dimensional pixel to the voxel in the three-dimensional target field without involving the forward projection process from the voxel to the pixel, thereby avoiding the step of time-consuming mapping relation conversion, saving a large amount of computing resources and improving the computing efficiency.

Description

Reverse mapping cross-interface tomography method
Technical Field
The invention belongs to the technical field of optical imaging, and particularly relates to a reverse mapping cross-interface tomography method.
Background
Tomography technology measures the acquired projections from different viewing angles simultaneously to reconstruct the three-dimensional spatial distribution of the selected target, and is widely applied to the flow and combustion problems. The tomography problem can be divided into a restricted space and an open space according to the type of the measurement environment. The signals emitted by the objects to be measured in the open space are received directly by the image sensor, these objects generally having a predetermined characteristic spatial distribution (dye solution with uniform dye concentration, premixed cone flame, etc.). Tomographic reconstruction techniques are commonly used to study flame burning, three-dimensional distributions of fluid flow characteristics such as velocity field, concentration, flame surface characteristics, and temperature. However, in practice combustion and flow often occur in confined spaces, and the pattern of ambient conditions, flow and flame present is more complex than in open spaces, which presents a significant challenge to analyzing the flow or combustion mechanisms.
However, for the research of the problem of the limited space, many previous researches neglect the projection distortion caused by continuous refraction of light through a transparent medium, the light path in the limited space is refracted twice through an optical wall, if ray tracing is directly carried out from voxels in the region to be measured to projection pixels, a large number of rays need to be emitted from each position of a measurement domain, a unique light path passing through the optical center of a lens needs to be found, and a large number of iteration steps are needed in the application of the algorithm, so that the method is very time-consuming. Although the refraction effect can be reduced by making the optical axis of the image sensor perpendicular to the surface of the transparent medium in some two-dimensional optical diagnostic techniques, the signal is received by using the image sensor from a plurality of different angles during three-dimensional tomography measurement, and each image sensor cannot be perpendicular to the surface of the medium, so that the projection is inevitably affected by the refraction of light, and a large calculation error is introduced into tomography reconstruction. In addition, the projection obtained by the planar medium measurement has only a whole pixel shift compared with the projection obtained by the open space measurement, but when the projection is measured by a transparent medium (such as a cylinder of an internal combustion engine) with a curved surface, not only the pixel shift but also signal distribution deformation is generated, so that the accuracy of the reconstruction result is reduced. For this problem, there is a related Cross-interface three-dimensional Tomography algorithm developed, for example, a three-dimensional Cross-interface Computed Tomography (CICT) imaging method.
However, in the past algorithm, when calculating the mapping relationship between the projection pixels and the voxels, a reverse ray tracing relationship from the two-dimensional projection to the three-dimensional region to be measured is always established, then the reverse ray tracing relationship is converted into a forward ray tracing relationship (i.e. from the three-dimensional target to the two-dimensional projection) by means of a mathematical model, and finally the point spread function is calculated. The computation is very time consuming, especially when the accuracy of the measurement requires a high pixel voxel discrete size to be very small, and the time taken is an exponential rise.
Disclosure of Invention
In view of this, the present invention provides a reverse Mapping Cross-interface Tomography method, that is, an RMCICT (reverse-Mapping Cross-interface Computed Tomography) imaging method, which adopts a simpler imaging model to avoid a time-consuming Mapping relationship conversion step, and effectively improves a computation speed while ensuring reconstruction computation accuracy.
The invention is realized by the following technical scheme:
a reverse mapping cross-interface tomography method is based on imaging of a target to be detected in a transparent hollow cylindrical cylinder sleeve on a camera outside the cylinder sleeve, and comprises the following steps:
step S1: establishing a pixel array coordinate system of camera imaging and a voxel space coordinate system in a cylinder sleeve;
step S2: carrying out reverse ray tracing, and establishing a reverse mapping relation from a pixel array coordinate system to a voxel space coordinate system;
step S3, obtaining the projection weight coefficient of each voxel according to the inverse mapping relation of step S2;
step S4: and based on the projection weight coefficient of each voxel in the step S3, establishing a point spread function matrix from the voxels on all the voxel layers to the CCD plane according to the reverse mapping relation, and realizing the imaging of the target to be measured placed in the cylinder sleeve on the CCD plane.
Further, the specific manner of step S1 is as follows:
the method for establishing the pixel array coordinate system of the camera imaging comprises the following steps: the camera is simplified into an imaging system consisting of a CCD plane and a convex lens, wherein the CCD plane comprises a plurality of pixel arrays with the same size, a two-dimensional Cartesian coordinate system o-xz is defined by taking the central point of the CCD plane as an original point, the x direction is the transverse axis of the CCD plane, and the z direction is the longitudinal axis of the CCD plane;
the mode for establishing the voxel space coordinate system in the cylinder liner is as follows: dispersing a cylindrical space surrounded by the cylinder sleeve into more than two voxel layers, wherein each voxel layer is parallel to a meridian plane in the cylindrical space surrounded by the cylinder sleeve, and each voxel layer consists of a plurality of voxel blocks with the same size; a three-dimensional Cartesian coordinate system O-XYZ is defined by setting the center of a circle of the bottom surface of the cylinder liner as an origin O, wherein the direction perpendicular to the voxel layer is the Y direction, the central axis of the cylinder liner is the Z direction, and the X direction is perpendicular to the Y, Z direction.
Further, the voxels are spherical.
Further, the specific manner of step S2 is as follows:
step S21: performing reverse ray tracing: selecting any one voxel layer as a current voxel layer, transmitting a light ray which passes through the center of a lens to a point on a CCD plane, intersecting the point with the current voxel layer after the light ray passes through the outer surface and the inner surface of the cylinder sleeve, and calculating the coordinates of the point on the current voxel layer according to the theorem of light propagation and the snell's law to obtain the coordinates of the corresponding point of the point on the current voxel layer on the CCD plane;
step S22, establishing a reverse mapping relationship: based on the backward ray tracing of step S21, for the ith point A on the CCD plane i ', according to the inverse ray tracing of step S12, the corresponding points A on the jth voxel layer in the voxel space coordinate system are all traced ji And establishing a reverse mapping relation from the pixel array coordinate system to the voxel space coordinate system.
Further, the step S21 is further introduced as follows:
s211: a calibration plate is taken as a target to be measured and placed on any voxel layer in the cylinder sleeve, any point A 'on a CCD plane is selected, a ray which passes through the center C of a lens and is emitted from the point A' is made to intersect with the outer surface of the cylinder sleeve at a point E, and the incident angle alpha of the ray at the point E is calculated Ei The concrete formula is as follows:
Figure BDA0003633166180000031
wherein the content of the first and second substances,
Figure BDA0003633166180000032
is the vector of the incident ray at point E,
Figure BDA0003633166180000033
is the normal vector at point E;
s212, calculating the emergence angle alpha of the light ray at the point E according to the snell' S law Ee The concrete formula is as follows:
Figure BDA0003633166180000034
wherein n is air Is the refractive index of light in air, n quar t z The refractive index of light in the cylinder sleeve material;
s213, according to the light ray emergence angle alpha Ee Determining the intersection point F of the light ray and the inner surface of the cylinder sleeve, and calculating the incident angle alpha of the incident light ray at the point F Fi The concrete formula is as follows:
Figure BDA0003633166180000035
wherein the content of the first and second substances,
Figure BDA0003633166180000036
is the vector of the incident ray at point F,
Figure BDA0003633166180000037
is the normal vector at point F;
s214, according to the incident angle alpha of the incident light at the point F Fi And the snell's law calculates the exit angle alpha of the exiting ray at the point F Fe The concrete formula is as follows:
Figure BDA0003633166180000038
s215, according to the emergent angle of the emergent ray at the point Fα Fe And determining the intersection point A of the light ray and the calibration plate to obtain the coordinate of the corresponding point A of any point A' on the CCD plane on the calibration plate.
Further, the specific manner of step S3 is as follows: selecting more than three characteristic points on the pixels on the CCD plane, forming a reverse tracking light path based on a reverse mapping relation, and obtaining the projection weight coefficient of the voxel according to the intersection volume fraction of the reverse tracking light path and the voxel.
Further, the reverse tracking optical path is formed in a manner that: four vertexes of any pixel on the CCD plane are selected as feature points which are respectively H, I, J and K, four reverse light rays are obtained according to the reverse mapping relation, and the four light rays are used as boundaries to form a reverse tracking light path.
Further, the manner of obtaining the projection weight coefficient of the voxel according to the intersection volume fraction of the back-tracking optical path and the voxel is as follows: and obtaining all voxels intersected by the light path and all voxel layers, and calculating the percentage of the volume of each voxel intersected by the light path in the volume of a single voxel, wherein the percentage is the projection weight coefficient of each voxel.
Further, the reverse tracking optical path which is actually in a frustum pyramid shape is converted into a cylindrical optical path, and the area of the cylindrical end face is the same as that of the frustum pyramid-shaped end face.
Has the advantages that:
(1) the invention provides a reverse mapping cross-interface tomography method, which comprises the steps of firstly establishing a reverse ray tracing relation, starting from a point on a projected pixel coordinate system, passing through a lens center to a unique corresponding point on a certain integral pixel layer on a target to be detected, then establishing a reverse mapping relation, directly calculating a projection weight coefficient according to the intersection volume fraction of a reverse tracing light path and a spherical voxel, and finally obtaining a point spread function.
(2) The invention respectively establishes a two-dimensional pixel array coordinate system and a three-dimensional voxel space coordinate system on the CCD plane, reversely tracks the unique corresponding point on a certain voxel layer on the three-dimensional voxel space coordinate system for any point on the CCD plane according to the light propagation principle, and realizes the one-to-one correspondence of the point on the two-dimensional pixel array coordinate system and the point of the three-dimensional voxel space coordinate system.
(3) The spherical voxel block is adopted, has isotropy and better accords with the characteristic of emitting light signals in practice than a cubic voxel block, eliminates the projection errors of cameras from different angles and improves the accuracy of measurement.
(4) The invention calculates the light transmission track by utilizing the light transmission principle and the snell's law, reversely tracks the unique corresponding point on a certain integral element layer on a three-dimensional voxel space coordinate system for any point on a CCD plane, and solves the problem of tomography projection distortion caused by the refraction of the inner wall and the outer wall of the cylinder sleeve to light in the limited space of the cylinder sleeve.
(5) The invention selects the characteristic points on the pixels on more than three CCD planes to carry out reverse ray tracing to form a reverse tracing light path, the reverse tracing light path comprises all light signals which can be received by the pixels, and the voxels on all voxel layers which are intersected with the light path are all voxels which generate signal projection contribution to the pixels, so that the volume fraction can be adopted to calculate the projection weight coefficient. The volume fraction is adopted to calculate the projection weight coefficient without carrying out forward projection, namely linear fitting is not required to be carried out according to projection similarity to obtain the projection point coordinates of each point on each voxel layer on the CCD plane, so that the time-consuming mapping relation conversion step is avoided, the calculation step is simplified, and the calculation resource is saved.
(6) Four vertexes of any pixel on the CCD plane are used as characteristic points, and compared with the selection of other characteristic points on the pixel, the real projection relation of pixel reverse mapping can be reflected.
(7) The reverse tracking optical path is approximate to a cylinder shape, so that the calculation of the intersection volume of the cylinder and the sphere is convenient mathematically under the condition of small influence on the calculation result.
Drawings
FIG. 1 is a schematic block diagram of a RMCICT imaging method flow;
FIG. 2 is a schematic diagram of a reverse ray tracing relationship on a calibration plate;
FIG. 3 is a schematic view of an optical path intersection model;
Detailed Description
The invention is described in detail below by way of example with reference to the accompanying drawings.
The embodiment provides a reverse mapping cross-interface tomography method, which is based on imaging of a target to be detected in a transparent hollow cylindrical cylinder sleeve on a camera outside the cylinder sleeve, wherein the cylinder sleeve is made of quartz.
Referring to fig. 1, the method comprises the steps of:
step S1: and establishing a pixel array coordinate system of camera imaging and a voxel space coordinate system in the cylinder sleeve. The concrete mode is as follows:
referring to fig. 2, the specific way to establish the coordinate system of the pixel array imaged by the camera is as follows: the camera is simplified into an imaging system consisting of a CCD plane (namely the camera plane in figure 2) comprising a plurality of pixel arrays with the same size and a convex lens (namely the lens in figure 2), a two-dimensional Cartesian coordinate system o-xz is defined by taking the central point of the CCD plane as an original point, the x direction is the transverse axis of the CCD plane, and the z direction is the longitudinal axis of the CCD plane;
referring to fig. 2 and 3, the method for establishing the voxel space coordinate system in the cylinder liner is as follows: dispersing a cylindrical space surrounded by the cylinder sleeve into more than two voxel layers, wherein each voxel layer is parallel to a meridian plane (namely a tangent plane passing through the axis of the cylindrical cylinder sleeve simultaneously) in the cylindrical space surrounded by the cylinder sleeve, and each voxel layer consists of a plurality of spherical voxel blocks (namely voxels) with the same size; a three-dimensional Cartesian coordinate system O-XYZ is defined by setting the center of a circle of the bottom surface of the cylinder liner as an origin O, wherein the direction perpendicular to the voxel layer is the Y direction, the central axis of the cylinder liner is the Z direction, and the X direction is perpendicular to the Y, Z direction.
It should be noted that each voxel layer is composed of a plurality of spherical voxel blocks (i.e. voxels) with the same size, and the spherical voxel blocks have isotropy, and better conform to the characteristics of light signals emitted by voxels in practice than cubic voxel blocks, thereby eliminating the projection errors of cameras from different angles and improving the accuracy of measurement.
Step S2: performing backward ray tracing, and establishing a backward mapping relation from a pixel array coordinate system to a voxel space coordinate system, which is specifically introduced as follows:
step S21: referring to fig. 2, the specific way to perform the backward ray tracing is as follows: and selecting any one voxel layer as a current voxel layer, transmitting a light ray which passes through the center of the lens to one point on the CCD plane, intersecting the point with the current voxel layer after passing through the outer surface and the inner surface of the cylinder sleeve, and calculating the coordinates of the point on the current voxel layer according to the theorem of light propagation and the snell's law to obtain the coordinates of the corresponding point of the point on the current voxel layer on the CCD plane. Specific examples are presented below:
s211: a calibration plate is taken as a target to be measured and placed on any voxel layer in the cylinder sleeve, any point A 'on a CCD plane is selected, a ray which passes through the center C of a lens and is emitted from the point A' is made to intersect with the outer surface of the cylinder sleeve at a point E, and the incident angle alpha of the ray at the point E is calculated Ei The concrete formula is as follows:
Figure BDA0003633166180000061
wherein the content of the first and second substances,
Figure BDA0003633166180000062
is the vector of the incident ray at point E,
Figure BDA0003633166180000063
is the normal vector at point E (in the direction of the cylinder axis of the adaptive cylinder liner);
s212, calculating the light at the point E according to the snell' S lawAngle of departure of the ray alpha Ee The concrete formula is as follows:
Figure BDA0003633166180000064
wherein n is air Is the refractive index of light in air, n quartz The refractive index of light in the cylinder sleeve material;
s213, according to the light ray emergence angle alpha Ee Determining the intersection point F of the light ray and the inner surface of the cylinder sleeve, and calculating the incident angle alpha of the incident light ray at the point F Fi The concrete formula is as follows:
Figure BDA0003633166180000065
wherein the content of the first and second substances,
Figure BDA0003633166180000066
is the vector of the incident ray at point F,
Figure BDA0003633166180000067
is the normal vector at point F;
s214, according to the incident angle alpha of the incident light at the point F Fi And the snell's law calculates the exit angle alpha of the exiting ray at the point F Fe The concrete formula is as follows:
Figure BDA0003633166180000068
s215, according to the emergent angle alpha of the emergent ray at the point F Fe And determining the intersection point A of the light ray and the calibration plate to obtain the coordinate of the corresponding point A of any point A' on the CCD plane on the calibration plate.
It should be noted that, referring to fig. 2, if there is no refraction of the cylinder liner to light, according to the pinhole model, the point a on the calibration plate will be the point a on the CCD plane, and the distance between the point a 'and the point a ″ shows the projection distortion caused by the cylinder liner refraction, and step S21 calculates the propagation trajectory of light by using the propagation principle of light and snell' S law, and for any point on the CCD plane, it tracks back to its unique corresponding point on a certain voxel layer on the three-dimensional voxel space coordinate system, thereby solving the problem of tomographic projection distortion caused by the refraction of the inner wall and the outer wall of the cylinder liner to light in the limited space of the cylinder liner.
Step S22, the specific way of establishing the reverse mapping relationship is: based on the backward ray tracing of step S21, for the ith point A on the CCD plane i ', according to the inverse ray tracing of step S12, the corresponding points A on the jth voxel layer in the voxel space coordinate system are all traced ji And establishing a reverse mapping relation from the pixel array coordinate system to the voxel space coordinate system.
Step S3, obtaining a projection weight coefficient for each voxel according to the inverse mapping relationship of step S2. The concrete mode is as follows: selecting feature points on pixels on more than three CCD planes, forming a reverse tracking light path based on a reverse mapping relation, and obtaining a projection weight coefficient of a voxel according to the intersection volume fraction of the reverse tracking light path and the voxel, wherein the specific example is as follows:
s31: referring to fig. 3, four vertexes of any one pixel on the CCD plane are selected as feature points, which are points H, I, J and K, respectively, and according to the inverse mapping relationship, four inverse rays are obtained, and a Line-of-sight (LOS), that is, an inverse tracking light path, is formed with the four rays as boundaries.
It should be noted that the backward tracking optical path includes all optical signals that can be received by the pixel HIJK, and the voxels on all voxel layers that intersect the optical path are all voxels that contribute to signal projection of the pixel HIJK.
S32: obtaining all voxels intersected by the light path and all voxel layers, and calculating the percentage of the volume of each voxel intersected by the light path in the volume of a single voxel, wherein the percentage is the projection weight coefficient of each voxel;
further, when calculating the projection weight coefficient, the optical path actually in the shape of a frustum of a pyramid is approximated to a cylindrical optical path, and the area of the cylindrical end face is the same as that of the frustum of a pyramid.
It should be noted that, assuming that the voxel emission light signal is uniform, it can be considered that the larger the volume of the intersection of the voxel and the optical path, the closer the center of sphere is to the back-tracking optical path, the more the light signal contribution to the pixel is, so the volume fraction can be used to calculate the projection weight coefficient. Approximating the optical path as a cylinder facilitates the calculation of mathematically the volume of intersection of the cylinder and the sphere.
Step S4: and based on the projection weight coefficient of each voxel in the step S3, establishing a point spread function matrix from the voxels on all the voxel layers to the CCD plane according to the reverse mapping relation, and realizing the imaging of the target to be measured placed in the cylinder sleeve on the CCD plane.
In summary, the above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (9)

1. The reverse mapping cross-interface tomography method is characterized in that imaging of a target to be measured in a transparent hollow cylindrical cylinder sleeve on a camera outside the cylinder sleeve comprises the following steps:
step S1: establishing a pixel array coordinate system of camera imaging and a voxel space coordinate system in a cylinder sleeve;
step S2: carrying out reverse ray tracing, and establishing a reverse mapping relation from a pixel array coordinate system to a voxel space coordinate system;
step S3, obtaining the projection weight coefficient of each voxel according to the inverse mapping relation of step S2;
step S4: and based on the projection weight coefficient of each voxel in the step S3, establishing a point spread function matrix from the voxels on all the voxel layers to the CCD plane according to the reverse mapping relation, and realizing the imaging of the target to be measured placed in the cylinder sleeve on the CCD plane.
2. The method of claim 1, wherein the step S1 is performed by:
the method for establishing the pixel array coordinate system of the camera imaging comprises the following steps: the camera is simplified into an imaging system consisting of a CCD plane and a convex lens, wherein the CCD plane comprises a plurality of pixel arrays with the same size, a two-dimensional Cartesian coordinate system o-xz is defined by taking the central point of the CCD plane as an original point, the x direction is the transverse axis of the CCD plane, and the z direction is the longitudinal axis of the CCD plane;
the mode for establishing the voxel space coordinate system in the cylinder liner is as follows: dispersing a cylindrical space surrounded by the cylinder sleeve into more than two voxel layers, wherein each voxel layer is parallel to a meridian plane in the cylindrical space surrounded by the cylinder sleeve, and each voxel layer consists of a plurality of voxel blocks with the same size; a three-dimensional Cartesian coordinate system O-XYZ is defined by setting the center of a circle of the bottom surface of the cylinder liner as an origin O, wherein the direction perpendicular to the voxel layer is the Y direction, the central axis of the cylinder liner is the Z direction, and the X direction is perpendicular to the Y, Z direction.
3. The method of reverse mapping cross-interface tomography of claim 2, wherein the voxels are spherical.
4. The method of any one of claims 1-3, wherein the step S2 is performed in a specific manner:
step S21: performing reverse ray tracing: selecting any voxel layer as a current voxel layer, transmitting a light ray which passes through the center of the lens to a point on a CCD plane, intersecting the point with the current voxel layer after passing through the outer surface and the inner surface of the cylinder sleeve, and calculating the coordinate of the point on the current voxel layer according to the theorem of light propagation and the snell's law to obtain the coordinate of the corresponding point of the point on the current voxel layer on the CCD plane;
step S22, establishing a reverse mapping relationship: based on the backward ray tracing of step S21, for the ith point A on the CCD plane i ', according to the inverse ray tracing of step S12, the corresponding points A on the jth voxel layer in the voxel space coordinate system are all traced ji And establishing a reverse mapping relation from the pixel array coordinate system to the voxel space coordinate system.
5. The method of reverse mapping cross-interface tomography according to claim 4, wherein the step S21 is further introduced as follows:
s211: a calibration plate is taken as a target to be measured and placed on any voxel layer in the cylinder sleeve, any point A 'on a CCD plane is selected, a ray which passes through the center C of a lens and is emitted from the point A' is made to intersect with the outer surface of the cylinder sleeve at a point E, and the incident angle alpha of the ray at the point E is calculated Ei The concrete formula is as follows:
Figure FDA0003633166170000021
wherein the content of the first and second substances,
Figure FDA0003633166170000022
is the vector of the incident ray at point E,
Figure FDA0003633166170000023
is the normal vector at point E;
s212, calculating the emergence angle alpha of the light ray at the point E according to the snell' S law Ee The concrete formula is as follows:
Figure FDA0003633166170000024
wherein n is air Is the refractive index of light in air, n quartz The refractive index of light in the cylinder sleeve material;
s213, according to the light ray emergence angle alpha Ee Determining the intersection point F of the light ray and the inner surface of the cylinder sleeve, and calculating the incident angle alpha of the incident light ray at the point F Fi The concrete formula is as follows:
Figure FDA0003633166170000031
wherein the content of the first and second substances,
Figure FDA0003633166170000032
is the vector of the incident ray at point F,
Figure FDA0003633166170000033
is the normal vector at point F;
s214, according to the incident angle alpha of the incident light at the point F Fi And the snell's law calculates the exit angle alpha of the exiting ray at the point F Fe The concrete formula is as follows:
Figure FDA0003633166170000034
s215, according to the emergent angle alpha of the emergent ray at the point F Fe And determining the intersection point A of the light ray and the calibration plate to obtain the coordinate of the corresponding point A of any point A' on the CCD plane on the calibration plate.
6. The method of any one of claims 1 to 3, wherein the step S3 is implemented by: selecting more than three characteristic points on the pixels on the CCD plane, forming a reverse tracking light path based on a reverse mapping relation, and obtaining the projection weight coefficient of the voxel according to the intersection volume fraction of the reverse tracking light path and the voxel.
7. The method of claim 6, wherein the reverse mapping cross-interface tomography method comprises: four vertexes of any pixel on the CCD plane are selected as feature points which are respectively H, I, J and K, four reverse light rays are obtained according to the reverse mapping relation, and the four light rays are used as boundaries to form a reverse tracking light path.
8. The method of claim 6, wherein the obtaining of the projection weight coefficient of the voxel based on the intersection volume fraction of the back-tracking optical path and the voxel is performed by: and obtaining all voxels intersected by the light path and all voxel layers, and calculating the percentage of the volume of each voxel intersected by the light path in the volume of a single voxel, wherein the percentage is the projection weight coefficient of each voxel.
9. The inverse mapping cross-interface tomography method as claimed in claim 7, wherein the inverse tracking optical path having a substantially prismoid shape is converted into a cylindrical optical path, and the area of the cylindrical end surface is the same as that of the prismoid shape.
CN202210497153.6A 2022-05-09 2022-05-09 Reverse mapping cross-interface tomography method Pending CN114972614A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210497153.6A CN114972614A (en) 2022-05-09 2022-05-09 Reverse mapping cross-interface tomography method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210497153.6A CN114972614A (en) 2022-05-09 2022-05-09 Reverse mapping cross-interface tomography method

Publications (1)

Publication Number Publication Date
CN114972614A true CN114972614A (en) 2022-08-30

Family

ID=82981404

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210497153.6A Pending CN114972614A (en) 2022-05-09 2022-05-09 Reverse mapping cross-interface tomography method

Country Status (1)

Country Link
CN (1) CN114972614A (en)

Similar Documents

Publication Publication Date Title
Sakakibara et al. Stereo-PIV study of flow around a maneuvering fish
US8294712B2 (en) Scalable method for rapidly detecting potential ground vehicle under cover using visualization of total occlusion footprint in point cloud population
JP4964135B2 (en) A method for measuring a three-dimensional object using the optical law of light propagation by the shadow graph method using a single viewpoint optical system.
Bailey et al. Semi-direct tree reconstruction using terrestrial LiDAR point cloud data
CN108398782A (en) The Monte Carlo simulation and optimum design method of underwater laser active imaging system
Ohta et al. Ray coherence theorem and constant time ray tracing algorithm
CN106204701A (en) A kind of rendering intent based on light probe interpolation dynamic calculation indirect reference Gao Guang
Paolillo et al. Perspective camera model with refraction correction for optical velocimetry measurements in complex geometries
CN114972614A (en) Reverse mapping cross-interface tomography method
CN108364326B (en) CT imaging method
CN113092049B (en) Three-dimensional cross-interface imaging method
Fahringer et al. The effect of grid resolution on the accuracy of tomographic reconstruction using a plenoptic camera
Richter et al. Refined Geometric Modeling of Laser Pulse Propagation in Airborne LiDAR Bathymetry: Verbesserte geometrische Modellierung der Laserpulsausbreitung in der Laserbathymetrie
CN113552094B (en) Measuring device and measuring method for ICF target pellet ice layer refractive index three-dimensional reconstruction
Ebert et al. Efficient PIV measurements in the interior of complex, transparent geometries
CN108492341B (en) Parallel beam projection method based on pixel vertex
Ruiter et al. P3a-2 resolution assessment of a 3d ultrasound computer tomograph using ellipsoidal backprojection
Gong et al. Direct-Mapping Cross-Interfaces Computed Tomography
Zhang et al. 2D-supervised fast neural fluid reconstruction technique for time-resolved volumetric flame reconstruction
US20240193849A1 (en) Quantification of sensor coverage using synthetic modeling and uses of the quantification
Berggren et al. Acoustic Inverse Scattering Images from Simulated Higher Contrast Objects and from Laboratory Test Objects
Sun et al. Illumination compensation for nominally planar surface recovery
D’Amore et al. An objective criterion for stopping light–surface interaction. Numerical validation and quality assessment
Jun et al. Tomographic three-dimensional position measurement of particles using a light field camera
CN114187175A (en) Emission tomography weight matrix determination method for three-dimensional space free projection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination