CN116539158A - Image plane matching method and device for prism light-splitting multispectral camera sensor - Google Patents
Image plane matching method and device for prism light-splitting multispectral camera sensor Download PDFInfo
- Publication number
- CN116539158A CN116539158A CN202310566986.8A CN202310566986A CN116539158A CN 116539158 A CN116539158 A CN 116539158A CN 202310566986 A CN202310566986 A CN 202310566986A CN 116539158 A CN116539158 A CN 116539158A
- Authority
- CN
- China
- Prior art keywords
- image sensor
- prism
- matched
- sensor
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 38
- 238000013519 translation Methods 0.000 claims abstract description 16
- 238000012360 testing method Methods 0.000 claims description 61
- 238000001514 detection method Methods 0.000 claims description 21
- 239000000853 adhesive Substances 0.000 claims description 14
- 230000001070 adhesive effect Effects 0.000 claims description 14
- 239000013598 vector Substances 0.000 claims description 10
- 238000010586 diagram Methods 0.000 claims description 9
- 238000010587 phase diagram Methods 0.000 claims description 5
- 238000006073 displacement reaction Methods 0.000 claims description 4
- 238000001914 filtration Methods 0.000 claims description 3
- 230000010354 integration Effects 0.000 claims description 2
- 230000009466 transformation Effects 0.000 claims description 2
- 230000003595 spectral effect Effects 0.000 claims 1
- 230000004075 alteration Effects 0.000 abstract description 6
- 230000008569 process Effects 0.000 abstract description 5
- 238000005516 engineering process Methods 0.000 abstract description 2
- 238000001228 spectrum Methods 0.000 description 13
- 238000003384 imaging method Methods 0.000 description 4
- 238000000605 extraction Methods 0.000 description 3
- 239000003292 glue Substances 0.000 description 3
- 239000004579 marble Substances 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 238000002377 Fourier profilometry Methods 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000007711 solidification Methods 0.000 description 1
- 230000008023 solidification Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J3/00—Spectrometry; Spectrophotometry; Monochromators; Measuring colours
- G01J3/28—Investigating the spectrum
- G01J3/2823—Imaging spectrometer
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J3/00—Spectrometry; Spectrophotometry; Monochromators; Measuring colours
- G01J3/12—Generating the spectrum; Monochromators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J3/00—Spectrometry; Spectrophotometry; Monochromators; Measuring colours
- G01J3/12—Generating the spectrum; Monochromators
- G01J2003/1208—Prism and grating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J3/00—Spectrometry; Spectrophotometry; Monochromators; Measuring colours
- G01J3/28—Investigating the spectrum
- G01J3/2823—Imaging spectrometer
- G01J2003/2826—Multispectral imaging, e.g. filter imaging
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A40/00—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
- Y02A40/10—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
Landscapes
- Physics & Mathematics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a method and a device for matching image planes of a sensor of a prism light-splitting multispectral camera. The invention uses the projection device to irradiate the sensor through the polygon prism and reflect the sensor to the camera to form a phase deflection operation, and performs three-dimensional reconstruction on the surface information of the sensor. The projection device comprises a plurality of wave band light sources, and three-dimensional information can be respectively formed for the sensors on different emergent surface positions through the polygon prism. The prism bonding support is controlled by the electric translation table and the electric angular position table to automatically adjust, so that the spatial position of the sensor can be adjusted in real time in the prism bonding process, and the image plane matching calibration of the multiple image sensors is realized. According to the invention, the phase deflection technology is adopted to reconstruct the sensor surface information in three dimensions, so that the influence of lateral chromatic aberration and distortion is avoided, and the accurate matching of a plurality of sensors in bonding is ensured.
Description
Technical Field
The invention relates to the technical field of imaging of a prism light-splitting multispectral camera, in particular to a method and a device for matching image planes of a sensor of the prism light-splitting multispectral camera.
Background
The prism light-splitting multispectral camera adopts a dichroic film layer in the multispectral prism to split light of different wave bands into a plurality of independent black-and-white sensors in space, and performs simultaneous imaging of multispectral precise spatial alignment. The position deviation can be generated in the bonding process of the sensor and the prism, so that aberration and space position dislocation are caused, and the image matching degree among different sensors is influenced. The invention provides a method for carrying out three-dimensional reconstruction on sensor surface information by utilizing a phase deflection technique to assist in matching and calibrating image surfaces of a multi-image sensor, which realizes accurate matching of the multi-image sensor on spatial position deviation and angle deviation and ensures consistency of spatial position and imaging definition of information of each pixel point on each spectrum image.
The prism light splitting multispectral camera needs to synthesize images shot under different spectrums and requires the spatial alignment of the images under different spectrums, so that the spatial position of the sensor needs to be adjusted on the transmission surface and the reflection surface of the multispectral prism in the bonding stage of the sensor and the prism, the consistency of the space and the imaging definition on the images with different spectrums is ensured, and the algorithm load of the rear-end image fusion is reduced. The existing multi-image sensor registration method needs to introduce a lens and a target, and the sensor shoots patterns on the target by means of the lens, and adjusts the space installation position of the sensor according to the definition difference at different positions.
In summary, the existing image plane matching calibration method of the multi-image sensor has the following disadvantages:
(1) The angle rotation of the X axis and the Y axis needs to be manually adjusted, the degree of automation is insufficient, and the manual adjustment precision is low.
(2) The common objective lens has lateral chromatic aberration and distortion, can influence the offset of an X axis and a Y axis and the adjustment of three angles, and can judge X, Y and the deviation of the angles by the image shot by the objective lens, so that reworking is easy to cause and the efficiency is low.
Disclosure of Invention
The invention provides a method and a device for matching image surfaces of a sensor of a prism light-splitting multispectral camera, which can at least solve one of the technical problems.
In order to achieve the above purpose, the present invention proposes the following technical solutions:
a method for matching image surfaces of sensors of a prism-splitting multispectral camera, which uses a multispectral prism to split light of different wave bands onto a plurality of corresponding image sensors, the method being used for matching the positional relationship of the plurality of image sensors so as to make the spatial positions and the definition of object information acquired by the plurality of image sensors consistent, the method comprising:
fixing a preset reference image sensor at a corresponding position of the polygon mirror; pre-fixing an image sensor to be matched on a corresponding position of a polygon mirror;
respectively using light of different wavebands to project grating fringes onto corresponding image sensors through a polygon prism, collecting grating fringe patterns reflected by each image sensor by a camera, and performing three-dimensional reconstruction on the surface of each image sensor by using a phase deflection technique according to the grating fringe patterns collected by the camera;
obtaining the position deviation between each image sensor to be matched and the reference image sensor according to the three-dimensional reconstruction result, and initially adjusting the position of each image sensor to be matched according to the deviation;
the multispectral camera is carried with a telecentric lens to collect a resolution test target image, the position deviation between each image sensor to be matched and a reference image sensor is obtained according to the resolution test target image collected by each image sensor, and the position of each image sensor to be matched is finely adjusted according to the deviation.
Further, the light of different wavebands is used to project the grating fringes onto the corresponding image sensors through the polygon prism, and the camera collects the grating fringes reflected by each image sensor, including:
projecting grating stripes with crisscross cross into a polygon mirror by using light with a wave band corresponding to the reference image sensor, and enabling the grating stripes to reach the reference image sensor after being split by the polygon mirror; the grating stripe reflected by the reference image sensor is collected by a camera after passing through a polygon prism, and a grating stripe diagram is obtained;
the steps are performed by using light of wave bands corresponding to other respective image sensors, respectively, so as to obtain a grating fringe pattern reflected by each image sensor.
Further, the three-dimensional reconstruction of each image sensor surface using phase-shifting from the camera-acquired grating fringe pattern comprises:
performing Fourier transformation on the transverse direction and the longitudinal direction of the grating fringe pattern respectively;
filtering in the transverse and longitudinal directions respectively, then taking a fundamental frequency signal, and performing Fourier transform to obtain a phase diagram in the transverse and longitudinal directions;
and respectively carrying out phase expansion on each pixel along the direction of a time axis on the three-dimensional phase field in the phase diagram, and obtaining the three-dimensional surface shape of the sensor surface through a height reconstruction algorithm based on radial basis function integration according to the known position information of the projection light source of the camera internal reference and the grating stripes to finish three-dimensional reconstruction.
Further, the obtaining the position deviation between each image sensor to be matched and the reference image sensor according to the three-dimensional reconstruction result, and initially adjusting the position of each image sensor to be matched according to the deviation includes:
performing feature point matching on the three-dimensional surface shape of each image sensor to be matched and the three-dimensional surface shape of the reference image sensor to obtain a corresponding matched feature point set;
calculating a rotation vector and a translation vector between the characteristic point set of each image sensor to be matched and the characteristic point set of the corresponding reference image sensor, and acquiring the position deviation, wherein the position deviation comprises angle deviations theta (X1), theta (Y1) and theta (Z1) of the deviations X1, Y1 and Z1 of X, Y, Z axes between each image sensor to be matched and the reference image sensor and X, Y, Z directions;
and initially adjusting the position of each image sensor according to the position deviation.
Further, the multispectral camera is provided with a telecentric lens for collecting a resolution test target image, the position deviation between each image sensor to be matched and a reference image sensor is obtained according to the resolution test target image collected by each image sensor, the position of each image sensor to be matched is finely adjusted according to the deviation, and the multispectral camera comprises:
the multispectral camera is carried with a telecentric lens to collect the test target graph with the same resolution for a plurality of times;
for each resolution test target map acquired by each image sensor, the following steps are performed:
performing feature point matching between the resolution test target images acquired by each image sensor to be matched and the resolution test target images acquired by the reference image sensor;
calculating a feature point set of the resolution test target image acquired by each image sensor to be matched, and a rotation vector and a translation vector between the feature point set of the resolution test target image acquired by the corresponding reference image sensor to obtain the position deviation, wherein the position deviation comprises offset X2 and Y2 of X, Y axes and a rotation angle theta (Z2) in the Z direction;
obtaining the average value of the offset X2 and Y2 of the X, Y axis and the rotation angle theta (Z2) of the Z direction, which are obtained after each image sensor collects the resolution test target image, so as to obtain the average position deviation;
and fine-adjusting the position of each image sensor to be matched according to the average position deviation.
Further, the fine-tuning the position of each image sensor to be matched according to the average position deviation includes:
judging whether the average value of each offset in the offset X2 and Y2 of the X, Y axis and the rotation angle theta (Z2) of the Z direction is larger than a set compensation threshold value, and if the average value of any offset is smaller than the corresponding compensation threshold value, not finely adjusting the offset;
if the average value of any offset is not smaller than the corresponding compensation threshold, fine-adjusting the offset, and repeating the steps of collecting the same resolution test target image and calculating the average position deviation by carrying a telecentric lens on a multispectral camera for a plurality of times after fine-adjusting until the average value of each offset in the offset X2 and Y2 of the X, Y axis and the rotation angle theta (Z2) of the Z direction is smaller than the set compensation threshold.
On the other hand, the invention also provides a device for matching the image surface of the sensor of the prism light splitting multispectral camera, the multispectral camera uses the multispectral prism to split the light of different wave bands onto a plurality of corresponding image sensors, the method is used for matching the position relation of the plurality of image sensors so as to lead the spatial position and the definition of the object information acquired by the plurality of image sensors to be consistent, and the method comprises the following steps:
the six-degree-of-freedom adjusting table is used for adjusting the position of an image sensor to be matched, the image sensor to be matched is pre-fixed on the corresponding position of the multi-beam splitter prism, and the reference image sensor is fixed on the multi-beam splitter prism;
the XYZ scanning detection table is used for adjusting the positions of the projection device, the camera, the telecentric lens and the resolution test target;
the XYZ scanning detection table adjusts the positions of a projection device and a camera, so that the projection device projects light with different wave bands respectively to project grating fringes onto corresponding image sensors through a polygon prism, the camera collects grating fringe patterns reflected by each image sensor, three-dimensional reconstruction is carried out on the surface of each image sensor by using a phase deflection technique according to the grating fringe patterns collected by the camera, and the position deviation between each image sensor to be matched and a reference image sensor is obtained according to the three-dimensional reconstruction result; the six-degree-of-freedom adjusting table initially adjusts the position of each image sensor to be matched according to the deviation;
the XYZ scanning detection platform adjusts the positions of the telecentric lens and the resolution test target, so that a resolution test target image projected by the resolution test target passes through the telecentric lens to reach the multi-beam splitter prism and is collected by each image sensor; obtaining the position deviation between each image sensor to be matched and the reference image sensor according to a plurality of resolution test target images acquired by each image sensor; and the six-degree-of-freedom adjusting table finely adjusts the position of each image sensor to be matched according to the deviation.
Further, the six-degree-of-freedom adjustment stage includes:
an XYZ axis translation stage for adjusting a X, Y, Z axis deviation between the image sensor to be matched and the reference image sensor;
the XYZ axis angular stage is used for adjusting angle offsets theta (X1), theta (Y1) and theta (Z1) in the X, Y, Z direction between the image sensor to be matched and the reference image sensor.
Further, the XYZ scanning detection stage includes:
the XYZ-axis displacement platform is connected with the adapter plate;
the projection device, the camera, the telecentric lens and the resolution test target are arranged on the adapter plate;
the XYZ axis displacement stage is used to adjust the position of the projection device and camera, telecentric lens and resolution test target on the X, Y, Z axis.
Further, the method further comprises the following steps:
the prism bonding structure comprises a prism fixing frame and a sensor bonding plate, wherein the prism fixing frame is used for fixing the multi-beam splitter prism, one side of the sensor bonding plate is fixed with an image sensor, and the other side of the sensor bonding plate is connected with the prism fixing frame;
the bonding plate with one side fixed with the reference image sensor is directly fixed at the corresponding position of the prism fixing frame, so that the multi-beam prism can irradiate the received light source to the reference image sensor after splitting the light source;
the adhesive plate with one side fixed with the image sensor to be matched is pre-fixed at the corresponding position of the prism fixing frame, so that the multi-beam prism can radiate the received light source to the reference image sensor after splitting the light;
the other side of the adhesive plate, on which the image sensor to be matched is fixed, is connected with the six-degree-of-freedom adjusting table, and the position of the image sensor to be matched is adjusted under the drive of the six-degree-of-freedom adjusting table;
after the position adjustment of the image sensor to be matched is completed, the adhesive plate with the image sensor to be matched fixed on one side is fixed on the prism fixing frame.
The beneficial effects of the invention are as follows:
(1) According to the invention, the phase deflection technology is adopted to reconstruct the sensor surface information in three dimensions, so that the influence of lateral chromatic aberration and distortion is avoided, and the accurate matching of a plurality of sensors in bonding is ensured.
(2) According to the invention, through the two processes of initial adjustment and fine adjustment of the position of the image sensor to be matched, the matching precision can be made into sub-pixel level, and the precision is higher.
Drawings
FIG. 1 is a schematic diagram of a sensor image plane matching device of a prism-splitting multispectral camera of the present invention;
FIG. 2 is a schematic diagram of a six degree-of-freedom adjustment stage in an embodiment of the invention;
FIG. 3 is a schematic diagram of an XYZ scanning inspection station in an embodiment of the invention;
FIG. 4 is a schematic view of a prism bonding structure in an embodiment of the present invention;
FIG. 5 is a schematic view of a prism-securing frame according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a process of collecting grating fringe patterns by a projection device and a camera in an embodiment of the invention;
FIG. 7 is a flow chart of three-dimensional reconstruction of image sensor surface information in an embodiment of the present invention;
FIG. 8 is a schematic diagram illustrating the slope calculation of a grating fringe pattern in an embodiment of the invention;
FIG. 9 is a schematic diagram of the position of an ultraviolet lamp illuminating a reference image sensor in an embodiment of the invention;
FIG. 10 is a schematic diagram of an ultraviolet lamp illuminating the location of an image sensor to be matched in an embodiment of the present invention;
fig. 11 is a flowchart illustrating an operation of the image plane matching device of the prism-splitting multispectral camera sensor according to the embodiment of the invention.
In the figure: 1-an air-floating foot pad; 2-marble Dan Ji; 3-a mounting platform; a 4-XYZ axis translation stage; 5-XYZ axis angular position table; 6-XYZ scanning detection stage; 7-a projection device; 8-camera; 9-telecentric lens; 10-resolution test target; 11-a prism bonding structure; 11 a-a prism mount; 11 b-a sensor adhesive plate; 12-six degrees of freedom adjustment stage.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention.
The embodiment firstly provides a prism light splitting multispectral camera sensor image surface matching device, which is applied to image sensor image surface matching of a binary spectrum camera, and it can be understood that when a binary spectrum prism of the binary spectrum camera is expanded into a multispectral prism, the device can realize image surface matching of a plurality of image sensors of the multispectral camera only by carrying out simple improvement which can be obtained by a person skilled in the art without creative thinking.
The image plane matching device of the prism-splitting multispectral camera sensor of the embodiment is shown in fig. 1, and comprises a base body composed of a mounting platform 3, a marble base 2 and an air-floating foot pad 1, wherein the base body is used for bearing other devices. Because of the unavoidable shake of laboratory environment, image offset is easy to generate when the image is picked, the stability of the marble base 2 is high, the vibration isolation capability of the air-float foot pad 1 is good, and the environmental influence can be reduced to a great extent by matching the two.
The six-degree-of-freedom adjustment table 12 and the XYZ scanning detection table 6 are mounted on the mounting platform 3, respectively. Wherein the six degree of freedom adjustment stage 12 is depicted in fig. 2, comprising an XYZ axis translation stage 4 and an XYZ axis angular stage 5, the XYZ axis translation stage 4 being adapted to provide translation on X, Y, Z axes and the XYZ axis angular stage 5 being adapted to provide rotation in X, Y, Z axes. The XYZ scanning detection stage 6 is provided with a projector 7, a camera 8, a telecentric lens 9, and a resolution test target 10, and is used for adjusting the positions of the projector 7, the camera 8, the telecentric lens 9, and the resolution test target 10 on the X, Y, Z axis as shown in fig. 3. The XYZ scanning detection table 6 can adjust the positions of the projection device 7 and the camera 8, so that the projection device 7 projects grating fringes onto corresponding image sensors through a polygon mirror, and then the camera 8 collects grating fringe patterns reflected by each image sensor; the XYZ scanning detection stage 6 can also remove the projection device 7 and the camera 8, and then adjust the positions of the telecentric lens 9 and the resolution test target 10, so that the resolution test target image projected by the resolution test target 10 passes through the telecentric lens 9 to reach the multi-beam splitter prism and is collected by each image sensor, wherein the resolution test target is provided with a plurality of groups of line pairs with equal line spacing and line width, and the embodiment preferably uses the USAF 1951 resolution test target.
The mounting platform 3 is further provided with a prism bonding structure 11, as shown in fig. 4, the prism bonding structure 11 comprises a prism fixing frame 11a and a sensor bonding plate 11b, the prism fixing frame 11a is used for fixing the multi-beam splitter prism, as shown in fig. 5, one side of the sensor bonding plate 11b is fixed with an image sensor, and the other side of the sensor bonding plate is connected with the prism fixing frame 11 a. In this embodiment, there are two image sensors to be matched, one is the reference image sensor and the other is the image sensor to be matched. The two sensor bonding plates 11b respectively mount the two image sensors at positions corresponding to the multi-beam splitter prism, and the corresponding positions enable the multi-beam splitter prism to split the received light source and then respectively irradiate light of different wave bands onto the reference image sensor and the matched image sensor. In the matching process, the adhesive plate with the reference image sensor fixed on one side is directly fixed at the corresponding position of the prism fixing frame 11a, and the adhesive plate with the image sensor to be matched fixed on one side is pre-fixed at the corresponding position of the prism fixing frame 11a, wherein the pre-fixation means that the image sensor to be matched can be temporarily fixed on the prism fixing frame 11a, but the pre-fixation position of the image sensor to be matched can be changed under the action of external force. The other side of the prism fixing frame is connected with a six-degree-of-freedom adjusting table 12, the position of the image sensor to be matched is adjusted under the drive of the six-degree-of-freedom adjusting table 12, and after the position adjustment of the image sensor to be matched is completed, an adhesive plate with one side fixed with the image sensor to be matched is fixed on the prism fixing frame 11 a.
The image plane matching method of the prism splitting multispectral camera sensor of the embodiment comprises the following steps:
step 100, mounting a bipartite prism and an image sensor, including:
step 110, the bipartite prism is mounted in the prism mount 11a, and the prism mount 11a is mounted on the mounting platform 3.
Step 120, the reference image sensor and the image sensor to be matched are connected with an adhesive plate, the adhesive plate with the reference image sensor is directly fixed on the prism fixing frame 11a, the adhesive plate with the image sensor to be matched is pre-fixed on the prism frame, and one end of the adhesive plate is connected with the six-degree-of-freedom adjusting table 12.
130, moving an XY axis of an XYZ scanning detection platform to ensure that a projection device 7 can project onto a sensor through a bipartite prism, and a camera 8 can acquire an image reflected by the image sensor; then moving the Z axis of the XYZ scanning detection platform to ensure that the image acquired by the reference image sensor is clear, and then moving the Z axis of the six-degree-of-freedom adjustment table 12 by a program to ensure that the image acquired by the image sensor to be matched is clear; the positions of XY and two Z axes are manually determined in the first test, the positions at the moment are recorded, and the subsequent test can be directly moved to the corresponding positions by a program without manual re-determination.
Step 200, performing three-dimensional reconstruction on the sensor surface by using a phase deflection technique, and initially adjusting the position of the image sensor to be matched, wherein the method specifically comprises the following steps:
step 210, system calibration: the system calibration is carried out on the camera 8 and the projection device 7, including the calibration of the internal parameters of the camera 8 and the calibration of the spatial position of the projection device 7, the calibration operation is carried out in the first operation, and the repeated operation is not needed after the calibration result is saved.
Step 220, a projection device 7 comprises two light sources of a spectrum band A and a spectrum band B; the spectrum band A of the light source of the projector 7 is firstly turned on, the projector 7 projects the grating fringes onto the bipartite prism, the dichromatic film layer can divide the spectrum band A onto the reference image sensor, and the camera 8 collects the grating fringe pattern reflected by the reference image sensor through the bipartite prism, as shown in fig. 6.
Step 230, performing phase extraction according to the acquired cross grating fringe pattern. The extraction method adopted in this embodiment is fourier transform profilometry, the projection device 7 projects cross stripes, the camera 8 collects the stripes reflected by the surface of the reference image sensor, fourier transforms are respectively performed on the directions X, Y of the stripes, filtering is performed to obtain fundamental frequency signals, inverse fourier changes are performed, and a phase diagram is calculated, and the flow of the steps is shown in fig. 7.
In step 240, the phase extracted from the deformed grating fringe pattern by the phase extraction algorithm is truncated, which is called a truncated phase, and the phase unwrapping algorithm is required to unwrap into a continuous phase. The invention adopts a time phase unfolding method, utilizes three-dimensional phase fields obtained from a plurality of groups of grating fringe patterns with different frequencies to respectively conduct phase unfolding on each pixel one by one along the direction of a time axis, and then conducts slope calculation by matching with calibration results of spatial positions of a reference camera 8 and a projection device 7, wherein the slope refers to the gradient of the sensor surface, as shown in figure 8.
Step 250, according to the obtained slope information, the three-dimensional surface shape of the reference image sensor can be obtained through a height reconstruction algorithm.
And 260, after the three-dimensional surface shape of the reference image sensor is obtained, switching the spectrum band of the light source of the projection device 7 to be the spectrum band B, and repeating the steps 230, 240 and 250 to obtain the three-dimensional surface shape of the image sensor to be matched.
Step 270, performing matching calibration on the spatial position of the image sensor to be matched by using the reference image sensor as a reference target, specifically including:
step 271, performing feature point matching on the surface shapes of the reference image sensor and the image sensor to be matched to form matching feature point sets { Pr1} and { Pt1}, and calculating rotation and translation vectors between the two point sets to obtain deviations X1, Y1 and Z1 of XYZ axes and angle offsets theta (X1), theta (Y1) and theta (Z1) of the XYZ directions.
Step 272, the six-degree-of-freedom adjustment stage 12 automatically adjusts the spatial position of the image sensor to be matched according to the obtained rotation and translation parameters.
The camera 8 samples the object to be measured discretely, the measured surface gradient information in two adjacent pixels cannot be completely reconstructed, so that the original surface shape cannot be completely and accurately recovered, the longitudinal Z matching precision of the three-dimensional surface shape reconstruction matching method can be within the focal depth of a lens, but the transverse XY matching precision can only reach the pixel level, and the images acquired by a plurality of image sensors are required to be completely matched, so that the transverse matching precision must be made into the sub-pixel level. The adjustment of the above steps cannot meet the final requirements. It is necessary to add a telecentric lens 9 and match with a resolution test target 10, and further adjust, chromatic aberration and distortion of the common lens can inevitably affect calibration in the XY direction, the telecentric lens 9 is low in distortion, the magnification is constant, and the influence caused by distortion and lateral chromatic aberration can be eliminated.
Therefore, the present embodiment further includes the following steps:
step 300, initially adjusting the position of the image sensor to be matched by using the telecentric lens 9 and matching with the resolution test target 10, specifically comprising the following steps:
and 310, moving the XY axes of the XYZ scanning detection platform, and adjusting the positions of the telecentric lens 9 and the resolution test target 10 to enable the resolution test target image projected by the resolution test target 10 to reach the multi-beam splitter prism through the telecentric lens 9 and be acquired by each image sensor. And the Z axis of the XYZ scanning detection platform is moved to clear the image acquired by the reference image sensor, wherein the position of the telecentric lens 9 and the position of the Z axis when the image of the reference image sensor is clear are recorded in the first test, and the image is directly moved to the corresponding position according to the stored position information in the subsequent test.
In step 320, the reference image sensor and the image sensor to be matched acquire the resolution test target image, in order to improve the accuracy, in this embodiment, a plurality of images are continuously and simultaneously acquired, and a preferred embodiment is to acquire 50 images. Selecting an area with alternate bright and dark stripes in the center of a picture, carrying out characteristic point matching on two pictures acquired each time to form matched characteristic point sets { Pr2} and { Pt2}, calculating rotation and translation vectors between the two point sets, averaging rotation and translation values obtained by multiple picture acquisition, and obtaining offset X2, Y2 of an XY axis and rotation angle theta (Z2) of a Z direction. And then judging the relation between X2, Y2 and theta (Z2) and the corresponding compensation threshold, and when any one of the three parameters is not smaller than the set compensation threshold, performing fine adjustment of corresponding offset and rotation angles on the X-axis, Y-axis translation stage and Z-axis angular stage of the six-degree-of-freedom adjustment stage 12, and circulating the operation of the step until all the three parameters are smaller than the set compensation threshold.
And 400, fixing the image sensor to be matched on a prism frame to finish image plane matching of the sensor of the prism beam-splitting multispectral camera.
In one embodiment, the image sensor and the prism frame are fixed by ultraviolet curing glue, and fig. 9 and fig. 10 respectively show irradiation positions of ultraviolet lamps when the reference image sensor and the image sensor to be matched are fixed by ultraviolet curing glue. In addition, each component in the present embodiment is automatically controlled by a program, and a specific control flow is as shown in fig. 11, as follows:
(1) The reference image sensor is fixed with the image sensor to be matched and the respective sensor bonding plates 11b, the reference image sensor bonding plates 11b are fixedly bonded with the prism frame, ultraviolet curing glue is dotted between the image sensor bonding plates 11b to be matched and the prism frame, and ultraviolet lamp irradiation is not performed temporarily.
(2) The XY axes of the XYZ scanning detection platform are moved through a program, so that the projection device 7 can project onto an image sensor through a bipartite prism, and the camera 8 can acquire images reflected by the sensor; then the program moves the Z axis of the XYZ scanning detection platform to ensure that the image acquired by the reference image sensor is clear, and then the program moves the Z axis of the six-degree-of-freedom adjustment table 12 to ensure that the image acquired by the image sensor to be matched is clear; the positions of XY and two Z axes are manually determined in the first test, the positions at the moment are recorded, and the subsequent test can be directly moved to the corresponding positions by a program without manual re-determination.
(3) And carrying out three-dimensional reconstruction of surface information on the reference image sensor and the image sensor to be matched, calculating adjustment parameters, inputting the fed-back parameters into the six-degree-of-freedom adjustment table 12 by a program, and carrying out initial adjustment on the spatial position of the image sensor to be matched.
(4) And the X-Y axis of the XYZ scanning detection platform is moved and is switched into the telecentric lens 9 and the resolution test target 10, the Z axis of the XYZ scanning detection platform is moved to clear the image acquired by the reference image sensor, the second time of space position adjustment is carried out on the image sensor to be matched, and the deflection of X-Y and the rotation angle of Z directions are adjusted through the six-degree-of-freedom adjustment table 12. The spatial position of the image sensor to be matched before solidification is adjusted; the position of the telecentric lens 9 and the position of the Z axis when the image of the image sensor to be matched is clear can be recorded in the first test, and the direct program can be moved to the corresponding position in the subsequent test.
(5) And (3) curing the ultraviolet lamp, when the spatial position of the image sensor to be matched is adjusted, controlling the ultraviolet lamp to be turned on by a program, and automatically turning off after 30 minutes, wherein after the completion, the image sensor to be matched is bonded with the prism.
After the first operation is completed, the whole flow of the embodiment records partial position information, and the follow-up operation can be performed fully automatically without manual operation.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.
Claims (10)
1. A method for matching image surfaces of sensors of a prism-splitting multispectral camera, wherein the multispectral camera uses a multispectral prism to split light of different wave bands onto a plurality of corresponding image sensors, and the method is used for matching the position relations of the plurality of image sensors so as to enable the spatial positions and the definition of object information acquired by the plurality of image sensors to be consistent, and is characterized by comprising the following steps:
fixing a preset reference image sensor at a corresponding position of the polygon mirror; pre-fixing an image sensor to be matched on a corresponding position of a polygon mirror;
respectively using light of different wavebands to project grating fringes onto corresponding image sensors through a polygon prism, collecting grating fringe patterns reflected by each image sensor by a camera, and performing three-dimensional reconstruction on the surface of each image sensor by using a phase deflection technique according to the grating fringe patterns collected by the camera;
obtaining the position deviation between each image sensor to be matched and the reference image sensor according to the three-dimensional reconstruction result, and initially adjusting the position of each image sensor to be matched according to the deviation;
the multispectral camera is carried with a telecentric lens to collect a resolution test target image, the position deviation between each image sensor to be matched and a reference image sensor is obtained according to the resolution test target image collected by each image sensor, and the position of each image sensor to be matched is finely adjusted according to the deviation.
2. The method for matching image surfaces of prism-splitting multispectral camera sensors according to claim 1, wherein the grating fringes are projected onto the corresponding image sensors through the polygon prism by using light with different wavebands, respectively, and the camera collects the grating fringes reflected by each image sensor, comprising:
projecting grating stripes with crisscross cross into a polygon mirror by using light with a wave band corresponding to the reference image sensor, and enabling the grating stripes to reach the reference image sensor after being split by the polygon mirror; the grating stripe reflected by the reference image sensor is collected by a camera after passing through a polygon prism, and a grating stripe diagram is obtained;
the steps are performed by using light of wave bands corresponding to other respective image sensors, respectively, so as to obtain a grating fringe pattern reflected by each image sensor.
3. The method for matching the image surface of the prism-splitting multispectral camera sensor according to claim 2, wherein the three-dimensional reconstruction of each image sensor surface by using phase-shifting according to the grating fringe pattern acquired by the camera comprises:
performing Fourier transformation on the transverse direction and the longitudinal direction of the grating fringe pattern respectively;
filtering in the transverse and longitudinal directions respectively, then taking a fundamental frequency signal, and performing Fourier transform to obtain a phase diagram in the transverse and longitudinal directions;
and respectively carrying out phase expansion on each pixel along the direction of a time axis on the three-dimensional phase field in the phase diagram, and obtaining the three-dimensional surface shape of the sensor surface through a height reconstruction algorithm based on radial basis function integration according to the known position information of the projection light source of the camera internal reference and the grating stripes to finish three-dimensional reconstruction.
4. The method for matching image surfaces of prism-splitting multispectral camera sensors according to claim 3, wherein the step of obtaining a positional deviation between each image sensor to be matched and a reference image sensor according to the three-dimensional reconstruction result, and the step of initially adjusting the position of each image sensor to be matched according to the deviation comprises the steps of:
performing feature point matching on the three-dimensional surface shape of each image sensor to be matched and the three-dimensional surface shape of the reference image sensor to obtain a corresponding matched feature point set;
calculating a rotation vector and a translation vector between the characteristic point set of each image sensor to be matched and the characteristic point set of the corresponding reference image sensor, and acquiring the position deviation, wherein the position deviation comprises angle deviations theta (X1), theta (Y1) and theta (Z1) of the deviations X1, Y1 and Z1 of X, Y, Z axes between each image sensor to be matched and the reference image sensor and X, Y, Z directions;
and initially adjusting the position of each image sensor according to the position deviation.
5. The method for matching image surfaces of prism-spectroscopic multispectral camera sensors according to claim 1, wherein the multispectral camera is provided with a telecentric lens to collect a resolution test target image, a positional deviation between each image sensor to be matched and a reference image sensor is obtained according to the resolution test target image collected by each image sensor, and the position of each image sensor to be matched is finely adjusted according to the deviation, comprising:
the multispectral camera is carried with a telecentric lens to collect the test target graph with the same resolution for a plurality of times;
for each resolution test target map acquired by each image sensor, the following steps are performed:
performing feature point matching between the resolution test target images acquired by each image sensor to be matched and the resolution test target images acquired by the reference image sensor;
calculating a feature point set of the resolution test target image acquired by each image sensor to be matched, and a rotation vector and a translation vector between the feature point set of the resolution test target image acquired by the corresponding reference image sensor to obtain the position deviation, wherein the position deviation comprises offset X2 and Y2 of X, Y axes and a rotation angle theta (Z2) in the Z direction;
obtaining the average value of the offset X2 and Y2 of the X, Y axis and the rotation angle theta (Z2) of the Z direction, which are obtained after each image sensor collects the resolution test target image, so as to obtain the average position deviation;
and fine-adjusting the position of each image sensor to be matched according to the average position deviation.
6. The method for image plane matching of prismatic, spectral, multispectral camera sensors according to claim 5, wherein said fine-tuning the position of each image sensor to be matched according to said average positional deviation comprises:
judging whether the average value of each offset in the offset X2 and Y2 of the X, Y axis and the rotation angle theta (Z2) of the Z direction is larger than a set compensation threshold value, and if the average value of any offset is smaller than the corresponding compensation threshold value, not finely adjusting the offset;
if the average value of any offset is not smaller than the corresponding compensation threshold, fine-adjusting the offset, and repeating the steps of collecting the same resolution test target image and calculating the average position deviation by carrying a telecentric lens on a multispectral camera for a plurality of times after fine-adjusting until the average value of each offset in the offset X2 and Y2 of the X, Y axis and the rotation angle theta (Z2) of the Z direction is smaller than the set compensation threshold.
7. A prism-splitting multispectral camera sensor image plane matching device, which uses a multispectral prism to split light of different wave bands onto a plurality of corresponding image sensors, the method is used for matching the position relations of the plurality of image sensors so as to make the spatial positions and the definition of object information acquired by the plurality of image sensors consistent, and the method is characterized by comprising the following steps:
the six-degree-of-freedom adjusting table is used for adjusting the position of an image sensor to be matched, the image sensor to be matched is pre-fixed on the corresponding position of the multi-beam splitter prism, and the reference image sensor is fixed on the multi-beam splitter prism;
the XYZ scanning detection table is used for adjusting the positions of the projection device, the camera, the telecentric lens and the resolution test target;
the XYZ scanning detection table adjusts the positions of a projection device and a camera, so that the projection device projects light with different wave bands respectively to project grating fringes onto corresponding image sensors through a polygon prism, the camera collects grating fringe patterns reflected by each image sensor, three-dimensional reconstruction is carried out on the surface of each image sensor by using a phase deflection technique according to the grating fringe patterns collected by the camera, and the position deviation between each image sensor to be matched and a reference image sensor is obtained according to the three-dimensional reconstruction result; the six-degree-of-freedom adjusting table initially adjusts the position of each image sensor to be matched according to the deviation;
the XYZ scanning detection platform adjusts the positions of the telecentric lens and the resolution test target, so that a resolution test target image projected by the resolution test target passes through the telecentric lens to reach the multi-beam splitter prism and is collected by each image sensor; obtaining the position deviation between each image sensor to be matched and the reference image sensor according to a plurality of resolution test target images acquired by each image sensor; and the six-degree-of-freedom adjusting table finely adjusts the position of each image sensor to be matched according to the deviation.
8. The prism-splitting multispectral camera sensor image plane matching device of claim 7, wherein the six-degree-of-freedom adjustment stage comprises:
an XYZ axis translation stage for adjusting a X, Y, Z axis deviation between the image sensor to be matched and the reference image sensor;
the XYZ axis angular stage is used for adjusting angle offsets theta (X1), theta (Y1) and theta (Z1) in the X, Y, Z direction between the image sensor to be matched and the reference image sensor.
9. The prism-spectroscopic multispectral camera sensor image plane matching device of claim 7, wherein the XYZ scan detection stage comprises:
the XYZ-axis displacement platform is connected with the adapter plate;
the projection device, the camera, the telecentric lens and the resolution test target are arranged on the adapter plate;
the XYZ axis displacement stage is used to adjust the position of the projection device and camera, telecentric lens and resolution test target on the X, Y, Z axis.
10. The prism-splitting multispectral camera sensor image plane matching device of claim 7, further comprising:
the prism bonding structure comprises a prism fixing frame and a sensor bonding plate, wherein the prism fixing frame is used for fixing the multi-beam splitter prism, one side of the sensor bonding plate is fixed with an image sensor, and the other side of the sensor bonding plate is connected with the prism fixing frame;
the bonding plate with one side fixed with the reference image sensor is directly fixed at the corresponding position of the prism fixing frame, so that the multi-beam prism can irradiate the received light source to the reference image sensor after splitting the light source;
the adhesive plate with one side fixed with the image sensor to be matched is pre-fixed at the corresponding position of the prism fixing frame, so that the multi-beam prism can radiate the received light source to the reference image sensor after splitting the light;
the other side of the adhesive plate, on which the image sensor to be matched is fixed, is connected with the six-degree-of-freedom adjusting table, and the position of the image sensor to be matched is adjusted under the drive of the six-degree-of-freedom adjusting table;
after the position adjustment of the image sensor to be matched is completed, the adhesive plate with the image sensor to be matched fixed on one side is fixed on the prism fixing frame.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310566986.8A CN116539158A (en) | 2023-05-19 | 2023-05-19 | Image plane matching method and device for prism light-splitting multispectral camera sensor |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310566986.8A CN116539158A (en) | 2023-05-19 | 2023-05-19 | Image plane matching method and device for prism light-splitting multispectral camera sensor |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116539158A true CN116539158A (en) | 2023-08-04 |
Family
ID=87454052
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310566986.8A Pending CN116539158A (en) | 2023-05-19 | 2023-05-19 | Image plane matching method and device for prism light-splitting multispectral camera sensor |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116539158A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115079505A (en) * | 2022-05-31 | 2022-09-20 | 合肥埃科光电科技股份有限公司 | Prism light splitting multispectral camera matching calibration device and method based on Talbot effect |
CN117970593A (en) * | 2024-04-02 | 2024-05-03 | 江苏永鼎光电子技术有限公司 | Automatic prism sticking machine |
-
2023
- 2023-05-19 CN CN202310566986.8A patent/CN116539158A/en active Pending
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115079505A (en) * | 2022-05-31 | 2022-09-20 | 合肥埃科光电科技股份有限公司 | Prism light splitting multispectral camera matching calibration device and method based on Talbot effect |
CN115079505B (en) * | 2022-05-31 | 2024-05-10 | 合肥埃科光电科技股份有限公司 | Prism beam splitting multispectral camera matching calibration device and method based on Talbot effect |
CN117970593A (en) * | 2024-04-02 | 2024-05-03 | 江苏永鼎光电子技术有限公司 | Automatic prism sticking machine |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN116539158A (en) | Image plane matching method and device for prism light-splitting multispectral camera sensor | |
US6788210B1 (en) | Method and apparatus for three dimensional surface contouring and ranging using a digital video projection system | |
CN110514143B (en) | Stripe projection system calibration method based on reflector | |
US5878152A (en) | Depth from focal gradient analysis using object texture removal by albedo normalization | |
CN109341574B (en) | Micro-nano structure three-dimensional morphology high-speed detection method based on structured light | |
US10247933B2 (en) | Image capturing device and method for image capturing | |
JP5489897B2 (en) | Stereo distance measuring device and stereo distance measuring method | |
CA2955224C (en) | Systems, methods, and apparatuses for measuring deformation of a surface | |
KR102248197B1 (en) | Large reflector 3D surface shape measuring method by using Fringe Pattern Reflection Technique | |
CN111238403A (en) | Three-dimensional reconstruction method and device based on light field sub-aperture stripe image | |
KR20190052158A (en) | Promoting spectral measurements during semiconductor device fabrication | |
JP2005539256A (en) | System and method for detecting differences between composite images | |
CN113251953B (en) | Mirror included angle measuring device and method based on stereo deflection technology | |
CN116183568A (en) | High-fidelity reconstruction method and device for three-dimensional structured light illumination super-resolution microscopic imaging | |
JP2012112705A (en) | Surface shape measuring method | |
CN111916366A (en) | Wafer detection equipment | |
JP2538435B2 (en) | Fringe phase distribution analysis method and fringe phase distribution analyzer | |
Gastinger et al. | Next-generation test equipment for micro-production | |
JP4208565B2 (en) | Interferometer and measurement method having the same | |
CN102313524B (en) | Image acquiring device and method | |
JP4125113B2 (en) | Interfering device | |
CN113566740A (en) | Ultra-precise measurement device and method based on microscopic stereo deflection beam technology | |
Pan et al. | Subaperture stitching interferometry based on digital holography | |
JP2021089150A (en) | Fracture surface inspection device and fracture surface inspection method | |
KR102584209B1 (en) | 3d reconstruction method of integrated image using concave lens array |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |