CN114897947A - Thermal infrared and visible light image synchronous registration method based on time-space unification - Google Patents
Thermal infrared and visible light image synchronous registration method based on time-space unification Download PDFInfo
- Publication number
- CN114897947A CN114897947A CN202210366866.9A CN202210366866A CN114897947A CN 114897947 A CN114897947 A CN 114897947A CN 202210366866 A CN202210366866 A CN 202210366866A CN 114897947 A CN114897947 A CN 114897947A
- Authority
- CN
- China
- Prior art keywords
- visible light
- thermal infrared
- thread
- image
- light image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 230000001360 synchronised effect Effects 0.000 title claims abstract description 22
- 239000011159 matrix material Substances 0.000 claims abstract description 26
- 230000009466 transformation Effects 0.000 claims abstract description 14
- 238000001514 detection method Methods 0.000 claims abstract description 8
- 230000008569 process Effects 0.000 claims description 14
- 238000013500 data storage Methods 0.000 claims description 9
- 238000003672 processing method Methods 0.000 claims description 6
- 238000004364 calculation method Methods 0.000 claims description 5
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 claims description 3
- 229910052782 aluminium Inorganic materials 0.000 claims description 3
- 239000004744 fabric Substances 0.000 claims description 3
- 238000010276 construction Methods 0.000 claims description 2
- 230000004069 differentiation Effects 0.000 claims description 2
- 238000003708 edge detection Methods 0.000 claims description 2
- 238000007499 fusion processing Methods 0.000 abstract description 2
- 238000005070 sampling Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 238000009434 installation Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 238000010191 image analysis Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000036544 posture Effects 0.000 description 2
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 230000001627 detrimental effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a thermal infrared and visible light image synchronous registration method based on time-space unification, which comprises the steps of building a chessboard calibration plate, and collecting a thermal infrared image and a visible light image which comprise the chessboard calibration plate, wherein the thermal infrared image and the visible light image have unified reference time, and a plurality of groups of thermal infrared and visible light image pairs are obtained based on the collected images and the reference time; processing the thermal infrared and visible light image pair through sub-pixel edge corner detection to obtain the position coordinates of the calibration plate corner; and solving an optimal homography transformation matrix based on the position coordinates of the calibration plate corner points, wherein the optimal homography transformation matrix is used for realizing synchronous registration of the thermal infrared image and the visible light image. The invention can provide good prior work for the fusion processing of the thermal infrared and visible light images and ensure the scene consistency of multi-modal data.
Description
Technical Field
The invention relates to the technical field of thermal infrared and visible light image processing, in particular to a thermal infrared and visible light image synchronous registration method based on time-space unification.
Background
The image analysis processing comprises a plurality of tasks such as image segmentation, splicing, reconstruction and detection. At present, a method of comparing or fusing effective information among a plurality of images is becoming a mainstream in the field of image processing. Regardless of the image information of a single modality or a multi-modality, a phenomenon in which an observation region or an object is misaligned inevitably occurs. The image registration technology is an important link in the field of image processing, and aims to find the optimal spatial alignment between two or more images in the same scene, and the quality of the registration result directly influences the effect of subsequent work.
The existing image registration algorithms are mainly divided into two categories, namely a method based on gray scale statistics and a method based on feature matching. The image registration method based on feature matching has become the method with the widest application field and the highest use frequency at present due to the higher operation speed and the better robustness. Algorithms like SIFT and SURF have good effect in single-mode images and are widely researched and applied. However, in the practical application process, especially when the sensor is in a motion state, the single-mode image data is often susceptible to a special environment, which causes a problem of inaccurate and incomplete information content. On one hand, when the lighting condition is not good enough, the characteristic information embodied by the visible light image is greatly reduced, and the thermal infrared image has the advantage of illumination interference resistance; on the other hand, the existing thermal infrared image is generally low in resolution and fuzzy in edge information of the detected object, while the visible light has high resolution and contains abundant characteristic information. Therefore, the unified representation of multi-modal data is formed by utilizing the respective imaging characteristics of visible light and thermal infrared sensing, and the method has a great promoting effect on subsequent image analysis and processing tasks. However, it is still difficult to actively find a feature relationship that can be correctly matched by means of the current advanced algorithm for the multi-modal data of the thermal infrared and visible light images. And because of differences of imaging resolution, focal length and the like between sensors, rigid body or affine transformation cannot be used simply to serve as a transformation model, and high-precision registration of two types of images cannot be achieved.
Meanwhile, the generalized image registration should also include temporal unification. Due to the different signal sampling frequencies and data storage times of different sensors, the data collected at the same time are not from the same real scene. This will result in a mismatch in the content of the multi-modal data fusion, creating more noise and extraneous terms than single-modal data, which is detrimental to highlighting the advantages of each modal instance.
Disclosure of Invention
In order to solve the technical problems in the prior art, the invention provides a thermal infrared and visible light image synchronous registration method based on time and space unification, which can provide good prior work for thermal infrared and visible light image fusion processing and ensure scene consistency of multi-mode data.
In order to achieve the technical purpose, the invention provides the following technical scheme:
a thermal infrared and visible light image synchronous registration method based on time-space unification comprises the following steps:
the method comprises the steps of building a chessboard calibration plate, collecting thermal infrared images and visible light images comprising the chessboard calibration plate, wherein the thermal infrared images and the visible light images have uniform reference time, and acquiring a plurality of groups of thermal infrared and visible light image pairs based on the collected images and the reference time;
processing the thermal infrared and visible light image pair through sub-pixel edge corner detection to obtain the position coordinates of the calibration plate corner;
and solving an optimal homography transformation matrix based on the position coordinates of the calibration plate corner points, wherein the optimal homography transformation matrix is used for realizing synchronous registration of the thermal infrared image and the visible light image.
Optionally, the chessboard calibration plate includes a white lattice background and black lattices, wherein the white lattice background uses an aluminum plate, and the black lattices use a black rubber cloth.
Optionally, when the thermal infrared image and the visible light image are collected, the reference time of the thermal infrared image and the reference time of the visible light image are unified through a multithreading cross processing method.
Optionally, the process of unifying the reference time by the multithread cross processing method includes:
respectively acquiring data streams through an infrared thermodynamics instrument and a visible light camera;
distributing threads to the data streams collected by the infrared thermal instrument and the visible light camera, and acquiring a first thread and a second thread, wherein the first thread is a processing thread for collecting the data streams by the infrared thermal instrument, and the second thread is a processing thread for collecting the data streams by the visible light camera;
when the first thread or the second thread reaches the data storage link, the thread reaching the data storage link starts to wait until the other thread also reaches the data storage link;
when the first thread and the second thread both reach a data storage link, caching data streams in the first thread and the second thread;
extracting the cached data streams through a third thread, respectively converting the data streams into image data, and simultaneously releasing the cache;
after releasing the cache, caching the data streams collected in real time in the first thread and the second thread again, and obtaining the thermal infrared image and the visible light image through the processes of circularly caching, extracting, converting and releasing the cache until the threads are finished.
Optionally, the process of acquiring a plurality of sets of pairs of thermal infrared and visible light images includes:
and correspondingly integrating the thermal infrared image and the visible light image at the same reference time to obtain a plurality of groups of thermal infrared and visible light image pairs.
Optionally, the process of obtaining the position coordinates of the calibration plate corner point includes:
processing the thermal infrared and visible light image pair through an edge detection operator to obtain the edge position of the checkerboard whole pixel;
based on the edge position, selecting an edge pixel point as a center of a sub-pixel point, based on the center of the sub-pixel point, obtaining a normal line, based on the normal line and a single-pixel distance, obtaining the sub-pixel point, and based on a gray value of the sub-pixel point, constructing a fitting function, wherein the fitting function is an arc tangent function;
solving the fitting function by a least square method to obtain an edge function model;
calculating the edge function model through secondary differentiation to obtain sub-pixel edge points;
and fitting edge lines through sub-pixel edge points in four directions around the thick angular point to obtain an intersection point, wherein the intersection point is used as the position coordinate of the calibration plate angular point.
Optionally, the gray value of the sub-pixel point is calculated by linear interpolation in the horizontal direction and the vertical direction.
Optionally, the process of solving the optimal homography transformation matrix includes:
matching position coordinates of calibration board corner points in the thermal infrared image pair and the visible light image pair to obtain a coordinate pair set, and randomly selecting a plurality of coordinate pairs as an inner point set;
constructing an initial homography matrix by a least square method based on the interior point set;
calculating projection errors of coordinate pairs in the coordinate pair set except for the inner point set through the initial homography matrix, judging errors of projection error calculation results, adding the coordinate pairs into the inner point set when the projection errors are smaller than a threshold value, and not adding the coordinate pairs into the inner point set when the projection errors are larger than the threshold value;
counting the number of coordinate pairs in the updated inner point set;
and repeating the process from the construction of the homography matrix to the statistics of the number of the coordinate pairs until the iteration times are reached, and obtaining the optimal homography matrix, wherein the optimal homography matrix is the homography matrix with the maximum number of the coordinate pairs in the corresponding inner point set.
The invention has the following technical effects:
(1) the invention adopts multi-thread cross control, introduces identifiers and a global monitoring mechanism, effectively reduces the acquisition time difference of each sensor and ensures the synchronization of the imaging timestamp in an ultra-low time delay range. Meanwhile, the method is not limited to early-stage registration work and can also be used in real-time application;
(2) the invention utilizes the checkerboard edge gray distribution to fit the corner position obtained by the edge line, replaces the characteristic points to realize the unification of the space scene, effectively reduces the errors caused by the extraction and the matching of the characteristic points, can improve the registration precision and the registration speed, and has simple method and easy implementation.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings required in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a schematic flow chart of the thermal infrared and visible light image synchronous registration method based on time-space unification according to the present invention;
FIG. 2 is a flowchart illustrating an image acquisition process according to an example of an infrared thread;
FIG. 3 is a schematic view of an infrared calorimeter and a visible light camera mounting platform of the present invention;
FIG. 4 is a schematic diagram of a 5 × 5 neighborhood of edge pixels according to the present invention;
FIG. 5 is a schematic diagram of bi-directional linear gray scale interpolation for sub-pixel points according to the present invention;
fig. 6 is a visible image and infrared images before and after registration according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to solve the problems existing in the prior art, the invention provides the following scheme:
as shown in fig. 1, the present invention provides a time-space unified thermal infrared and visible light image synchronous registration method, which comprises the following steps: and synchronously storing data stream information by adopting a multithread cross control infrared thermodynamics instrument and a visible light camera, preprocessing the formed images, accurately positioning checkerboard angular points in the two images by adopting a sub-pixel edge angular point detection method, and finally solving an optimal homography transformation matrix by utilizing a random sampling consistency algorithm to complete registration.
The technical route is as follows: firstly, unifying reference time of data collected by an infrared thermodynamic instrument and a visible light camera based on a multithread cross processing method; then, a chessboard pattern calibration plate with different black and white lattice emissivity is built, the installation positions of the infrared thermal instrument and the visible light camera are adjusted and fixed, and the large-field-angle sensor is ensured to completely contain scene information collected by another sensor; further, two devices are used for collecting images of the calibration plate at different positions, angles and postures at the same time, and noise filtering pretreatment is carried out; then, utilizing sub-pixel edge angular point detection to obtain the position coordinates of the calibration plate angular points in each group of images and storing the position coordinates in sequence; and finally, solving the optimal homography transformation matrix by combining a least square method based on a robust method with random sampling consistency to finish registration.
Example 1
In this embodiment, a time-space unified thermal infrared and visible light image synchronous registration method includes the following steps:
step 1: as shown in fig. 2, an image acquisition flow chart taking an infrared thread as an example unifies reference time of data acquisition by an infrared thermal instrument and a visible light camera by using a multithread cross processing method, which comprises the following specific steps:
step 1.1: starting the multi-thread module, and respectively starting to acquire data streams by the infrared thermodynamics instrument and the visible light camera;
step 1.2: allocating and judging a first identifier 1 and a second identifier 2 to a thermal infrared and visible light processing thread, and setting an initial value as False;
step 1.3: establishing a global monitoring mechanism, converting an identifier of a certain thread into True when the certain thread reaches a data flow information storage link, and starting waiting until another thread also reaches the data flow information storage link after the identifier of the other thread is still False;
step 1.4: when the identifiers 1 and 2 are True at the same time, the two threads start to store the frame information in the current data stream into a cache;
step 1.5: after the frame information is stored, converting the corresponding identifier into False;
step 1.6: under the condition of not influencing the synchronous flow, extracting frame information from the cache by another thread, storing the frame information as image data, and releasing the cache;
step 1.7: and returning to the step 1.3, storing the information of the next data stream, and circularly running until the thread is finished.
Step 2: and (3) building checkerboard calibration plates with different black and white lattice emissivity, and selecting a smooth aluminum plate with lower emissivity as a white lattice background and a rough black rubber cloth with higher emissivity as a black lattice. The visible light camera can still judge the angular point according to the color information; meanwhile, the infrared thermodynamics instrument can passively detect a checkerboard image with distinct corners at the same temperature, so that corner fuzzy errors caused by active heating operation are eliminated, and effective synchronous acquisition of the two sensors is realized;
and step 3: connecting two sensor devices to the same industrial personal computer loaded with the multithread cross control acquisition method in the step 1, adjusting and fixing the installation positions of the two sensors by using the infrared thermal instrument and visible light camera installation platform shown in fig. 3, and ensuring that the large-field-angle sensor can completely contain scene information acquired by the other sensor. Meanwhile, two devices are used for collecting images of the calibration plate at different positions, angles and postures, recording a plurality of groups of thermal infrared and visible light image pairs, and performing noise filtering pretreatment;
and 4, step 4: the method comprises the following steps of utilizing sub-pixel edge corner point detection to obtain position coordinates of a calibration plate corner point in each group of images and storing the position coordinates in sequence, wherein the method comprises the following specific steps:
step 4.1: preliminarily acquiring the edge position of the whole pixel of the checkerboard by using a Canny operator;
step 4.2: performing corner detection on the edge positions of pixels, selecting any pixel point A in four edge directions of a certain corner O as a center of a sub-pixel point, and using a connecting line of two edge pixel points at the outermost side in a 5 x 5 neighborhood as a tangent line in the neighborhood as shown in FIG. 4 to obtain a horizontal direction included angle theta and a vertical direction normal F;
step 4.3: two sub-pixel points a are respectively arranged at two ends of the central pixel point A along the normal direction by a single-pixel distance 1 ,a 2 And a 3 ,a 4 Obtaining sub-pixel points in 5 normal directions;
step 4.4: as shown in FIG. 5, with a 1 For example, let the coordinates of the sub-pixel close to the center in the neighborhood of 4 whole pixels be (x, y), and the distance to the center pixel (0, 0) be 1, then x is cos θ, and y is sin θ. First, linear interpolation of the gray levels of the pixels (0, 0) and (1, 0) in the horizontal direction is obtainedAnd linear interpolation of the gray levels at the pixels (0, 1) and (1, 1) in the horizontal directionRespectively as follows:
in the formula I 00 、I 01 、I 10 、I 11 The gray scale values of the pixel coordinates (0, 0), (0, 1), (1, 0), and (1, 1) are represented, respectively. Then linear interpolation calculation in the vertical direction is adopted to obtain the gray value I of the sub-pixel point xy Comprises the following steps:
step 4.5: as shown in FIG. 5, with a 2 For example, similar to step 4.4, let the coordinates of the sub-pixel far from the center in the neighborhood of 9 whole pixels be (m, n), and the distance to the center pixel (0, 0) be 2, then m is 2cos θ, and n is 2sin θ. First, linear interpolation of the gray levels of the pixels (0, 0) and (2, 0) in the horizontal direction is obtainedAnd linear interpolation of the gray levels at the pixels (0, 2) and (2, 2) in the horizontal directionRespectively as follows:
in the formula I 02 、I 20 、I 22 The gradation values of the pixel coordinates (0, 2), (2, 0), and (2, 2) are represented, respectively. Then linear interpolation calculation in the vertical direction is adopted to obtain the gray value I of the sub-pixel point mn Comprises the following steps:
step 4.6: fitting the actual gray distribution condition of the edge by using the gray values of the sub-pixel points in the 5 normal directions and using an arc tangent function, wherein the fitting function is represented as follows:
y=a 1 arctan(a 2 x+a 3 )+a 4
in the formula, y represents the gray value of the sub-pixel point, x represents the vector distance from the sub-pixel point to the A point, and a 1 Means to enlarge the amplitude of the arctangent function to the original a 1 Multiple, a 2 、a 3 Representing the degree of curve bending of the arctan function, a 4 Representing the offset of the arctan function in the vertical direction. Solving for a using least squares fitting 1 ,a 2 ,a 3 ,a 4 Obtaining an edge function model;
step 4.7: defining the sub-pixel edge point E at the coordinate point with the maximum function slope, and calculating the quadratic derivative of the edge function:
the following can be obtained:the complete offset of the sub-pixel edge point relative to the point A can be obtained by combining the position of the sub-pixel edge point E in the normal direction and theta, wherein the horizontal offset isA vertical offset ofBased on the coordinate of point A, the method is known becauseThus, the final coordinate of the sub-pixel edge point E can be obtained;
step 4.8: and taking other pixel points in the other three edge directions of the thick corner point O, obtaining corresponding sub-pixel edge points according to the steps 4.2 to 4.7, fitting the sub-pixel edge points in the opposite directions to obtain two crossed edge lines, and obtaining intersection points as corner point coordinates and storing the intersection points in sequence.
And 5: combining the set S of sub-pixel coordinate pairs, using a robust method based on random sampling consistency, and combining a least square method to solve an optimal homography transformation matrix, as shown in fig. 6, where (a) in fig. 6 is a visible light image, and (b) in fig. 6 and (c) in fig. 6 are infrared images before and after registration, respectively, and the specific steps are as follows:
step 5.1: randomly selecting n-4 pairs of matching feature points from the initial matching pair set S as an interior point set S i Converting the over-determined equation into a nonlinear optimization problem, and estimating an initial homography matrix H by least square fitting i ;
Step 5.2: using H determined in step 5.1 i Calculating the rest matching point pairs in the set S, and adding the matching point pairs to the set S if the projection error of a certain characteristic point is less than a threshold value t which is 5 i The other is regarded as an outlier;
step 5.3: set of records S i The number of the middle matching point pairs;
step 5.4: repeating the steps until the iteration number is more than K to 200;
step 5.5: and selecting the model with the largest number of interior points as the homography matrix H to be solved. And (4) realizing synchronous registration of the thermal infrared image and the visible light image through an optimal homography transformation matrix.
The foregoing illustrates and describes the principles, general features, and advantages of the present invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are described in the specification and illustrated only to illustrate the principle of the present invention, but that various changes and modifications may be made therein without departing from the spirit and scope of the present invention, which fall within the scope of the invention as claimed. The scope of the invention is defined by the appended claims and equivalents thereof.
Claims (8)
1. A thermal infrared and visible light image synchronous registration method based on time-space unification is characterized by comprising the following steps:
the method comprises the steps of building a chessboard calibration plate, collecting thermal infrared images and visible light images comprising the chessboard calibration plate, wherein the thermal infrared images and the visible light images have uniform reference time, and acquiring a plurality of groups of thermal infrared and visible light image pairs based on the collected images and the reference time;
processing the thermal infrared and visible light image pair through sub-pixel edge corner detection to obtain the position coordinates of the calibration plate corner;
and solving an optimal homography transformation matrix based on the position coordinates of the calibration plate corner points, wherein the optimal homography transformation matrix is used for realizing synchronous registration of the thermal infrared image and the visible light image.
2. The synchronous registration method based on temporal-spatial unification of thermal infrared and visible light images according to claim 1, wherein:
the chessboard calibration plate comprises a white lattice background and black lattices, wherein the white lattice background uses an aluminum plate, and the black lattices use black rubber cloth.
3. The synchronous registration method based on temporal-spatial unification of thermal infrared and visible light images according to claim 1, wherein:
when the thermal infrared image and the visible light image are collected, the reference time of the thermal infrared image and the reference time of the visible light image are unified through a multithreading cross processing method.
4. The method for synchronously registering the thermal infrared image and the visible light image based on the time-space unification as claimed in claim 1, wherein:
the process of unifying the reference time by the multithread cross processing method comprises the following steps:
respectively acquiring data streams through an infrared thermodynamics instrument and a visible light camera;
distributing threads to the data streams collected by the infrared thermal instrument and the visible light camera, and acquiring a first thread and a second thread, wherein the first thread is a processing thread for collecting the data streams by the infrared thermal instrument, and the second thread is a processing thread for collecting the data streams by the visible light camera;
when the first thread or the second thread reaches the data storage link, the thread reaching the data storage link starts to wait until the other thread also reaches the data storage link;
when the first thread and the second thread both reach a data storage link, caching data streams in the first thread and the second thread;
extracting the cached data streams through a third thread, respectively converting the data streams into image data, and simultaneously releasing the cache;
after releasing the cache, caching the data streams collected in real time in the first thread and the second thread again, and obtaining the thermal infrared image and the visible light image through the processes of circularly caching, extracting, converting and releasing the cache until the threads are finished.
5. The synchronous registration method based on temporal-spatial unification of thermal infrared and visible light images according to claim 1, wherein:
the process of acquiring sets of pairs of thermal infrared and visible light images includes:
and correspondingly integrating the thermal infrared image and the visible light image at the same reference time to obtain a plurality of groups of thermal infrared and visible light image pairs.
6. The synchronous registration method based on temporal-spatial unification of thermal infrared and visible light images according to claim 1, wherein:
the process of obtaining the position coordinates of the calibration plate corner points comprises the following steps:
processing the thermal infrared and visible light image pair through an edge detection operator to obtain the edge position of the checkerboard whole pixel;
based on the edge position, selecting an edge pixel point as a center of a sub-pixel point, based on the center of the sub-pixel point, obtaining a normal line, based on the normal line and a single-pixel distance, obtaining the sub-pixel point, and based on a gray value of the sub-pixel point, constructing a fitting function, wherein the fitting function is an arc tangent function;
solving the fitting function by a least square method to obtain an edge function model;
calculating the edge function model through secondary differentiation to obtain sub-pixel edge points;
and fitting edge lines through sub-pixel edge points in four directions around the thick angular point to obtain an intersection point, wherein the intersection point is used as the position coordinate of the calibration plate angular point.
7. The synchronous registration method based on temporal-spatial unification of thermal infrared and visible light images according to claim 1, wherein:
the gray values of the sub-pixel points are obtained by calculation through linear interpolation in the horizontal direction and the vertical direction.
8. The synchronous registration method based on temporal-spatial unification of thermal infrared and visible light images according to claim 1, wherein:
the process of solving the optimal homography transformation matrix comprises the following steps:
matching position coordinates of calibration board corner points in the thermal infrared image pair and the visible light image pair to obtain a coordinate pair set, and randomly selecting a plurality of coordinate pairs as an inner point set;
constructing an initial homography matrix by a least square method based on the interior point set;
calculating projection errors of coordinate pairs in the coordinate pair set except for the interior point set through the initial homography matrix, judging errors of projection error calculation results, adding the coordinate pairs into the interior point set when the projection errors are smaller than a threshold value, and not adding the coordinate pairs into the interior point set when the projection errors are larger than the threshold value;
counting the number of coordinate pairs in the updated inner point set;
and repeating the process from the construction of the homography matrix to the statistics of the number of the coordinate pairs until the iteration times are reached, and obtaining the optimal homography matrix, wherein the optimal homography matrix is the homography matrix with the maximum number of the coordinate pairs in the corresponding inner point set.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210366866.9A CN114897947A (en) | 2022-04-08 | 2022-04-08 | Thermal infrared and visible light image synchronous registration method based on time-space unification |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210366866.9A CN114897947A (en) | 2022-04-08 | 2022-04-08 | Thermal infrared and visible light image synchronous registration method based on time-space unification |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114897947A true CN114897947A (en) | 2022-08-12 |
Family
ID=82715216
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210366866.9A Pending CN114897947A (en) | 2022-04-08 | 2022-04-08 | Thermal infrared and visible light image synchronous registration method based on time-space unification |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114897947A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116718165A (en) * | 2023-06-08 | 2023-09-08 | 中国矿业大学 | Combined imaging system based on unmanned aerial vehicle platform and image enhancement fusion method |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017029784A1 (en) * | 2015-08-19 | 2017-02-23 | 日本電気株式会社 | Image position matching system, method and recording medium |
CN107170001A (en) * | 2017-04-25 | 2017-09-15 | 北京海致网聚信息技术有限公司 | Method and apparatus for carrying out registration to image |
CN111260731A (en) * | 2020-01-10 | 2020-06-09 | 大连理工大学 | Checkerboard sub-pixel level corner point self-adaptive detection method |
-
2022
- 2022-04-08 CN CN202210366866.9A patent/CN114897947A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017029784A1 (en) * | 2015-08-19 | 2017-02-23 | 日本電気株式会社 | Image position matching system, method and recording medium |
CN107170001A (en) * | 2017-04-25 | 2017-09-15 | 北京海致网聚信息技术有限公司 | Method and apparatus for carrying out registration to image |
CN111260731A (en) * | 2020-01-10 | 2020-06-09 | 大连理工大学 | Checkerboard sub-pixel level corner point self-adaptive detection method |
Non-Patent Citations (2)
Title |
---|
YIHAO C.等: "MRSI: A multimodal proximity remote sensing data set for environment perception in rail transit", pages 5530 - 5556 * |
梁晋 等: "3D反求技术", 武汉:华中科技大学出版社, pages: 78 - 79 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116718165A (en) * | 2023-06-08 | 2023-09-08 | 中国矿业大学 | Combined imaging system based on unmanned aerial vehicle platform and image enhancement fusion method |
CN116718165B (en) * | 2023-06-08 | 2024-05-14 | 中国矿业大学 | Combined imaging system based on unmanned aerial vehicle platform and image enhancement fusion method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108510530B (en) | Three-dimensional point cloud matching method and system | |
CN109903227B (en) | Panoramic image splicing method based on camera geometric position relation | |
WO2019105044A1 (en) | Method and system for lens distortion correction and feature extraction | |
CN111339951A (en) | Body temperature measuring method, device and system | |
CN109903341A (en) | Join dynamic self-calibration method outside a kind of vehicle-mounted vidicon | |
CN106595528A (en) | Digital speckle-based telecentric microscopic binocular stereoscopic vision measurement method | |
CN105608671A (en) | Image connection method based on SURF algorithm | |
CN106657789A (en) | Thread panoramic image synthesis method | |
CN107392849B (en) | Target identification and positioning method based on image subdivision | |
CN109584238A (en) | A kind of bow net operation conditions on-line detecting system and method based on stereoscopic vision | |
CN109544628A (en) | A kind of the accurate reading identifying system and method for pointer instrument | |
CN107358628B (en) | Linear array image processing method based on target | |
CN106504290A (en) | A kind of high-precision video camera dynamic calibrating method | |
CN106447601A (en) | Unmanned aerial vehicle remote image mosaicing method based on projection-similarity transformation | |
CN110763204A (en) | Planar coding target and pose measurement method thereof | |
CN112541932A (en) | Multi-source image registration method based on different focal length transformation parameters of dual-optical camera | |
JP4694624B2 (en) | Image correction apparatus and method, and computer program | |
JPWO2010095460A1 (en) | Image processing system, image processing method, and image processing program | |
CN107730558A (en) | 360 ° of vehicle-running recording systems and method based on two-way fish eye camera | |
CN106056121A (en) | Satellite assembly workpiece fast-identification method based on SIFT image feature matching | |
CN116205993A (en) | Double-telecentric lens high-precision calibration method for 3D AOI | |
CN114998447A (en) | Multi-view vision calibration method and system | |
CN116129037A (en) | Visual touch sensor, three-dimensional reconstruction method, system, equipment and storage medium thereof | |
CN114897947A (en) | Thermal infrared and visible light image synchronous registration method based on time-space unification | |
CN117115272A (en) | Telecentric camera calibration and three-dimensional reconstruction method for precipitation particle multi-angle imaging |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20220812 |