CN114897947A - A Synchronous Registration Method for Thermal Infrared and Visible Light Images Based on Unity of Time and Space - Google Patents

A Synchronous Registration Method for Thermal Infrared and Visible Light Images Based on Unity of Time and Space Download PDF

Info

Publication number
CN114897947A
CN114897947A CN202210366866.9A CN202210366866A CN114897947A CN 114897947 A CN114897947 A CN 114897947A CN 202210366866 A CN202210366866 A CN 202210366866A CN 114897947 A CN114897947 A CN 114897947A
Authority
CN
China
Prior art keywords
visible light
thermal infrared
thread
image
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210366866.9A
Other languages
Chinese (zh)
Inventor
吴澄
汪一鸣
盛洁
张瑾
牛伟龙
陈一豪
王庆亮
梅震琨
何印
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Rail Transit Group Co ltd
Suzhou University
Original Assignee
Suzhou Rail Transit Group Co ltd
Suzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Rail Transit Group Co ltd, Suzhou University filed Critical Suzhou Rail Transit Group Co ltd
Priority to CN202210366866.9A priority Critical patent/CN114897947A/en
Publication of CN114897947A publication Critical patent/CN114897947A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a thermal infrared and visible light image synchronous registration method based on time-space unification, which comprises the steps of building a chessboard calibration plate, and collecting a thermal infrared image and a visible light image which comprise the chessboard calibration plate, wherein the thermal infrared image and the visible light image have unified reference time, and a plurality of groups of thermal infrared and visible light image pairs are obtained based on the collected images and the reference time; processing the thermal infrared and visible light image pair through sub-pixel edge corner detection to obtain the position coordinates of the calibration plate corner; and solving an optimal homography transformation matrix based on the position coordinates of the calibration plate corner points, wherein the optimal homography transformation matrix is used for realizing synchronous registration of the thermal infrared image and the visible light image. The invention can provide good prior work for the fusion processing of the thermal infrared and visible light images and ensure the scene consistency of multi-modal data.

Description

一种基于时空间统一的热红外和可见光图像同步配准方法A Synchronous Registration Method for Thermal Infrared and Visible Light Images Based on Unity of Time and Space

技术领域technical field

本发明涉及热红外与可见光图像处理技术领域,特别涉及一种基于时空间统一的热红外和可见光图像同步配准方法。The invention relates to the technical field of thermal infrared and visible light image processing, in particular to a synchronous registration method of thermal infrared and visible light images based on the unity of time and space.

背景技术Background technique

图像分析处理包括图像分割、拼接、重构和检测等多种任务。目前通过比较或融合多个图像之间有效信息的方法正逐渐成为图像处理领域内的主流。而无论是单模态还是多模态的图像信息,都不可避免地存在观察区域或对象错位的现象。图像配准技术作为图像处理领域的重要环节,目的就是找到同一场景下两个或多个图像之间最佳的空间对齐,其配准结果的优劣将直接影响着后续工作的效果。Image analysis processing includes various tasks such as image segmentation, stitching, reconstruction, and detection. At present, the method of comparing or fusing effective information between multiple images is gradually becoming the mainstream in the field of image processing. Whether it is single-modal or multi-modal image information, there is inevitably the phenomenon of dislocation of the observation area or object. As an important link in the field of image processing, image registration technology aims to find the best spatial alignment between two or more images in the same scene. The quality of the registration results will directly affect the effect of subsequent work.

现有的图像配准算法主要分为基于灰度统计的方法和基于特征匹配的方法两大类。其中基于特征匹配的图像配准方法因其具有较高的运算速度和较好的鲁棒性,已经成为了当前应用领域最广、使用频率最高的方法。类似SIFT、SURF的算法在单模态图像中拥有很好的效果而得到了广泛地研究与应用。但是单模态的图像数据在实际应用过程中,特别是在传感器处于运动状态时,往往容易受到特殊环境的影响,导致信息内容不精确、不完备的问题。一方面,当照明条件不够良好时,可见光图像体现的特征信息将大幅减少,而热红外图像则具有抗光照干扰的优势;另一方面,现有热红外图像一般都是较低的分辨率且探测对象的边缘信息模糊,而可见光具有高分辨率,包含丰富的特征信息。由此可见,利用可见光与热红外传感各自的成像特性,形成多模态数据的统一表征,对于后续图像分析处理任务有很大的推动作用。然而,针对热红外和可见光图像的多模态数据,依靠目前较先进的算法仍难以主动找出能够正确匹配的特征关系。而且因为传感器之间成像分辨率和焦距等差异,也不能简单地使用刚体或仿射变换来充当变换模型,无法实现两类图像的高精度配准。The existing image registration algorithms are mainly divided into two categories: methods based on grayscale statistics and methods based on feature matching. Among them, the image registration method based on feature matching has become the most widely used and most frequently used method because of its high operation speed and good robustness. Algorithms like SIFT and SURF have good results in single-modal images and have been widely studied and applied. However, in the actual application process of single-modal image data, especially when the sensor is in motion, it is often easily affected by special environments, resulting in inaccurate and incomplete information content. On the one hand, when the lighting conditions are not good enough, the feature information reflected in the visible light image will be greatly reduced, while the thermal infrared image has the advantage of resisting light interference; on the other hand, the existing thermal infrared images are generally of low resolution and The edge information of the detected object is blurred, while visible light has high resolution and contains rich feature information. It can be seen that the use of the respective imaging characteristics of visible light and thermal infrared sensors to form a unified representation of multi-modal data will greatly promote the subsequent image analysis and processing tasks. However, for the multi-modal data of thermal infrared and visible light images, it is still difficult to actively find the feature relationship that can be correctly matched with the current more advanced algorithms. Moreover, because of the differences in imaging resolution and focal length between sensors, rigid body or affine transformation cannot simply be used as a transformation model, and high-precision registration of the two types of images cannot be achieved.

同时,广义的图像配准还应包括对时间的统一。由于不同的传感器有着各自不同的信号采样频率和数据储存时间,会使得同一时刻采集到的数据并非源自真实的相同场景。这将导致多模态数据融合的内容不匹配,产生相比于单模态数据更多的噪声和无关项,不利于凸显各模态实例的优势。At the same time, the generalized image registration should also include the unification of time. Since different sensors have different signal sampling frequencies and data storage times, the data collected at the same time may not originate from the same real scene. This will cause the content of multimodal data fusion to be mismatched, resulting in more noise and irrelevant items than single-modal data, which is not conducive to highlighting the advantages of each modal instance.

发明内容SUMMARY OF THE INVENTION

为解决上述现有技术中所存在的上述技术问题,本发明提供了一种基于时空间统一的热红外和可见光图像同步配准方法,本发明能够为给热红外和可见光图像融合处理提供良好的先验工作,保证多模态数据的场景一致性。In order to solve the above-mentioned technical problems existing in the above-mentioned prior art, the present invention provides a synchronous registration method of thermal infrared and visible light images based on the unity of time and space. A priori works to ensure scene consistency for multimodal data.

为了实现上述技术目的,本发明提供了如下技术方案:In order to realize the above-mentioned technical purpose, the present invention provides the following technical solutions:

一种基于时空间统一的热红外和可见光图像同步配准方法,包括:A synchronous registration method for thermal infrared and visible light images based on time-space unity, comprising:

搭建棋盘标定板,采集包含有棋盘标定板的热红外图像和可见光图像,其中,热红外图像及可见光图像具有统一的基准时间,基于采集的图像及基准时间,获取若干组热红外和可见光图像对;Build a checkerboard calibration board and collect thermal infrared images and visible light images including the checkerboard calibration board. The thermal infrared images and visible light images have a unified reference time. Based on the collected images and reference time, several sets of thermal infrared and visible light image pairs are obtained. ;

通过亚像素边缘角点检测对热红外和可见光图像对进行处理,得到标定板角点的位置坐标;The thermal infrared and visible light image pairs are processed through sub-pixel edge corner detection to obtain the position coordinates of the corners of the calibration board;

基于标定板角点的位置坐标求解最佳单应性变换矩阵,其中所述最佳单应性变换矩阵以实现热红外和可见光图像同步配准。An optimal homography transformation matrix is obtained based on the position coordinates of the corner points of the calibration plate, wherein the optimal homography transformation matrix is used to achieve simultaneous registration of thermal infrared and visible light images.

可选的,所述棋盘标定板包括白格背景及黑格,其中所述白格背景使用铝板,所述黑格使用黑色橡胶布。Optionally, the checkerboard calibration board includes a white grid background and a black grid, wherein the white grid background uses an aluminum plate, and the black grid uses a black rubber cloth.

可选的,采集热红外图像和可见光图像时,通过多线程交叉处理方法统一热红外图像及可见光图像的基准时间。Optionally, when the thermal infrared image and the visible light image are collected, the reference times of the thermal infrared image and the visible light image are unified through a multi-threaded cross-processing method.

可选的,通过多线程交叉处理方法统一基准时间的过程包括:Optionally, the process of unifying the reference time through the multi-thread cross processing method includes:

通过红外热力仪及可见光摄像机分别采集数据流;Collect data streams through infrared thermometer and visible light camera respectively;

对所述红外热力仪及可见光摄像机采集数据流分配线程,获取第一线程及第二线程,其中第一线程为红外热力仪采集数据流的处理线程,第二线程为可见光摄像机采集数据流的处理线程;Collecting data streams for the infrared thermometer and the visible light camera and assigning threads to obtain a first thread and a second thread, wherein the first thread is the processing thread for the infrared thermometer to collect the data stream, and the second thread is the processing thread for the visible light camera to collect the data stream thread;

当所述第一线程或者第二线程达到数据存储环节时,达到数据存储环节的线程开始等待,直到另一线程也到达数据存储环节;When the first thread or the second thread reaches the data storage link, the thread that reaches the data storage link starts to wait until another thread also reaches the data storage link;

当所述第一线程及第二线程都到达数据存储环节时,将第一线程及第二线程中的数据流进行缓存;When both the first thread and the second thread reach the data storage link, cache the data streams in the first thread and the second thread;

通过第三线程将缓存的数据流进行提取并分别转换为图像数据,同时释放缓存;Extract the cached data stream through the third thread and convert it into image data respectively, and release the cache at the same time;

在释放缓存后,再次对第一线程及第二线程中的实时采集的数据流进行缓存,通过循环缓存、提取、转换及释放缓存的过程,直到线程结束,得到热红外图像和可见光图像。After the cache is released, the real-time collected data streams in the first thread and the second thread are cached again, and the thermal infrared image and the visible light image are obtained through the process of circular cache, extraction, conversion and release of the cache until the thread ends.

可选的,获取若干组热红外和可见光图像对的过程包括:Optionally, the process of acquiring several sets of thermal infrared and visible light image pairs includes:

将同一基准时间下的热红外图像及可见光图像进行对应整合,得到若干组热红外和可见光图像对。The thermal infrared images and visible light images at the same reference time are correspondingly integrated to obtain several sets of thermal infrared and visible light image pairs.

可选的,得到标定板角点的位置坐标的过程包括:Optionally, the process of obtaining the position coordinates of the corner points of the calibration board includes:

通过边缘检测算子对热红外和可见光图像对进行处理,获取棋盘格整像素边缘位置;The thermal infrared and visible light image pairs are processed by the edge detection operator to obtain the edge position of the whole pixel of the checkerboard;

基于边缘位置,选取边缘像素点作为亚像素点中心,基于亚像素点中心,获取法线,基于法线及单像素距离,获取亚像素点,基于亚像素点的灰度值,构建拟合函数,其中拟合函数为反正切函数;Based on the edge position, select the edge pixel as the center of the sub-pixel point, obtain the normal based on the center of the sub-pixel point, obtain the sub-pixel point based on the normal line and single-pixel distance, and construct the fitting function based on the gray value of the sub-pixel point , where the fitting function is the arctangent function;

通过最小二乘法求解拟合函数,得到边缘函数模型;The fitting function is solved by the least square method, and the edge function model is obtained;

通过二次微分对边缘函数模型进行计算,得到亚像素边缘点;Calculate the edge function model through quadratic differentiation to obtain sub-pixel edge points;

通过粗角点周围四个方向的亚像素边缘点拟合边缘线,获取交点,其中交点作为标定板角点的位置坐标。The edge line is fitted by the sub-pixel edge points in four directions around the thick corner point, and the intersection point is obtained, where the intersection point is used as the position coordinate of the calibration board corner point.

可选的,亚像素点的灰度值通过水平方向及垂直方向的线性插值进行计算得到。Optionally, the gray value of the sub-pixel point is calculated by linear interpolation in the horizontal direction and the vertical direction.

可选的,求解最佳单应性变换矩阵的过程包括:Optionally, the process of solving the optimal homography transformation matrix includes:

对热红外和可见光图像对中的标定板角点的位置坐标进行匹配,获取坐标对集合,随机选取若干坐标对作为内点集合;Match the position coordinates of the corner points of the calibration plate in the thermal infrared and visible light image pairs, obtain a coordinate pair set, and randomly select several coordinate pairs as the interior point set;

基于内点集合,通过最小二乘法构建初始单应性矩阵;Based on the interior point set, the initial homography matrix is constructed by the least square method;

通过初始单应性矩阵对坐标对集合中除内点集合之外的坐标对进行投影误差计算,并对投影误差计算结果进行误差判断,投影误差小于阈值时,则将坐标对添加到内点集合中,若投影误差大于阈值时,则不添加坐标对到内点集合;Through the initial homography matrix, the projection error is calculated for the coordinate pairs except the interior point set in the coordinate pair set, and the error judgment is performed on the projection error calculation result. When the projection error is less than the threshold, the coordinate pair is added to the interior point set. , if the projection error is greater than the threshold, the coordinate pair will not be added to the interior point set;

统计更新后的内点集合中坐标对数量;Count the number of coordinate pairs in the updated interior point set;

通过重复构建单应性矩阵到统计坐标对数量之间的过程,直到达到迭代次数时,获得最佳单应性矩阵,其中最佳单应性矩阵为对应内点集合中坐标对数量最多的单应性矩阵。By repeating the process from building a homography matrix to counting the number of coordinate pairs, until the number of iterations is reached, the optimal homography matrix is obtained, where the optimal homography matrix is the homography with the largest number of coordinate pairs in the corresponding interior point set Responsiveness Matrix.

本发明具有如下技术效果:The present invention has the following technical effects:

(1)本发明采用多线程交叉控制,引入标识符和全局监控机制,有效减少了各传感器采集时间差,保证成像时间戳在超低时延范围内的同步。同时,该方法不局限于前期配准工作,在实时应用中也可使用;(1) The present invention adopts multi-thread cross control, introduces identifier and global monitoring mechanism, effectively reduces the acquisition time difference of each sensor, and ensures the synchronization of imaging time stamps within the ultra-low delay range. At the same time, the method is not limited to pre-registration work, and can also be used in real-time applications;

(2)本发明利用棋盘格边缘灰度分布拟合边缘线获取的角点位置,替代特征点实现空间场景的统一,有效减少了特征点提取和匹配带来的误差,能够提高配准精度与配准速度,方法简单,易于实施。(2) The present invention uses the edge gray distribution of the checkerboard to fit the corner positions obtained by the edge lines, and replaces the feature points to realize the unification of the space scene, effectively reduces the error caused by the feature point extraction and matching, and can improve the registration accuracy and accuracy. The registration speed is simple, and the method is easy to implement.

附图说明Description of drawings

为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the accompanying drawings required in the embodiments will be briefly introduced below. Obviously, the drawings in the following description are only some of the present invention. In the embodiments, for those of ordinary skill in the art, other drawings can also be obtained according to these drawings without creative labor.

图1为本发明基于时空间统一的热红外和可见光图像同步配准方法的流程示意图;FIG. 1 is a schematic flowchart of a synchronous registration method for thermal infrared and visible light images based on the unified time and space of the present invention;

图2为本发明以红外线程为例的图像采集流程图;Fig. 2 is the image acquisition flow chart of the present invention taking infrared thread as an example;

图3为本发明红外热力仪和可见光摄像机安装平台示意图;Fig. 3 is the schematic diagram of the installation platform of the infrared thermometer and the visible light camera of the present invention;

图4为本发明边缘像素点5×5邻域示意图;4 is a schematic diagram of a 5×5 neighborhood of an edge pixel of the present invention;

图5为本发明亚像素点双向线性灰度插值示意图;5 is a schematic diagram of bi-directional linear grayscale interpolation of sub-pixel points according to the present invention;

图6为本发明可见光图像与配准前后的红外图像。FIG. 6 is a visible light image and an infrared image before and after registration according to the present invention.

具体实施方式Detailed ways

下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only a part of the embodiments of the present invention, but not all of the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative efforts shall fall within the protection scope of the present invention.

为了解决在现有技术中存在的问题,本发明提供了如下方案:In order to solve the problems existing in the prior art, the present invention provides the following solutions:

如图1所述,本发明提供了一种基于时空间统一的热红外和可见光图像同步配准方法,其流程为:采用多线程交叉控制红外热力仪和可见光摄像机完成数据流信息的同步存储,对形成的图像进行预处理后采用一种亚像素边缘角点检测方法,对两类图像中的棋盘格角点进行精确定位,最后利用随机抽样一致算法求解最佳单应性变换矩阵,完成配准。As shown in Figure 1, the present invention provides a method for synchronous registration of thermal infrared and visible light images based on the unity of time and space. After preprocessing the formed image, a sub-pixel edge corner detection method is used to precisely locate the checkerboard corners in the two types of images, and finally the optimal homography transformation matrix is solved by random sampling consensus algorithm to complete the matching. allow.

技术路线:首先基于多线程交叉处理方法统一红外热力仪和可见光摄像机采集数据的基准时间;接着搭建黑白格发射率不同的棋盘格标定板,调整并固定红外热力仪和可见光摄像机的安装位置,确保大视场角传感器能够完全包含另一传感器所采集的场景信息;进一步的,同时使用两个设备采集标定板在不同位置、角度及姿态下的图像,进行噪声滤波预处理;然后利用亚像素边缘角点检测获取每组图像中标定板角点的位置坐标并依序保存;最后基于随机抽样一致的鲁棒方法,结合最小二乘法求解最佳单应性变换矩阵,完成配准。Technical route: First, based on the multi-threaded cross-processing method, unify the reference time of the infrared thermometer and visible light camera to collect data; then build a checkerboard calibration board with different black and white emissivity, adjust and fix the installation position of the infrared thermometer and the visible light camera to ensure that A sensor with a large field of view can completely contain the scene information collected by another sensor; further, two devices are used to collect images of the calibration board at different positions, angles and attitudes, and noise filtering preprocessing is performed; and then the sub-pixel edge is used. Corner detection obtains the position coordinates of the corner points of the calibration plate in each group of images and saves them in sequence. Finally, based on the robust method of random sampling and consistent, the optimal homography transformation matrix is solved by the least square method to complete the registration.

实施例1Example 1

本实施例中的一种基于时空间统一的热红外和可见光图像同步配准方法,包括以下步骤:A thermal infrared and visible light image synchronous registration method based on time-space unity in this embodiment includes the following steps:

步骤1:如图2所示的以红外线程为例的图像采集流程图,利用基于多线程交叉处理方法统一红外热力仪和可见光摄像机采集数据的基准时间,其具体步骤为:Step 1: As shown in Figure 2, the image acquisition flow chart with infrared threads as an example is used to unify the reference time of the infrared thermometer and the visible light camera based on the multi-thread cross processing method. The specific steps are:

步骤1.1:启动多线程模块,红外热力仪和可见光摄像机分别开始采集数据流;Step 1.1: Start the multi-threading module, and the infrared thermometer and the visible light camera start to collect data streams respectively;

步骤1.2:分配判断第一标识符1和第二标识符2给热红外和可见光处理线程,初始值设为False;Step 1.2: Allocate and judge the first identifier 1 and the second identifier 2 to the thermal infrared and visible light processing threads, and the initial value is set to False;

步骤1.3:建立全局监控机制,当某一线程到达数据流信息存储环节时,将其标识符转变为True,若另一线程标识符仍为False,则开始等待,直至另一线程也到达数据流信息存储环节后,将其标识符转变为True;Step 1.3: Establish a global monitoring mechanism. When a thread reaches the data stream information storage link, its identifier is changed to True. If another thread identifier is still False, it starts to wait until another thread also reaches the data stream. After the information storage process, convert its identifier to True;

步骤1.4:当标识符1和2同时为True时,两个线程开始将当前数据流中的帧信息存储到高速缓存中;Step 1.4: When the identifiers 1 and 2 are both True, the two threads start to store the frame information in the current data stream into the cache;

步骤1.5:帧信息存储完成后,将对应标识符再转变为False;Step 1.5: After the frame information is stored, change the corresponding identifier to False again;

步骤1.6:在不影响同步流程的情况下,由另一线程从缓存中提取出帧信息,并将其保存为图像数据,同时释放缓存;Step 1.6: Without affecting the synchronization process, another thread extracts the frame information from the cache, saves it as image data, and releases the cache at the same time;

步骤1.7:返回步骤1.3,存储下一个数据流信息,循环运行直至线程结束。Step 1.7: Return to step 1.3, store the next data stream information, and run in a loop until the thread ends.

步骤2:搭建黑白格发射率不同的棋盘格标定板,选择以具有较低发射率的光滑铝板作为白格背景,以及较高发射率的粗糙黑色橡胶布作为黑格。可见光摄像机仍可根据颜色信息判别角点;同时,红外热力仪也能在相同温度下,被动地检测到角点分明的棋盘格图像,消除主动加热操作导致的角点模糊误差,实现两种传感器的有效同步采集;Step 2: Build a checkerboard calibration board with different emissivity of black and white grids, choose a smooth aluminum plate with a lower emissivity as the white grid background, and a rough black rubber cloth with a higher emissivity as the black grid. The visible light camera can still identify the corner points according to the color information; at the same time, the infrared thermometer can also passively detect the checkerboard image with clear corner points at the same temperature, eliminate the corner blur error caused by the active heating operation, and realize two kinds of sensors. effective synchronous acquisition;

步骤3:连接两个传感器设备到同一台装载步骤1中多线程交叉控制采集方法的工控机上,利用如图3所示的红外热力仪和可见光摄像机安装平台,调整并固定两个传感器的安装位置,确保大视场角传感器能够完全包含另一传感器所采集的场景信息。同时,使用两个设备采集标定板在不同位置、角度及姿态下的图像,记录多组热红外和可见光图像对,并进行噪声滤波预处理;Step 3: Connect the two sensor devices to the same industrial computer loaded with the multi-threaded cross-control acquisition method in step 1, and use the infrared thermometer and visible light camera installation platform as shown in Figure 3 to adjust and fix the installation positions of the two sensors , to ensure that the large field of view sensor can fully contain the scene information collected by the other sensor. At the same time, two devices are used to collect images of the calibration board at different positions, angles and attitudes, record multiple sets of thermal infrared and visible light image pairs, and perform noise filtering preprocessing;

步骤4:利用亚像素边缘角点检测获取每组图像中标定板角点的位置坐标并依序保存,其具体步骤为:Step 4: Use sub-pixel edge corner detection to obtain the position coordinates of the calibration board corners in each group of images and save them in sequence. The specific steps are:

步骤4.1:利用Canny算子初步获取棋盘格整像素边缘位置;Step 4.1: Use the Canny operator to initially obtain the edge position of the whole pixel of the checkerboard;

步骤4.2:通过对像素边缘位置进行角点检测,选取某角点O四个边缘方向上任一像素点A作为亚像素点中心,在如图4所示的5×5邻域内用最外侧两个边缘像素点的连线作为该邻域内的切线,得到水平方向夹角θ,垂直方向法线F;Step 4.2: By performing corner detection on the pixel edge position, select any pixel A in the four edge directions of a corner O as the center of the sub-pixel point, and use the outermost two in the 5×5 neighborhood as shown in Figure 4. The connection line of the edge pixel points is used as the tangent line in the neighborhood, and the angle θ in the horizontal direction and the normal line F in the vertical direction are obtained;

步骤4.3:沿法线方向以单像素距离在中心像素点A两端各设置两个亚像素点a1,a2和a3,a4,得到5个法线方向的亚像素点;Step 4.3: Set two sub-pixel points a 1 , a 2 and a 3 , a 4 at each end of the center pixel point A at a distance of a single pixel along the normal direction to obtain 5 sub-pixel points in the normal direction;

步骤4.4:如图5所示,以a1为例,设靠近中心的亚像素点在4个整像素邻域内坐标为(x,y),到中心像素点(0,0)的距离为1,则有x=cosθ,y=sinθ。先得到其在像素(0,0)和(1,0)的灰度在水平方向的线性插值

Figure BDA0003586222630000091
和在像素(0,1)和(1,1)的灰度在水平方向的线性插值
Figure BDA0003586222630000092
分别为:Step 4.4: As shown in Figure 5, taking a 1 as an example, set the coordinates of the sub-pixels close to the center in 4 whole pixel neighborhoods as (x, y), and the distance to the center pixel (0, 0) as 1 , then x=cosθ, y=sinθ. First get the linear interpolation of the gray levels of pixels (0, 0) and (1, 0) in the horizontal direction
Figure BDA0003586222630000091
and the linear interpolation of the gray levels at pixels (0, 1) and (1, 1) in the horizontal direction
Figure BDA0003586222630000092
They are:

Figure BDA0003586222630000093
Figure BDA0003586222630000093

Figure BDA0003586222630000094
Figure BDA0003586222630000094

式中,I00、I01、I10、I11分别表示像素坐标(0,0)、(0,1)、(1,0)、(1,1)的灰度值。再采用垂直方向的线性插值计算,得到亚像素点的灰度值Ixy为:In the formula, I 00 , I 01 , I 10 , and I 11 represent the grayscale values of pixel coordinates (0, 0), (0, 1), (1, 0), and (1, 1), respectively. Then use the linear interpolation calculation in the vertical direction to obtain the gray value I xy of the sub-pixel point as:

Figure BDA0003586222630000095
Figure BDA0003586222630000095

步骤4.5:如图5所示,以a2为例,类似于步骤4.4,设远离中心的亚像素点在9个整像素邻域内坐标为(m,n),到中心像素点(0,0)的距离为2,则有m=2cosθ,n=2sinθ。先得到其在像素(0,0)和(2,0)的灰度在水平方向的线性插值

Figure BDA0003586222630000096
和在像素(0,2)和(2,2)的灰度在水平方向的线性插值
Figure BDA0003586222630000097
分别为:Step 4.5: As shown in Figure 5, taking a 2 as an example, similar to step 4.4, set the coordinates of the sub-pixel points far away from the center in the 9 integer pixel neighborhood as (m, n), to the center pixel point (0, 0 ) is 2, then m=2cosθ, n=2sinθ. First get the linear interpolation of the gray level of the pixels (0, 0) and (2, 0) in the horizontal direction
Figure BDA0003586222630000096
and the linear interpolation in the horizontal direction of the gray levels at pixels (0, 2) and (2, 2)
Figure BDA0003586222630000097
They are:

Figure BDA0003586222630000098
Figure BDA0003586222630000098

Figure BDA0003586222630000099
Figure BDA0003586222630000099

式中,I02、I20、I22分别表示像素坐标(0,2)、(2,0)、(2,2)的灰度值。再采用垂直方向的线性插值计算,得到亚像素点的灰度值Imn为:In the formula, I 02 , I 20 , and I 22 represent the grayscale values of pixel coordinates (0, 2), (2, 0), and (2, 2), respectively. Then use the linear interpolation calculation in the vertical direction to obtain the gray value I mn of the sub-pixel point as:

Figure BDA0003586222630000101
Figure BDA0003586222630000101

步骤4.6:利用5个法线方向亚像素点的灰度值,使用反正切函数拟合边缘实际的灰度分布情况,其拟合函数由下表示:Step 4.6: Using the gray values of the five sub-pixel points in the normal direction, use the arc tangent function to fit the actual gray distribution of the edge. The fitting function is represented by the following:

y=a1arctan(a2x+a3)+a4 y=a 1 arctan(a 2 x+a 3 )+a 4

式中,y表示亚像素点的灰度值,x表示亚像素点到A点的矢量距离,a1表示将反正切函数的幅值扩大至原来的a1倍,a2、a3表示反正切函数曲线弯曲的程度,a4表示反正切函数在垂直方向上的偏移量。使用最小二乘法拟合求解a1,a2,a3,a4的值,得到边缘函数模型;In the formula, y represents the gray value of the sub-pixel point, x represents the vector distance from the sub-pixel point to point A, a 1 represents that the amplitude of the arc tangent function is expanded to the original a 1 times, and a 2 and a 3 represent the inverse The degree of curvature of the tangent function curve, a 4 represents the offset of the arctangent function in the vertical direction. Use least squares fitting to solve the values of a 1 , a 2 , a 3 , a 4 to obtain the edge function model;

步骤4.7:定义亚像素边缘点E就在函数斜率最大的坐标点处,通过对边缘函数求二次导:Step 4.7: Define the sub-pixel edge point E at the coordinate point with the largest slope of the function, by taking the second derivative of the edge function:

Figure BDA0003586222630000102
Figure BDA0003586222630000102

Figure BDA0003586222630000103
Figure BDA0003586222630000103

可得:

Figure BDA0003586222630000104
为亚像素边缘点E在法线方向的位置,再结合θ即可求得亚像素边缘点相对于A点的完整偏移量,其中水平偏移量为
Figure BDA0003586222630000105
垂直偏移量为
Figure BDA0003586222630000106
基于A点坐标已知,因此可得到亚像素边缘点E的最终坐标;Available:
Figure BDA0003586222630000104
is the position of the sub-pixel edge point E in the normal direction, and combined with θ, the complete offset of the sub-pixel edge point relative to point A can be obtained, where the horizontal offset is
Figure BDA0003586222630000105
The vertical offset is
Figure BDA0003586222630000106
Based on the known coordinates of point A, the final coordinates of the sub-pixel edge point E can be obtained;

步骤4.8:另取粗角点O另外三个边缘方向上的像素点,按照步骤4.2到4.7求得相应的亚像素边缘点,利用相反方向的亚像素边缘点拟合得到两条相交的边缘线,获取交点作为角点坐标并依序保存。Step 4.8: Take the other three pixel points in the edge direction of the thick corner point O, obtain the corresponding sub-pixel edge points according to steps 4.2 to 4.7, and use the sub-pixel edge points in the opposite direction to fit two intersecting edge lines , obtain the intersection point as the corner coordinate and save it in sequence.

步骤5:结合上述亚像素点坐标对集合S,利用基于随机抽样一致的鲁棒方法,结合最小二乘法求解最佳单应性变换矩阵,如图6所示,其中图6中的(a)为可见光图像与图6中的(b)与图6中的(c)分别为配准前后的红外图像,其具体步骤为:Step 5: Combine the above-mentioned sub-pixel point coordinate pair set S, use the robust method based on random sampling consistency, and combine with the least squares method to solve the optimal homography transformation matrix, as shown in Figure 6, where (a) in Figure 6 The visible light image and (b) in FIG. 6 and (c) in FIG. 6 are the infrared images before and after registration, respectively, and the specific steps are:

步骤5.1:从初始匹配对集合S中随机选取n=4对匹配特征点作为内点集合Si,将超定方程转换为非线性优化问题,通过最小二乘法拟合来估计初始的单应性矩阵HiStep 5.1: Randomly select n=4 pairs of matching feature points from the initial matching pair set S as the interior point set S i , convert the overdetermined equation into a nonlinear optimization problem, and estimate the initial homography by least squares fitting matrix H i ;

步骤5.2:用步骤5.1中所求得的Hi计算集合S中剩余的匹配点对,若某个特征点的投影误差小于阈值t=5,则将其添加到集合Si中,其他则视为外点;Step 5.2: Use the H i obtained in Step 5.1 to calculate the remaining matching point pairs in the set S. If the projection error of a feature point is less than the threshold t=5, it will be added to the set S i , and the others will be treated as for the outside point;

步骤5.3:记录集合Si中匹配点对的数量;Step 5.3: record the number of matching point pairs in the set Si;

步骤5.4:重复以上步骤,直到迭代次数大于K=200;Step 5.4: Repeat the above steps until the number of iterations is greater than K=200;

步骤5.5:选择内点数量最多的模型作为所要求解的单应性矩阵H。通过最佳单应性变换矩阵以实现热红外和可见光图像同步配准。Step 5.5: Select the model with the largest number of interior points as the homography matrix H to be solved. Simultaneous registration of thermal infrared and visible light images through an optimal homography transformation matrix.

以上显示和描述了本发明的基本原理、主要特征和优点。本行业的技术人员应该了解,本发明不受上述实施例的限制,上述实施例和说明书中描述的只是说明本发明的原理,在不脱离本发明精神和范围的前提下,本发明还会有各种变化和改进,这些变化和改进都落入要求保护的本发明范围内。本发明要求保护范围由所附的权利要求书及其等效物界定。The foregoing has shown and described the basic principles, main features and advantages of the present invention. Those skilled in the art should understand that the present invention is not limited by the above-mentioned embodiments, and the descriptions in the above-mentioned embodiments and the description are only to illustrate the principle of the present invention. Without departing from the spirit and scope of the present invention, the present invention will have Various changes and modifications fall within the scope of the claimed invention. The claimed scope of the present invention is defined by the appended claims and their equivalents.

Claims (8)

1. A thermal infrared and visible light image synchronous registration method based on time-space unification is characterized by comprising the following steps:
the method comprises the steps of building a chessboard calibration plate, collecting thermal infrared images and visible light images comprising the chessboard calibration plate, wherein the thermal infrared images and the visible light images have uniform reference time, and acquiring a plurality of groups of thermal infrared and visible light image pairs based on the collected images and the reference time;
processing the thermal infrared and visible light image pair through sub-pixel edge corner detection to obtain the position coordinates of the calibration plate corner;
and solving an optimal homography transformation matrix based on the position coordinates of the calibration plate corner points, wherein the optimal homography transformation matrix is used for realizing synchronous registration of the thermal infrared image and the visible light image.
2. The synchronous registration method based on temporal-spatial unification of thermal infrared and visible light images according to claim 1, wherein:
the chessboard calibration plate comprises a white lattice background and black lattices, wherein the white lattice background uses an aluminum plate, and the black lattices use black rubber cloth.
3. The synchronous registration method based on temporal-spatial unification of thermal infrared and visible light images according to claim 1, wherein:
when the thermal infrared image and the visible light image are collected, the reference time of the thermal infrared image and the reference time of the visible light image are unified through a multithreading cross processing method.
4. The method for synchronously registering the thermal infrared image and the visible light image based on the time-space unification as claimed in claim 1, wherein:
the process of unifying the reference time by the multithread cross processing method comprises the following steps:
respectively acquiring data streams through an infrared thermodynamics instrument and a visible light camera;
distributing threads to the data streams collected by the infrared thermal instrument and the visible light camera, and acquiring a first thread and a second thread, wherein the first thread is a processing thread for collecting the data streams by the infrared thermal instrument, and the second thread is a processing thread for collecting the data streams by the visible light camera;
when the first thread or the second thread reaches the data storage link, the thread reaching the data storage link starts to wait until the other thread also reaches the data storage link;
when the first thread and the second thread both reach a data storage link, caching data streams in the first thread and the second thread;
extracting the cached data streams through a third thread, respectively converting the data streams into image data, and simultaneously releasing the cache;
after releasing the cache, caching the data streams collected in real time in the first thread and the second thread again, and obtaining the thermal infrared image and the visible light image through the processes of circularly caching, extracting, converting and releasing the cache until the threads are finished.
5. The synchronous registration method based on temporal-spatial unification of thermal infrared and visible light images according to claim 1, wherein:
the process of acquiring sets of pairs of thermal infrared and visible light images includes:
and correspondingly integrating the thermal infrared image and the visible light image at the same reference time to obtain a plurality of groups of thermal infrared and visible light image pairs.
6. The synchronous registration method based on temporal-spatial unification of thermal infrared and visible light images according to claim 1, wherein:
the process of obtaining the position coordinates of the calibration plate corner points comprises the following steps:
processing the thermal infrared and visible light image pair through an edge detection operator to obtain the edge position of the checkerboard whole pixel;
based on the edge position, selecting an edge pixel point as a center of a sub-pixel point, based on the center of the sub-pixel point, obtaining a normal line, based on the normal line and a single-pixel distance, obtaining the sub-pixel point, and based on a gray value of the sub-pixel point, constructing a fitting function, wherein the fitting function is an arc tangent function;
solving the fitting function by a least square method to obtain an edge function model;
calculating the edge function model through secondary differentiation to obtain sub-pixel edge points;
and fitting edge lines through sub-pixel edge points in four directions around the thick angular point to obtain an intersection point, wherein the intersection point is used as the position coordinate of the calibration plate angular point.
7. The synchronous registration method based on temporal-spatial unification of thermal infrared and visible light images according to claim 1, wherein:
the gray values of the sub-pixel points are obtained by calculation through linear interpolation in the horizontal direction and the vertical direction.
8. The synchronous registration method based on temporal-spatial unification of thermal infrared and visible light images according to claim 1, wherein:
the process of solving the optimal homography transformation matrix comprises the following steps:
matching position coordinates of calibration board corner points in the thermal infrared image pair and the visible light image pair to obtain a coordinate pair set, and randomly selecting a plurality of coordinate pairs as an inner point set;
constructing an initial homography matrix by a least square method based on the interior point set;
calculating projection errors of coordinate pairs in the coordinate pair set except for the interior point set through the initial homography matrix, judging errors of projection error calculation results, adding the coordinate pairs into the interior point set when the projection errors are smaller than a threshold value, and not adding the coordinate pairs into the interior point set when the projection errors are larger than the threshold value;
counting the number of coordinate pairs in the updated inner point set;
and repeating the process from the construction of the homography matrix to the statistics of the number of the coordinate pairs until the iteration times are reached, and obtaining the optimal homography matrix, wherein the optimal homography matrix is the homography matrix with the maximum number of the coordinate pairs in the corresponding inner point set.
CN202210366866.9A 2022-04-08 2022-04-08 A Synchronous Registration Method for Thermal Infrared and Visible Light Images Based on Unity of Time and Space Pending CN114897947A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210366866.9A CN114897947A (en) 2022-04-08 2022-04-08 A Synchronous Registration Method for Thermal Infrared and Visible Light Images Based on Unity of Time and Space

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210366866.9A CN114897947A (en) 2022-04-08 2022-04-08 A Synchronous Registration Method for Thermal Infrared and Visible Light Images Based on Unity of Time and Space

Publications (1)

Publication Number Publication Date
CN114897947A true CN114897947A (en) 2022-08-12

Family

ID=82715216

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210366866.9A Pending CN114897947A (en) 2022-04-08 2022-04-08 A Synchronous Registration Method for Thermal Infrared and Visible Light Images Based on Unity of Time and Space

Country Status (1)

Country Link
CN (1) CN114897947A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116718165A (en) * 2023-06-08 2023-09-08 中国矿业大学 Combined imaging system based on unmanned aerial vehicle platform and image enhancement fusion method
CN116958218A (en) * 2023-08-14 2023-10-27 苏州大学 A point cloud and image registration method and equipment based on calibration plate corner point alignment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017029784A1 (en) * 2015-08-19 2017-02-23 日本電気株式会社 Image position matching system, method and recording medium
CN107170001A (en) * 2017-04-25 2017-09-15 北京海致网聚信息技术有限公司 Method and apparatus for carrying out registration to image
CN111260731A (en) * 2020-01-10 2020-06-09 大连理工大学 Checkerboard sub-pixel level corner point self-adaptive detection method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017029784A1 (en) * 2015-08-19 2017-02-23 日本電気株式会社 Image position matching system, method and recording medium
CN107170001A (en) * 2017-04-25 2017-09-15 北京海致网聚信息技术有限公司 Method and apparatus for carrying out registration to image
CN111260731A (en) * 2020-01-10 2020-06-09 大连理工大学 Checkerboard sub-pixel level corner point self-adaptive detection method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
YIHAO C.等: "MRSI: A multimodal proximity remote sensing data set for environment perception in rail transit", pages 5530 - 5556 *
梁晋 等: "3D反求技术", 武汉:华中科技大学出版社, pages: 78 - 79 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116718165A (en) * 2023-06-08 2023-09-08 中国矿业大学 Combined imaging system based on unmanned aerial vehicle platform and image enhancement fusion method
CN116718165B (en) * 2023-06-08 2024-05-14 中国矿业大学 Combined imaging system based on unmanned aerial vehicle platform and image enhancement fusion method
US12254600B2 (en) 2023-06-08 2025-03-18 China University Of Mining And Technology Joint imaging system based on unmanned aerial vehicle platform and image enhancement fusion method
CN116958218A (en) * 2023-08-14 2023-10-27 苏州大学 A point cloud and image registration method and equipment based on calibration plate corner point alignment

Similar Documents

Publication Publication Date Title
WO2019105044A1 (en) Method and system for lens distortion correction and feature extraction
CN106504290B (en) A high-precision camera dynamic calibration method
WO2021004416A1 (en) Method and apparatus for establishing beacon map on basis of visual beacons
CN112132874B (en) Calibration plate-free heterosource image registration method, device, electronic equipment and storage medium
CN114897947A (en) A Synchronous Registration Method for Thermal Infrared and Visible Light Images Based on Unity of Time and Space
CN111047649A (en) A high-precision camera calibration method based on optimal polarization angle
CN106897995B (en) A Method for Automatic Parts Recognition Oriented to Mechanical Assembly Process
JP2011506914A (en) System and method for multi-frame surface measurement of object shape
CN103530880A (en) Camera calibration method based on projected Gaussian grid pattern
CN112541932A (en) Multi-source image registration method based on different focal length transformation parameters of dual-optical camera
CN105809640A (en) Multi-sensor fusion low-illumination video image enhancement method
CN108269228B (en) The automatic detection method in unmanned plane image garland region based on GPU parallel computation
CN109656033A (en) A kind of method and device for distinguishing liquid crystal display dust and defect
CN105469389A (en) Grid ball target for visual sensor calibration and corresponding calibration method
Yang et al. A novel camera calibration method based on circle projection model
CN106971408A (en) A kind of camera marking method based on space-time conversion thought
CN116205993A (en) A high-precision calibration method for bi-telecentric lens for 3D AOI
CN111724446A (en) A zoom camera extrinsic parameter calibration method for 3D reconstruction of buildings
CN117115272A (en) Telecentric camera calibration and three-dimensional reconstruction method for precipitation particle multi-angle imaging
CN114998447A (en) Multi-objective vision calibration method and system
CN1987893A (en) Method for identifying fabric grain image facing camara weft straightener
CN115100126B (en) An intelligent perception method for plane displacement field of bridge structure
CN112508885B (en) Method and system for detecting three-dimensional central axis of bent pipe
CN116758063B (en) A workpiece size detection method based on image semantic segmentation
Dong et al. An automatic calibration method for PTZ camera in expressway monitoring system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220812

RJ01 Rejection of invention patent application after publication