CN101567051B - Image matching method based on characteristic points - Google Patents

Image matching method based on characteristic points Download PDF

Info

Publication number
CN101567051B
CN101567051B CN2009100524538A CN200910052453A CN101567051B CN 101567051 B CN101567051 B CN 101567051B CN 2009100524538 A CN2009100524538 A CN 2009100524538A CN 200910052453 A CN200910052453 A CN 200910052453A CN 101567051 B CN101567051 B CN 101567051B
Authority
CN
China
Prior art keywords
similarity
msub
mrow
corner
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2009100524538A
Other languages
Chinese (zh)
Other versions
CN101567051A (en
Inventor
魏二岭
杨夙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fudan University
Original Assignee
Fudan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fudan University filed Critical Fudan University
Priority to CN2009100524538A priority Critical patent/CN101567051B/en
Publication of CN101567051A publication Critical patent/CN101567051A/en
Application granted granted Critical
Publication of CN101567051B publication Critical patent/CN101567051B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

本发明属模式识别与图像处理技术领域,涉及一种基于特征点的图像配准方法。本发明根据特征点的K近邻结构定义了环式边角码模型并将该模型用于点模式匹配。相似环式边角码不仅描述了两个特征点的空间结构相似性,还可用于局部映射估计。两个特征点的相似度由它们所关联的最大相似环式边角码的相似长度确定。本发明根据特征点的局部空间结构的相似性进行结构匹配,利用局部映射聚类进行优化匹配,优化匹配结果用于最优映射估计。本方法可用于基于特征点的图像配准,图像拼接与镶嵌,运动追踪,医学单模态与多模态图像的信息融合,及基于内容的图像检索等。

The invention belongs to the technical field of pattern recognition and image processing, and relates to an image registration method based on feature points. According to the K nearest neighbor structure of feature points, the present invention defines a circular corner code model and uses the model for point pattern matching. The similarity ring corner code not only describes the spatial structure similarity of two feature points, but also can be used for local map estimation. The similarity of two feature points is determined by the similarity length of the maximum similarity ring-type corner code associated with them. The invention performs structural matching according to the similarity of local spatial structures of feature points, optimizes matching by using local mapping clustering, and optimizes matching results for optimal mapping estimation. This method can be used for image registration based on feature points, image stitching and mosaic, motion tracking, information fusion of medical single-modal and multi-modal images, and content-based image retrieval, etc.

Description

一种基于特征点的图像配准方法A Method of Image Registration Based on Feature Points

技术领域 technical field

本发明属模式识别与图像处理技术领域,具体涉及一种基于特征点的图像配准方法。本方法可用于基于特征点的图像配准,图像拼接与镶嵌,运动追踪,医学单模态与多模态图像的信息融合,以及基于内容的图像检索等。  The invention belongs to the technical field of pattern recognition and image processing, and in particular relates to an image registration method based on feature points. This method can be used for image registration based on feature points, image stitching and mosaic, motion tracking, information fusion of medical single-modal and multi-modal images, and content-based image retrieval, etc. the

背景技术 Background technique

点模式匹配,可以理解为基于特征的图像匹配,从两个待匹配的点集中搜索对应关系并对点集之间的映射进行估计。点模式匹配广泛应用在遥感(图像镶嵌、图像拼接),医学图像(病情诊断与追踪)和计算机视觉(目标或场景识别、运动追踪)。  Point pattern matching can be understood as feature-based image matching, which searches for correspondences from two point sets to be matched and estimates the mapping between point sets. Point pattern matching is widely used in remote sensing (image mosaic, image mosaic), medical image (condition diagnosis and tracking) and computer vision (target or scene recognition, motion tracking). the

近年来虽然很多优秀的点模式匹配算法被提出,但几乎没有一种算法可以同时在计算时间和精度上满足应用要求。目前所公开的算法大致可分为5类:聚类算法(见参考文献[1][2]),形状上下文算法(见参考文献[3]),松驰标签算法(见参考文献[4][5]),逐步优化算法(见参考文献[6]),图匹配算法(见参考文献[7])。  Although many excellent point pattern matching algorithms have been proposed in recent years, almost none of them can meet the application requirements in terms of computing time and accuracy. Currently published algorithms can be roughly divided into 5 categories: clustering algorithm (see reference [1][2]), shape context algorithm (see reference [3]), slack label algorithm (see reference [4] [5]), stepwise optimization algorithm (see reference [6]), graph matching algorithm (see reference [7]). the

其中,  in,

聚类算法假设变换模型为相似变换,因为估计相似变换只需两对匹配点,算法首先从两个待匹配点集中选择两对点进行组合,然后估计变换系数,用估计的变换验证其他可能匹配的点对。因为用正确匹配的点对估计的变换也是正确的,用该变换进行聚类,则会把所有可能正确的匹配点聚到该类,而且如果正确匹配的点多的话,相应的类也会很大;相反,如果是用错误匹配的点对估计的变换,则该变换肯定是错误的,错误变换有一个特点就是随机性。错误匹配大都随机的,也就是说用错误变换做分类器进行聚类,则只会聚到很少点。算法根据这种性质对两个特征点集进行聚类,聚类结果中,最大类将作为对应关系输出以用于后续处理。这类算法的缺陷在于计算复杂度高,耗时,而且如果两个特征点集大小差异非常大,算法性能也会非常差。霍夫变换后来被用于预处理以加速计算,然而匹配仍然是一种非常耗时的工作,而且在噪声点非常多,或者两个点集大小差异非常大的情况下,算法不一定保证可以找出最佳对应关系。  The clustering algorithm assumes that the transformation model is a similar transformation, because only two pairs of matching points are needed to estimate the similarity transformation, the algorithm first selects two pairs of points from the two sets of points to be matched to combine, then estimates the transformation coefficient, and uses the estimated transformation to verify other possible matches right. Because the estimated transformation of the correctly matched points is also correct, clustering with this transformation will gather all possible correct matching points into this class, and if there are many correctly matched points, the corresponding class will also be very large. On the contrary, if the transformation is estimated by the wrongly matched point pairs, the transformation must be wrong, and one of the characteristics of the wrong transformation is randomness. Most of the wrong matches are random, that is to say, using the wrong transformation as a classifier for clustering will only gather a few points. According to this property, the algorithm clusters two feature point sets, and among the clustering results, the largest cluster will be output as a corresponding relationship for subsequent processing. The disadvantage of this type of algorithm is that the calculation complexity is high, time-consuming, and if the size difference between the two feature point sets is very large, the performance of the algorithm will be very poor. The Hough transform was later used in preprocessing to speed up calculations. However, matching is still a very time-consuming task, and when there are many noise points or the size of the two point sets is very different, the algorithm does not necessarily guarantee that it can Find the best correspondence. the

形状上下文算法也是近些年来研究比较热的点模式匹配技术。形状上下文描述了一个点集中其他特征点相对某个特征点的空间分布信息。两幅图像同一形状中匹配的特征点具有相似的形状上下文,每个特征点对应了一个方向直方图,以表示其他特征点相对自己的空间约束。这种算法可以解决非刚性变换的点模式匹配,大部分类似算法采用样条曲线对变换进行估计,然而这类算法的问题是对特征点的采样具有依赖性,图像背景不能太复杂,各种算法的测试也大都基于合成数据,对于简单的形状匹配应用来说,算法性能比较好,可这类算法的计算复杂度比较高,每个特征点都需要利用所有其他点计算形状上下文,因此在复杂场景中,识别的物体又比较难于分割的情况下,算法的性能很差。  The shape context algorithm is also a point pattern matching technology that has been studied in recent years. The shape context describes the spatial distribution information of other feature points in a point set relative to a certain feature point. The matching feature points in the same shape of the two images have similar shape contexts, and each feature point corresponds to an orientation histogram to represent the spatial constraints of other feature points relative to itself. This algorithm can solve the point pattern matching of non-rigid transformation. Most similar algorithms use spline curves to estimate the transformation. However, the problem with this type of algorithm is that it is dependent on the sampling of feature points. Algorithms are mostly tested based on synthetic data. For simple shape matching applications, the performance of the algorithm is relatively good, but the computational complexity of this type of algorithm is relatively high. Each feature point needs to use all other points to calculate the shape context. Therefore, in In complex scenes, when the recognized objects are difficult to segment, the performance of the algorithm is very poor. the

松驰标签算法在变换关系上定义一种概率分布,然后用离散算法进行优化,算法实际上是一个迭代的过程,从一个比较粗的匹配开始,然后逐步更新匹配概率矩阵,这种算法很容易收敛于局部极值。  The loose label algorithm defines a probability distribution on the transformation relationship, and then optimizes it with a discrete algorithm. The algorithm is actually an iterative process, starting with a relatively rough match, and then gradually updating the matching probability matrix. This algorithm is easy converges to a local extremum. the

逐步优化算法也是利用全局特征点构建一个相容函数,通过更新相容函数以解决对应,该算法和松驰算法具有相同的缺陷。两类算法在特征点集比较大,噪声又比较多的情况下算法性能都比较差,而且不能 保证算法在各种情况下可以收敛。  The step-by-step optimization algorithm also uses the global feature points to construct a compatible function, and solves the correspondence by updating the compatible function. This algorithm has the same defect as the relaxation algorithm. The performance of the two types of algorithms is relatively poor when the feature point set is relatively large and the noise is relatively large, and there is no guarantee that the algorithm can converge in various situations. the

图匹配算法是近些年研究的比较热的点模式匹配算法。通过将待匹配的两个点集构建成带权图,属性关系图等,然后在两个图中搜索相容的部分,即很可能匹配的部分,或者进行子图匹配,或者进行完全映射匹配,保证优化函数最小。这类算法可以归结为图搜索问题,是一个NP难的问题,不能保证对各种测例都有解。很多算法也都是利用优化函数进行估计,得到较优解而不是最优解,图匹配的另一个缺陷是不能处理两个都含有大量噪声点的点集,除了计算时间长外,故该算法也很难得到合理的对应关系。  Graph matching algorithm is a hot point pattern matching algorithm researched in recent years. By constructing the two point sets to be matched into a weighted graph, an attribute relationship graph, etc., and then searching for compatible parts in the two graphs, that is, the parts that are likely to match, or perform subgraph matching, or complete mapping matching , to ensure that the optimization function is minimized. This type of algorithm can be attributed to the graph search problem, which is an NP-hard problem, and it cannot be guaranteed to have solutions for various test cases. Many algorithms also use the optimization function to estimate and get a better solution instead of the optimal solution. Another defect of graph matching is that it cannot handle two point sets that contain a lot of noise points. In addition to the long calculation time, the algorithm It is also difficult to obtain a reasonable correspondence. the

与本发明相关的参考文献有:  References relevant to the present invention are:

[1]A.Goshtasby.Description and discrimination of planar shapes using shape matrices.IEEE Trans.Pattern Analysis and Machine Intelligence,7:738-743,1985.  [1] A. Goshtasby. Description and discrimination of planar shapes using shape matrices. IEEE Trans. Pattern Analysis and Machine Intelligence, 7: 738-743, 1985.

[2]S.H.Chang,F.H.Cheng,W.H.Hsu,and G.Z.Wu.Fast algorithm for point pattern matching:Invariant to translations,rotations and scale changes.Pattern Recognition,30:311-320,1997.  [2] S.H.Chang, F.H.Cheng, W.H.Hsu, and G.Z.Wu. Fast algorithm for point pattern matching: Invariant to translations, rotations and scale changes. Pattern Recognition, 30: 311-320, 1997.

[3]S.Belongie,J.Malik,and J.Puzicha,“Shape Matching and Object Recognition Using ShapeContexts,”IEEE Trans.Pattern Analysis and Machine Intelligence,vol.24,no.4,pp.509-522,Apr.2002.  [3] S.Belongie, J.Malik, and J.Puzicha, "Shape Matching and Object Recognition Using ShapeContexts," IEEE Trans.Pattern Analysis and Machine Intelligence, vol.24, no.4, pp.509-522, Apr. .2002.

[4]W.J.Christmas,J.Kittler,and M.Petrou,“Structural Matching in Computer Vision UsingProbabilistic Relaxation,”IEEE Trans.Pattern Analysis and Machine Intelligence,vol.17,no.8,pp.749-764,Aug.1994.  [4] W.J.Christmas, J.Kittler, and M.Petrou, "Structural Matching in Computer Vision Using Probabilistic Relaxation," IEEE Trans.Pattern Analysis and Machine Intelligence, vol.17, no.8, pp.749-764, Aug. 1994.

[5]R.C.Wilson and E.R.Hancock,“Structural Matching by Discrete Relaxation,”IEEE Trans.Pattern Analysis and Machine Intelligence,vol.19,no.6,pp.634-648,June 1997.  [5] R.C.Wilson and E.R.Hancock, "Structural Matching by Discrete Relaxation," IEEE Trans.Pattern Analysis and Machine Intelligence, vol.19, no.6, pp.634-648, June 1997.

[6]S.Gold and A.Rangarajan,“A Graduated Assignment Algorithm for Graph Matching,”IEEE Trans.Pattern Analysis and Machine Intelligence,vol.18,no.4,pp.377-388,Apr.1996.  [6] S.Gold and A.Rangarajan, "A Graduated Assignment Algorithm for Graph Matching," IEEE Trans.Pattern Analysis and Machine Intelligence, vol.18, no.4, pp.377-388, Apr.1996.

[7]Tibe’rio S.Caetano,Terry Caelli,Fellow,IEEE,Dale Schuurmans,and Dante A.C.Barone.Graphical Models and Point Pattern Matching.IEEE Trans.Pattern Analysis and MachineIntelligence,28:1646-1663,2006.  [7]Tibe’rio S.Caetano, Terry Caelli, Fellow, IEEE, Dale Schuurmans, and Dante A.C.Barone.Graphical Models and Point Pattern Matching.IEEE Trans.Pattern Analysis and Machine Intelligence, 28:1646-1663, 2006.

发明内容 Contents of the invention

本发明旨在克服现有技术中存在的不足,提出一种基于特征点的图像配准方法。  The present invention aims to overcome the deficiencies in the prior art, and proposes an image registration method based on feature points. the

本发明首先提出一种基于特征点K近邻空间结构的环式边角码模型,对环式边角码之间的相似性进行探索并将其用于衡量两个特征点的局部空间的相似性。所述的相似环式边角码不仅描述了两个特征点之间的局部空间结构的相似性,还可以用于估计局部映射。环式边角码模型建立在相似变换基础上,本发明同时对环式边角码相似条件进行了松驰,使得匹配算法对一定程度的仿射和视角变换具有鲁棒性。特征点之间的相似度由它们所关联的最大相似环式边角码的相似长度确定。本发明首先根据特征点的局部空间结构的相似性进行结构匹配,然后利用匹配点所关联的局部映射聚类进行优化匹配。优化匹配的结果用于估计最优映射。在加速匹配过程方面,本发明通过在搜索空间建立索引避免了大量不必要计算,从而保证匹配算法在保证精度的同时也达到了接近实时应用的要求。  The present invention first proposes a ring-type corner code model based on the spatial structure of feature point K neighbors, explores the similarity between ring-type corner codes and uses it to measure the similarity of the local space of two feature points . The similarity ring type corner code not only describes the similarity of the local spatial structure between two feature points, but also can be used to estimate the local mapping. The ring-type corner code model is established on the basis of similarity transformation, and the invention relaxes the similarity condition of the ring-type corner code at the same time, so that the matching algorithm is robust to a certain degree of affine and perspective transformation. The similarity between feature points is determined by the similarity length of the maximum similarity ring-type corner codes associated with them. The present invention first performs structural matching according to the similarity of local spatial structures of feature points, and then uses local mapping clusters associated with matching points to perform optimal matching. The results of the optimal matching are used to estimate the optimal mapping. In terms of speeding up the matching process, the present invention avoids a large number of unnecessary calculations by establishing an index in the search space, thereby ensuring that the matching algorithm meets the requirements of near-real-time applications while ensuring accuracy. the

本发明公开了一种快速有效的点模式匹配算法,其流程图见附图1。包括步如下骤:  The invention discloses a fast and effective point pattern matching algorithm, the flow chart of which is shown in accompanying drawing 1. Include the following steps:

步骤1):特征提取,对两幅待匹配的图像进行特征提取,匹配算法基于图像特征点,匹配前期提取具有缩放,旋转和平移不变性的特征点集作为输入,点集之间也可以具有一定程度的仿射与视角变换;  Step 1): Feature extraction. Feature extraction is performed on two images to be matched. The matching algorithm is based on image feature points. The feature point set with zoom, rotation and translation invariance extracted in the early stage of matching is used as input, and the point sets can also have A certain degree of affine and perspective transformation;

步骤2):相似度计算,提取特征点的K近邻结构并构建特征点的环式边角码,对搜索空间建立索引, 利用二分查找和增量匹配算法计算特征点之间的最大相似环式边角码,并将最大相似环式边角的相似长度作为两个特征点之间的相似度,计算相似度的同时确定可能匹配的特征点;  Step 2): Similarity calculation, extracting the K-nearest neighbor structure of the feature points and constructing the circular corner codes of the feature points, indexing the search space, and calculating the maximum similarity between the feature points using binary search and incremental matching algorithms Corner code, and use the similarity length of the maximum similar circular corner as the similarity between two feature points, and determine the possible matching feature points while calculating the similarity;

步骤3):结构匹配,特征点之间的相似度在物理意义上描述了特征点的局部空间结构的相似性,本发明根据特征点之间的相似度大小,将可能匹配的特征点分成三类:不可能匹配的特征点类badClass,可能匹配但匹配不唯一的特征点类unknownClass,可能匹配的特征点类goodClass,同时对goodClass中的匹配点进行局部映射估计;  Step 3): structure matching, the similarity between the feature points describes the similarity of the local spatial structure of the feature points in a physical sense, the present invention divides the possible matching feature points into three according to the similarity between the feature points Class: the feature point class badClass that cannot be matched, the feature point class unknownClass that may match but not unique, the feature point class goodClass that may match, and local mapping estimation for the matching points in goodClass;

步骤4):优化匹配,利用goodClass中的匹配点所关联的局部映射进行聚类,在聚类结果中,定义元素最多的类为最大类,元素第二多的类为次大类。如果最大类元素个数大于某个阈值同时大于次大类元素个数,或者最大类与次大类的元素个数之比大于某个阈值,最大类将作为匹配结果,估计最优变换;  Step 4): Optimizing the matching, using the local mapping associated with the matching points in goodClass to perform clustering. In the clustering results, the class with the most defined elements is the largest class, and the class with the second most elements is the second largest class. If the number of elements of the largest class is greater than a certain threshold and greater than the number of elements of the next largest class, or the ratio of the number of elements of the largest class to the second largest class is greater than a certain threshold, the largest class will be used as the matching result to estimate the optimal transformation;

步骤5):最优变换估计,本发明假设图像之间的变换为除了相似变换外,还有一定程度的仿射与视角变换,变换估计模型为仿射变换。  Step 5): Optimal transformation estimation. The present invention assumes that the transformation between images is not only the similarity transformation, but also a certain degree of affine and perspective transformation, and the transformation estimation model is affine transformation. the

具体而言,  in particular,

本发明所述的步骤1中的两个点集定义为S和T,大小分别为n1和n2,同时定义点Pi∈S,Qj∈T。  The two point sets in step 1 of the present invention are defined as S and T, the sizes are n 1 and n 2 respectively, and points P i ∈ S, Q j ∈ T are defined at the same time.

所述的步骤2中的环式边角码定义为:  The circular corner code in the described step 2 is defined as:

如附图2所示,特征点的K近邻空间结构可以通过一个环式边角码表示。为更好地定义环式边角码,本发明将附图2(a)中每条边和它逆时针邻接的角的组合定义为一个边角码(Edge-Angle Code)EAC=(E,θ)。附图2(a)共有K个边角码(Ei,θi),i=0,1,...,K-1。对这K个边角码按逆时针顺序编码并依次将它们连成一个环形结构则得到特征点Pc的环式边角码,如附图2(b)所示,本发明定义K为环式边角码的长度。  As shown in Figure 2, the K-nearest neighbor spatial structure of feature points can be represented by a circular corner code. For better definition ring type corner code, the present invention is defined as a corner code (Edge-Angle Code) EAC=(E, θ). Figure 2(a) has K corner codes (E i , θ i ), i=0, 1, . . . , K-1. These K corner codes are coded in counterclockwise order and they are connected into a ring structure successively to obtain the ring formula corner code of feature point P c , as shown in accompanying drawing 2 (b), the present invention defines K as ring The length of the formula corner code.

本发明所定义的环式边角码具有以下特性:  The ring type corner code defined by the present invention has the following characteristics:

1)连通性:从任一边角码开始向前遍历,都可以将所有边角码按顺序遍历完毕。为便于表示,本发明将遍历下标定义为mt,其中t表示非负整数,而mt则表示t对K取余的结果,这样做可以保证在对边角码遍历时,下标不会越界。  1) Connectivity: starting from any corner code and traversing forward, all corner codes can be traversed in order. For the convenience of expression, the present invention defines the traversal subscript as m t , wherein t represents a non-negative integer, and m t represents the result of taking the remainder of t to K, which can ensure that the subscript does not would cross the line.

2)空间约束性:边角码的顺序描述了特征点的邻居点的约束关系。边角码顺序  ( E m i , θ m i ) , ( E m ( i + 1 ) , θ m ( i + 1 ) ) , ( E m ( i + 2 ) , θ m ( i + 2 ) ) 描述了在空间结构中,邻居点 与邻居点 

Figure G2009100524538D00033
在逆时针方向上邻接,而与邻居点 
Figure G2009100524538D00034
在逆时针方向上间隔一个邻居点。因此对边角码顺序进行分析等价于对邻居点在空间结构上的约束关系进行分析。  2) Spatial constraints: the order of the corner codes describes the constraint relationship of the neighbor points of the feature point. Corner code order ( E. m i , θ m i ) , ( E. m ( i + 1 ) , θ m ( i + 1 ) ) , ( E. m ( i + 2 ) , θ m ( i + 2 ) ) Describes that in the spatial structure, neighbor points point with neighbors
Figure G2009100524538D00033
adjacency in the counterclockwise direction, and with neighbor points
Figure G2009100524538D00034
Space one neighbor point in the counterclockwise direction. Therefore, analyzing the order of corner codes is equivalent to analyzing the constraint relationship of neighbor points on the spatial structure.

3)等价性:同一特征点可以有K个不同的环式边角码,但它们都等价地描述了特征点的同一局部空间结构。由附图2(a)可知,将不同的邻居点作为第一个边角码时,本发明将得到不同的环式边角码,但它们在物理意义上是等价的,即都描述了相同的空间结构。本发明把环式边角码的这种特性定义为等价性。  3) Equivalence: The same feature point can have K different ring-type corner codes, but they all describe the same local spatial structure of the feature point equivalently. As can be seen from accompanying drawing 2 (a), when using different neighbor points as the first corner code, the present invention will obtain different circular corner codes, but they are equivalent in a physical sense, that is, they all describe the same spatial structure. The present invention defines this characteristic of the ring type corner code as equivalence. the

所述的步骤2中的相似环式边角码定义具体为:  The definition of the similar ring type corner code in the described step 2 is specifically:

如附图3所示,如果两个环式边角码从某一对边开始满足对应边成比例,对应角成比例且等于1,本发明就说这两个环式边角码相似,其中一个环式边角码的长度定义为相似环式边角码的相似长度,相似环式边角码对应的空间结构定义为相似空间结构。附图3所示的相似环式边角码从(E0,E′0)开始满足下式:  As shown in accompanying drawing 3, if two ring-type corner codes satisfy the proportion of corresponding side from a certain opposite side, the corresponding angle is proportional and equal to 1, the present invention just says that these two ring-type corner codes are similar, wherein The length of a circular corner code is defined as the similar length of similar circular corner codes, and the corresponding spatial structure of similar circular corner codes is defined as similar spatial structure. The similar ring type corner code shown in accompanying drawing 3 satisfies the following formula from (E 0 , E′ 0 ):

|| EE. ′′ 00 || || EE. 00 || == || EE. ′′ 11 || || EE. 11 || == KK == || EE. ′′ KK -- 11 || || EE. KK -- 11 ||

θθ ′′ 00 θθ 00 == θθ ′′ 11 θθ 11 == KK == θθ ′′ KK -- 11 θθ KK -- 11 == 11

为计算步骤二所述的相似环式边角码,本发明还定义了两个相邻边角码的加操作,具体为:  For calculating the similar circular corner codes described in step two, the present invention also defines the addition operation of two adjacent corner codes, specifically:

(( EE. mm ii ,, θθ mm ii )) ++ (( EE. mm (( ii ++ 11 )) ,, θθ mm (( ii ++ 11 )) )) == (( EE. mm ii ,, θθ mm ii ++ θθ mm (( ii ++ 11 )) ))

两个相邻边角码加操作的结果仍为一个边角码,新边角码的边为第一个边角码的边,而新边角码的角则是两个相加边角码的角之和。相加操作的物理意义实际上表示了对两个逆时针方向相邻的邻居点进行合并,即将第二个邻居点删除,然后对第一条边逆时针邻接的角进行更新。  The result of adding two adjacent corner codes is still one corner code, the side of the new corner code is the side of the first corner code, and the corner of the new corner code is the two added corner codes The sum of the angles of . The physical meaning of the addition operation actually means merging two counterclockwise adjacent neighbor points, that is, deleting the second neighbor point, and then updating the counterclockwise adjacent corner of the first side. the

为计算步骤二所述的相似环式边角码,本发明还定义了两个边角码的比较操作,如果两个边角码(E1,θ1),(E′1,θ′1)满足  For calculating the similar ring type corner code described in step 2, the present invention also defines the comparison operation of two corner codes, if two corner codes (E 1 , θ 1 ), (E′ 1 , θ′ 1 )satisfy

absabs (( || EE. &prime;&prime; 11 || // || EE. 11 || || EE. &prime;&prime; 22 || // || EE. 22 || -- 1.01.0 )) << &epsiv;&epsiv; absabs (( &theta;&theta; &prime;&prime; 11 &theta;&theta; 11 -- 1.01.0 )) << &epsiv;&epsiv;

其中ε为误差控制因子,接近于0的正数,(E2,θ2),(E′2,θ′2)分别是(E1,θ1),(E′1,θ′1)下一个邻接的边角码,本发明就规定(E1,θ1),(E′1,θ′1)的比较操作不需要加操作,不需要加操作的物理意义是表示由E1,θ1,E2构成的三角形与由E′1,θ′1,E′2构成的三角形相似。  Where ε is the error control factor, a positive number close to 0, (E 2 , θ 2 ), (E′ 2 , θ′ 2 ) are (E 1 , θ 1 ), (E′ 1 , θ′ 1 ) The next adjacent corner code, the present invention just stipulates (E 1 , θ 1 ), the comparison operation of (E' 1 , θ' 1 ) does not need to add operation, and the physical meaning that does not need to add operation is to represent by E 1 , The triangle formed by θ 1 , E 2 is similar to the triangle formed by E′ 1 , θ′ 1 , E′ 2 .

所述的步骤二相似度计算具体为:  The similarity calculation of the second step is specifically:

(1)建立搜索空间,提取特征点的K近邻并构建特征点的环式边角码,搜索空间为目标点集中所有特征点的环式边角码上的n2K个边角码,对这些边角码按角度建立索引以加速搜索。示意图见附图4中间和右边两列表格。  (1) Establish a search space, extract the K nearest neighbors of feature points and construct the ring-type corner codes of feature points, the search space is n 2 K corner codes on the ring-type corner codes of all feature points in the target point set, for These corner codes are indexed by angle to speed up searches. The schematic diagram is shown in the middle and right two-column tables of attached drawing 4.

(2)确定搜索对象,匹配目的在于找到匹配点。因此单次搜索对象为Pi的环式边角码所关联的每个边角码,就Pi而言,共有K个搜索对象(Ek,θk),如附图4左边表格所示。  (2) Determine the search object, and the purpose of matching is to find the matching point. Therefore, each corner code associated with the ring-type corner code with a single search object of P i , as far as P i is concerned, there are K search objects (E k , θ k ), as shown in the left table of Figure 4 .

(3)确定搜索策略,本发明首先对(Ek,θk)利用二分搜索在搜索空间中查找误差允许范围内角度相同的边角码(En,θn),设对应的特征点为Qj。如果找到,本发明还会在所找到边角码前后线性搜索在误差允许范围内角度相同的其他(En,θn)。如果(Ek,θk)与(En,θn)的比较结果不需要加操作,则分别以(Ek,θk)与(En,θn)为起点生成两个等价的环式边角码LP={A0,A1,...,A(K-1)}和LQ={B0,B1,...,B(K-1)},其中Ak,Bn表示边角码,记Ek A,θk A,En B,θn B分别表示Ak,Bn的边和角。然后按增量匹配算法计算它们的最大相似环式边角码。增量匹配算法如下:首先对A0,B0进行比较操作,如果θ0 A,θ0 B不相等,则角度小的边角码与其相邻的下一个边角码进行加操作生成新的边角码;如果θ0 A,θ0 B相等,则验证θ0 A,θ0 B相邻的对应边(E0 A,E0 B)和(E1 A,E1 B)是否成比例,如果成比例,则将A0,B0分别保留在已求得的相似环式边角码中(刚开始为空),如果不成比例,则将这A0,B0与各自相邻的下一个边角码进行加操作生成新的边角码,然后循环进行下次比较操作,循环过程直到其中一个环式边角码遍历完为止。伪代码表示如下:  (3) To determine the search strategy, the present invention first uses binary search for (E k , θ k ) to search for corner codes (E n , θ n ) with the same angle within the allowable error range in the search space, and the corresponding feature points are Qj . If found, the present invention will also linearly search for other (E n , θ n ) with the same angle within the error tolerance range before and after the found corner code. If the comparison result of (E k , θ k ) and ( E n , θ n ) does not need an addition operation, two equivalent Ring type corner code L P = {A 0 , A 1 , ..., A (K-1) } and L Q = {B 0 , B 1 , ..., B (K-1) }, where A k , B n represent the corner codes, write down E k A , θ k A , E n B , θ n B represent the sides and angles of A k , B n respectively. Then calculate their maximum similarity ring type corner code according to incremental matching algorithm. Incremental matching algorithm is as follows: first, compare A 0 and B 0 , if θ 0 A , θ 0 B are not equal, add the corner code with the smaller angle to the next adjacent corner code to generate a new Side angle code; if θ 0 A , θ 0 B are equal, verify whether the corresponding sides ( E 0 A , E 0 B ) and (E 1 A , E 1 B ) adjacent to θ 0 A , θ 0 B are proportional , if proportional, keep A 0 and B 0 respectively in the obtained similar ring-type corner codes (empty at the beginning ) ; The next corner code is added to generate a new corner code, and then the next comparison operation is performed in a loop until one of the circular corner codes has been traversed. The pseudo code is expressed as follows:

假设t1和t2是两个循环变量,t1=0,t2=0。令tempA=Ak,tempB=Bn,rate=En B/Ek A,则增量匹配算法如下:  Suppose t 1 and t 2 are two loop variables, t 1 =0, t 2 =0. Let temp A =A k , temp B =B n , rate=E n B /E k A , then the incremental matching algorithm is as follows:

whilewhile (( tt 11 << KandKand tt 22 << KK )) {{

ifif (( &theta;&theta; temptemp AA << &theta;&theta; temptemp BB )) {{ tt 11 == tt 11 ++ 11 ,, temptemp AA == temptemp AA ++ AA mm (( ii ++ tt 11 )) .. }}

elseifelse if (( &theta;&theta; temptemp AA >> &theta;&theta; temtem pp BB )) {{ tt 22 == tt 22 ++ 11 ,, temptemp BB == temptemp BB ++ BB mm (( jj ++ tt 22 )) .. }}

elseelse {{

tt 11 == tt 11 ++ 11 ,, tt 22 == tt 22 ++ 11 ..

ifif (( raterate == == EE. BB mm (( nno ++ tt 22 )) // EE. AA mm (( kk ++ tt 11 )) ))

{{ Addadd temptemp AA intointo LL kk ,, addadd temptemp BB intointo LL nno ,, temtem pp AA == AA mm (( kk ++ tt 11 )) ,, temptemp BB == BB mm (( nno ++ tt 22 )) .. }}

elseelse {{ temptemp AA == temptemp AA ++ AA mm (( kk ++ tt 11 )) ,, temptemp BB == temptemp BB ++ BB mm (( nno ++ tt 22 )) .. }} }} }}

搜索过程中只保留相似长度最大的相似环式边角码,当相似长度最大的环式边角码有多个时,全部保留。将最大相似环式边角码的相似长度作为两个特征点之间的相似度,注意搜索过程中所有没有被计算过的两个特征点的相似度为0。  In the search process, only the similar ring-type corner codes with the largest similar length are kept, and when there are multiple ring-type corner codes with the largest similar length, all of them are kept. Take the similarity length of the maximum similarity ring-type corner code as the similarity between two feature points, and note that the similarity of all two feature points that have not been calculated in the search process is 0. the

(4)确定可能匹配的特征点,本发明定义S(Ak,Bn)={Lk,Ln}表示以Ek A,En B为起始对应边的相似环式边角码,Lk,Ln表示相似环式边角码。令S(Ak,Bn)的相似长度为lenij(k,n),则特征点Pi,Qj的相似度为:  (4) Determining the feature point of possible matching, the present invention defines S(A k , B n )={L k , L n } to represent the similar ring-type corner code with E k A , E n B as the starting corresponding side , L k , L n represent similar circular corner codes. Let the similarity length of S(A k , B n ) be len ij (k, n), then the similarity of feature points P i , Q j is:

similaritysimilarity (( PP ii ,, QQ jj )) == maxmax kk ,, nno &Element;&Element; [[ 00 ,, KK -- 11 ]] lenlen ijij (( kk ,, nno ))

如果Qj *满足  If Qj * satisfies

QQ jj ** == argarg maxmax Oo jj &Element;&Element; TT similaritysimilarity (( PP ii ,, QQ jj ))

本发明就认为Qj *为Pi的可能匹配点,计算过程中Pi可能有多个可能匹配的Qj *,全部保留。  The present invention considers that Q j * is the possible matching point of P i , and during the calculation process, P i may have multiple possible matching Q j * , all of which are reserved.

本发明所述的步骤3结构匹配,具体为:  Step 3 structure matching of the present invention is specifically:

设Pi可能匹配的Qj *有NO个,则对可能匹配点(Pi,Q* j)分类如下:  Assuming that there are NO Q j * that P i may match, the possible matching points (P i , Q * j ) are classified as follows:

(( PP ii ,, QQ ** jj )) &Element;&Element; badClassbadClass ifif similaritysimilarity (( PP ii ,, QQ ** jj )) << 33 unknownClassunknownClass ifif similaritysimilarity (( PP ii ,, QQ ** jj )) &GreaterEqual;&Greater Equal; 33 andNOandNO >> 11 goodClassgoodClass ifif similaritysimilarity (( PP ii ,, QQ ** jj )) &GreaterEqual;&Greater Equal; 33 andNOandNO == 11

分类条件解释如下:badClass中的(Pi,Q* j)的相似度都小于3,在K比较大的时候,如K=15,说明(Pi,Q* j)的局部空间结构差异非常大,Pi或Q* j可能是噪声点,匹配的可能性很小;unknownClass中的(Pi,Q* j)的相似度都大于等于3,在K比较大的时候,如K=15,说明(Pi,Q* j)的局部空间结构比较相似,但Pi可能匹配的Q* j却不止一个,这说明仅仅通过局部空间结构的相似性无法确定Pi与哪个Q* j匹配,需 要后续验证才能确定;goodClass中的(Pi,Q* j)的相似度都大于等于3,在K比较大的时候,如K=15,说明(Pi,Q* j)的局部空间结构比较相似,而且Pi可能匹配的Q* j只有一个,(Pi,Q* j)正确匹配的可能性非常大,而且相似度越大,说明局部空间结构越相似,匹配的可能性越大。  The classification conditions are explained as follows: the similarity of (P i , Q * j ) in badClass is less than 3. When K is relatively large, such as K=15, it shows that the local spatial structure of (P i , Q * j ) is very different. Large, P i or Q * j may be a noise point, and the possibility of matching is very small; the similarity of (P i , Q * j ) in unknownClass is greater than or equal to 3, when K is relatively large, such as K=15 , indicating that the local spatial structure of (P i , Q * j ) is relatively similar, but there are more than one Q * j that P i may match, which shows that it is impossible to determine which Q * j matches P i only through the similarity of local spatial structure , need follow-up verification to confirm; the similarity of (P i , Q * j ) in goodClass is greater than or equal to 3, when K is relatively large, such as K=15, it shows that the local space of (P i , Q * j ) The structure is relatively similar, and there is only one Q * j that P i may match, the possibility of (P i , Q * j ) matching is very high, and the greater the similarity, the more similar the local space structure is, the more likely it is to match big.

根据上述分析,本发明将goodClass中的(Pi,Q* j)作为结构匹配的结果,同时本发明还对goodClass中的(Pi,Q* j)进行局部相似变换估计,,注意在局部相似变换估计过程中,(Pi,Q* j)与它们关联的一组对应邻居点进行组合进而估计一组局部相似变换,如果(Pi,Q* j)的相似度为ls,则可以估计ls组局部相似变换,匹配算法取ls组局部相似变换的均值作为最终估计的局部相似变换。  According to the above analysis, the present invention regards (P i , Q * j ) in goodClass as the result of structural matching, and at the same time, the present invention also performs local similarity transformation estimation on (P i , Q * j ) in goodClass, and pays attention to local In the process of similarity transformation estimation, (P i , Q * j ) are combined with a set of corresponding neighbor points associated with them to estimate a set of local similarity transformations. If the similarity of (P i , Q * j ) is ls, then Estimate the local similarity transformation of the ls group, and the matching algorithm takes the mean value of the local similarity transformation of the ls group as the final estimated local similarity transformation.

所述的步骤4,优化匹配具体为:  In step 4, the optimization matching is specifically as follows:

(1)对goodClass中的(Pi,Q* j)按相似度由大到小进行排序;  (1) Sort (P i , Q * j ) in goodClass according to the similarity from large to small;

(2)对排过序的(Pi,Q* j)进行扫描,如果(Pi,Q* j)没有被聚类过,则以(Pi,Q* j)为类中心Ci,(Pi,Q* j)所关联的局部映射Ti作为分类器,对goodClass和unknownClass中未被聚类的可能匹配点及关联的对应邻居点进行验证。聚类过程如下:首先将(Pi,Q* j)关联的对应邻居点归入该类,因为分类器Ti就是由这些对应邻居点估计出来的,然后利用分类器验证其他goodClass和unknownClass中未聚类的可能匹配点及关联的对应邻居点,如果所验证的点对满足分类器,则对其进行标记,不再进行后续处理并将其归入该类,否则将其忽略。设其中需要验证的一对点为(P,Q),设P经过Ti映射后的点为Ti(P),如果Ti(P)与Q的欧氏距离在误差范围内可以接受,本发明则认为(P,Q)符合Ti,将(P,Q)归为Ci。用公式表示为:  (2) Scan the sorted (P i , Q * j ), if (P i , Q * j ) has not been clustered, then take (P i , Q * j ) as the cluster center C i , The local map T i associated with (P i , Q * j ) is used as a classifier to verify the unclustered possible matching points and associated corresponding neighbor points in goodClass and unknownClass. The clustering process is as follows: First, the corresponding neighbor points associated with (P i , Q * j ) are classified into this class, because the classifier T i is estimated from these corresponding neighbor points, and then the classifier is used to verify other goodClass and unknownClass For unclustered possible matching points and associated corresponding neighbor points, if the verified point pair satisfies the classifier, it will be marked, no further processing will be performed and it will be classified into this class, otherwise it will be ignored. Let the pair of points that need to be verified be (P, Q), let the point after P is mapped by T i be T i (P), if the Euclidean distance between T i (P) and Q is acceptable within the error range, The present invention considers that (P, Q) conforms to T i , and classifies (P, Q) as C i . Expressed as:

|Ti(P),Q|<εT |T i (P), Q|<ε T

εT为误差控制因子。注意在聚类过程中每对(Pi,Q* j)只被聚类一次,即(Pi,Q* j)属于且只能属于一个类。一次聚类过程必须对所有未聚类过的其他可能匹配点及关联的对应邻居点进行验证。下一次聚类的开始条件是goodClass中还存在没有聚类过的(Pi,Q* j)。  ε T is the error control factor. Note that each pair (P i , Q * j ) is only clustered once during the clustering process, that is, (P i , Q * j ) belongs to and can only belong to one class. A clustering process must verify all other possible matching points and associated corresponding neighbor points that have not been clustered. The starting condition for the next clustering is that there are (P i , Q * j ) that have not been clustered in goodClass.

(3)对聚类结果进行分析,因为结构匹配结果中通常情况下是存在错误匹配的,必须将其删除。在聚类结果中,定义元素最多的类为最大类,元素第二多的类为次大类,类中每个元素表示一对匹配点。通常情况下,正确匹配的特征点所关联的局部映射也是正确的,所有正确的匹配点在误差允许范围内都会聚入一类,相应类的点对会非常多;反之,错误匹配的点所关联的局部映射随机性很大,相应的类中元素非常少。因此本发明进行优化匹配如下:如果最大类与次大类的元素个数之比大于某个阈值,或者最大类元素个数大于次大类,同时最大类元素个数大于某个阈值,本发明将会输出最大类作为对应关系,进行最优变换估计,反之本发明则认为匹配的两个点集相关性太差,无法进行匹配。  (3) Analyze the clustering results, because there are usually wrong matches in the structure matching results, which must be deleted. In the clustering results, the class with the most elements is defined as the largest class, the class with the second most elements is the second largest class, and each element in a class represents a pair of matching points. Usually, the local mapping associated with the correctly matched feature points is also correct, and all the correct matched points will be grouped into one class within the allowable range of error, and there will be a lot of point pairs in the corresponding class; The associated local maps are very random, and the corresponding classes have very few elements. Therefore, the present invention optimizes matching as follows: if the ratio of the number of elements of the largest category to the second largest category is greater than a certain threshold, or the number of elements of the largest category is greater than the second largest category, and at the same time the number of elements of the largest category is greater than a certain threshold, the present invention The largest class will be output as the corresponding relationship for optimal transformation estimation, otherwise the present invention considers that the correlation between the two matched point sets is too poor to be matched. the

附图说明 Description of drawings

图1:本发明流程图。  Figure 1: Flowchart of the present invention. the

图2:特征点Pc的KNN空间结构图及相应的环式边角码。  Figure 2: The KNN space structure diagram of the feature point P c and the corresponding ring corner code.

图3:相似环式边角码及对应的相似空间结构。  Figure 3: Similar ring-type corner codes and corresponding similar spatial structures. the

图4:相似度计算示意图。  Figure 4: Schematic diagram of similarity calculation. the

图5:两幅待匹配图像。  Figure 5: Two images to be matched. the

图6:图5所示的两幅图像的匹配结果。  Figure 6: Matching results for the two images shown in Figure 5. the

具体实施方式 Detailed ways

以下提供具体实例来进一步说明本发明的应用。  Specific examples are provided below to further illustrate the application of the present invention. the

实施例1  Example 1

匹配对象见附图5所示的两幅图像。两幅图像拍摄于不同时间,不同角度,只有部分内容相同,图像大小均为440×330。  The matching objects are shown in the two images shown in Figure 5. The two images were taken at different times and from different angles, only part of the content is the same, and the image size is 440×330. the

运行本发明的设备为Gateway T6307c笔记本,Intel 2core 1.6G,1G内存。本发明具有平台移植性,具体实现平台为Centos下的GCC环境和Windows XP/Server 2003下的Visual C++2005环境。运行效率GCC好于Visual C++2005。  The equipment running the present invention is Gateway T6307c notebook, Intel 2core 1.6G, 1G memory. The present invention has platform portability, and concrete realization platform is the Visual C++2005 environment under the GCC environment under Centos and Windows XP/Server 2003. The operating efficiency of GCC is better than that of Visual C++2005. the

本发明规定两个标量相等的误差ε=0.1,两个像素点之间的欧式距离误差εT=10,定义两种误差旨在计算相似环式边角码和聚类。同时本发明将特征点的邻居数设置为K=15。  The present invention stipulates that the error ε=0.1 when two scalar quantities are equal, and the Euclidean distance error ε T =10 between two pixel points, and the definition of the two errors aims at calculating similar circular corner codes and clustering. At the same time, the present invention sets the number of neighbors of feature points as K=15.

具体实施情况如下:  The specific implementation is as follows:

步骤1:从两幅待匹配的图像中获取两个特征点集作为输入,特征点只需包括二维坐标信息即可。本发明采用了单尺度Harris角点检测算法,Harris响应系数α=0.06,差分尺度σD=3,平滑尺度σI=7,局部非极大值抑制窗口为3×3,抑制值为全局最大角点响应值的0.01倍。两幅图像分别提取189和208个特征点。相应点集分别记为S和T,n1=189和n2=208。  Step 1: Obtain two feature point sets from two images to be matched as input, and the feature points only need to include two-dimensional coordinate information. The present invention adopts the single-scale Harris corner detection algorithm, Harris response coefficient α=0.06, difference scale σ D =3, smooth scale σ I =7, local non-maximum suppression window is 3×3, and the suppression value is the global maximum 0.01 times the response value of the corner point. The two images extract 189 and 208 feature points respectively. The corresponding point sets are denoted as S and T respectively, n 1 =189 and n 2 =208.

步骤2:相似度计算,提取每个特征点的15近邻,按照环式边角码的定义计算特征点的环式边角码。在计算过程中,要避免除零操作。标量做除法时,根据需要,加上或减去一个非常小的正数,以免程序崩溃。  Step 2: Calculate the similarity, extract 15 neighbors of each feature point, and calculate the ring-type corner code of the feature point according to the definition of the ring-type corner code. During calculations, division by zero should be avoided. When doing scalar division, add or subtract a very small positive number as needed to avoid program crashes. the

根据相似环式边角码的定义,计算待匹配的两个特征点的相似度,根据分类规则,确定goodClass和unknownClass中的元素,同时计算goodClass中的元素所关联的局部映射。具体步骤如下:  According to the definition of similar circular corner codes, calculate the similarity of the two feature points to be matched, determine the elements in goodClass and unknownClass according to the classification rules, and calculate the local mapping associated with the elements in goodClass. Specific steps are as follows:

(1)建立搜索空间。对T中208个特征点的环式边角码的208×15个边角码(En,θn)按照角度大小排序以建立索引。  (1) Establish a search space. The 208×15 corner codes (E n , θ n ) of the circular corner codes (E n , θ n ) of 208 feature points in T are sorted according to the angle size to establish an index.

(2)确定搜索对象。为得到Pi可能匹配的点Q* j,需要对Pi的环式边角码上的15个边角码(Ek,θk)(k=0,1,...,15)分别进行搜索。  (2) Determine the search object. In order to obtain the point Q * j that P i may match, it is necessary to have 15 corner codes (E k , θ k ) (k=0, 1, ..., 15) on the circular corner code of P i respectively to search.

(3)确定搜索策略。依次选择每个(Ek,θk),在搜索空间按角度进行搜索,如果搜索成功,则在搜索到的边角码(En,θn)前后继续线性搜索在误差范围内角度相等的其他边角码(En,θn),如果(Ek,θk)与(En,θn)的比较结果不需要加操作,则分别以(Ek,θk)与(En,θn)为起点生成两个等价的环式边角码并利用增量匹配算法计算它们的最大相似环式边角码。  (3) Determine the search strategy. Select each (E k , θ k ) in turn, search in the search space according to the angle, if the search is successful, then continue to search linearly before and after the searched corner code (E n , θ n ) and have the same angle within the error range For other corner codes (E n , θ n ), if the comparison result of (E k , θ k ) and (E n , θ n ) does not need to be added, then use (E k , θ k ) and (E n , θ n ) as the starting point to generate two equivalent circular corner codes and use the incremental matching algorithm to calculate their maximum similarity circular corner codes.

(4)确定可能匹配的特征点,Pi的15个(Ek,θk)全部搜索结束之后,保留计算过程中相似长度 最大的相似环式边角码,将保留的最大相似环式边角码关联的另一个特征点Qj作为Pi可能匹配的点Q* j,如有多个则全部保存。  (4) Determine the possible matching feature points. After all the 15 (E k , θ k ) of P i have been searched, the similar ring-type corner code with the largest similarity length in the calculation process is retained, and the retained maximum similar ring-type edge Another feature point Q j associated with the corner code is used as the point Q * j that P i may match, and if there are more than one, all are saved.

步骤3:结构匹配,对S中的189个特征点根据相似度和可能匹配的特征点个数进行分类。保留goodClass和unknownClass。并按照相似变换模型计算goodClass中元素关联的局部映射。相似变换模型如下:  Step 3: Structure matching, classify the 189 feature points in S according to the similarity and the number of possible matching feature points. Keep goodClass and unknownClass. And calculate the local mapping of element association in goodClass according to the similarity transformation model. The similar transformation model is as follows:

X=sxcosθ-sysinθ+tx X=sxcosθ-sysinθ+t x

Y=sxsinθ+sycosθ+ty Y=sxsinθ+sycosθ+t y

其中s,θ分别是缩放因子和旋转角度,tx,ty分别是x和y方向的平移量。(x,y)和(X,Y)分别是变换前后的特征点坐标。注意局部相似变换计算过程中,可能匹配点与它们关联的一组对应邻居点便可估计一组变换系数,因此如果Pi,Q* j相似度为ls,则可以估计ls组局部相似变换,本发明将ls组局部相似变换的均值作为最终估计的局部相似变换。  Where s, θ are scaling factor and rotation angle respectively, t x , t y are translation amounts in x and y directions respectively. (x, y) and (X, Y) are the coordinates of feature points before and after transformation respectively. Note that in the calculation process of local similarity transformation, a group of transformation coefficients can be estimated by matching points and their corresponding neighbor points, so if the similarity of P i and Q * j is ls, then the ls group of local similarity transformation can be estimated, The present invention takes the mean value of the local similarity transformation of the ls group as the final estimated local similarity transformation.

步骤4:优化匹配,对步骤三中的goodClass中的元素按相似长度从大到小进行排序。因为相似长度越大,表示两个特征点的局部空间结构越相似,匹配的可能性也越大,从相似长度最大的Pi,Q* j开始聚类,也就能保证聚类效果越好,得到的匹配效果也越好。因此聚类首先选择未聚过类的相似度最大的一对特征点开始。然后按聚类规则进行聚类。针对大尺度场景图像的拼接与识别应用来说,设s1,s2是最大类和次大类的元素个数,如果s1≥2s2或者s1>s2,s1>23,算法将把最大类中的元素输出作为优化匹配结果,用于最优变换估计。否则,算法将认为匹配不成功。针对附图5的实例,聚类结果一共有88类,其中最大类元素数为33,次大类元素数为6。通过观察,最大类元素在误差范围内都是正确匹配的,次大类元素在误差范围内都是错误匹配的。在附图5所示实例中,最大类与次大类元素个数比为5.5,满足条件,匹配成功。最大类的元素将作为对应关系,进行最优变换估计。  Step 4: Optimize the matching, and sort the elements in goodClass in step 3 according to the similar length from large to small. Because the larger the similarity length, the more similar the local spatial structure of the two feature points is, and the greater the possibility of matching is, starting clustering from P i and Q * j with the largest similarity length can ensure better clustering effect , the better the matching effect is. Therefore, clustering first selects a pair of feature points with the largest similarity that has not been clustered. Then cluster according to the clustering rules. For the splicing and recognition applications of large-scale scene images, let s 1 and s 2 be the number of elements of the largest category and the second largest category, if s 1 ≥ 2s 2 or s 1 >s 2 , s 1 >23, the algorithm The element in the largest class will be output as the optimal matching result for optimal transformation estimation. Otherwise, the algorithm considers the match unsuccessful. For the example in Fig. 5, there are 88 clustering results in total, of which the largest number of elements is 33, and the second largest number of elements is 6. By observation, the elements of the largest category are all correctly matched within the error range, and the elements of the second largest category are all incorrectly matched within the error range. In the example shown in Figure 5, the ratio of the number of elements of the largest category to the next largest category is 5.5, which satisfies the condition and the matching is successful. The elements of the largest class will be used as correspondences for optimal transformation estimation.

步骤5:变换估计,因为通常情况下,场景图像之间的变换往往有一定程度的仿射或视角变换,因此本发明采用仿射变换模型估计场景图像之间的变换。仿射变换模型如下:  Step 5: Transformation estimation, because usually, the transformation between scene images often has a certain degree of affine or perspective transformation, so the present invention uses an affine transformation model to estimate the transformation between scene images. The affine transformation model is as follows:

X=ax+by+tx X=ax+by+t x

Y=cx+dy+ty Y=cx+dy+t y

其中a,b,c,d分别是旋转,缩放与拉伸因子,tx,ty分别是x和y方向的平移量。(x,y)和(X,Y)分别是变换前后的特征点坐标。对所有匹配点,按仿射变换模型组成线性方程组,进行最小二乘拟合估计从而得到最优变换。针对附图5的实例,最优变换为 1.04237 0.00366 - 333.03 0.01088 1.02362 - 10.1731 , 将最优变换应用到匹配图像,得到匹配结果,见附图6。结果显示,本发明在误差范围内将两幅只有部分内容相同的图像很好地拼接在一起。  Where a, b, c, d are the rotation, scaling and stretching factors respectively, t x , t y are the translation amounts in the x and y directions respectively. (x, y) and (X, Y) are the coordinates of feature points before and after transformation respectively. For all matching points, a system of linear equations is formed according to the affine transformation model, and the least square fitting estimation is performed to obtain the optimal transformation. For the example of accompanying drawing 5, optimal transformation is 1.04237 0.00366 - 333.03 0.01088 1.02362 - 10.1731 , Apply the optimal transformation to the matching image to obtain the matching result, see Figure 6. The results show that the present invention stitches together two images with only part of the same content within the error range.

Claims (2)

1. An image registration method based on feature points is characterized by comprising the following steps:
step 1): the characteristic extraction is carried out on the basis of the characteristic,
extracting feature points of two images to be registered by using a feature point extraction operator, and defining two obtained feature point sets as input data of the method;
step 2): the calculation of the degree of similarity is carried out,
establishing a search space, defining the two feature point sets as S and T, and defining the number of the included points as n1And n2Extracting K neighbor structures of the characteristic points and constructing ring-type corner codes of the characteristic points; the search space is defined as n corresponding to the point set T2N on individual ring type corner code2K corner pieces, to n2K corner codes are sorted according to the size of the angle to establish an index; the K neighbor structure of the feature points is represented by a ring-type corner code, the maximum similar ring-type corner code between the feature points is calculated by utilizing a binary search and incremental matching algorithm, the similarity length of the maximum similar ring-type corner code is defined as the similarity between the two feature points, and the feature point pairs which are possibly matched are determined while the similarity is calculated;
the ring type corner connector is characterized in that: respectively connecting the characteristic points with K neighbor points in a K neighbor structure of the characteristic points through one edge; defining the combination of each edge and its counter-clockwise adjoining corner as an edge code EAC ═ E, θ, where there are K edge codes (Ei, θ i), i ═ 0, 1, …, K-1; coding the K corner codes according to a counterclockwise sequence, sequentially connecting the K corner codes into an annular structure to obtain the annular corner codes of the characteristic points, and defining K as the length of the annular corner codes; the two ring-type edge corner codes start from a certain pair of edges and meet the condition that the corresponding edges are in proportion, the corresponding angles are in proportion and equal to 1, the two ring-type edge corner codes are defined to be similar, and the length of one ring-type edge corner code is defined as the similar length of the similar ring-type edge corner code;
step 3): the structure of the utility model is matched with that of the utility model,
classifying the feature point pairs according to the similarity and the possible matching point number, and performing local mapping estimation to order (P)i,Q* j) Representing a characteristic point pair, wherein the classification condition of the characteristic point pair is defined as:
<math> <mrow> <mrow> <mo>(</mo> <msub> <mi>P</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <msup> <mi>Q</mi> <mo>*</mo> </msup> <mi>j</mi> </msub> <mo>)</mo> </mrow> <mo>&Element;</mo> <mfenced open='{' close=''> <mtable> <mtr> <mtd> <mi>badClass</mi> </mtd> <mtd> <mi>if</mi> </mtd> <mtd> <mi>similarity</mi> <mrow> <mo>(</mo> <msub> <mi>P</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <msup> <mi>Q</mi> <mo>*</mo> </msup> <mi>j</mi> </msub> <mo>)</mo> </mrow> <mo>&lt;</mo> <mn>2</mn> </mtd> </mtr> <mtr> <mtd> <mi>unknownClass</mi> </mtd> <mtd> <mi>if</mi> </mtd> <mtd> <mi>similarity</mi> <mrow> <mo>(</mo> <msub> <mi>P</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <msup> <mi>Q</mi> <mo>*</mo> </msup> <mi>j</mi> </msub> <mo>)</mo> </mrow> <mo>&GreaterEqual;</mo> <mn>3</mn> <mi>and NO</mi> <mo>></mo> <mn>1</mn> </mtd> </mtr> <mtr> <mtd> <mi>goodClass</mi> </mtd> <mtd> <mi>if</mi> </mtd> <mtd> <mi>similarity</mi> <mrow> <mo>(</mo> <msub> <mi>P</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <msup> <mi>Q</mi> <mo>*</mo> </msup> <mi>j</mi> </msub> <mo>)</mo> </mrow> <mo>&GreaterEqual;</mo> <mn>3</mn> <mi>and NO</mi> <mo>=</mo> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> </mrow> </math>
in goodClass (P)i,Q* j) Defined as the result of structure matching, for (P) in goodClassi,Q* j) Local mapping estimation using a similarity transformation model, definition (P)i,Q* j) Has a similarity of ls, (P)i,Q* j) Combining the local similarity transformation of the ls group with each pair of neighbor points to estimate the local similarity transformation of the ls group, and defining the mean value of the local similarity transformation of the ls group as the final estimated local mapping; wherein Q* jIs PiPossible matching points of (a); said NO is Qj *The number of (2); similarity (P)i,Q* j) Means that Q* jAnd PiThe similarity of (2);
step 4): the optimization and matching are carried out, and the optimization and matching are carried out,
clustering is carried out by utilizing the structure matching result, and the corresponding relation of the characteristic points is determined by the following steps:
(a) for (P) in goodClassi,Q* j) Sorting according to similarity from big to small;
(b) for ordered (P)i,Q* j) Scanning in descending order according to the size of the similarity (P)i,Q* j) Not clustered, then (P)i,Q* j) Is a class center Ci,(Pi,Q* j) Associated partial map TiAs a classifier, verifying the unclustered feature point pairs and the associated corresponding neighbor points in the goodClass and the unknown class;
firstly, the first step is to mix with (P)i,Q* j) Directly classifying associated ls pairs into the class, verifying non-clustered possible matched pairs and associated matched neighbor pairs in other goodclasss and unknown classes by using a classifier, and satisfying the classifier T when the spatial coordinate relationship of the verified pairs meets the requirement of the classifier T in an error rangeiIf so, marking the data, classifying the data into the category without subsequent processing, otherwise, ignoring the data;
(c) the result of the clustering is analyzed and,
in the clustering result, the class with the most elements is defined as the maximum class, and the class with the second most elements is the second largest class; the maximum number of class elements is defined as S1The number of the next largest class element is defined as S2When one of the following conditions is satisfied, the maximum class is used as the feature point correspondence, the optimal transformation is estimated,
①S1>t1S2,②S1>S2at the same time S1>t2
Wherein t is1、t2Is a specific threshold value larger than 1, which is determined by the kind of the image to be matched;
step 5): the optimal transformation estimate is estimated by taking into account the transformation,
and performing least square estimation on the mapping between the two images by using an affine transformation model.
2. The feature point-based image registration method according to claim 1, wherein the similarity calculation is performed by:
(1) establishing a search space, defining the two feature point sets as S and T, and defining the number of the included points as n1And n2While defining Pi∈S、QjE, T, extracting K neighbor of the characteristic point and constructing a ring type edge code of the characteristic point according to definition; the search space is defined as n corresponding to the point set T2N on individual ring type corner code2K corner pieces, to n2K corner codes are sorted according to the size of the angle to establish an index;
(2) determining a search object, a single search object being defined as PiK search objects are searched for each edge code on the ring-type edge codes;
(3) determining a search strategy, defining (E)k,θk) Is PiUsing binary search and linear search to search all corner codes (E) with the same angle in the error tolerance range in the search spacen,θn) Definition of (E)n,θn) The characteristic point of membership is Qj
(4) Computing using an incremental matching algorithmSimilar ring type corner connector, will (E)k,θk) And each (E)n,θn) Making a comparison when (E)k,θk) And (E)n,θn) When the comparison operation of (A) is true, respectively (E)k,θk) And (E)n,θn) Generating P as a starting pointiAnd QjTwo ring type corner connectors LP={A0,A1,...,A(K-1)And LQ={B0,B1,...,B(K-1)In which A isk∈LpAnd Bn∈LQDenotes corner code, note Ek A,θk A,En B,θn BRespectively represent Ak,BnThe edges and corners of (a);
first compare A0,B0If theta is greater than theta0 A,θ0 BIf the angle codes are not equal, the edge code with the small angle and the next adjacent edge code are subjected to addition operation to generate a new edge code; if theta is greater than theta0 A,θ0 BIf equal, verify θ0 A,θ0 BAdjacent corresponding edge (E)0 A,E0 B) And (E)1 A,E1 B) Whether or not to scale, if so, then A0,B0Keeping the same in the obtained similar ring-type corner codes, if not in proportion, adding A to the obtained similar ring-type corner codes0,B0Adding operation is carried out on the new corner codes and the adjacent next corner codes to generate new corner codes, then next comparison operation is carried out in a circulating mode, and the circulating process is carried out until one of the ring-type corner codes is traversed;
(5) determining possible matching points, defining S (A)k,Bn)={Lk,LnDenotes by Ek A,En BSimilar ring-type corner connectors, L, for the starting corresponding edgek,LnRepresenting similar ring type corner codes, let S (A)k,Bn) Of similar length lenij(k, n), characteristic point Pi,QjPhase ofThe similarity is defined as:
<math> <mrow> <mi>similarity</mi> <mrow> <mo>(</mo> <msub> <mi>P</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>Q</mi> <mi>j</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <munder> <mi>max</mi> <mrow> <mi>k</mi> <mo>,</mo> <mi>n</mi> <mo>&Element;</mo> <mo>[</mo> <mn>0</mn> <mo>,</mo> <mi>K</mi> <mo>-</mo> <mn>1</mn> <mo>]</mo> </mrow> </munder> <msup> <mi>len</mi> <mi>ij</mi> </msup> <mrow> <mo>(</mo> <mi>k</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> </mrow> </math>
when Q isj *Satisfy the requirement of <math> <mrow> <msup> <msub> <mi>Q</mi> <mi>j</mi> </msub> <mo>*</mo> </msup> <mo>=</mo> <msub> <mrow> <mi>arg</mi> <mi>max</mi> </mrow> <mrow> <msub> <mi>Q</mi> <mi>j</mi> </msub> <mo>&Element;</mo> <mi>T</mi> </mrow> </msub> <mi>similarity</mi> <mrow> <mo>(</mo> <msub> <mi>P</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>Q</mi> <mi>j</mi> </msub> <mo>)</mo> </mrow> </mrow> </math> When, define Qj *Is PiPossible matching points of, Qj *The number of (2) is defined as NO.
CN2009100524538A 2009-06-03 2009-06-03 Image matching method based on characteristic points Expired - Fee Related CN101567051B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009100524538A CN101567051B (en) 2009-06-03 2009-06-03 Image matching method based on characteristic points

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2009100524538A CN101567051B (en) 2009-06-03 2009-06-03 Image matching method based on characteristic points

Publications (2)

Publication Number Publication Date
CN101567051A CN101567051A (en) 2009-10-28
CN101567051B true CN101567051B (en) 2012-08-22

Family

ID=41283196

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009100524538A Expired - Fee Related CN101567051B (en) 2009-06-03 2009-06-03 Image matching method based on characteristic points

Country Status (1)

Country Link
CN (1) CN101567051B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8965134B2 (en) 2011-04-05 2015-02-24 Hewlett-Packard Development Company, L.P. Document registration

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101916445A (en) * 2010-08-25 2010-12-15 天津大学 An Image Registration Method Based on Affine Parameter Estimation
CN102088569B (en) * 2010-10-13 2013-06-19 首都师范大学 Sequence image splicing method and system of low-altitude unmanned vehicle
CN102004786B (en) * 2010-12-02 2012-11-28 上海交通大学 Acceleration method in image retrieval system
CN102567724B (en) * 2010-12-11 2016-06-22 罗普特(厦门)科技集团有限公司 Image correction system and method
GB2487377B (en) * 2011-01-18 2018-02-14 Aptina Imaging Corp Matching interest points
CN102184560B (en) * 2011-03-25 2013-01-30 南昌航空大学 A Template-Based CCD-DR Image Stitching Method
CN102129477B (en) * 2011-04-23 2013-01-09 山东大学 Multimode-combined image reordering method
US9390514B2 (en) 2011-06-09 2016-07-12 The Hong Kong University Of Science And Technology Image based tracking
CN102231191B (en) * 2011-07-17 2012-12-26 西安电子科技大学 Multimodal image feature extraction and matching method based on ASIFT (affine scale invariant feature transform)
KR20130015146A (en) * 2011-08-02 2013-02-13 삼성전자주식회사 Method and apparatus for processing medical image, robotic surgery system using image guidance
CN102857704B (en) * 2012-09-12 2015-08-19 天津大学 With the multisource video joining method of time-domain synchronous calibration technology
CN104268140B (en) * 2014-07-31 2017-06-23 浙江大学 Image search method based on weight self study hypergraph and multivariate information fusion
CN104616300B (en) * 2015-02-03 2017-07-28 清华大学 The image matching method and device separated based on sampling configuration
CN105472272A (en) * 2015-11-25 2016-04-06 浙江工业大学 Multi-channel video splicing method based on FPGA and apparatus thereof
CN105869145B (en) * 2016-03-22 2018-12-14 武汉工程大学 A kind of nuclear magnetic resonance image multistep method for registering accelerated based on k-t
CN106446923B (en) * 2016-05-25 2019-08-06 哈尔滨工程大学 Medical Image Classification Method Based on Corner Matching
CN106530341B (en) * 2016-11-01 2019-12-31 成都理工大学 A Point Registration Algorithm Preserving Local Topological Invariance
CN106548493A (en) * 2016-11-03 2017-03-29 亮风台(上海)信息科技有限公司 A kind of method and system of figure matching
CN108458655A (en) * 2017-02-22 2018-08-28 上海理工大学 Support the data configurableization monitoring system and method for vision measurement
CN107240127A (en) * 2017-04-19 2017-10-10 中国航空无线电电子研究所 The image registration appraisal procedure of distinguished point based mapping
CN108427927B (en) * 2018-03-16 2020-11-27 深圳市商汤科技有限公司 Object re-recognition method and apparatus, electronic device, program, and storage medium
CN109726718B (en) * 2019-01-03 2022-09-16 电子科技大学 A system and method for visual scene graph generation based on relational regularization
CN110162656B (en) * 2019-05-05 2021-04-06 南京师范大学 A method and system for enhancing image feature point information
CN110288516A (en) * 2019-06-27 2019-09-27 北京迈格威科技有限公司 Method, apparatus, equipment and the computer readable storage medium of image procossing
CN112651408B (en) * 2021-01-07 2022-05-20 华中科技大学 Point-to-point transformation characteristic-based three-dimensional local surface description method and system
CN113160284B (en) * 2021-03-09 2024-04-30 大连海事大学 Guidance space-consistent photovoltaic image registration method based on local similar structure constraint
CN113479105A (en) * 2021-07-20 2021-10-08 钟求明 Intelligent charging method and intelligent charging station based on automatic driving vehicle

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101140624A (en) * 2007-10-18 2008-03-12 清华大学 image matching method
CN101350101A (en) * 2008-09-09 2009-01-21 北京航空航天大学 Automatic Registration Method of Multiple Depth Images

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101140624A (en) * 2007-10-18 2008-03-12 清华大学 image matching method
CN101350101A (en) * 2008-09-09 2009-01-21 北京航空航天大学 Automatic Registration Method of Multiple Depth Images

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8965134B2 (en) 2011-04-05 2015-02-24 Hewlett-Packard Development Company, L.P. Document registration

Also Published As

Publication number Publication date
CN101567051A (en) 2009-10-28

Similar Documents

Publication Publication Date Title
CN101567051B (en) Image matching method based on characteristic points
Zhang et al. A review of deep learning-based semantic segmentation for point cloud
US11416710B2 (en) Feature representation device, feature representation method, and program
CN104090972B (en) The image characteristics extraction retrieved for D Urban model and method for measuring similarity
CN113160287B (en) Complex component point cloud splicing method and system based on feature fusion
Hao et al. Efficient 2D-to-3D correspondence filtering for scalable 3D object recognition
Zhang et al. Convmatch: Rethinking network design for two-view correspondence learning
CN103345744B (en) A kind of human body target part automatic analytic method based on many images
CN110188763B (en) Image significance detection method based on improved graph model
Xu et al. GLORN: Strong generalization fully convolutional network for low-overlap point cloud registration
Li et al. Hierarchical semantic parsing for object pose estimation in densely cluttered scenes
CN115661509A (en) Surgical instrument identification and classification method based on three-dimensional point cloud ICP (inductively coupled plasma) registration algorithm
Radkowski et al. Natural feature tracking augmented reality for on-site assembly assistance systems
Shao et al. A deep learning-based semantic filter for RANSAC-based fundamental matrix calculation and the ORB-SLAM system
Li et al. 4FP-structure: A robust local region feature descriptor
CN104978582A (en) Contour chord angle feature based identification method for blocked target
CN115049833A (en) Point cloud component segmentation method based on local feature enhancement and similarity measurement
Han et al. Grid graph-based large-scale point clouds registration
Nurhaida et al. Determining the Number of Batik Motif Object based on Hierarchical Symmetry Detection Approach
Wang et al. Image matching via the local neighborhood for low inlier ratio
Tang et al. A GMS-guided approach for 2D feature correspondence selection
CN112861714B (en) Remote sensing image matching method based on deep learning and multi-sub image matching
Wu et al. Image Matching Algorithm Based on Topology Consistency of Bidirectional Optimal Matching Point Pairs.
CN114707174A (en) Data processing method and device, electronic equipment and storage medium
Shen et al. Extended Neighborhood Consensus With Affine Correspondence for Outlier Filtering in Feature Matching

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120822

Termination date: 20200603

CF01 Termination of patent right due to non-payment of annual fee