CN104700400A - High-precision image matching method based on feature point training - Google Patents

High-precision image matching method based on feature point training Download PDF

Info

Publication number
CN104700400A
CN104700400A CN201510046829.XA CN201510046829A CN104700400A CN 104700400 A CN104700400 A CN 104700400A CN 201510046829 A CN201510046829 A CN 201510046829A CN 104700400 A CN104700400 A CN 104700400A
Authority
CN
China
Prior art keywords
point
image
unique point
registration
unique
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510046829.XA
Other languages
Chinese (zh)
Inventor
胡晓彤
陈蕴智
田仁赞
郭少英
王旭迎
程雪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University of Science and Technology
Original Assignee
Tianjin University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University of Science and Technology filed Critical Tianjin University of Science and Technology
Priority to CN201510046829.XA priority Critical patent/CN104700400A/en
Publication of CN104700400A publication Critical patent/CN104700400A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention relates to a high-precision image matching method based on feature point training. The method is that the feature point matching is performed for images to be registered under different angles and a standard image before registering the image; the feature points achieving accurate matching under different angles are used as the reference feature points for the actual detection. According to the method, the selected feature points benefit to treat the searched feature points with high positioning precision as the calculation basis of a transformation module, and therefore, the positioning precision can be improved; meanwhile, the calculation load of the feature point matching operation can be further reduced, and the operation speed of the algorithm can be increased.

Description

A kind of high precision image matching process of distinguished point based training
Technical field
The present invention relates to characteristic point matching method, especially a kind of high precision image matching process of distinguished point based training.
Background technology
Image registration is a basic problem of field of machine vision, is the focus studied of people and difficult point always.Image registration refers to finds corresponding relation between two width or multiple image of the Same Scene from different time, different visual angles or different sensors.About the method for image registration is mainly divided into: based on the method for gray scale registration and feature based registration.
Based on the method for gray scale registration, also referred to as correlation registration method, carry out image registration with space two-dimensional sleiding form, the difference of algorithms of different is mainly reflected in the selection of correlation criterion.Distinguished point based method for registering first in original image, extracts feature, then sets up the registration relation of unique point between two width images, have higher robustness, such as SURF algorithm.At present, be used widely based on the image registration of SURF algorithm, be such as applied to medical figure registration, remote sensing image registration etc.The unique point that first SURF algorithm extracts in image carries out Feature Points Matching, and the feature point pairs of coupling is used for computed image registration transformation model parameter, makes image subject to registration transform to standard picture position exactly, to reach registration object.The coupling of unique point is the key realizing image registration, and the precision of coupling directly affects the precision of subsequent registration.According to the unique point neighborhood information comprised in SURF unique point descriptor in algorithm, adopt K nearest neighbor method can find out two potential optimal match points of each unique point, and optimize optimal matching points by the distance optimum value of match point with time good value ratio.
But in real image registration process, image subject to registration also exists the change of different angles relative to standard picture, although SURF feature point detection algorithm has the unique point of rotational invariance, but along with the change of image different angles, the positioning precision of the unique point therefrom extracted still has certain otherness.In order to improve the precision of image registration, how to extract and not become the key of dealing with problems by the unique point of image angle variable effect.
Summary of the invention
The object of the present invention is to provide the high precision image matching process that a kind of distinguished point based is trained, choose not by the unique point of image rotation angle variable effect, thus improve the precision of image registration.
The present invention solves its technical matters and takes following technical scheme to realize:
A kind of high precision image matching process of distinguished point based training, it is characterized in that: before image registration, the image subject to registration under different angles and standard picture are carried out Feature Points Matching, and extract the higher unique point of positioning precision as the reference characteristic point in actual testing process.
And, the concrete selection step of described reference characteristic point is: use SURF feature point extraction algorithm extracts the unique point in two kinds of different rotary angular image respectively, and mate with the unique point in standard picture respectively, obtain the feature point set mated in standard picture, the X and Y coordinates of recording feature point; Analyze the two groups of unique point set that the match is successful obtained, choose in two unique point set and there is the on all four unique point of X and Y coordinates as reference characteristic point.
And the described anglec of rotation is 2 ° to 10 °.
And described image registration adopts SURF feature point detection algorithm, using the computed image registration transformation model parameter of reference characteristic point as the image registration of SURF algorithm, makes image subject to registration transform to standard picture position exactly.
Advantage of the present invention and good effect are:
1, this method is by the matching performance of checking unique point, the unique point chosen all can all accurate match under different rotary condition, and its coordinate reaches the precision of after radix point three, the descriptive power of this part feature point pairs characteristics of image is strong, positioning precision is high, and can adapt to the Feature Points Matching under different rotary angle better.
2, the unique point that the present invention chooses both had been conducive to searching the basis of the higher unique point of positioning precision as transformation model, thus improved positioning precision; Also can reduce the calculated amount of Feature Points Matching computing simultaneously further, accelerate the arithmetic speed of algorithm.
Accompanying drawing explanation
Fig. 1 is the unique point of mating in counterclockwise 2 degree of standard pictures;
Fig. 2 is the unique point of mating in clockwise 2 degree of standard pictures;
Fig. 3 is the unique point distribution plan realizing in standard picture under different rotary angle mating;
Fig. 4 is the coordinate difference Distribution value result that coincidence unique point calculates with the unique point that do not overlap;
Fig. 5 is 9 × 9 box Filtering Templates;
Fig. 6 is choosing of the principal direction of unique point;
The formation of Fig. 7 feature descriptor.
Embodiment
Below in conjunction with accompanying drawing, also by specific embodiment, the invention will be further described, and following examples are descriptive, are not determinate, can not limit protection scope of the present invention with this.
A high precision image matching process for distinguished point based training, the positioning precision of research characteristic point is by the influence degree of different conversion, and the present embodiment have chosen the testing image that is rotated counterclockwise 2 degree and the testing image of 2 degree of turning clockwise is tested.
First, use SURF feature point extraction algorithm extracts the unique point in two kinds of different rotary angular image respectively, and mate with the unique point in standard picture respectively, obtain the feature point set mated in standard picture, as depicted in figs. 1 and 2, difform point is labeled with the unique point realizing in standard picture under different rotary angle mating.
Subsequently, after analyzing two groups of the acquisition unique point set that the match is successful further, find to there is the on all four unique point of X and Y coordinates (two shown in Fig. 3 kind mark the point overlapped completely) in two unique point set, the on all four unique point of X and Y coordinates is the unique point of all energy accurate match under different rotary angle.
Table 1 is depicted as the coordinate partially overlapped a little.
Table 1 coincide point coordinate
Sequence number X-coordinate Y coordinate
1 143.113 119.743
2 216.9 155.781
3 225.787 327.051
4 243.284 410.629
5 258.155 55.093
6 344.04 297.874
7 381.819 415.828
8 509.629 262.798
9 513.646 205.083
10 518.92 262.092
As can be seen from experimental result, there is bigger difference in the unique point set that under different rotary condition, the match is successful.Meanwhile, we notice that in above-mentioned feature point pairs set, there is part occurs simultaneously, and in above-mentioned unique point set, namely there is Partial Feature point all can all accurate match under different rotary condition, and its coordinate reaches the precision of after radix point three.This experimental result illustrates that the descriptive power of this part feature point pairs characteristics of image is strong, and positioning precision is high, and can adapt to the Feature Points Matching under different rotary angle better.
In order to verify the image feature representation ability that this kind of unique point has further, carry out the rotation compatibility test of unique point.At this, the coincidence unique point of extraction shown in Fig. 3 and the unique point that do not overlap (being the unique point in standard picture) are respectively as Feature Points Matching reference point, carry out Feature Points Matching with the unique point extracted in the testing image of 10 degree of turning clockwise, and the unique point that the match is successful done following process:
The theoretical coordinate value (theory calculate obtain) of unique point after image rotation 10 degree achieving coupling in two groups of matching reference points is calculated respectively according to image rotation theory;
The unique point coordinate figure calculated in (1) is carried out the mathematic interpolation in X and Y-direction with the coordinate figure achieving the unique point of mating in the testing image rotating 10 degree, thus obtain the deviations of above-mentioned unique point, result as shown in Figure 4.
As can be seen from Figure 4, the theoretical coordinate value difference value of the coincidence unique point coordinate figure extracted in standard picture and actual computation changes relatively little, and Relatively centralized is near initial point, shows that its positioning precision is higher.Correspond, the deviations of the unique point that do not overlap is just larger.This result shows the positioning precision of unique point, and there is some difference.
For this reason, before image registration, the image subject to registration under different angles and standard picture can be carried out Feature Points Matching, and extract the higher unique point of positioning precision as the reference characteristic point in actual testing process.By said method, be both conducive to searching the basis of the higher unique point of positioning precision as transformation model, thus improved positioning precision; Also can reduce the calculated amount of Feature Points Matching computing simultaneously further, accelerate the arithmetic speed of algorithm.
In order to illustrate that the unique point can extracted before carrying out image registration and all can mate with the image subject to registration under different angles in standard picture is as the reference characteristic point in actual testing process, has carried out further analysis.Feature Points Matching is carried out by two training images subject to registration and standard picture of selecting different angles combination, extract the unique point all can mated with two images subject to registration in standard picture, and use the test pattern of width different angles to test, to verify the validity of the unique point of extraction.As shown in table 2.
The precision analysis of table 2 Feature Points Matching
As can be seen from Table 2, the mean distance of coincide point and theoretical coordinate is all less than the mean distance of non-coincidence point, and about being only the half of the latter.Thus further illustrate, with the validity of the unique point of the equal energy of the image under different angles realization character Point matching as the reference point in subsequent detection process on selection standard image.
The reference characteristic point finally obtained is used for the calculating of registration transformation model parameter, as the computed image registration transformation model parameter of SURF algorithm image registration, makes image subject to registration transform to standard picture position exactly, to reach registration object.
SURF feature point detection algorithm is that a kind of speed proposed on the basis of SIFT algorithm is fast, the feature extraction algorithm of strong robustness.This algorithm not only has good robustness to image rotation, translation, convergent-divergent and noise, and has good process to illumination variation and visual angle change unchangeability and image blur.Meanwhile, by introducing integral image and cassette filter, arithmetic speed approximately improves 3 times, and combination property is more superior.
SURF feature point detection algorithm is the position being determined unique point by the local extremum of calculating Hessian matrix determinant.Wherein, yardstick is the image I mid point of σ hessian defined matrix be:
H = L xx ( x ^ , σ ) L xy ( x ^ , σ ) L xy ( x ^ , σ ) L yy ( x ^ , σ )
In formula, L xxthat Gauss's second order is led the result of same I=(x, y) convolution, wherein l xy, L yythere is identical implication.
SURF algorithm adopts box filtering to carry out approximate second gaussian filtering, constructs a kind of Hessian matrix fast, and uses integral image to accelerate convolution to improve computing velocity.
Wherein, the box Filtering Template of 9 × 9 is illustrated in figure 5.In order to make SURF feature have scale invariability, need tectonic scale space, and obtain Local Extremum by quick Hessian matrix determinant under each metric space.At this, SURF algorithm adopts the cassette filter of different size and source images to carry out convolution to obtain the feature interpretation in different scale space.After box filtering process, the value in x direction is denoted as D xx, the value in xy direction is denoted as D xy, the value in y direction is denoted as D yy.Due to the approximate evaluation that box filtering is second order Gauss filtering, therefore introduce scale factor ω (ω is approximate gets 0.9), solving the ranks expression formula obtaining quick Hessian matrix is further:
ΔH=D xx(x)D yy(x)-(ωD xy(x)) 2
Wherein, Δ is the cassette filter response of picture point I (x, y) peripheral region, carries out the detection of extreme point with Δ.
Obtain the extreme point of scalogram picture local under each metric space according to quick Hessian matrix after, each Local Extremum and 8 consecutive point of same yardstick and each 9 points of its upper and lower two yardstick are formed the three-dimensional neighborhood of 3 × 3 × 3.Adjacent with three-dimensional neighborhood for each extreme point of metric space 26 points are compared, when only having the value when Local Extremum to be all greater than (or being less than) all 26 consecutive point, just by this Local Extremum alternatively unique point.In order to sub-pixel positioning can be carried out to candidate feature point, interpolation can be carried out in metric space and image space, obtain stable characteristic point position and the scale-value at place.
The quadratic fit function that interpolation uses is:
D ( x ) = D + ∂ D T ∂ X X + 1 2 X T ∂ D ∂ X 2 X
Obtaining extreme point to function differentiate is:
X ^ = - ∂ 2 D - 1 ∂ X 2 ∂ D ∂ X
The extreme value of function at extreme point place is:
D ( X ^ ) = D + 1 2 ∂ D T ∂ X X ^
The candidate feature point of D (x) < 0.03 is rejected in experiment.
The description of SURF unique point
For keeping the rotational invariance of unique point, after characteristic point position is determined, for each unique point carries out the determination of principal direction.For this reason, centered by unique point, calculating radius is in the border circular areas of 6 σ (σ is the yardstick of unique point), in x and y direction, Harr small echo (the Harr small echo length of side gets 4 σ) is carried out to image and responds computing, and use yardstick to be that the gaussian weighing function of 2 σ carries out Gauss's weighting to Harr small echo response, make the response contribution the closer to unique point larger; Then travel through whole border circular areas by the sector region scope of π/3 size, and the direction of the maximal value of vector mould horizontal directional response in territory and vertical direction responded is defined as the principal direction (as shown in Figure 6) of unique point.
Then, centered by unique point, structure is perpendicular to principal direction and the length of side is the square window port area of 20 σ, and this window area is divided into the subregion of 4 × 4.In each subregion, carry out the calculating of the horizontal direction of a 5 σ × 5 σ sampled point and the little wave response of Harr of vertical direction, be denoted as d respectively xand d y, same use yardstick is that the gaussian weighing function of 2 σ carries out Gauss's weighting to Harr small echo response, to increase the robustness to geometric transformation.Then the absolute value of the response of every sub regions and response is added formation ∑ d x, ∑ d y, ∑ | d x|, ∑ | d y|.Thus, every sub regions just defines a four-dimensional feature interpretation vector V:
V=(Σd x,Σd x,Σ|d x|,Σ|d y|)
For each unique point, form the proper vector of 4 × 4 × 4=64 dimension.For ensureing the unchangeability to illumination, proper vector being normalized, obtaining final SURF descriptor, as shown in Figure 7.
The coupling of unique point
The coupling of unique point is the key realizing image registration, and the precision of coupling directly affects the precision of subsequent registration.According to the unique point neighborhood information comprised in SURF unique point descriptor in algorithm, adopt K nearest neighbor method can find out two potential optimal match points of each unique point, and optimize optimal matching points by the distance optimum value of match point with time good value ratio.
If M 1, M 2be respectively two width image I 1, I 2adopt the unique point set to be matched that SURF extracts respectively, to M 1in arbitrary unique point m 1i, M 2in with m 1iminimum two unique points of Euclidean distance be respectively m 2j, m' 2j, respective distances is respectively d ij, d' ijif, d ij≤ α * d' ij(α=0.65 is got in experiment), then think m 1iwith m 2jfor preferred matching double points.
The method above of utilization can obtain standard picture and all matching double points of image subject to registration wherein S, D are respectively the feature point set that standard picture and image subject to registration complete coupling.Owing to there is certain Mismatching point pair in these matching double points, the similar method of triangle is adopted to purify to it herein.Appoint in feature point set S, D and get three matching double points P 1with Q 1, P 2with Q 2, and P 3with Q 3, form virtual triangle Δ P 1p 2p 3with Δ Q 1q 2q 3, and forming a triangle pair, the length on its three limit is set to lp respectively 1, lp 2, lp 3and lq 1, lq 2, lq 3if meet the relation in following formula, then show that the matching double points chosen is available point pair, otherwise there is Mismatching point pair, reject, the purification that realization character point is right:
lp 1 lp 1 &ap; lp 2 lp 2 &ap; lp 3 lp 3
Control errors in the present embodiment experiment between side ratio is 0.02.
Although disclose embodiments of the invention and accompanying drawing for the purpose of illustration, but it will be appreciated by those skilled in the art that: in the spirit and scope not departing from the present invention and claims, various replacement, change and amendment are all possible, therefore, scope of the present invention is not limited to the content disclosed in embodiment and accompanying drawing.

Claims (4)

1. the high precision image matching process of a distinguished point based training, it is characterized in that: before image registration, the image subject to registration under different angles and standard picture are carried out Feature Points Matching, and extract the higher unique point of positioning precision as the reference characteristic point in actual testing process.
2. the high precision image matching process of distinguished point based training according to claim 1, it is characterized in that: the concrete selection step of described reference characteristic point is: use SURF feature point extraction algorithm extracts the unique point in two kinds of different rotary angular image respectively, and mate with the unique point in standard picture respectively, obtain the feature point set mated in standard picture, the X and Y coordinates of recording feature point; Analyze the two groups of unique point set that the match is successful obtained, choose in two unique point set and there is the on all four unique point of X and Y coordinates as reference characteristic point.
3. the high precision image matching process of distinguished point based training according to claim 1, is characterized in that: the described anglec of rotation is 2 ° to 10 °.
4. the high precision image matching process of distinguished point based training according to claim 1, it is characterized in that: described image registration adopts SURF feature point detection algorithm, using the computed image registration transformation model parameter of reference characteristic point as the image registration of SURF algorithm, image subject to registration is made to transform to standard picture position exactly.
CN201510046829.XA 2015-01-30 2015-01-30 High-precision image matching method based on feature point training Pending CN104700400A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510046829.XA CN104700400A (en) 2015-01-30 2015-01-30 High-precision image matching method based on feature point training

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510046829.XA CN104700400A (en) 2015-01-30 2015-01-30 High-precision image matching method based on feature point training

Publications (1)

Publication Number Publication Date
CN104700400A true CN104700400A (en) 2015-06-10

Family

ID=53347482

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510046829.XA Pending CN104700400A (en) 2015-01-30 2015-01-30 High-precision image matching method based on feature point training

Country Status (1)

Country Link
CN (1) CN104700400A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105701766A (en) * 2016-02-24 2016-06-22 网易(杭州)网络有限公司 Image matching method and device
CN110443796A (en) * 2019-08-15 2019-11-12 厦门盈趣科技股份有限公司 Graph testing method and relevant apparatus
CN114565781A (en) * 2022-02-25 2022-05-31 中国人民解放军战略支援部队信息工程大学 Image matching method based on rotation invariance

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101276465A (en) * 2008-04-17 2008-10-01 上海交通大学 Method for automatically split-jointing wide-angle image
US20120281146A1 (en) * 2010-11-11 2012-11-08 Hitoshi Yamada Image processing device, image processing method, and program for image processing
CN103489195A (en) * 2013-10-11 2014-01-01 胡晓彤 Method for detecting can body pattern direction of metal can
CN103955931A (en) * 2014-04-29 2014-07-30 江苏物联网研究发展中心 Image matching method and device
CN103979118A (en) * 2014-04-01 2014-08-13 浙江大学 Airfoil wall plate digital positioning method and positioning device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101276465A (en) * 2008-04-17 2008-10-01 上海交通大学 Method for automatically split-jointing wide-angle image
US20120281146A1 (en) * 2010-11-11 2012-11-08 Hitoshi Yamada Image processing device, image processing method, and program for image processing
CN103489195A (en) * 2013-10-11 2014-01-01 胡晓彤 Method for detecting can body pattern direction of metal can
CN103979118A (en) * 2014-04-01 2014-08-13 浙江大学 Airfoil wall plate digital positioning method and positioning device
CN103955931A (en) * 2014-04-29 2014-07-30 江苏物联网研究发展中心 Image matching method and device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
朱世元: "抗几何攻击鲁棒数字图像水印算法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
胡晓彤,杨威: "一种基于快速傅里叶变换的分区域图像配准方法", 《天津科技大学学报》 *
苑玮琦 等: "基于序列图像提取稳定特征点的虹膜识别算法", 《仪器仪表学报》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105701766A (en) * 2016-02-24 2016-06-22 网易(杭州)网络有限公司 Image matching method and device
CN105701766B (en) * 2016-02-24 2019-03-19 网易(杭州)网络有限公司 Image matching method and device
CN110443796A (en) * 2019-08-15 2019-11-12 厦门盈趣科技股份有限公司 Graph testing method and relevant apparatus
CN114565781A (en) * 2022-02-25 2022-05-31 中国人民解放军战略支援部队信息工程大学 Image matching method based on rotation invariance

Similar Documents

Publication Publication Date Title
CN101950419B (en) Quick image rectification method in presence of translation and rotation at same time
CN101980250B (en) Method for identifying target based on dimension reduction local feature descriptor and hidden conditional random field
CN103824080B (en) Robot SLAM object state detection method in dynamic sparse environment
CN105913489A (en) Indoor three-dimensional scene reconstruction method employing plane characteristics
CN103136525B (en) High-precision positioning method for special-shaped extended target by utilizing generalized Hough transformation
CN105740899A (en) Machine vision image characteristic point detection and matching combination optimization method
CN102661708B (en) High-density packaged element positioning method based on speeded up robust features (SURFs)
CN102324045B (en) Invariant-moment target recognition method based on Radon transformation and polar harmonic transformation
CN104318548A (en) Rapid image registration implementation method based on space sparsity and SIFT feature extraction
CN103400388A (en) Method for eliminating Brisk key point error matching point pair by using RANSAC
CN103295239A (en) Laser-point cloud data automatic registration method based on plane base images
CN103679702A (en) Matching method based on image edge vectors
CN103727930A (en) Edge-matching-based relative pose calibration method of laser range finder and camera
CN105160686B (en) A kind of low latitude various visual angles Remote Sensing Images Matching Method based on improvement SIFT operators
CN102865859A (en) Aviation sequence image position estimating method based on SURF (Speeded Up Robust Features)
CN103632142A (en) Local coordinate system feature description based image matching method
CN112163588A (en) Intelligent evolution-based heterogeneous image target detection method, storage medium and equipment
CN105787451A (en) Fingerprint matching method based on multi-judgment point mode
CN104851095A (en) Workpiece image sparse stereo matching method based on improved-type shape context
CN116452644A (en) Three-dimensional point cloud registration method and device based on feature descriptors and storage medium
CN103593838A (en) Rapid cross-correlation grey-scale image coupling method and rapid cross-correlation grey-scale image coupling device
CN108182705A (en) A kind of three-dimensional coordinate localization method based on machine vision
CN101833763A (en) Method for detecting reflection image on water surface
CN104700400A (en) High-precision image matching method based on feature point training
CN103778619A (en) Image matching method based on Zernike matrix

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20150610