CN104766323B - A kind of Point matching method of remote sensing images - Google Patents
A kind of Point matching method of remote sensing images Download PDFInfo
- Publication number
- CN104766323B CN104766323B CN201510160933.1A CN201510160933A CN104766323B CN 104766323 B CN104766323 B CN 104766323B CN 201510160933 A CN201510160933 A CN 201510160933A CN 104766323 B CN104766323 B CN 104766323B
- Authority
- CN
- China
- Prior art keywords
- point
- matching
- remote sensing
- pairs
- sensing images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 39
- 238000006243 chemical reaction Methods 0.000 claims description 9
- 238000012216 screening Methods 0.000 claims description 5
- 238000004364 calculation method Methods 0.000 claims description 4
- 230000006698 induction Effects 0.000 claims 1
- 238000005259 measurement Methods 0.000 abstract 1
- 238000004422 calculation algorithm Methods 0.000 description 5
- 230000009466 transformation Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 238000001228 spectrum Methods 0.000 description 3
- 238000000605 extraction Methods 0.000 description 2
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
Landscapes
- Image Analysis (AREA)
- Other Investigation Or Analysis Of Materials By Electrical Means (AREA)
Abstract
The invention discloses a kind of Point matching method of remote sensing images, and the initial matching point for changing greatly remote sensing images in background, which is concentrated, filters out correct Point matching collection.This method is to apply the similitude of the global structure of correct match point composition in two images.First, by three points arbitrarily choosing to forming many corresponding triangle pairs, the difference between two triangle corresponding angle and corresponding length of side ratio is calculated, and compares Space Rotating order of three summits relative to reference point, filters out the triangle pair for meeting condition of similarity.Then each frequency put to appearance is counted in these similar triangles, establishes histogram, given threshold, and frequency is less than the point of a certain threshold value to removing.Finally, Space Consistency measurement is used to remove the very much like Mismatching point of those coordinates.The present invention improves the Point matching precision of remote sensing images, while improves the registration accuracy of remote sensing images.
Description
Technical Field
The invention relates to the technical field of remote sensing image processing, in particular to a point matching method of a remote sensing image.
Background
Image registration is one of the most basic techniques in image processing, and has been widely used in the fields of medicine, pattern recognition, remote sensing, and computer vision. Registration is the alignment of different images so that all features in them can be represented using the same coordinate system. Feature matching is the most important link in image registration, and if there are too many mismatches in the feature matching set, erroneous transformation parameters will be generated. A robust and accurate feature matching method is therefore necessary. Most of the feature matching methods which are applied more currently are point matching methods. The point matching method is mainly divided into two types: 1) Matching by using local gray information; 2) Matching is performed using a local neighborhood structure or a global structure. The SIFT is a point feature extraction and point matching method based on local descriptor, and matching point pairs are searched according to the similarity of point descriptors. However, when the method is applied to remote sensing images, especially images before and after a disaster, the local gray value change is large, the method is very unreliable by simply depending on local gray information, and more mismatching points are easily generated. RANSAC is a classical point matching method, which uses an initial matching point set to estimate conversion parameters and simultaneously removes mismatching in the matching point set. However, this method is not effective when the mismatching point pair ratio is large. In the GTM, correct matching is searched from an initial matching point set by using a spatial relation in the field of the characteristic point K, but the method cannot eliminate mismatching with the same neighborhood structure. In addition, correct matching points surrounded by a large number of mismatched points may be erroneously removed. The WGTM algorithm is a modified version of GTM, which uses the angular separation between the connected feature dotted line segments as a weight. But it can only remove a part of mismatching with similar neighborhood structure and can not distinguish correct matching points with different neighborhood structures. The Bi-SOGC algorithm is a GTM-based method and is mainly applied to the field of remote sensing images. It uses ordered line segments instead of unordered ones, the spatial order of the feature points is used to remove mismatches with similar neighborhood structure, and a recovery strategy is used to retrieve the correct pair of incorrectly removed points. This method is effective for some remote sensing images, but it cannot handle isolated correct matching points without any feature points in the neighborhood.
The existing point matching method is based on local gray information for matching, and is based on local neighborhood structure for matching. However, when the method is applied to remote sensing image registration, especially remote sensing images before and after a disaster, due to the fact that background change is large, mismatching is frequently more in initial matching concentration, correct matching is few, correct matching distribution is sparse, correct structural information capable of being utilized in a neighborhood is few, and a satisfactory effect cannot be achieved by using local gray scale information or neighborhood structural information. The invention (triangle transformation matching method TTM) utilizes the similarity of the global structure formed by correct matching points to remove mismatching and retain correct matching. Although the change of the gray scale information of the remote sensing images before and after the disaster is large, the global structure formed by the correct matching points is still similar. The invention adopts triangles as the elements of the global structure to compare the similarity of the triangles and finally extracts the correct matching point set.
Disclosure of Invention
The invention aims to: the point matching method of the remote sensing image improves the point matching precision of the remote sensing image and improves the registration precision of the remote sensing image.
The technical scheme adopted by the invention is as follows: a point matching method of remote sensing images comprises the following steps:
step (1): the initial matching point set is two corresponding point sets extracted from the input in the reference image and the sensing image, and P = { P } in the initial matching point set i And P' = { P } i ' } where i =1,2,3.. N, N is the number of point concentration points, and any three point pairs are selected to form a plurality of corresponding triangular pairs, and any three point pairs are designated as (p) i ,p j ,p k ) And (p) i ',p j ',p k ');
Step (2): calculating the corresponding angle difference and the corresponding side length ratio difference between the two triangles:
and (3):andrepresenting point p i And p i The longitudinal and transverse coordinates of' set reference points in the two triangular shapes respectivelyp d And p d ', the coordinate calculation formula is as follows:
and (4): setting a triangle similarity threshold theta b =5°,l b =0.1, then according to the order consistency of the three vertices with respect to the reference point and the condition Δ θ 1 ,△θ 2 ,△θ 3 ∈[0,θ b ],△l 1 ,△l 2 ,△l 3 ∈[0,l b ]Screening out the triangle pairs meeting the conditions;
and (5): counting the occurrence times of each point pair in the selected triangular pairs and establishing a frequency histogram;
and (6): setting threshold k according to experimental analysis p And the number of occurrences of the removal is less than k p Point pair of (2);
and (7): calculating model conversion parameters from all the point pairs at this time, the point pairs being denoted as P o And P o ', the number n;
T=argminE(T(θ)) (10)
wherein,denotes the root mean square error, T (θ) denotes the set of points P o And P o ' conversion parameter using least squares method, d (i) = | | | T (θ) (p) i ')-p i I =1,2,3,. N denotes a conversion error of each point pair;
and (8): setting a threshold value E t When E (T (theta))>E t When the condition E (T (θ)) ≦ E is satisfied, the point pairs having the largest d (i) values are removed and the point set is updated t And then, the iteration is finished, and two correct matching point sets are obtained at the moment.
The method is used for screening out a correct point matching set from the initial matching point set of the remote sensing image with large background change.
Compared with the prior art, the invention has the advantages that:
(1) The invention uses the global structure as a comparison element and uses the triangle as a primitive, thereby being easy to calculate and ensuring the stability of the matching result.
(2) The invention ensures the robustness of the result by establishing the histogram and screening according to the statistical result.
(3) The invention uses the global consistency criterion to remove the mismatching with similar coordinates, thereby improving the correct matching rate.
Drawings
Fig. 1 is a global structure formed by correctly matching feature points, wherein (a) is a global structure diagram formed by an image before a disaster occurs and feature points therein, and (b) is a global structure diagram formed by an image after the disaster occurs and corresponding correctly matching feature points therein;
FIG. 2 is an overall flowchart of a point matching method of a remote sensing image according to the present invention;
FIG. 3 is an example of a mismatch point;
FIG. 4 shows the matching result with k p A variation graph;
FIG. 5 (a) shows the initial matching result and (b) shows the TTM matching result;
FIG. 6 is a histogram of frequency statistics of feature points (i.e., a frequency histogram or a frequency distribution histogram);
FIG. 7 shows (a) initial matching results, (b) RANSAC matching results, (c) GTM matching results, (d) Bi-SOGC matching results, and (e) TTM matching results;
FIG. 8 is a graph of the results of different spectral image matching accuracies;
FIG. 9 (a) shows the initial matching result, (b) shows the RANSAC matching result, (c) shows the GTM matching result, (d) shows the Bi-SOGC matching result, and (e) shows the TTM matching result;
fig. 10 is a diagram showing the result of image matching accuracy before and after a disaster.
Detailed Description
The invention is further described with reference to the following figures and specific examples.
The method input is two corresponding point sets extracted in a reference image and a sensing image, they can be represented as P = { P = { (P) } i And P' = { P } i '}(i=1,2,3...N),(p i Corresponds to p i '). In the initial set of matching points, there are many mis-matching points, especially in remote sensing images with background variations. The TTM algorithm is based on the similarity of the overall structure of correct points, and is used to remove most of the mis-matched points and retain all of the correct points. Since affine transformations often exist between different remote sensing images taken of the same area, a triangle is chosen as the basic element of the overall structure rather than a quadrilateral or other polygon, because of its better shape retention. Fig. 1 is two remote sensing images before and after a disaster, and it can be seen that the overall structure consisting of correct matching points is similar. As shown in fig. 2, the method comprises three steps: firstly, extracting the characteristics of a corresponding triangle; step two, establishing a frequency histogram; and step three, removing residual mismatching points. The invention is described in detail below with reference to the following specific figures:
1. feature extraction for corresponding triangles
Randomly selecting three matching point pairs in the initial matching point set, and describing the three matching point pairs as (p) i ,p j ,p k ) And (p) i ',p j ',p k '). The following steps are then performed:
(1) Is composed of (p) i ,p j ,p k ) And (p) i ',p j ',p k ') two triangles are composed in two images, and then the difference between the corresponding angle and the corresponding side length ratio is calculated:
if all three point pairs are a correct match, then the condition Δ θ should be satisfied 1 ,△θ 2 ,△θ 3 ∈[0,θ b ],△l 1 ,△l 2 ,△l 3 ∈[0,l b ]. The three point pair combination that satisfies the condition will be recorded.
(2) As shown in FIG. 3, (p) a ,p a ') and (p) b ,p b ') are two correct matches, (p) c ,p c ') is a mismatch. However, this case still satisfies the conditions (1), (p) c ,p c ') will be incorrectly recorded as a correct match. Thus defining a reference point (p) d ,p d ') the coordinate calculation formula is as follows:
in FIG. 3, let vectorAround the point p d ,p d ' rotate clockwise, if the three feature point pairs are all correct matches, then the order of the vector through the three vertices should be the same. As shown in FIG. 3, the left image passes through the dots in the order of (p) a ,p c ,p b ) And the right graph passes through the points in the order of (p) a ',p b ',p c ') therefore (p) c ,p c ') would not be considered a correct match. Three-point pair combinations satisfying both condition 1 and condition 2 will be selected as the subset S, and similarly (p) a ,p a ') and (p) b ,p b ') will be correctly matched with anotherAnd selecting when combining. Since the correct matching points constitute a structure with similarities, the correct match will be chosen in S (as long as there are at least three pairs of correct matches), but some of the mismatches will also be incorrectly chosen, and a histogram associated with each number of points in the subset S will be established in (two).
2. Creating a frequency histogram
After step 1 and step 2 of the (a) part are performed for each pair of three points combination, the number of times each point appears in S is counted. In theory, as long as there is a mismatch point in the combination, the combination will not be selected as S. It can therefore be seen that in S the number of correct matches must be much larger than the number of mismatches. The frequency histogram is built as follows:
hist P ={k(p i ),i=1,2,...,M} (5)
k(p i ) Representing the inclusion of a point (p) in the subset i ,p i ') of the combinations. M represents the number of all non-repetitive points in S. From this we can derive the maximum number of entries as:
k max =max(k(p 1 ),k(p 2 ),...,k(p M )) (6)
in order to determine the correct threshold k p And 20 pairs of images with different threshold values for recall and precision are shown in fig. 4.
The calculation formulas for recall and precision are as follows:
call = (number of correct matches extracted)/(number of all correct matches) (7)
precision = (number of correct matching points extracted)/(number of all points extracted) (8)
If the number of the existing correct point pairs is m, the number of the correct point pairs in S can be up toMuch larger than the number of mismatches. In connection with fig. 4, the threshold is thus set:
while the number is less than k p And (5) removing the point pairs. Fig. 6 is a statistical graph of the frequency counts (i.e., frequencies) of all the feature points in fig. 5, wherein the points whose frequency counts exceed the horizontal line are all correct points and the incorrect points are not below the horizontal line.
3. Removing residual mismatches
Most of the mismatches will be removed in the first two parts. But also exist similarly to in fig. 3 (p) e ,p e ') points. (p) e ,p e ') are mismatches, but mismatches such as these cannot be removed in the previous process because their coordinates are very close. The spatial consistency measure is used to determine whether a point pair is mismatched. Obtaining two matching point sets P with the number of points n through the part A and the part B o And P o ', the conversion parameter T between images can be calculated by the following formula:
T=argminE(T(θ)) (10)
representing Root Mean Square Error (RMSE), T (θ) represents a set of points P o And P o ' transformation parameters obtained using the least squares method. d (i) = | | | T (θ) (p) i ')-p i ||,i=1,2,3,...n。
In order to remove mismatching with similar coordinates and obtain the optimal conversion parameters, a threshold E is set t . During each iteration, when E (T (theta))>E t Then the point pair with the largest value of d (i) will be removed and the point set updated. When the condition E (T (theta)) ≦ E is satisfied t When so, the iteration ends. Two sets of correct match points can be obtained at this time.
Fig. 5 is two remote sensing images taken before and after the sea area earthquake in 2010. (a) Is the initial matching result, and (b) is the matching result after applying the TTM algorithm.
Applying the above algorithm to remote sensing images, we divide the experimental images into two groups: 1) Different spectrum remote sensing images; 2) Remote sensing images before and after a disaster. Finally, precision, recall and RMSE indices were used to evaluate the results and compared to classical RANSAC, GTM and Bi-SOGC methods.
Example 1: remote sensing images of different spectra
15 pairs of different spectrum remote sensing images are adopted, and the mismatching proportion is set to be 5% -95%. The matching examples and the precision results are shown in fig. 7 and fig. 8. In fig. 7 (b), (c), (d), and (e), the solid line connection indicates a correct match, and the dotted line connection indicates a mismatch. Compared with other methods, the method achieves the highest Precision in Precision, call and RMSE.
Example 2: remote sensing image before and after disaster
15 pairs of remote sensing images before and after the disaster are adopted, and the mismatching proportion is set to be 5% -95%. The matching examples and the precision results are shown in fig. 9 and fig. 10. In FIG. 9 (b) (c) (d) (e), the solid line connection indicates a correct match and the dotted line connection indicates a mismatch.
The comparison experiment proves that the method has the best performance on the precision index, particularly under the condition of smaller proportion of correct matching.
Claims (2)
1. A point matching method of remote sensing images is characterized in that: the method comprises the following steps:
step (1): the initial matching point set is two corresponding point sets extracted from the reference image and the induction image, and P = { P } in the initial matching point set i And P' = { P } i ', where i =1,2,3.. N, N is the number of point concentration points, and any three selected point pairs form a number of corresponding triangular pairs, and the any three selected point pairs are denoted as (p) i, p j, p k ) And (p) i ',p j ',p k ');
Step (2): calculating the corresponding angle difference and the corresponding side length ratio difference between the two triangles:
and (3):andrepresents a point p i And p i The longitudinal and transverse coordinates of' set references in the two triangles
Point p d And p d ', the coordinate calculation formula is as follows:
and (4): setting a triangle similarity threshold theta b =5°,l b =0.1, then from the order correspondence of the three vertices with respect to the reference point and the condition Δ θ 1 ,△θ 2 ,△θ 3 ∈[0,θ b ],△l 1 ,△l 2 ,△l 3 ∈[0,l b ]Screening out the triangle pairs meeting the conditions;
and (5): counting the occurrence frequency of each point pair in the selected triangular pairs and establishing a frequency histogram;
and (6): setting threshold k according to experimental analysis p And the number of occurrences of removal is less than k p Point pair of (2);
and (7): according to all the point pairs at the momentCalculating model conversion parameters, point pairs being denoted as P o And P o ', the number is n;
T=argminE(T(θ)) (10)
wherein,denotes the root mean square error, T (θ) denotes the set of points P o And P o ' conversion parameter using least squares method, d (i) = | | | T (θ) (p) i ')-p i I =1,2,3,. N denotes a conversion error of each point pair;
and (8): setting a threshold value E t When E (T (theta))>E t When the condition E (T (theta)) ≦ E is satisfied, the point pair having the largest value of d (i) is removed and the point set is updated t And then, the iteration is finished, and two correct matching point sets are obtained at the moment.
2. The method for point matching of remote sensing images according to claim 1, characterized in that: the method is used for screening out a correct point matching set from the initial matching point set of the remote sensing image with large background change.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510160933.1A CN104766323B (en) | 2015-04-07 | 2015-04-07 | A kind of Point matching method of remote sensing images |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510160933.1A CN104766323B (en) | 2015-04-07 | 2015-04-07 | A kind of Point matching method of remote sensing images |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104766323A CN104766323A (en) | 2015-07-08 |
CN104766323B true CN104766323B (en) | 2018-03-06 |
Family
ID=53648132
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510160933.1A Active CN104766323B (en) | 2015-04-07 | 2015-04-07 | A kind of Point matching method of remote sensing images |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104766323B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108446725A (en) * | 2018-03-12 | 2018-08-24 | 杭州师范大学 | A kind of Image Feature Matching method of feature based triangle |
CN108596962A (en) * | 2018-04-23 | 2018-09-28 | 武汉大学 | A kind of heterologous remote sensing image reliable matching method under iteration triangular network constraint |
CA3154216A1 (en) * | 2019-10-11 | 2021-04-15 | Beyeonics Surgical Ltd. | System and method for improved electronic assisted medical procedures |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102184418A (en) * | 2011-06-10 | 2011-09-14 | 上海应用技术学院 | Triangle-area-representation-histogram-based image registration method |
CN103778626A (en) * | 2013-12-31 | 2014-05-07 | 北京理工大学 | Quick image registration method based on visual remarkable area |
CN103903249A (en) * | 2012-12-27 | 2014-07-02 | 纽海信息技术(上海)有限公司 | Image matching system and method |
-
2015
- 2015-04-07 CN CN201510160933.1A patent/CN104766323B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102184418A (en) * | 2011-06-10 | 2011-09-14 | 上海应用技术学院 | Triangle-area-representation-histogram-based image registration method |
CN103903249A (en) * | 2012-12-27 | 2014-07-02 | 纽海信息技术(上海)有限公司 | Image matching system and method |
CN103778626A (en) * | 2013-12-31 | 2014-05-07 | 北京理工大学 | Quick image registration method based on visual remarkable area |
Non-Patent Citations (3)
Title |
---|
Point-matching method for remote sensing images with background variation;Xiaolong Shi等;《Journal of Applied Remote Sensing》;20150131;第9卷(第1期);参见第2节 * |
三角形约束下的图像特征点匹配方法;吴飞等;《计算机辅助设计与图形学学报》;20100331;第 22 卷(第 3 期);参见第503-510页 * |
基于Delaunay三角化的有效角点匹配算法;李赣华等;《信号处理》;20071031;第23卷(第5期);参见第695-698页 * |
Also Published As
Publication number | Publication date |
---|---|
CN104766323A (en) | 2015-07-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106709950B (en) | Binocular vision-based inspection robot obstacle crossing wire positioning method | |
CN107145829B (en) | Palm vein identification method integrating textural features and scale invariant features | |
CN107481274B (en) | Robust reconstruction method of three-dimensional crop point cloud | |
CN108550166B (en) | Spatial target image matching method | |
CN109118528A (en) | Singular value decomposition image matching algorithm based on area dividing | |
CN110189344B (en) | Method for extracting straight line segment in electric power disconnecting link image | |
CN101593205B (en) | Video-based method for searching three-dimension model | |
CN101847163A (en) | Design patent image retrieval method with multi-characteristics fusion | |
WO2019047248A1 (en) | Feature extraction method and device for hyperspectral remotely sensed image | |
CN110738695B (en) | Image feature point mismatching and removing method based on local transformation model | |
CN106446894A (en) | Method for recognizing position of spherical object based on contour | |
CN103456022A (en) | High-resolution remote sensing image feature matching method | |
CN106408597A (en) | Neighborhood entropy and consistency detection-based SAR (synthetic aperture radar) image registration method | |
CN106981077A (en) | Infrared image and visible light image registration method based on DCE and LSS | |
CN104751111B (en) | Identify the method and system of human body behavior in video | |
CN110766708B (en) | Image comparison method based on contour similarity | |
CN104751112B (en) | A kind of fingerprint template and fingerprint identification method based on fuzzy characteristics point information | |
CN103279955B (en) | Image matching method and system | |
CN104036296B (en) | A kind of expression of image and processing method and processing device | |
CN104766323B (en) | A kind of Point matching method of remote sensing images | |
CN107862319B (en) | Heterogeneous high-light optical image matching error eliminating method based on neighborhood voting | |
CN105551022A (en) | Image error matching detection method based on shape interaction matrix | |
CN105654421A (en) | Projection transform image matching method based on transform invariant low-rank texture | |
CN111597933A (en) | Face recognition method and device | |
CN111709426B (en) | Diatom recognition method based on contour and texture |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
EXSB | Decision made by sipo to initiate substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |