CN107862319A - A kind of heterologous high score optical image matching error elimination method based on neighborhood ballot - Google Patents

A kind of heterologous high score optical image matching error elimination method based on neighborhood ballot Download PDF

Info

Publication number
CN107862319A
CN107862319A CN201711152569.XA CN201711152569A CN107862319A CN 107862319 A CN107862319 A CN 107862319A CN 201711152569 A CN201711152569 A CN 201711152569A CN 107862319 A CN107862319 A CN 107862319A
Authority
CN
China
Prior art keywords
matching
inner product
points
image
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711152569.XA
Other languages
Chinese (zh)
Other versions
CN107862319B (en
Inventor
李百寿
沈宇臻
谢跃辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guilin University of Technology
Original Assignee
Guilin University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guilin University of Technology filed Critical Guilin University of Technology
Priority to CN201711152569.XA priority Critical patent/CN107862319B/en
Publication of CN107862319A publication Critical patent/CN107862319A/en
Application granted granted Critical
Publication of CN107862319B publication Critical patent/CN107862319B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/259Fusion by voting

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Other Investigation Or Analysis Of Materials By Electrical Means (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of heterologous high score optical image matching error elimination method based on neighborhood ballot.Including picking poor research object as Image Matching using heterologous high score image, poor basis is picked as neighborhood using the distance between thick match point modulus value and principal direction angle, calculate the distance between thick matching double points inner product value and direction inner product value, the larger matching double points of error are rejected finally by neighborhood ballot threshold value, retain accurate matching double points, and correct matching double points are reselected, realize that high score Image Matching picks difference.The present invention has taken into full account around match point the influence of other matching double points match point, the two degree of constraints extracted using the relation of the every other matching double points match point of its neighborhood as match point, Image Matching precision is improved, while solves the problems, such as that heterologous high score optical image matching precision is not high, robustness is not good enough in existing method.

Description

Heterogeneous hyperspectral optical image matching error elimination method based on neighborhood voting
Technical Field
The invention relates to the field of high-resolution remote sensing image processing, in particular to a heterogeneous high-resolution optical image matching error elimination method based on neighborhood voting.
Background
High-resolution remote sensing image matching is an important basis involved in high-resolution image processing, and the image matching quality has a key influence on subsequent processing and application such as image fusion, mosaic, segmentation, object identification, change detection and the like.
In recent years, comprehensive use of heterogeneous images (remote sensing images from various different sources) becomes an effective way for deep application of remote sensing data, and heterogeneous image matching also becomes one of important links and key technologies in high-resolution remote sensing image processing. With the diversification of sensors and platforms, the realization of high-precision image matching technology is more and more difficult. The accuracy is low when the different-source optical image matching algorithm meets the matching among the different-source images with large differences of resolution, time phase and space rotation, a large number of error matching points can be generated in the matching, and the error matching points can have important influences on positioning, splicing and image feature library establishment of high-resolution images such as unmanned planes, airborne planes and satellite-borne planes. Therefore, mismatching elimination in the image matching process is a non-negligible problem in the fields of photogrammetry and high-resolution remote sensing, and the fields of heterogeneous image remote sensing and computer vision have wide market demands.
At present, a statistical model, spatial information and a function fitting method are commonly adopted. The method based on the statistical model is typically a random sampling consistency algorithm, and the method is a robust parameter estimation method, but the algorithm is repeated and iterated, and continuous testing is performed, so that the time consumption is high. Meanwhile, the method can remove all the error matching points when the number of the error matching points is less, and can not completely remove the error matching points when the number of the error matching points is increased and is close to or exceeds the total number of points. The elimination method based on the spatial distribution descriptor can eliminate the error matching points with simple characteristics, but the algorithm is complex to realize and does not have rotation resistance. The method based on function fitting is that all correct matching points are supposed to meet a certain function model according to the known image condition to be matched, the method is simple in calculation and high in speed, but if more abnormal points with larger errors exist, the difference between the coefficient of the fitted function model and the actual coefficient is larger, and the matching precision is lower.
The key method for solving the problems is that before adjustment processing, conditions and similarity criteria for screening correct point pairs are established, the clustering performance of the correct point pairs is fully considered, according to the correlation of image characteristics, namely, any pixel point on an image is influenced by other pixel points in the neighborhood of the image, and the influence is determined according to the distance from the point. Therefore, the point is subjected to two-degree constraint by using all other matching points in the image neighborhood, all rough matching points on the image are attached with a unique label by the other matching points, and the label is used for calculating whether the matching points are matched to set a threshold value or not, so that simple and rapid rough difference elimination is performed.
Disclosure of Invention
The invention aims to provide a heterogeneous high-resolution optical image matching error eliminating method based on neighborhood voting, which can judge whether to eliminate or store a rough matching point pair obtained by a matching operator according to a set distance threshold and a direction threshold by fully utilizing neighborhood information of matching points so as to solve the problems of poor matching precision and low robustness caused by the fact that the characteristic points are uniquely described and matched only by means of a characteristic point descriptor Euclidean distance in the existing scheme.
The method comprises the following specific steps:
(1) Backing up two heterogeneous high-light optical images with same name areas and different resolutions, cutting one of the heterogeneous high-light optical images, and cutting different types of ground object objects in the same name areas; the two cut different ground feature images need the same-name areas, and the selected ground feature object is most representative.
(2) The SIFT matching operator or other feature detection and description operators with scale invariance and rotation invariance are used to perform feature point detection and feature point matching operation on the data to obtain an initial rough matching point pair, and the image plane coordinates (x, y) and the main direction theta of each matching point can be obtained, the image plane coordinates and the main direction of the matching point on the reference image are (x, y) and theta, and the image plane coordinates and the main direction of the matching point on the image to be matched are (x ', y ') and theta ', which will not be described in detail later.
(3) Respectively calculating the distance module values d of any matching point on the reference image and the image to be matched and other matching points in the neighborhood n,m At an angle theta to the main direction n,m And obtaining the matrixes D, D ', theta and theta'.
(4) And (4) respectively carrying out [0,1] normalization on the obtained four matrixes according to the row vectors to obtain four normalized matrixes.
(5) And respectively calculating the distance inner product value and the direction inner product value of two matching points in any matching point pair to obtain an n multiplied by 2 matrix, wherein n is the logarithm of the matching points, one column in the two columns is the distance inner product value, and the other column is the direction inner product value.
(6) Taking a distance threshold T d And a direction threshold T θ The threshold value dereferencing rule is to obtain an optimal threshold value for an experimental method, to obtain a plurality of groups of different distance threshold values and direction threshold values, to judge the optimal threshold value by a plurality of groups of experiments, and to determine the optimal threshold value according to the actual experimental conditions, so that better matching precision and appropriate matching point pairs are required to be stored, and the situation that only few matching point pairs are reserved due to excessive improvement of the matching precision is avoided. After the two thresholds are determined, traversing the distance inner product value and the direction inner product value of each matching point pair, comparing the corresponding distance inner product value with the distance threshold, comparing the direction inner product value with the direction threshold, if the distance inner product value and the direction inner product value are both within the set distance threshold and direction threshold, considering that the matching point pair has good precision, otherwise, considering that the matching point pair has large error, and rejecting the matching point pair.
(7) And (5) after eliminating the matching point pairs which do not accord with the set threshold, performing inner product on any coarse matching point and other coarse matching points of the eliminated coarse matching points according to the four normalized matrixes in the step (4), obtaining corresponding distance inner product values and direction inner product values, comparing the distance inner product values and the direction inner product values with the set threshold, if the distance inner product values and the direction inner product values are within the range of the corresponding set threshold, saving the distance inner product values and the direction inner product values as new matching point pairs, and otherwise, eliminating the new matching point pairs.
(8) And (4) combining the matching point pairs stored in the step (6) with the matching point pairs newly stored in the step (7) to form matching point pairs on the two images, wherein the matching point pairs are fine matching point pairs after neighborhood voting is carried out to eliminate differences.
(9) And establishing a coordinate conversion model between the images by adopting an empirical judgment method, and determining a mapping relation parameter between the two images through the model.
Let the reference image have a matching point A (x) A ,y A ) There is a corresponding matching point a (x) on the image to be matched a ,y a ) If the points A and a are considered to be correct matching points, then there is a mapping relation A (x) A ,y A )=Ra(x a ,y a ) After the empirical coordinate conversion model is determined, respectively bringing the matching point pairs corresponding to the reference image in the rough matching point pair into the model to obtain corresponding new coordinate values, and substituting the obtained new coordinate values and the rough matching points in the image to be matched in the original rough matching point pair into a medium error calculation formula as follows:
where n is the number of coarse matching point pairs, (x) i ,y i )、The coordinate of the matching point i on the reference image and the point plane coordinate of the matching point on the image to be matched mapped to the reference image are respectively, and the error unit is a pixel. Thus, the matching precision of the precise matching point pair is obtained.
And similarly, repeating the operation on the accurate matching point pair after the neighborhood voting error is eliminated to obtain the accuracy of the matching point after the difference is eliminated.
(10) And respectively carrying out the whole neighborhood voting difference elimination process on the image pairs of different ground objects in the data II, and obtaining a group of fine matching point pairs for each image pair.
(11) And carrying out precision analysis on the accurate matching point pairs of the data, judging the improvement degree of the matching precision of each image pair, considering factors such as accidental errors and the like, and analyzing the robustness of the neighborhood voting difference eliminating method.
The invention has the beneficial effects that:
the invention fully considers the clustering property of the correct matching points, namely in the image matching, the probability of the correct matching points existing around the correct matching points is far greater than the probability of the correct matching points existing around the wrong matching points, and according to the correlation of the matching information in the neighborhood of the image, any pixel point on the image can be influenced by other pixel points in the neighborhood, and the influence is determined according to the distance from the point. Therefore, the point is subjected to two-degree constraint by using other matching points in the neighborhood of the image, a 'unique' label is attached to all rough matching points on the image by the 'other' matching points, then the 'labels' are used for calculating whether the matching points are matched with a set threshold value, matlab programming is used for realizing the heterogeneous high-resolution optical image matching error eliminating method based on neighborhood voting, and the implementation result proves that the matching precision can be improved.
Drawings
FIG. 1 is a flow chart of an embodiment of the present invention.
Fig. 2 is a schematic diagram of coarse matching points in the embodiment of the present invention.
FIG. 3 is a schematic diagram of distance module and direction angle of the rough matching point in the embodiment of the present invention.
FIG. 4 is a cut-out area of a building provided in accordance with an embodiment of the present invention; a: quickBird-building, b: GF-2-buildings.
FIG. 5 is a block diagram illustrating the matching result of the cut-out area of the building according to an embodiment of the present invention; a, SIFT method building object pair matching result, b: the method of the invention is used for matching the building image pairs.
FIG. 6 illustrates a water cutting area provided in accordance with an embodiment of the present invention; a, quickbird-water body, b: GF-2-water body.
Fig. 7 shows a matching result of a water body cropping area a, a matching result of a water body image pair by an SIFT method, b: the method of the invention matches the water body image with the matching result.
FIG. 8 illustrates a road cutting area according to an embodiment of the present invention; a: quickbird-road, b: GF-2-road.
Fig. 9 shows a road cutting area matching result according to an embodiment of the present invention; a, SIFT method road image pair matching result, b: the method of the invention matches the road image with the matching result.
Detailed Description
In order to facilitate the understanding and implementation of the present invention for those of ordinary skill in the art, the present invention is further described in detail with reference to the accompanying drawings and examples, it is to be understood that the embodiments described herein are merely illustrative and explanatory of the present invention and are not restrictive thereof.
Example (b):
fig. 1 is a technical flowchart of a method for rejecting an error in a heterogeneous high-resolution optical image based on neighborhood voting, which includes the following steps:
(1) Two heterogeneous high-resolution images with the same name area are used as input data, the two images have differences in imaging time, resolution, satellite platform and the like, one image A is assumed as a reference image, the image is considered to be processed basically, the other image B is an image to be matched, and the image B is matched on the basis of the image A.
(2) The feature points of the two images are extracted through the feature point extraction operators respectively, and because the two images have differences in the aspects of resolution, imaging mode, rotation and the like, in order to have a better effect, the feature points of the two images can be extracted by adopting scale-invariant feature transformation, the extracted feature points have the advantages of rotation and affine deformation, and the like, are very suitable for matching of heterogeneous image data, have unique feature descriptors and have a main gradient direction.
(3) And performing feature matching on the extracted feature points, wherein the feature matching is to traverse the feature points of the image to be matched, find the feature points with the feature points on the reference image with the similar features, if the similarity of the features of the two points meets the threshold set by the matching operator, the two points are considered to be a pair of matching points, the obtained matching point pair is a coarse matching point pair without difference removal processing, and a group of coarse matching point pairs between the two images can be obtained through algorithm traversal, as shown in fig. 2, a and a, B and B, C and C are corresponding coarse matching point pairs, theta is the main direction included angle of the corresponding coarse matching point, and (x, y) are the image plane coordinates of the coarse matching points.
(4) Respectively calculating distance module values and main direction included angles of neighborhood matching points of rough matching points on a reference image and an image to be matched, taking the reference image as an example, assuming that the rough matching points A, B and C are arranged on the reference image, and the image plane coordinates and the main direction of the corresponding matching points are (x) A 、y A )、(x B 、y B )、(x C 、y C ) And theta A 、θ B 、θ C . Then, the distance module values and the main direction included angles of the neighborhood matching points corresponding to the points a, B, and C are respectively calculated, taking the point a as an example, the distance module values and the main direction included angles of the neighborhood matching points of the point a are respectively dAB, dAC, Δ θ AB, and Δ θ AC, as shown in fig. 3. Wherein:
θ AB =|θ BA |、θ AC =|θ CA |
and calculating distance module values and main direction included angles of all rough matching points and other matching points on the two images to obtain four matrixes, wherein D is the distance module value between each rough matching point on the reference image, and D 'is the distance module value theta and theta' between each rough matching point on the image to be matched, and the corresponding main direction included angles are obtained.
(5) The four matrices obtained above are normalized [0,1], each matrix being normalized by the row vector, here by means of the self-contained function mapminmax in Matlab. This step results in four matrices normalized by the row vector. As follows:
(6) And (4) in the step (4), if the reference image matching point 1 and the image point 2 to be matched are a pair of rough matching points, calculating the distance inner product value dot1 and the direction inner product value dot2 of the two points. Then there are:
as shown above, each pair of coarse matching point pairs is traversed to calculate the distance inner product value and the direction inner product value, and an n × 2 matrix is obtained, where n is the number of pairs of coarse matching points, one column is the corresponding distance inner product value dot1, and the other column is the corresponding direction inner product value dot2.
(7) According to the practical high-resolution image condition, the number of rough matching point pairs extracted by a matching operator and other reasons, the distance threshold and the direction threshold are determined by multiple groups of experiments, the matching precision should be improved as much as possible under the condition of ensuring the matching point pairs in principle, namely, the number of the matching point pairs cannot be greatly reduced for improving the matching precision, the matching precision cannot be improved for keeping more matching point pairs, and therefore the optimal distance threshold T needs to be obtained according to the practical requirements and the purpose of multiple groups of experiments d And a direction threshold T θ
(8) After two thresholds are obtained, each pair of matching points in the n multiplied by 2 matrix is compared with the two thresholds, when the distance inner product value and the direction inner product value of one matching point pair are both smaller than or equal to the thresholds, the matching point pair is considered to be an accurate matching point pair, and otherwise, the matching point pair is removed. As follows:
when dot1 is less than or equal to T d ,dot2≤T θ The matching point pair is retained.
When dot1<T d ,dot2>T θ 、dot1>T d ,dot2<T θ 、dot1>T d ,dot2>T θ And rejecting the matching point pair.
(9) Traversing the removed rough matching points and the unmatched characteristic points on the reference image and the image to be matched, performing inner product operation in the same way, searching whether matched point pairs meeting two thresholds exist, and if so, storing the matched point pairs as accurate matched point pairs.
(10) And obtaining an accurate matching point pair from the coarse matching point pair and reselecting the accurate matching point pair from the wrong matching point and the unmatched characteristic point through the operation of the steps, so that the difference elimination of the high-score image matching points is completed, and the accurate matching point pair subjected to the difference elimination by the two-degree constraint of the neighborhood voting is obtained.
(11) And after the steps are completed, analyzing whether the obtained accurate matching point logarithm is proper or not, and calculating the image matching precision before and after difference elimination.
Taking several different image pairs with different ground objects in the same name area as an example, specific embodiments (1-3) implement high-resolution optical image matching error elimination by using the method.
Specific example 1:
a Quickbird image in a certain area of Guangdong is selected, the spatial resolution is 0.61m, the imaging time is 2016 years, the spatial resolution is 0.8m, the imaging time is 2015 years, the Quickbird image is used as a reference image, and the GF-2 image is used as an image to be matched for analysis. The two different high-resolution optical images with the same name area are cut, and different types of ground object objects are cut in the same name area, for example, a building object is cut out, as shown in fig. 4.
And then, extracting the characteristic points of the reference image and the image to be matched by using a characteristic point detection operator, matching the two images by using a matching operator according to a corresponding matching principle, and selecting a rough matching point pair, wherein each rough matching point has a unique pixel plane coordinate and a unique main direction. And then calculating the direction and distance difference between a certain rough matching point and other rough matching points on the same image so as to obtain the distance module value and the main direction included angle value of any two rough matching points on the same image. Then, the distance module value and the main direction included angle value between any two rough matching points are processed according to the row vector [0,1]And (6) normalization processing. And then calculating the inner product value of the direction included angle and the inner product value of the distance between two points in any coarse matching point pair on the two images. Finally, according to the actual situation of the difference elimination, multiple groups of experiments prove that the time T is d 、T θ If =0.5, there are still more mismatching point pairs after the difference is removed, and when T is greater than T d 、T θ If =0.3, the number of matching point pairs left after rejecting difference is very small, so in this embodiment, T is taken d 、T θ =0.4. And comparing the distance with the inner product value of the direction included angle of the distance, if the distance and the inner product value of the direction included angle of the distance are both within the set corresponding threshold value, storing the matching point pair, otherwise, rejecting the matching point pair, reselecting the rough matching point which meets the threshold value, and recombining the matching point pair, wherein the matching result of the method is shown in fig. 5 (b), and fig. 5 (a) is the road matching result of the SIFT method.
In the specific embodiment 2 and the specific embodiment 3, the matching precision analysis and research is also performed on the water body, the road and other places in the image, and the precision and the performance evaluation of the method are performed. The clipping region and the matching result are shown in FIGS. 6,7,8, and 9, respectively.
The results were analyzed as shown in Table 1. The result shows that the conventional SIFT method utilizes scale-invariant feature transformation to extract feature points of the QuickBird image and the GF-2 image of the building from 338 and 326 feature points respectively, the feature points of the QuickBird image and the GF-2 image of the water body are 100 and 485 feature points respectively, and the feature points of the QuickBird image and the GF-2 image of the road are 226 and 662 feature points respectively. And the characteristic points are matched by using the Euclidean distance, and the matching point pairs obtained by the buildings, the water bodies and the roads are respectively 85 pairs, 14 pairs and 5 pairs. After matching is finished, precision analysis shows that the matching errors of the buildings, the water bodies and the roads are 20.0340 pixels, 61.8433 pixels and 67.6633 pixels, the matching precision is low, and the mismatching rate is high.
The matching result of the method shows that the matching point pairs behind the building, water body and road error removal and correction matching point pairs are 42, 11 and 5 pairs, the corresponding matching errors are 4.6417, 39.4758 and 10.2053 pixels, and the error removal consumption time is 0.012s, 0.0086s and 0.0014s respectively. The matching precision is improved, and meanwhile, the matching time is within an acceptable range.
TABLE 1 SIFT-comparison of matching precision of images of different ground objects by the method of the present invention
Compared with the traditional scale invariant feature transform algorithm (SIFT), the embodiment shows that the building matching precision is improved by 77%, the road class is improved by 36%, the water body is improved by 85%, and the average matching precision is improved by 66%. The new method can carry out more precise difference elimination processing on the SIFT matching result, thereby improving the matching precision of the image and simultaneously maintaining the characteristics of scale invariance and the like of the original algorithm.
Therefore, the invention fully considers that any pixel point in the image is influenced by surrounding pixel points deeply, and the influence size is in inverse proportion to the influence radius by taking the pixel point as the center of a circle. On the basis of considering the influence of the matching points around the rough matching points on the rough matching points, rejecting rough matching point pairs which do not accord with the threshold according to a threshold judgment rule, and reconsidering whether the rough matching point pairs accord with the set threshold. The method solves the problems of insufficient precision and low robustness of heterogeneous high-resolution optical images in image matching, and compared with other poor rejection methods, the method has the advantages of low complexity and high precision.
While the invention has been described in connection with specific embodiments and with specific principles employed, it is to be understood that the invention is capable of modification in various other respects, all without departing from the spirit and scope of the present invention as set forth in the specification and the accompanying drawings.

Claims (8)

1. A heterogeneous high-light optical image matching error eliminating method based on neighborhood voting is characterized by comprising the following steps:
step 1: backing up two heterogeneous high-light optical images with same name areas and different resolutions, cutting one of the heterogeneous high-light optical images, and cutting different types of ground object objects in the same name areas;
step 2: carrying out feature point detection and feature point matching operation on the data by utilizing an SIFT matching operator or other feature detection and description operators with scale invariance and rotation invariance to obtain an initial coarse matching point pair;
and 3, step 3: respectively calculating the distance module values d of any matching point on the reference image and the image to be matched and other matching points in the neighborhood n,m Angle theta with main direction n,m Obtaining matrixes D, D ', theta and theta'; the four obtained matrixes are respectively processed by [0,1] according to row vectors]Normalizing to obtain four normalized matrixes;
and 4, step 4: respectively calculating the distance inner product value and the direction inner product value of two matching points in any matching point pair, thereby obtaining an n multiplied by 2 matrix, wherein n is the logarithm of the matching points, one of two columns is the distance inner product value, and the other column is the direction inner product value;
and 5: obtaining an optimal distance threshold T d Sum direction threshold T θ
And 6: and (6) voting for difference.
2. The heterogeneous high-beam optical image matching error elimination method based on neighborhood voting according to claim 1, wherein:
step 3, coarse alignment of the reference image and the image to be matchedCalculating distance module value and principal direction included angle of neighborhood matching points of the matching points, taking a reference image as an example, assuming that the reference image has rough matching points A, B and C, and the image plane coordinate and principal direction of the corresponding matching points are (x) A 、y A )、(x B 、y B )、(x C 、y C ) And theta A 、θ B 、θ C (ii) a Respectively solving a distance module value and a main direction included angle of the neighborhood matching points corresponding to the points A, B and C, taking the point A as an example, and respectively determining the distance module value and the main direction included angle of the neighborhood matching points of the point A as dAB, dAC, delta theta AB and Delta theta AC; wherein:
θ AB =|θ BA |、θ AC =|θ CA |
obtaining four matrixes, wherein D is a distance module value between each rough matching point on the reference image, and D ' is a distance module value theta and theta ' between each rough matching point on the image to be matched, and the distance module values theta and theta ' are corresponding main direction included angles;
carrying out [0,1] normalization processing on the four matrixes obtained above, and normalizing each matrix according to a row vector, wherein the function mapminmax is carried out by means of a self-contained function in Matlab; obtaining four matrixes normalized according to the row vectors in the step; as follows:
3. the heterogeneous high-beam optical image matching error elimination method based on neighborhood voting according to claim 1, wherein:
step 4, calculating the distance inner product value and the direction inner product value of two matching points in any matching point pair, and adopting the following formula:
in the formula, a reference image matching point 1 and an image point 2 to be matched are a pair of rough matching points, dot1 is the distance inner product value of the two points, dot2 is the direction inner product value of the two points, and n pairs of rough matching points are provided.
4. The heterogeneous high-beam optical image matching error elimination method based on neighborhood voting according to claim 1, wherein:
distance threshold T in step 5 d And a direction threshold T θ The threshold value dereferencing rule is to obtain an optimal threshold value for an experimental method, to obtain a plurality of groups of different distance threshold values and direction threshold values, to judge the optimal threshold value by a plurality of groups of experiments, and to determine the optimal threshold value according to the actual experimental conditions, so that better matching precision and appropriate matching point pairs should be obtained and the situation that only few matching point pairs are reserved due to excessive improvement of the matching precision is avoided.
5. The heterologous high-beam optical image matching error elimination method based on neighborhood voting according to claim 1, wherein the method comprises the following steps:
step 6, the voting difference elimination needs to traverse the distance inner product value and the direction inner product value of each matching point pair, and takes the corresponding distance inner product value to compare with the distance threshold value, and takes the direction inner product value to compare with the direction threshold value, if the distance inner product value and the direction inner product value are both within the set distance threshold value and direction threshold value, the matching point pair is considered to have good precision, otherwise, the matching point pair is considered to have larger error, and the matching point pair is rejected; the adopted judgment criteria are as follows:
(1) When dot1 is less than or equal to T d ,dot2≤T θ If yes, the matching point pair is reserved;
(2) When dot1<T d ,dot2>T θ 、dot1>T d ,dot2<T θ 、dot1>T d ,dot2>T θ Then the matching point pair is rejected.
6. The heterogeneous high-beam optical image matching error elimination method based on neighborhood voting according to claim 1 or 5, wherein:
in step 6, the pair of rejected coarse matching points according to claim 5 performs inner product on any one coarse matching point and other coarse matching points according to the four normalized matrices in step 3, and obtains a corresponding distance inner product value and direction inner product value, and then compares the distance inner product value and the direction inner product value with a set threshold, if both the distance inner product value and the direction inner product value are within a range of the corresponding set threshold, the distance inner product value and the direction inner product value are saved as a new matching point pair, otherwise, the pair is rejected.
7. The heterogeneous hyperspectral optical image matching error elimination method based on neighborhood voting according to claim 1, 5 or 6, wherein the method comprises the following steps:
regarding step 6, the matching point pairs stored in claim 5 and the matching point pairs newly stored in claim 6 are combined into matching point pairs on two images, and the matching point pairs are fine matching point pairs after neighborhood voting is performed.
8. The heterologous high-beam optical image matching error elimination method based on neighborhood voting according to claim 1, wherein the method comprises the following steps:
regarding step 6, an empirical judgment method is adopted to establish a coordinate conversion model between the images, and two images are determined through the modelMapping relation parameters between the two; let the reference image have a matching point A (x) A ,y A ) There is a corresponding matching point a (x) on the image to be matched a ,y a ) If two points A and a are considered to be correct matching points, then there is a mapping relation A (x) A ,y A )=Ra(x a ,y a ) After the empirical coordinate conversion model is determined, respectively bringing the matching point pairs corresponding to the reference image in the rough matching point pair into the model to obtain corresponding new coordinate values, and substituting the obtained new coordinate values and the rough matching points in the image to be matched in the original rough matching point pair into a medium error calculation formula as follows:
where n is the number of coarse matching point pairs, (x) i ,y i )、The matching points i on the reference image and the point plane coordinates of the matching points on the image to be matched, which are mapped to the reference image, are respectively, and the error unit is a pixel, so that the matching precision of the precise matching point pair is obtained.
CN201711152569.XA 2017-11-19 2017-11-19 Heterogeneous high-light optical image matching error eliminating method based on neighborhood voting Active CN107862319B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711152569.XA CN107862319B (en) 2017-11-19 2017-11-19 Heterogeneous high-light optical image matching error eliminating method based on neighborhood voting

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711152569.XA CN107862319B (en) 2017-11-19 2017-11-19 Heterogeneous high-light optical image matching error eliminating method based on neighborhood voting

Publications (2)

Publication Number Publication Date
CN107862319A true CN107862319A (en) 2018-03-30
CN107862319B CN107862319B (en) 2021-11-16

Family

ID=61703148

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711152569.XA Active CN107862319B (en) 2017-11-19 2017-11-19 Heterogeneous high-light optical image matching error eliminating method based on neighborhood voting

Country Status (1)

Country Link
CN (1) CN107862319B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109117854A (en) * 2018-07-25 2019-01-01 北京达佳互联信息技术有限公司 Key point matching process, device, electronic equipment and storage medium
CN109948624A (en) * 2019-02-18 2019-06-28 北京旷视科技有限公司 Method, apparatus, electronic equipment and the computer storage medium of feature extraction
CN110634149A (en) * 2018-06-22 2019-12-31 湖南大学 Non-rigid target characteristic point matching method for optical motion capture system
CN110992263A (en) * 2019-11-27 2020-04-10 国网山东省电力公司电力科学研究院 Image splicing method and system
CN113496183A (en) * 2020-04-08 2021-10-12 上海耕岩智能科技有限公司 Fingerprint matching method and device, electronic equipment and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102693542A (en) * 2012-05-18 2012-09-26 中国人民解放军信息工程大学 Image characteristic matching method
CN103389074A (en) * 2013-07-18 2013-11-13 河南科技大学 Multi-scale scene matching area selecting method
CN104200461A (en) * 2014-08-04 2014-12-10 西安电子科技大学 Mutual information image selected block and sift (scale-invariant feature transform) characteristic based remote sensing image registration method
US20160168638A1 (en) * 2005-04-15 2016-06-16 Becton, Dickinson And Company Diagnosis of sepsis

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160168638A1 (en) * 2005-04-15 2016-06-16 Becton, Dickinson And Company Diagnosis of sepsis
CN102693542A (en) * 2012-05-18 2012-09-26 中国人民解放军信息工程大学 Image characteristic matching method
CN103389074A (en) * 2013-07-18 2013-11-13 河南科技大学 Multi-scale scene matching area selecting method
CN104200461A (en) * 2014-08-04 2014-12-10 西安电子科技大学 Mutual information image selected block and sift (scale-invariant feature transform) characteristic based remote sensing image registration method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
杨永涛,黄国言,吴培良: "一种利用关键点提取的改进ICP三维彩色点云场景配准方法", 《小型微型计算机系统》 *
王志华: "激光笔交互技术研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
石俏 等: "长期静置对高压断路器合闸时间特性的影响", 《高压电器》 *
胡小青 等: "基于邻域投票和Harris-SIFT特征的低空遥感影像匹配", 《测绘工程》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110634149A (en) * 2018-06-22 2019-12-31 湖南大学 Non-rigid target characteristic point matching method for optical motion capture system
CN110634149B (en) * 2018-06-22 2022-03-18 湖南大学 Non-rigid target characteristic point matching method for optical motion capture system
CN109117854A (en) * 2018-07-25 2019-01-01 北京达佳互联信息技术有限公司 Key point matching process, device, electronic equipment and storage medium
CN109117854B (en) * 2018-07-25 2021-01-29 北京达佳互联信息技术有限公司 Key point matching method and device, electronic equipment and storage medium
CN109948624A (en) * 2019-02-18 2019-06-28 北京旷视科技有限公司 Method, apparatus, electronic equipment and the computer storage medium of feature extraction
CN110992263A (en) * 2019-11-27 2020-04-10 国网山东省电力公司电力科学研究院 Image splicing method and system
CN110992263B (en) * 2019-11-27 2023-07-11 国网智能科技股份有限公司 Image stitching method and system
CN113496183A (en) * 2020-04-08 2021-10-12 上海耕岩智能科技有限公司 Fingerprint matching method and device, electronic equipment and readable storage medium
CN113496183B (en) * 2020-04-08 2023-02-03 上海耕岩智能科技有限公司 Fingerprint matching method and device, electronic equipment and readable storage medium
US11710342B2 (en) 2020-04-08 2023-07-25 Shanghai Harvest Intelligence Technology Co., Ltd. Fingerprint matching method and apparatus, electronic equipment and readable storage medium

Also Published As

Publication number Publication date
CN107862319B (en) 2021-11-16

Similar Documents

Publication Publication Date Title
CN107862319B (en) Heterogeneous high-light optical image matching error eliminating method based on neighborhood voting
CN111028277B (en) SAR and optical remote sensing image registration method based on pseudo-twin convolution neural network
CN110097093B (en) Method for accurately matching heterogeneous images
CN109064502B (en) Multi-source image registration method based on combination of deep learning and artificial design features
CN104574421B (en) Large-breadth small-overlapping-area high-precision multispectral image registration method and device
WO2019042232A1 (en) Fast and robust multimodal remote sensing image matching method and system
CN109409208A (en) A kind of vehicle characteristics extraction and matching process based on video
CN110569861B (en) Image matching positioning method based on point feature and contour feature fusion
CN104751465A (en) ORB (oriented brief) image feature registration method based on LK (Lucas-Kanade) optical flow constraint
CN109118528A (en) Singular value decomposition image matching algorithm based on area dividing
CN107610166B (en) Planar map image registration method based on linear feature region segmentation
CN110866430A (en) License plate recognition method and device
CN107862708A (en) A kind of SAR and visible light image registration method
CN107240130B (en) Remote sensing image registration method, device and system
CN105787943B (en) SAR image registration method based on multi-scale image block feature and rarefaction representation
CN113095385B (en) Multimode image matching method based on global and local feature description
CN105869168A (en) Multi-source remote sensing image shape registering method based on polynomial fitting
CN110675442A (en) Local stereo matching method and system combined with target identification technology
CN106886988A (en) A kind of linear goal detection method and system based on unmanned aerial vehicle remote sensing
CN103353941A (en) Natural marker registration method based on viewpoint classification
CN109509216A (en) A kind of Landsat and SPOT satellite image automatic registration method
CN114897705A (en) Unmanned aerial vehicle remote sensing image splicing method based on feature optimization
CN104484647B (en) A kind of high-resolution remote sensing image cloud height detection method
CN113963174A (en) Bogie identification image feature extraction method based on fusion of multi-view intensity domain and frequency domain
CN116883446B (en) Real-time monitoring system for grinding degree of vehicle-mounted camera lens

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant