CN111242139A - Point-line-feature-based maximum histogram heterogeneous image matching method - Google Patents

Point-line-feature-based maximum histogram heterogeneous image matching method Download PDF

Info

Publication number
CN111242139A
CN111242139A CN202010034764.8A CN202010034764A CN111242139A CN 111242139 A CN111242139 A CN 111242139A CN 202010034764 A CN202010034764 A CN 202010034764A CN 111242139 A CN111242139 A CN 111242139A
Authority
CN
China
Prior art keywords
feature
image
point
maximum
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010034764.8A
Other languages
Chinese (zh)
Other versions
CN111242139B (en
Inventor
吴全
徐贵力
王正盛
程月华
董文德
李振华
于启达
闫富菊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN202010034764.8A priority Critical patent/CN111242139B/en
Publication of CN111242139A publication Critical patent/CN111242139A/en
Application granted granted Critical
Publication of CN111242139B publication Critical patent/CN111242139B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • G06V10/507Summing image-intensity values; Histogram projection analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a point-line-characteristic-based maximum histogram heterogeneous image matching method, which adopts a position-based boosting detector to extract a large number of characteristic points distributed on a thick edge, and selects the local maximum edge direction and the gradient direction of the characteristic points in combined characteristics to construct histogram characteristics so as to solve the time consumption problem of dense matching; the characteristic points of initial matching are extracted based on bilateral matching, the common mismatching problem in matching is solved by adopting the idea based on difference variance, and the matching accuracy is improved. The invention ensures that the matching system works in an uncertain environment, objectively evaluates the proposed heterogeneous image matching algorithm, and can stably output accurate and densely matched matching point pairs.

Description

Point-line-feature-based maximum histogram heterogeneous image matching method
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to a maximum histogram heterogeneous image matching method based on point-line characteristics.
Background
In recent years, heterogeneous image analysis is widely applied to the fields of tracking, video monitoring, guidance, remote sensing monitoring and the like. These applications are often images acquired by sensors employing different imaging mechanisms. So-called heterogeneous image matching is the process of locating features and regions having the same identity from different sensor images. The information provided by the heterogeneous image pair can complement each other, and the method is beneficial to solving various detection problems under the condition of lacking effective information. Among these visual tasks requiring image analysis, image matching is one of the most important processing steps. However, the different principles of acquiring images by different sensors lead to the acquired images exhibiting different gray scale intensity and texture characteristics in the same area. On the one hand, a single feature present in the reference image may not appear in the corresponding region of the target image; on the other hand, multiple features in the reference image map to unique features in the target image, and vice versa. In addition, the computational efficiency and matching accuracy of the heterogeneous images and the stability thereof are great guarantees for subsequently performing image registration, target detection, positioning and navigation. Therefore, heterogeneous image matching is still a challenging subject, and researchers are still required to develop efficient and stable heterogeneous image matching algorithms.
Generally speaking, the success of a match is not, depending largely on the descriptive power of the descriptor and the repeatability of the keypoints between corresponding regions. In heterogeneous image matching, infrared images tend to be of poor quality and contain fewer detectable feature points than visible light images. Therefore, in a heterogeneous image, it is difficult to achieve satisfactory matching performance based on a single point feature descriptor. Edges are an important image feature, and stable local features are easier to acquire than point features in heterogeneous image matching. However, edge features are less applicable in image matching than feature point-based feature matching because descriptors constructed based on edge features need to encode local information, and the uncertainty of the end points has a large impact on the matching performance. Therefore, it is still difficult to obtain satisfactory matching results in heterogeneous image matching using feature point or edge based features alone. However, in the current stage of heterogeneous image matching, in order to meet the matching efficiency, single features are often adopted for matching, but the single features do not fully utilize useful information of the image, so that the matching precision is not high, and the number of the obtained matching point pairs is small.
Disclosure of Invention
The invention aims to provide a point-line-characteristic-based maximum histogram heterogeneous image matching method, which is characterized in that all-weather monitoring and accurate positioning of remote sensing images are realized by matching different imaging characteristics under different sensors and fusing useful information of all wave bands, the influence of image characteristics on a heterogeneous image matching system is fully analyzed, characteristic elements influencing the performance of the matching system are determined, a fused local characteristic descriptor is designed, and the problem that the description characteristics are unstable when a single characteristic matching method is used for matching heterogeneous images with large regional characteristic differences is solved.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
a maximum histogram heterogeneous image matching method based on dotted line features comprises the following steps:
respectively graying the reference image and the heterogeneous image to be matched and extracting characteristic points distributed in a thick edge area;
constructing feature point maximum gradient direction feature description and feature point maximum edge direction feature description for the feature points extracted from the reference image and the heterogeneous image to be matched;
combining the feature point maximum gradient direction feature description and the feature point maximum edge direction feature description to construct a new feature point descriptor;
and matching the characteristic points by adopting a bilateral matching rule.
Further, the extracting of the feature points of the thick edge region after the graying of the reference image and the heterogeneous image to be matched respectively comprises:
by taking into account different confidence thresholds when a potential target detector confidence is attached to its predicted position, a complete set of target detectors can be obtained, with the confidence object detector assigned to an input image or set of images, and a set of confidence positions generated:
h={((x1,y1),c1),((x2,y2),c2),...,((xi,yi),ci)}
wherein (x)i,yi) Is the ith predicted position in the image, ciIs the prediction confidence;
denote the filtered list h at confidence threshold θ by h (θ):
h(θ)={(x,y):((x,y),c)∈h and c≥θ}
the filtered list h under the confidence threshold theta outputs the positions (x) of the feature points distributed in the thick edge region in the form of coordinatesi,yi)。
Further, the feature points extracted from the reference image and the heterogeneous image to be matched construct feature description of the maximum gradient direction of the feature points, including:
graying an image to be matched and solving the gradient:
Figure BDA0002365652890000022
wherein I is a gray scale image, GxAnd GyGradients in the horizontal x-direction and vertical y-direction, respectively;
for the solved gradient, the gradient in the opposite direction is normalized to the same direction by gradient square:
Figure BDA0002365652890000031
wherein G iss,xAnd Gs,yRepresents the square of the gradient in the horizontal x-direction and the vertical y-direction, respectively;
solving the local average gradient value:
Figure BDA0002365652890000032
wherein the content of the first and second substances,
Figure BDA0002365652890000033
respectively representing the average gradient values, h, in the horizontal x-direction and the vertical y-directionσIs a Gaussian window;
the principal direction of the feature points of each image is defined as:
Figure BDA0002365652890000034
wherein phi represents the main direction of the feature point;
dividing a 128x128 pixel area around the feature point into small areas connected with 4x4, establishing directional distribution histograms of all feature points in the small areas based on the main direction of the feature point in each small area, selecting the maximum value of the histogram of each small area as the direction of the small area, and constructing the feature description of the maximum gradient direction of the feature point:
HPO=[θ1,V1......θ16,V16]
wherein HPO represents a feature point maximum gradient direction feature descriptor, θi,ViAnd the histogram values of the maximum gradient direction and the maximum gradient direction of the ith small region of the local image of the feature point are respectively shown, wherein i is 1,2, … and 16.
Further, the constructing the feature description of the maximum edge direction of the feature point for the feature points extracted from the reference image and the heterogeneous image to be matched includes:
dividing a 128x128 pixel area around the feature point into small areas which are connected with each other by 4x4, and constructing a feature point maximum edge direction histogram based on edge directions of edge pixel points in the small areas;
using Sobel's operation, the edge direction histogram is calculated:
binHEO(x,y)=fSobel(x,y)·Ik(x,y)
wherein binHEO(x, y) is the histogram distribution, fSobel(x, y) is the Sobel operation, Ik(x, y) is an image block of the kth small region, k ═ 1,2, …,16, "· represents a dot product;
constructing a feature point local maximum edge direction feature descriptor HEO:
HEO=[θ′1,V′1......θ′16,V′16]
wherein, theta'i,V′iAnd the histogram values of the maximum edge direction and the maximum edge direction of the ith local small region of the feature point are respectively shown, wherein i is 1,2, … and 16.
Further, the constructing a new feature point descriptor by combining the feature point maximum gradient direction feature description and the feature point maximum edge direction feature description includes:
HPEO=[θ1,V1......θ16,V16,λθ1',λV1',…,λθ′16,λV′16]
wherein, HPEO is a descriptor of the feature point, λ is the ratio of the edge pixel point and the feature point, θi,ViRespectively representing the maximum gradient direction and the maximum gradient direction histogram value theta of the ith small region of the local image of the feature pointi',Vi' denotes the maximum edge direction and maximum edge direction histogram values of the local i-th small region of the feature point, i is 1,2, …,16, respectively.
Further, the matching the feature points by using the bilateral matching rule includes:
feature point f of reference image1iTo the feature point set F of the image to be matched2All distances of (a) are expressed as:
Figure BDA0002365652890000041
if f'2j-f″2jIf the value of | is less than t, the corresponding closest point is the matching point; on the contrary, simultaneously satisfying the condition on the image to be matchedCharacteristic point f2iThe closest point on the reference image is f1iThen characteristic point f1iAnd f2iThe bilateral matching rule is met;
wherein, f'2jAnd f ″)2jThe nearest and the second nearest among all distances, respectively, and t is the nearest neighbor threshold.
Further, the method also comprises the step of eliminating mismatching point pairs after the heterogeneous images are matched:
the preliminarily matched feature point pairs are as follows:
M(P11(x11,y11),P21(x21,y21)),....,M(p1k(x1k,y1k),P2k(x2k,y2k))
wherein M denotes the matching process, M2k(x1k,y1k) And P2k(x2k,y2k) Respectively representing matching point pairs;
constructing an angle on the reference image from the angle between the horizontal line and the line passing through the center point and the other feature points
Figure BDA0002365652890000042
Constructing angles between straight lines on images to be matched in the same way
Figure BDA0002365652890000043
At the angle theta of the respective configurations of the reference image and the image to be matched1,jAnd theta2,jThe difference is calculated:
Figure BDA0002365652890000044
and eliminating mismatching points by iteratively calculating the variance of the angle difference diversity.
Further, the eliminating mismatching points by iteratively calculating the variance of the angle difference diversity includes:
(6d1) inputting angle data
Figure BDA0002365652890000045
(6d2) Calculating an initial value of data variance;
(6d3) removing one angle in sequence;
(6d4) calculating the variance and the mean of the residual angles;
(6d5) if the variance of the current cycle is not less than the variance of the last cycle, entering the (6d6) th cycle; otherwise, storing the minimum variance and eliminating the current round
Figure BDA0002365652890000051
The data set is put back;
(6d6) judging whether the circulation is finished or not, if not, returning to the step (6d3), otherwise, entering the next step;
(6d7) if Min/Var < r and M > c are simultaneously satisfied, the point pair is a mismatching point, otherwise, the point pair is a correct matching point, wherein V is an initial value of variance, Min is a minimum variance, M is a mean value of all reserved data, and r and c are selected threshold values.
Further, r and c are set to 0.4 and 2, respectively.
Further, the reference image is an infrared image, and the image to be matched is a visible light image.
Compared with the prior art, the technical scheme of the invention has the following beneficial technical effects:
according to the method, a large number of characteristic points distributed in the significant edge area are extracted by using a position boosting algorithm, on one hand, sufficient characteristic points are provided for subsequent matching, on the other hand, the edge area has clear texture and is easy to generate consistent characteristics, and the method is favorable for subsequent matching.
When the local features are expressed in a combined mode, the gradient information can provide reference for the edge features, and the dependency of edge distribution is reduced; in contrast, the edge feature can provide reliable edge direction information for the feature point, and effectively alleviates the problem of performance degradation caused by inaccurate gradient direction calculation.
According to the invention, by removing the part which contributes less to the histogram, the construction process of the descriptor is optimized, and the calculation efficiency is improved.
The invention designs the rotation invariant mismatch elimination method based on the angle difference, and further improves the matching performance.
Drawings
FIG. 1 is a flowchart of a dotted line feature based maximum histogram heterogeneous image matching method of the present invention;
FIG. 2 is a graph of feature points extracted in the coarse edge region according to the present invention;
FIG. 3 is a depiction of the invention;
FIG. 4 is a schematic view of the angle construction of the present invention;
FIG. 5 is a flow chart of a rotation invariant mismatch culling method based on angle difference in the present invention;
FIG. 6 shows the matching result of the present invention in case of the opposite gray values of the different images;
FIG. 7 shows the matching result of the present invention in the presence of noise in a heterogeneous image.
Detailed Description
The invention is further described below. The following examples are only for illustrating the technical solutions of the present invention more clearly, and should not be taken as limiting the scope of the present invention.
The invention provides a point-line characteristic maximum histogram heterogeneous image matching method. Firstly, aiming at infrared and visible light different source images to be matched, graying the images respectively, and extracting a large number of characteristic points concentrated in a thick edge area by using position-based boosting. Secondly, aiming at the characteristic points detected on the infrared and visible light different source images to be matched, designing an improved histogram to extract the local maximum gradient direction of the characteristic points, and constructing a new local characteristic vector; and the mode of square summation of the windowed gradient is adopted to indicate that the local gradient ensures that the main directions of the characteristic points in the corresponding area but with different intensities are consistent. Thirdly, aiming at the feature points on the infrared and visible light different source images to be matched, the direction of the edge pixel points of the local area of the feature points is calculated, and the maximum edge direction histogram is selected, so that the local edge direction features of the feature points are constructed. Fourthly, designing and combining the maximum edge direction of the characteristic points and the gradient direction histogram of the characteristic points to construct a local characteristic point descriptor HPEO with illumination invariance aiming at the characteristic points on the infrared and visible light heterogeneous images to be matched. Fifthly, aiming at the characteristic point descriptors constructed by the infrared and visible light different source images to be matched, matching is completed by using an effective bilateral matching rule based on the Euclidean distance. And finally, designing a rotation invariant error matching rejection method based on angle difference aiming at the matching point pairs constructed by the infrared and visible light different source images to be matched so as to achieve higher matching accuracy.
With reference to fig. 1, the present invention provides a maximum histogram heterogeneous image matching method based on dotted line features, which specifically includes:
(1) aiming at the problems that the characteristics of the traditional heterogeneous image are not obvious and the matching point pairs are less to cause low matching precision, aiming at the infrared and visible light heterogeneous source images to be matched, the infrared and visible light heterogeneous source images are respectively grayed and then extracted by a position-based boosting detector to obtain a large number of characteristic points concentrated in a thick edge area, so that the guarantee is provided for extracting a large number of repeatable descriptors and improving the matching precision. In the invention, the reference image is an infrared image, and the image to be matched is a visible light image.
Further, the extracting the feature points of the coarse edge region includes:
(1a) by taking into account different confidence thresholds, a complete set of target detectors can be obtained when a (potential) target detector confidence is attached to its predicted position. The confidence object detector is assigned to an input image (or set of images) and generates a set of confidence positions:
h={((x1,y1),c1),((x2,y2),c2),...,((xi,yi),ci)}
wherein (x)i,yi) Is the ith predicted position in the image, ciIs the prediction confidence. The confidence in this list defines the ordering of the detections.
(1b) Denote the filtered list h at confidence threshold θ by h (θ):
h(θ)={(x,y):((x,y),c)∈h and c≥θ}
the filtered list h under the confidence threshold theta outputs the positions (x) of the feature points distributed at the coarse edge in the form of coordinatesi,yi). Fig. 2 is an example of a large number of feature points extracted in the edge region. It should be noted here that the detected target is output in a point coordinate manner with dense edges of the target.
(2) Aiming at the problem that gradient directions are inconsistent due to inconsistent image intensities and even opposite contrasts of corresponding areas commonly seen in heterogeneous image matching, an improved histogram is designed for extracting the local maximum gradient direction of the feature points on the infrared and visible light heterogeneous images to be matched, and a new feature point maximum gradient direction feature description is constructed. Specifically, a mode of square summation of windowed gradients is designed to represent local gradients, and main directions of characteristic points in corresponding regions but with different intensities are guaranteed to be consistent. The realization process is as follows:
(2a) graying and gradient solving are performed on the image to be matched, and the expression is as follows:
Figure BDA0002365652890000071
wherein I is a gray scale image, GxAnd GyThe gradients in the horizontal (x) and vertical (y) directions, respectively.
(2b) For the gradient in step (2a), the gradient in the opposite direction is changed to the same direction, specifically horizontal (G) by using the square of the gradients,x) And the vertical direction (G)s,y) The gradient squared is expressed as:
Figure BDA0002365652890000072
(2c) and (3) summing the gradients in the step (2b) by adopting a Gaussian windowing summation mode to obtain a local average gradient value:
Figure BDA0002365652890000073
wherein the content of the first and second substances,
Figure BDA0002365652890000074
respectively representing the average gradient in the horizontal direction and the average gradient value in the vertical direction, hσIs a gaussian window.
(2d) According to the angle relationship, the main direction of the feature point of each image of the image group to be matched is defined as:
Figure BDA0002365652890000075
where phi denotes the principal direction of the image feature points.
(2e) Dividing a 128x128 pixel area around the feature point into small areas connected with 4x4, establishing gradient direction distribution histograms of all feature points in the small areas based on the feature point direction in each small area, and selecting the maximum value of each area histogram as the direction of the area, so as to construct the feature point maximum gradient direction characterization:
HPO=[θ1,V1......θ16,V16](5)
wherein HPO represents a feature point maximum gradient direction feature descriptor, θi,ViAnd the histogram values of the maximum gradient direction and the maximum gradient direction of the ith small region of the local image of the feature point are respectively shown, wherein i is 1,2, … and 16.
(3) And aiming at the characteristic points on the infrared and visible light heterogeneous images to be matched, calculating the directions of the edge pixel points of the local areas of the characteristic points, and selecting the maximum edge direction as the main direction of the characteristic points so as to construct the local edge characteristics of the characteristic points.
The method specifically comprises the following steps:
(3a) and dividing a 128x128 pixel area around the feature point into small areas connected with 4x4, and constructing a feature point maximum edge direction histogram based on the edge direction of edge pixel points in the small areas.
(3b) Using Sobel operation, the edge direction histogram is calculated, and the calculation formula is as follows:
binHEO(x,y)=fSobel(x,y)·Ik(x,y) (6)
wherein,binHEO(x, y) is the histogram distribution, fSobel(x, y) is the Sobel operation, Ik(x, y) are image blocks representing the kth small region, and k ═ 1,2, …,16, "· represents a dot product.
(3c) Selecting the maximum value in the edge direction histograms created by the small regions to construct a feature description, namely the feature point maximum edge direction histogram feature:
HEO=[θ1',V1'......θ′16,V′16](7)
wherein, thetai',Vi' denotes the maximum edge direction and maximum edge direction histogram values of the ith small region of the feature point local image, i is 1,2, …,16, respectively.
(4) Aiming at the problem that the matching features of single heterogeneous image feature matching are unstable when the regional feature difference is large, a local feature point descriptor HPEO with illumination invariance is constructed by combining a maximum edge direction histogram and a feature point gradient direction histogram, and the construction process of the feature point descriptor is shown in FIG. 3.
The new feature descriptor consists of 64-dimensional feature vectors, as follows: (unmatched, matched below)
HPEO=[θ1,V1......θ16,V16,λθ′1,λV′1,…,λθ′16,λV′16](8)
And the ratio of the number of the edge pixels extracted by the lambda to the number of the characteristic points.
(5) According to the method, on the basis of solving the Euclidean distances among all the feature points, the feature points are matched by adopting a bilateral matching rule.
Feature point f of reference image1iTo the feature point set F of the image to be matched2All distances of (a) are expressed as:
Figure BDA0002365652890000081
definition of f'2jAnd f ″)2jThe nearest and the second nearest of all distances, respectively. If f'2j-f″2jIf the value of | is less than t, the corresponding closest point is the matching point; on the contrary, the feature point f on the image to be matched is satisfied2iThe closest point on the reference image is f1iThen characteristic point f1iAnd f2iThe bilateral matching rule is met; where the nearest threshold t is set to 0.9.
(6) Aiming at the problem that partial mismatching point pairs still exist after the characteristics of the heterogeneous images are matched, and aiming at the matching point pairs constructed by the infrared and visible light heterogeneous source images to be matched, a rotation invariant mismatching rejection method based on angle difference is designed so as to achieve higher matching accuracy.
The method comprises the following specific steps:
(6a) the feature point pairs subjected to preliminary matching obtained in the step (5) are as follows:
M(P11(x11,y11),P21(x21,y21)),....,M(p1k(x1k,y1k),P2k(x2k,y2k)) (9)
wherein M denotes the matching process, M2k(x1k,y1k) And P2k(x2k,y2k) Respectively representing pairs of matching points.
(6b) Referring to FIG. 4, an angle θ is formed on the infrared image from the angle between the horizontal line and the line passing through the center point and the other feature points1,11,2....θ1,kSimilarly, the angle θ between the straight lines on the visible light image can be constructed2,12,2....θ2,k
(6c) The angle theta of the reference image and the target image constructed in step (6b)1,jAnd theta2,jThe difference is calculated:
Figure BDA0002365652890000093
(6d) and eliminating mismatching points by iteratively calculating the variance of the angle difference diversity. Fig. 5 is a flowchart of determining whether a certain feature point pair is a correct matching point, which includes the following steps:
(6d1) inputting angle data
Figure BDA0002365652890000091
(6d2) Calculating an initial value of data variance;
(6d3) removing one angle in sequence;
(6d4) calculating the variance and the mean of the residual angles;
(6d5) if the variance of the current cycle is not less than the variance of the last cycle, entering the (6d6) th cycle; otherwise, storing the minimum variance and eliminating the current round
Figure BDA0002365652890000092
The data set is put back;
(6d6) judging whether the circulation is finished or not, if not, returning to the step (6d3), otherwise, entering the next step;
(6d7) if Min/Var < r and M > c are satisfied simultaneously, the point pair is a mismatching point, otherwise, the point pair is a correct matching point, wherein V is an initial value of variance, Min is a minimum variance, and M is a mean value of all the retained data.
The thresholds r and c are selected thresholds, and the degree of looseness of the matching standard can be controlled, the larger r is, the fewer the retained feature points are, and the larger c is, the more the retained feature points are. In the present invention, r and c are set to 0.4 and 2, respectively.
FIG. 6 shows the matching result of the present invention in case of the opposite gray values of the different images. The matching image is a remote sensing image with completely opposite gray intensity, and a large number of characteristic point pairs can be matched by adopting the algorithm provided by the invention, and all the matching point pairs are ensured to be correctly matched.
FIG. 7 shows the matching result of the present invention on a remote sensing image with strong noise. It can be seen that although the noise strength is large, the proposed algorithm can still achieve a large number of matches while satisfying a high accuracy.
The above description is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, several modifications and variations can be made without departing from the technical principle of the present invention, and these modifications and variations should also be regarded as the protection scope of the present invention.

Claims (10)

1. A maximum histogram heterogeneous image matching method based on dotted line features is characterized by comprising the following steps:
respectively graying the reference image and the heterogeneous image to be matched and extracting characteristic points distributed in a thick edge area;
constructing feature point maximum gradient direction feature description and feature point maximum edge direction feature description for the feature points extracted from the reference image and the heterogeneous image to be matched;
combining the feature point maximum gradient direction feature description and the feature point maximum edge direction feature description to construct a new feature point descriptor;
and matching the characteristic points by adopting a bilateral matching rule.
2. The dotted line feature-based maximum histogram heterogeneous image matching method according to claim 1, wherein the extracting of feature points of a coarse edge region after graying the reference image and the heterogeneous image to be matched respectively comprises:
by taking into account different confidence thresholds when a potential target detector confidence is attached to its predicted position, a complete set of target detectors can be obtained, with the confidence object detector assigned to an input image or set of images, and a set of confidence positions generated:
h={((x1,y1),c1),((x2,y2),c2),...,((xi,yi),ci)}
wherein (x)i,yi) Is the ith predicted position in the image, ciIs the prediction confidence;
denote the filtered list h at confidence threshold θ by h (θ):
h(θ)={(x,y):((x,y),c)∈h and c≥θ}
the filtered list h under the confidence threshold theta outputs the positions (x) of the feature points distributed in the thick edge region in the form of coordinatesi,yi)。
3. The dotted-line-feature-based maximum histogram heterogeneous image matching method according to claim 1, wherein the constructing feature point maximum gradient direction feature description for the feature points extracted from the reference image and the heterogeneous image to be matched comprises:
graying an image to be matched and solving the gradient:
Figure FDA0002365652880000011
wherein I is a gray scale image, GxAnd GyGradients in the horizontal x-direction and vertical y-direction, respectively;
for the solved gradient, the gradient in the opposite direction is normalized to the same direction by gradient square:
Figure FDA0002365652880000021
wherein G iss,xAnd Gs,yRepresents the square of the gradient in the horizontal x-direction and the vertical y-direction, respectively;
solving the local average gradient value:
Figure FDA0002365652880000022
wherein the content of the first and second substances,
Figure FDA0002365652880000023
respectively representing the average gradient values, h, in the horizontal x-direction and the vertical y-directionσIs a Gaussian window;
the principal direction of the feature points of each image is defined as:
Figure FDA0002365652880000024
wherein phi represents the main direction of the feature point;
dividing a 128x128 pixel area around the feature point into small areas connected with 4x4, establishing directional distribution histograms of all feature points in the small areas based on the main direction of the feature point in each small area, selecting the maximum value of the histogram of each small area as the direction of the small area, and constructing the feature description of the maximum gradient direction of the feature point:
HPO=[θ1,V1......θ16,V16]
wherein HPO represents a feature point maximum gradient direction feature descriptor, θi,ViAnd the histogram values of the maximum gradient direction and the maximum gradient direction of the ith small region of the local image of the feature point are respectively shown, wherein i is 1,2, … and 16.
4. The dotted-line-feature-based maximum histogram heterogeneous image matching method according to claim 1, wherein the constructing feature point maximum edge direction feature descriptions for the feature points extracted from the reference image and the heterogeneous image to be matched comprises:
dividing a 128x128 pixel area around the feature point into small areas which are connected with each other by 4x4, and constructing a feature point maximum edge direction histogram based on edge directions of edge pixel points in the small areas;
using Sobel's operation, the edge direction histogram is calculated:
binHEO(x,y)=fSobel(x,y)·Ik(x,y)
wherein binHEO(x, y) is the histogram distribution, fSobel(x, y) is the Sobel operation, Ik(x, y) is an image block of the kth small region, k ═ 1,2, …,16, "· represents a dot product;
constructing a feature point local maximum edge direction feature descriptor HEO:
HEO=[θ′1,V′1......θ′16,V′16]
wherein, theta'i,V′iAnd the histogram values of the maximum edge direction and the maximum edge direction of the ith local small region of the feature point are respectively shown, wherein i is 1,2, … and 16.
5. The dotted-line-feature-based maximum histogram heterogeneous image matching method according to claim 1, wherein the constructing a new feature point descriptor by combining feature point maximum gradient direction feature description and feature point maximum edge direction feature description comprises:
HPEO=[θ1,V1......θ16,V16,λθ′1,λV′1,…,λθ′16,λV′16]
wherein, HPEO is a descriptor of the feature point, λ is the ratio of the edge pixel point and the feature point, θi,ViRespectively representing maximum gradient direction histogram values and maximum gradient direction histogram values theta 'of ith small region of the local image of the feature point'i,V′iAnd the histogram values of the maximum edge direction and the maximum edge direction of the ith local small region of the feature point are respectively shown, wherein i is 1,2, … and 16.
6. The dotted line feature-based maximum histogram heterogeneous image matching method according to claim 1, wherein the matching of feature points by using a bilateral matching rule comprises:
feature point f of reference image1iTo the feature point set F of the image to be matched2All distances of (a) are expressed as:
Figure FDA0002365652880000031
if f'2j-f″2jIf the value of | is less than t, the corresponding closest point is the matching point; on the contrary, the feature points f on the image to be matched are satisfied simultaneously2iThe closest point on the reference image is f1iThen characteristic point f1iAnd f2iThe bilateral matching rule is met;
wherein, f'2jAnd f ″)2jThe nearest and the second nearest among all distances, respectively, and t is the nearest neighbor threshold.
7. The point-line-feature-based maximum histogram heterogeneous image matching method according to claim 1, further comprising the step of removing mismatching point pairs after heterogeneous image matching:
the preliminarily matched feature point pairs are as follows:
M(P11(x11,y11),P21(x21,y21)),....,M(p1k(x1k,y1k),P2k(x2k,y2k))
wherein M denotes the matching process, M2k(x1k,y1k) And P2k(x2k,y2k) Respectively representing matching point pairs;
constructing an angle on the reference image from the angle between the horizontal line and the line passing through the center point and the other feature points
Figure FDA0002365652880000032
Constructing angles between straight lines on images to be matched in the same way
Figure FDA0002365652880000033
At the angle theta of the respective configurations of the reference image and the image to be matched1,jAnd theta2,jThe difference is calculated:
Figure FDA0002365652880000041
and eliminating mismatching points by iteratively calculating the variance of the angle difference diversity.
8. The method of claim 7, wherein the eliminating mismatching points by iteratively calculating the variance of the angular difference diversity comprises:
(6d1) inputting angle data
Figure FDA0002365652880000042
(6d2) Calculating an initial value of data variance;
(6d3) removing one angle in sequence;
(6d4) calculating the variance and the mean of the residual angles;
(6d5) if the variance of the current cycle is not less than the variance of the last cycle, entering the (6d6) th cycle; otherwise, storing the minimum variance and eliminating the current round
Figure FDA0002365652880000043
The data set is put back;
(6d6) judging whether the circulation is finished or not, if not, returning to the step (6d3), otherwise, entering the next step;
(6d7) if Min/Var < r and M > c are simultaneously satisfied, the point pair is a mismatching point, otherwise, the point pair is a correct matching point, wherein V is an initial value of variance, Min is a minimum variance, M is a mean value of all reserved data, and r and c are selected threshold values.
9. The method as claimed in claim 8, wherein r and c are set to 0.4 and 2 respectively.
10. The method for maximum histogram heterogeneous image matching based on dotted line features as claimed in any one of claims 1 to 7, wherein the reference image is an infrared image and the image to be matched is a visible light image.
CN202010034764.8A 2020-01-14 2020-01-14 Point-line-feature-based maximum histogram heterogeneous image matching method Active CN111242139B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010034764.8A CN111242139B (en) 2020-01-14 2020-01-14 Point-line-feature-based maximum histogram heterogeneous image matching method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010034764.8A CN111242139B (en) 2020-01-14 2020-01-14 Point-line-feature-based maximum histogram heterogeneous image matching method

Publications (2)

Publication Number Publication Date
CN111242139A true CN111242139A (en) 2020-06-05
CN111242139B CN111242139B (en) 2023-04-18

Family

ID=70874502

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010034764.8A Active CN111242139B (en) 2020-01-14 2020-01-14 Point-line-feature-based maximum histogram heterogeneous image matching method

Country Status (1)

Country Link
CN (1) CN111242139B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114694040A (en) * 2022-05-31 2022-07-01 潍坊绘圆地理信息有限公司 Data identification method for optical remote sensing data block registration based on dynamic threshold

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110148162A (en) * 2019-04-29 2019-08-20 河海大学 A kind of heterologous image matching method based on composition operators
CN110232387A (en) * 2019-05-24 2019-09-13 河海大学 A kind of heterologous image matching method based on KAZE-HOG algorithm

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110148162A (en) * 2019-04-29 2019-08-20 河海大学 A kind of heterologous image matching method based on composition operators
CN110232387A (en) * 2019-05-24 2019-09-13 河海大学 A kind of heterologous image matching method based on KAZE-HOG algorithm

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
YUN-HUI JIANG: "Optical/SAR image registration based on cross-correlation with multi-scale and multi-direction Gabor characteristic matrixes", 《IEEE XPLORE》 *
张姣,等: "SIFT与形状上下文结合的异源图像匹配算法", 《激光与红外》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114694040A (en) * 2022-05-31 2022-07-01 潍坊绘圆地理信息有限公司 Data identification method for optical remote sensing data block registration based on dynamic threshold

Also Published As

Publication number Publication date
CN111242139B (en) 2023-04-18

Similar Documents

Publication Publication Date Title
US11176706B2 (en) Systems and methods for automated camera calibration
CN110097093B (en) Method for accurately matching heterogeneous images
US20120082385A1 (en) Edge based template matching
CN112016401A (en) Cross-modal-based pedestrian re-identification method and device
Bak et al. Improving person re-identification by viewpoint cues
US11030436B2 (en) Object recognition
CN111667506B (en) Motion estimation method based on ORB feature points
CN110569861B (en) Image matching positioning method based on point feature and contour feature fusion
Schmiedel et al. IRON: A fast interest point descriptor for robust NDT-map matching and its application to robot localization
CN110009680B (en) Monocular image position and posture measuring method based on circle feature and different-surface feature points
Fan et al. SAR image registration using multiscale image patch features with sparse representation
CN111680699B (en) Air-ground infrared time-sensitive weak small target detection method based on background suppression
CN110120013B (en) Point cloud splicing method and device
Liu et al. Regularization based iterative point match weighting for accurate rigid transformation estimation
Lu et al. Robustness to lighting variations: An RGB-D indoor visual odometry using line segments
Du et al. Infrared and visible image registration based on scale-invariant piifd feature and locality preserving matching
CN111932582A (en) Target tracking method and device in video image
Haner et al. Combining foreground/background feature points and anisotropic mean shift for enhanced visual object tracking
CN111932579A (en) Method and device for adjusting equipment angle based on motion trail of tracked target
Bernreiter et al. Spherical multi-modal place recognition for heterogeneous sensor systems
CN114066954B (en) Feature extraction and registration method for multi-modal image
CN111242139B (en) Point-line-feature-based maximum histogram heterogeneous image matching method
CN111311657B (en) Infrared image homologous registration method based on improved corner principal direction distribution
CN111882594A (en) ORB feature point-based polarization image rapid registration method and device
WO2017068331A1 (en) Improvements in and relating to missile targeting

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant