CN112634335A - Method for extracting characteristic point pairs of robust remote sensing image facing to nonlinear radiation distortion - Google Patents

Method for extracting characteristic point pairs of robust remote sensing image facing to nonlinear radiation distortion Download PDF

Info

Publication number
CN112634335A
CN112634335A CN202011564394.5A CN202011564394A CN112634335A CN 112634335 A CN112634335 A CN 112634335A CN 202011564394 A CN202011564394 A CN 202011564394A CN 112634335 A CN112634335 A CN 112634335A
Authority
CN
China
Prior art keywords
image
points
feature
point
remote sensing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011564394.5A
Other languages
Chinese (zh)
Inventor
张海涛
赵薇薇
丁一帆
李泽一
王艳
周颖
陈雪华
李谦
王刚
曾小莉
黄翊航
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Beijing Institute of Remote Sensing Information
Original Assignee
Tsinghua University
Beijing Institute of Remote Sensing Information
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University, Beijing Institute of Remote Sensing Information filed Critical Tsinghua University
Priority to CN202011564394.5A priority Critical patent/CN112634335A/en
Publication of CN112634335A publication Critical patent/CN112634335A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/337Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/168Segmentation; Edge detection involving transform domain methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/35Determination of transform parameters for the alignment of images, i.e. image registration using statistical methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Abstract

The invention discloses a method for extracting a characteristic point pair of a robust remote sensing image facing to nonlinear radiation distortion, which comprises the following specific steps: performing image preprocessing on the two obtained multi-mode remote sensing images; respectively extracting accurate angle points and edge feature points of the master image and the slave image, and acquiring the feature points of each image: generating a feature vector of the feature point: registering the feature vectors of the multi-modal image feature points: and further screening all candidate matching point pairs by using a RANSAC algorithm to finally obtain accurate and uniform characteristic matching point pairs which are robust to nonlinear radiation distortion. The invention utilizes the phase consistency measurement rectangular graph robust to nonlinear radiation distortion as the object of characteristic point extraction, can obtain a large amount of accurate robust characteristic point information, adopts the mode of extracting Harris angular points by image blocks, and effectively avoids the problems of over-concentrated distribution of image characteristic points and low characteristic extraction efficiency caused by uneven illumination radiation.

Description

Method for extracting characteristic point pairs of robust remote sensing image facing to nonlinear radiation distortion
Technical Field
The invention belongs to the technical field of remote sensing image processing, and particularly relates to a method for extracting a robust remote sensing image feature point pair for nonlinear radiation distortion.
Background
At present, the multi-modal remote sensing image feature matching has become a very important research direction in the field of remote sensing image processing. The matching information extracted by the multi-mode remote sensing image features can be widely applied to various fields of national production and living and economic construction, such as natural disasters, environmental monitoring, multi-source information joint positioning, urban planning, major engineering construction and the like. However, the multimode remote sensing images have different sources, and the resolution, spectrum, angle, time phase and other details are different, which also brings about a challenge for overcoming the matching difference of heterogeneous images and improving the matching accuracy and stability.
In the multi-mode remote sensing image feature matching method, the remote sensing image matching method based on mutual information and the template feature extraction method based on SIFT and the improvement strategy thereof are widely applied at present. These two methods perform well in certain tasks, but still have few limitations and are difficult to cope with high nonlinear radiation distortion and noise.
Lexin, Yangyuhui et al disclose a method for multi-source remote sensing image matching based on directional phase features in the thesis "multisource remote sensing image matching using directional phase features" (Wuhan university journal of information science, 2020,45(04): 488-one 494), the method uses the existing imaging parameters and geographic reference information to perform preliminary calibration on the image, uses a block Harris algorithm to extract the feature points of the reference image based on the calibration graph, then uses the phase feature images of the image in multiple directions to construct dense feature descriptors, and uses a sliding window mode to determine the matching point pairs, finally uses a Taylor series of a certain order to fit the matching result to a sub-pixel level, thereby realizing stable and reliable matching of the same-name points. Although the method effectively utilizes the phase consistency characteristic and can resist nonlinear radiation distortion to a certain extent, the method has strong dependence on preliminary calibration of geographic information, the accuracy is high only when a small amount of pixel deviation exists, and when the deviation of a remote sensing image is large, the calculated amount is increased and the performance is poor.
The patent application ' a rapid and robust multi-mode remote sensing image matching method and system ' proposed by the southwest university of transportation ' (application number CN201710773285.6, publication number CN107563438A) discloses an automatic multi-mode remote sensing image matching method integrating various local feature descriptors. The method comprises the steps of firstly extracting local feature descriptors from each pixel of an image to form a pixel-by-pixel feature expression graph, then establishing a rapid matching similarity measure in a frequency domain by utilizing three-dimensional Fourier transform based on the feature expression graph, and finally adopting template matching to identify homonymy points. According to the method, through the synthesis of a plurality of traditional local feature descriptors such as HOG, LSS or SURF, the accuracy and matching performance of attribute description of common structures, shapes, textures and the like of the images are improved to a certain extent, but the traditional descriptors still have no robustness to nonlinear radiation distortion, and the pixel-by-pixel matching method also reduces the calculation efficiency and is not beneficial to the effective matching of the multi-mode images.
Disclosure of Invention
The invention discloses a method for extracting a characteristic point pair of a robust remote sensing image facing nonlinear radiation distortion, which aims at solving the problems that the existing multimode remote sensing image is difficult to extract accurate and uniform robust characteristic information, so that heterogeneous information fusion is difficult and satellite uncontrolled positioning accuracy is low.
The invention discloses a method for extracting a characteristic point pair of a robust remote sensing image facing to nonlinear radiation distortion, which comprises the following specific steps:
s1, image preprocessing is carried out on the two obtained multi-modal remote sensing images, and the image preprocessing specifically comprises the following steps:
s11, judging whether the obtained multi-mode remote sensing image is full-color black and white or not, if so, turning to S12, and otherwise, weighting the multi-mode remote sensing image into a black and white image;
s12, setting one image in the multi-modal remote sensing images as a main image and the other image as a slave image;
s13, resampling the main image and the auxiliary image respectively to make the resolution of the main image and the auxiliary image the same;
s14, extracting the maximum moment graph and the minimum moment graph of phase consistency of the master image and the slave image respectively;
setting the filter scale number NsNumber of directions N =4o=6;
For each pixel point (x, y) of the main image, calculating the phase consistency measure PC (x, y) of the pixel point under all different scales s and different directions o, wherein the calculation formula is as follows:
Figure BDA0002861254260000031
in the formula, wo(x, y) is a weight function based on the frequency response range at the pixel point (x, y), T is a noise threshold, xi is an offset, and an operator
Figure BDA0002861254260000032
Denotes that if and only if its expression is positive, take itself, otherwise take 0, the amplitude component a at the pixel point (x, y)so(x, y), phase deviation function Δ ΦsoThe calculation formulas of (x, y) are respectively as follows:
Figure BDA0002861254260000033
Figure BDA0002861254260000034
wherein the content of the first and second substances,
Figure BDA0002861254260000035
and
Figure BDA0002861254260000036
the calculation formula of (2) is as follows:
Figure BDA0002861254260000037
Figure BDA0002861254260000038
Figure BDA0002861254260000039
wherein [ E ]so(x,y),oso(x,y)]The filtered response component at pixel point (x, y) is expressed as:
Figure BDA00028612542600000310
in the formula, Leven(x, y, s, o) and Lodd(x, y, s, o) are respectively a real part and an imaginary part of the two-dimensional log-Gabor filter function at the pixel point (x, y) in the spatial domain, which represents convolution operation, and I (x, y) is a gray value at the pixel point (x, y), wherein the expression of the two-dimensional log-Gabor filter function is as follows:
Figure BDA00028612542600000311
where (rho, theta) represents the log polar coordinate and (s, o) is the dimension and direction of the filter, rhosAnd thetasoAs a center frequency corresponding parameter, σρAnd σθBandwidths of ρ and θ, respectively;
taking the maximum value of PC (x, y) in different scale degrees of the direction o as the phase consistency measure of the direction o, and recording the maximum value as PC (theta)o),θoAn angle value representing the direction o when PC (x, y) takes the maximum value;
for the main image and the auxiliary image, the main axis psi and the maximum moment M of the pixel points (x, y) contained in the main image and the auxiliary image are respectively calculatedψMinimum moment mψThe specific process comprises the following steps of,
firstly, calculating intermediate variables a, b and c by the formula,
Figure BDA0002861254260000041
Figure BDA0002861254260000042
Figure BDA0002861254260000043
calculating the principal axis psi and the maximum moment M of the pixel point (x, y) according to the intermediate variablesψMinimum moment mψThe calculation formula is as follows,
Figure BDA0002861254260000044
Figure BDA0002861254260000045
Figure BDA0002861254260000046
maximum moment M of all pixel points in imageψWith minimum moment mψRespectively forming a phase consistency maximum moment map and a minimum moment map of the image.
S2, extracting the accurate angle points and edge feature points of the master image and the slave image respectively, and acquiring the feature points of each image:
s21, uniformly dividing the minimum moment image into N multiplied by N non-overlapping rectangular image small blocks, and accurately extracting the corner points of each small block by using a multi-scale Harris algorithm;
setting the scale sigma of a Gaussian kerneliI is 1,2,3, where σ1=0.5,σ2=1,σ3=2;
Filtering each image small block by using a Gaussian kernel function, finally generating a multi-scale image set containing three images for each image small block, extracting a plurality of angular points from the image set by using a Harris algorithm, iteratively filtering pseudo angular points along the direction from small scale to large scale to obtain accurate angular points, and storing Harris operator values of the angular points;
s22, sorting Harris operator values of the accurate corner points extracted from each image small block, and taking K points with the maximum Harris operator values as characteristic corner points of the image small blocks, wherein the number of the final characteristic corner points is NxNxK;
s23, extracting a plurality of edge feature points by using a FAST algorithm for the maximum moment image, further screening M edge feature points with high repetition rate to form an edge feature point set, and combining the edge feature point set with feature corner points to obtain final feature points of the image;
and S3, generating a feature vector of the feature point:
s31, generating a maximum index map of the master image and the slave image by using a log-Gabor filter;
s311, calculating the log-Gabor convolution layer amplitude A of each pixel point (x, y) of the master-slave image in 6 filtering directions of 0, 30 degrees, 60 degrees, 90 degrees, 120 degrees and 150 degreeso(x, y) having the formula:
Figure BDA0002861254260000051
in the formula, NsThe total number of different scales of the filter is;
s312, sequentially numbering the log-Gabor convolution layer amplitudes corresponding to the 6 filtering directions to be 1-6;
s313, comparing the 6 amplitude values, and taking the maximum value corresponding number as the index value of the pixel point;
and S314, forming a maximum index map by the set of all pixel point index values.
S32, based on all feature points of the master-slave image, selecting a local image block of J multiplied by J pixels taking the feature point as the center in the maximum value index map, and constructing a feature vector by using a distribution histogram;
s321, selecting a local image block of J multiplied by J pixels taking a feature point as a center in the maximum index map;
s322, using a Gaussian function with the characteristic point as the center and the standard deviation equal to J/2, and taking the Gaussian function value at the corresponding position of each pixel point as the weight of the pixel point;
s323, uniformly dividing the local image block into 6 multiplied by 6 sub-blocks;
s324, a distribution histogram including 6 groups is established for each sub-block: each subblock is a part of a maximum value index map, the pixel values of the subblocks are distributed between 1 and 6, the number of points with the pixel value of i in each subblock is counted and is recorded as viAnd in viAs histogram vector V ═ V1,v2,v3,v4,v5,v6) Wherein i ∈ {1,2,3,4,5,6 };
s325, sequentially connecting the histogram vectors of all the sub-blocks to obtain a feature vector, and performing normalization processing on the feature vector, so as to obtain a vector with dimension of 6 × 6 × 6 ═ 216;
s4, registering the feature vectors of the multi-modal image feature points:
s41, calculating a normalized cross correlation coefficient between the feature vector of each feature point of the main image and the feature vectors of all the feature points of the secondary image;
s42, sorting the normalized cross correlation coefficients from high to low, and taking the point with the highest normalized cross correlation coefficient and the characteristic point of the main image to form a candidate matching point pair;
and S5, further screening all candidate matching point pairs by using a RANSAC algorithm to finally obtain accurate and uniform characteristic matching point pairs which are robust to nonlinear radiation distortion.
The invention has the beneficial effects that:
firstly, constructing a master-slave image maximum moment and minimum moment graph by utilizing phase consistency information; applying a block-based multi-scale Harris algorithm to the minimum moment graph to accurately extract corner features; extracting edge features with high repetition rate by applying a FAST algorithm to the maximum moment graph; and taking the respective sets of precise corner points and edge feature points of the master image and the slave image as the image feature points. In the process, the phase consistency metric moment graph robust to the nonlinear radiation distortion is used as an object for extracting the characteristic points, so that a large amount of accurate and robust characteristic point information can be obtained. The accuracy of feature point extraction is further improved by the aid of the multi-scale iterative Harris algorithm and the FAST algorithm. Besides, Harris angular points are extracted from the sub-image blocks, so that the problems that the distribution of image characteristic points is too concentrated and the characteristic extraction efficiency is low due to uneven illumination radiation can be effectively solved.
Secondly, constructing a maximum index graph based on a log-Gabor filter; describing the feature point vector through a maximum value index map; calculating a normalized cross-correlation coefficient of each characteristic point vector of the master image and all characteristic point vectors of the slave images; sorting the coefficients, and taking the point pair with the highest similarity as a candidate matching point pair; and further eliminating the mismatching point pairs by using an RANSAC algorithm and screening the fine matching point pairs. The above process achieves robustness of the feature descriptor against nonlinear radiation distortion through the maximum value index map. The RANSAC algorithm further ensures the accuracy of the matching point pairs, and finally solves the problem of accurate, uniform and robust feature point pair extraction.
Drawings
FIG. 1 is a flow chart of steps of a method for extracting a pair of characteristic points of a robust remote sensing image facing to nonlinear radiation distortion;
FIG. 2 is a diagram illustrating the result of multiscale Harris corner extraction of a block of a partial image;
FIG. 3 is a schematic diagram of an alternative maximum index in the present invention;
fig. 4 is a graph of simulation results of the present invention.
Detailed Description
For a better understanding of the present disclosure, an example is given here.
Fig. 1 is a flow chart of steps of a method for extracting a robust feature point of a multimodal remote sensing image according to an embodiment of the present invention, and as shown in fig. 1, the embodiment discloses a method for extracting a robust remote sensing image feature point pair oriented to nonlinear radiation distortion, which includes the following specific steps:
s1, image preprocessing is carried out on the two obtained multi-modal remote sensing images, and the image preprocessing specifically comprises the following steps:
s11, judging whether the obtained multi-mode remote sensing image is full-color black and white or not, if so, turning to S12, and otherwise, weighting the multi-mode remote sensing image into a black and white image;
s12, setting one image in the multi-modal remote sensing images as a main image and the other image as a slave image;
s13, resampling the main image and the auxiliary image respectively to make the resolution of the main image and the auxiliary image the same;
s14, extracting the maximum moment graph and the minimum moment graph of phase consistency of the master image and the slave image respectively;
setting the filter scale number NsNumber of directions N =4o=6;
For each pixel point (x, y) of the main image, calculating the phase consistency measure PC (x, y) of the pixel point under all different scales s and different directions o, wherein the calculation formula is as follows:
Figure BDA0002861254260000071
in the formula, wo(x, y) is a weight function based on the frequency response range at the pixel point (x, y), T is a noise threshold, xi is an offset, and an operator
Figure BDA0002861254260000072
Denotes that if and only if its expression is positive, take itself, otherwise take 0, the amplitude component a at the pixel point (x, y)so(x, y), phase deviation function Δ ΦsoThe calculation formulas of (x, y) are respectively as follows:
Figure BDA0002861254260000081
Figure BDA0002861254260000082
wherein the content of the first and second substances,
Figure BDA0002861254260000083
and
Figure BDA0002861254260000084
the calculation formula of (2) is as follows:
Figure BDA0002861254260000085
Figure BDA0002861254260000086
Figure BDA0002861254260000087
wherein [ E ]so(x,y),oso(x,y)]The filtered response component at pixel point (x, y) is expressed as:
Figure BDA0002861254260000088
in the formula, Leven(x, y, s, o) and Lodd(x, y, s, o) are respectively a real part and an imaginary part of the two-dimensional log-Gabor filter function at the pixel point (x, y) in the spatial domain, which represents convolution operation, and I (x, y) is a gray value at the pixel point (x, y), wherein the expression of the two-dimensional log-Gabor filter function is as follows:
Figure BDA0002861254260000089
where (rho, theta) represents the log polar coordinate and (s, o) is the dimension and direction of the filter, rhosAnd thetasoAs a center frequency corresponding parameter, σρAnd σθBandwidths of ρ and θ, respectively;
taking the maximum value of PC (x, y) in different scale degrees of the direction o as the phase consistency measure of the direction o, and recording the maximum value as PC (theta)o),θoAn angle value representing the direction o when PC (x, y) takes the maximum value;
for the main image and the auxiliary image, the main axis psi and the maximum moment M of the pixel points (x, y) contained in the main image and the auxiliary image are respectively calculatedΨMinimum moment mΨThe specific process comprises the following steps of,
firstly, calculating intermediate variables a, b and c by the formula,
Figure BDA0002861254260000091
Figure BDA0002861254260000092
Figure BDA0002861254260000093
calculating the principal axis psi and the maximum moment M of the pixel point (x, y) according to the intermediate variableΨMinimum moment mΨThe calculation formula is as follows,
Figure BDA0002861254260000094
Figure BDA0002861254260000095
Figure BDA0002861254260000096
maximum moment M of all pixel points in imageΨWith minimum moment mΨRespectively forming a phase consistency maximum moment map and a minimum moment map of the image.
S2, extracting the accurate angle points and edge feature points of the master image and the slave image respectively, and acquiring the feature points of each image:
s21, uniformly dividing the minimum moment image into N multiplied by N non-overlapping rectangular image small blocks, and accurately extracting the corner points of each small block by using a multi-scale Harris algorithm;
setting the scale sigma of a Gaussian kerneliI =1,2,3, where σ1=0.5,σ2=1,σ3=2;
Filtering each image small block by using a Gaussian kernel function, finally generating a multi-scale image set containing three images for each image small block, extracting a plurality of angular points from the image set by using a Harris algorithm, iteratively filtering pseudo angular points along the direction from small scale to large scale to obtain accurate angular points, and storing Harris operator values of the angular points;
s22, sorting Harris operator values of the accurate corner points extracted from each image small block, and taking K points with the maximum Harris operator values as characteristic corner points of the image small blocks, wherein the number of the final characteristic corner points is NxNxK;
s23, extracting a plurality of edge feature points by using a FAST algorithm for the maximum moment image, further screening M edge feature points with high repetition rate to form an edge feature point set, and combining the edge feature point set with feature corner points to obtain final feature points of the image;
FIG. 2 is a diagram illustrating the result of multi-scale Harris corner extraction of a block image according to an embodiment of the present invention;
and S3, generating a feature vector of the feature point:
s31, generating a maximum index map of the master image and the slave image by using a log-Gabor filter;
s311, calculating the log-Gabor convolution layer amplitude A of each pixel point (x, y) of the master-slave image in 6 filtering directions of 0, 30 degrees, 60 degrees, 90 degrees, 120 degrees and 150 degreeso(x, y) having the formula:
Figure BDA0002861254260000101
in the formula, NsThe total number of different scales of the filter is;
s312, sequentially numbering the log-Gabor convolution layer amplitudes corresponding to the 6 filtering directions to be 1-6;
s313, comparing the 6 amplitude values, and taking the maximum value corresponding number as the index value of the pixel point;
s314, the set of index values of all the pixels forms a maximum index map, as shown in fig. 3.
S32, based on all feature points of the master-slave image, selecting a local image block of J multiplied by J pixels taking the feature point as the center in the maximum value index map, and constructing a feature vector by using a distribution histogram;
s321, selecting a local image block of J multiplied by J pixels taking a feature point as a center in the maximum index map;
s322, using a Gaussian function with the characteristic point as the center and the standard deviation equal to J/2, and taking the Gaussian function value at the corresponding position of each pixel point as the weight of the pixel point;
s323, uniformly dividing the local image block into 6 multiplied by 6 sub-blocks;
s324, a distribution histogram including 6 groups is established for each sub-block: each subblock is a part of a maximum value index map, the pixel values of the subblocks are distributed between 1 and 6, the number of points with the pixel value of i in each subblock is counted and is recorded as viAnd vi is taken as a histogram vector V ═ V (V)1,v2,v3,v4,v5,v6) Wherein i ∈ {1,2,3,4,5,6 };
s325, sequentially connecting the histogram vectors of all the sub-blocks to obtain a feature vector, and performing normalization processing on the feature vector, so as to obtain a vector with dimension of 6 × 6 × 6 ═ 216;
s4, registering the feature vectors of the multi-modal image feature points:
s41, calculating a normalized cross correlation coefficient between the feature vector of each feature point of the main image and the feature vectors of all the feature points of the secondary image;
s42, sorting the normalized cross correlation coefficients from high to low, and taking the point with the highest normalized cross correlation coefficient and the characteristic point of the main image to form a candidate matching point pair;
and S5, further screening all candidate matching point pairs by using a RANSAC algorithm to finally obtain accurate and uniform characteristic matching point pairs which are robust to nonlinear radiation distortion.
FIG. 4 is a simulation of the final result of the method of the present invention. In fig. 4, the panchromatic optical image from asset one is shown on the left, and the SAR image from the top third satellite is shown on the right. The optical image is set as a main image, and after the SAR image is set as a secondary image, the characteristic points are extracted and matched by the method disclosed by the invention. The positioning precision of the extracted characteristic points reaches 1 pixel precision.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (5)

1. A nonlinear radiation distortion-oriented robust remote sensing image feature point pair extraction method is characterized by comprising the following specific steps:
s1, preprocessing the two obtained multi-modal remote sensing images;
s2, extracting the accurate angle points and edge feature points of the master image and the slave image respectively, and acquiring the feature points of each image:
s3, generating a feature vector of the feature point;
s4, registering the feature vectors of the multi-modal image feature points;
and S5, further screening all candidate matching point pairs by using a RANSAC algorithm to finally obtain accurate and uniform characteristic matching point pairs which are robust to nonlinear radiation distortion.
2. The method for extracting the characteristic point pairs of the robust remote sensing image facing to the nonlinear radiation distortion as claimed in claim 1, wherein the step S1 comprises the following specific steps:
s11, judging whether the obtained multi-mode remote sensing image is full-color black and white, if so, turning to S12, otherwise, weighting the multi-mode remote sensing image into a black and white image;
s12, setting one image in the multi-modal remote sensing images as a main image and the other image as a slave image;
s13, resampling the main image and the auxiliary image respectively to make the resolution of the main image and the auxiliary image the same;
s14, extracting the maximum moment graph and the minimum moment graph of phase consistency of the master image and the slave image respectively;
setting the filter scale number Ns4, direction number No=6;
For each pixel point (x, y) of the main image, calculating the phase consistency measure PC (x, y) of the pixel point under all different scales s and different directions o, wherein the calculation formula is as follows:
Figure FDA0002861254250000011
in the formula, wo(x, y) is a weight function based on the frequency response range at the pixel point (x, y), T is a noise threshold, xi is an offset, and an operator
Figure FDA0002861254250000012
Denotes that if and only if its expression is positive, take itself, otherwise take 0, the amplitude component a at the pixel point (x, y)so(x, y), phase deviation function Δ ΦsoThe calculation formulas of (x, y) are respectively as follows:
Figure FDA0002861254250000021
Figure FDA0002861254250000022
wherein the content of the first and second substances,
Figure FDA0002861254250000023
and
Figure FDA0002861254250000024
the calculation formula of (2) is as follows:
Figure FDA0002861254250000025
Figure FDA0002861254250000026
Figure FDA0002861254250000027
wherein [ E ]so(x,y),oso(x,y)]The filtered response component at pixel point (x, y) is expressed as:
[Eso(x,y),Oso(x,y)]
=[I(x,y)*Leven(x,y,s,o),I(x,y)*Lodd(x,y,s,o)],
in the formula, Leven(x, y, s, o) and Lodd(x, y, s, o) are respectively a real part and an imaginary part of the two-dimensional log-Gabor filter function at the pixel point (x, y) in the spatial domain, which represents convolution operation, and I (x, y) is a gray value at the pixel point (x, y), wherein the expression of the two-dimensional log-Gabor filter function is as follows:
Figure FDA0002861254250000028
where (rho, theta) represents the log polar coordinate and (s, o) is the dimension and direction of the filter, rhosAnd thetasoAs a center frequency corresponding parameter, σρAnd σθBandwidths of ρ and θ, respectively;
taking the maximum value of PC (x, y) in different scale degrees of the direction o as the phase consistency measure of the direction o, and recording the maximum value as PC (theta)o),θoAn angle value representing the direction o when PC (x, y) takes the maximum value;
for the main image and the auxiliary image, the main axis psi and the maximum moment M of the pixel points (x, y) contained in the main image and the auxiliary image are respectively calculatedψMinimum moment mψThe specific process comprises the following steps of,
firstly, calculating intermediate variables a, b and c by the formula,
Figure FDA0002861254250000031
Figure FDA0002861254250000032
Figure FDA0002861254250000033
calculating the principal axis psi and the maximum moment M of the pixel point (x, y) according to the intermediate variablesψMinimum moment mψThe calculation formula is as follows,
Figure FDA0002861254250000034
Figure FDA0002861254250000035
Figure FDA0002861254250000036
maximum moment M of all pixel points in imageψWith minimum moment mψRespectively forming a phase consistency maximum moment map and a minimum moment map of the image.
3. The method for extracting the characteristic point pairs of the robust remote sensing image facing to the nonlinear radiation distortion as claimed in claim 1, wherein the step S2 comprises the following specific steps:
s21, uniformly dividing the minimum moment image into N multiplied by N non-overlapping rectangular image small blocks, and accurately extracting the corner points of each small block by using a multi-scale Harris algorithm;
setting the scale sigma of a Gaussian kerneliI is 1,2,3, where σ1=0.5,σ2=1,σ3=2;
Filtering each image small block by using a Gaussian kernel function, finally generating a multi-scale image set containing three images for each image small block, extracting a plurality of angular points from the image set by using a Harris algorithm, iteratively filtering pseudo angular points along the direction from small scale to large scale to obtain accurate angular points, and storing Harris operator values of the angular points;
s22, sorting Harris operator values of the accurate corner points extracted from each image small block, and taking K points with the maximum Harris operator values as characteristic corner points of the image small blocks, wherein the number of the final characteristic corner points is NxNxK;
s23, extracting a plurality of edge feature points by using a FAST algorithm for the maximum moment image, further screening M edge feature points with high repetition rate to form an edge feature point set, and combining the edge feature point set with the feature corner points to obtain the final feature points of the image.
4. The method for extracting the characteristic point pairs of the robust remote sensing image facing to the nonlinear radiation distortion as claimed in claim 1, wherein the step S3 comprises the following specific steps:
s31, generating a maximum index map of the master image and the slave image by using a log-Gabor filter;
s311, calculating the log-Gabor convolution layer amplitude A of each pixel point (x, y) of the master-slave image in 6 filtering directions of 0, 30 degrees, 60 degrees, 90 degrees, 120 degrees and 150 degreeso(x, y) having the formula:
Figure FDA0002861254250000041
in the formula, NsFor the sum of different dimensions of the filter, Aso(x, y) is the amplitude component at pixel point (x, y);
s312, sequentially numbering the log-Gabor convolution layer amplitudes corresponding to the 6 filtering directions to be 1-6;
s313, comparing the 6 amplitude values, and taking the maximum value corresponding number as the index value of the pixel point;
s314, the set of all pixel point index values forms a maximum index map;
s32, based on all feature points of the master-slave image, selecting a local image block of J multiplied by J pixels taking the feature point as the center in the maximum value index map, and constructing a feature vector by using a distribution histogram;
s321, selecting a local image block of J multiplied by J pixels taking a feature point as a center in the maximum index map;
s322, using a Gaussian function with the characteristic point as the center and the standard deviation equal to J/2, and taking the Gaussian function value at the corresponding position of each pixel point as the weight of the pixel point;
s323, uniformly dividing the local image block into 6 multiplied by 6 sub-blocks;
s324, a distribution histogram including 6 groups is established for each sub-block: each subblock is a part of a maximum value index map, the pixel values of the subblocks are distributed between 1 and 6, the number of points with the pixel value of i in each subblock is counted and is recorded as viAnd in viAs histogram vector V ═ V1,v2,v3,v4,v5,v6) Wherein i ∈ {1,2,3,4,5,6 };
and S325, sequentially connecting the histogram vectors of all the sub-blocks to obtain a feature vector, and performing normalization processing on the feature vector to finally obtain a vector with the dimension of 6 multiplied by 6 being 216.
5. The method for extracting the characteristic point pairs of the robust remote sensing image facing to the nonlinear radiation distortion as claimed in claim 1, wherein the step S4 comprises the following specific steps:
s41, calculating a normalized cross correlation coefficient between the feature vector of each feature point of the main image and the feature vectors of all the feature points of the secondary image;
and S42, sorting the normalized cross-correlation coefficients from high to low, and taking the point with the highest normalized cross-correlation coefficient and the characteristic point of the main image to form a candidate matching point pair.
CN202011564394.5A 2020-12-25 2020-12-25 Method for extracting characteristic point pairs of robust remote sensing image facing to nonlinear radiation distortion Pending CN112634335A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011564394.5A CN112634335A (en) 2020-12-25 2020-12-25 Method for extracting characteristic point pairs of robust remote sensing image facing to nonlinear radiation distortion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011564394.5A CN112634335A (en) 2020-12-25 2020-12-25 Method for extracting characteristic point pairs of robust remote sensing image facing to nonlinear radiation distortion

Publications (1)

Publication Number Publication Date
CN112634335A true CN112634335A (en) 2021-04-09

Family

ID=75324939

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011564394.5A Pending CN112634335A (en) 2020-12-25 2020-12-25 Method for extracting characteristic point pairs of robust remote sensing image facing to nonlinear radiation distortion

Country Status (1)

Country Link
CN (1) CN112634335A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113066015A (en) * 2021-05-14 2021-07-02 清华大学 Multi-mode remote sensing image rotation difference correction method based on neural network
CN113256653A (en) * 2021-05-25 2021-08-13 南京信息工程大学 High-rise ground object-oriented heterogeneous high-resolution remote sensing image registration method
CN113409369A (en) * 2021-05-25 2021-09-17 西安电子科技大学 Multi-mode remote sensing image registration method based on improved RIFT
CN113643334A (en) * 2021-07-09 2021-11-12 西安电子科技大学 Different-source remote sensing image registration method based on structural similarity
CN114216454A (en) * 2021-10-27 2022-03-22 湖北航天飞行器研究所 Unmanned aerial vehicle autonomous navigation positioning method based on heterogeneous image matching in GPS rejection environment
CN115205558A (en) * 2022-08-16 2022-10-18 中国测绘科学研究院 Multi-mode image matching method and device with rotation and scale invariance

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105261014A (en) * 2015-09-30 2016-01-20 西南交通大学 Multi-sensor remote sensing image matching method
US9245201B1 (en) * 2013-03-15 2016-01-26 Excelis Inc. Method and system for automatic registration of images
CN107563438A (en) * 2017-08-31 2018-01-09 西南交通大学 The multi-modal Remote Sensing Images Matching Method and system of a kind of fast robust
CN108198157A (en) * 2017-12-22 2018-06-22 湖南源信光电科技股份有限公司 Heterologous image interfusion method based on well-marked target extracted region and NSST
CN111985502A (en) * 2020-08-03 2020-11-24 武汉大学 Multi-mode image feature matching method with scale invariance and rotation invariance

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9245201B1 (en) * 2013-03-15 2016-01-26 Excelis Inc. Method and system for automatic registration of images
CN105261014A (en) * 2015-09-30 2016-01-20 西南交通大学 Multi-sensor remote sensing image matching method
CN107563438A (en) * 2017-08-31 2018-01-09 西南交通大学 The multi-modal Remote Sensing Images Matching Method and system of a kind of fast robust
CN108198157A (en) * 2017-12-22 2018-06-22 湖南源信光电科技股份有限公司 Heterologous image interfusion method based on well-marked target extracted region and NSST
CN111985502A (en) * 2020-08-03 2020-11-24 武汉大学 Multi-mode image feature matching method with scale invariance and rotation invariance

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
PETER KOVESI: "Image Features from Phase Congruency", 《VIDERE A JOURNAL OF COMPUTER VISION RESEARCH》 *
PETER KOVESI: "Phase Congruency Detects Corners and Edges", 《PROC. VIITH DIGITAL IMAGE COMPUTING: TECHNIQUES AND APPLICATIONS》 *
李欣等: "利用方向相位特征进行多源遥感影像匹配", 《武汉大学学报(信息科学版)》 *
钟佩珂等: "基于相位一致性的红外多光谱影像波段间特征匹配", 《地理与地理信息科学》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113066015A (en) * 2021-05-14 2021-07-02 清华大学 Multi-mode remote sensing image rotation difference correction method based on neural network
CN113256653A (en) * 2021-05-25 2021-08-13 南京信息工程大学 High-rise ground object-oriented heterogeneous high-resolution remote sensing image registration method
CN113409369A (en) * 2021-05-25 2021-09-17 西安电子科技大学 Multi-mode remote sensing image registration method based on improved RIFT
CN113256653B (en) * 2021-05-25 2023-05-09 南京信息工程大学 Heterogeneous high-resolution remote sensing image registration method for high-rise ground object
CN113643334A (en) * 2021-07-09 2021-11-12 西安电子科技大学 Different-source remote sensing image registration method based on structural similarity
CN114216454A (en) * 2021-10-27 2022-03-22 湖北航天飞行器研究所 Unmanned aerial vehicle autonomous navigation positioning method based on heterogeneous image matching in GPS rejection environment
CN114216454B (en) * 2021-10-27 2023-09-08 湖北航天飞行器研究所 Unmanned aerial vehicle autonomous navigation positioning method based on heterogeneous image matching in GPS refusing environment
CN115205558A (en) * 2022-08-16 2022-10-18 中国测绘科学研究院 Multi-mode image matching method and device with rotation and scale invariance
CN115205558B (en) * 2022-08-16 2023-03-24 中国测绘科学研究院 Multi-mode image matching method and device with rotation and scale invariance

Similar Documents

Publication Publication Date Title
CN112634335A (en) Method for extracting characteristic point pairs of robust remote sensing image facing to nonlinear radiation distortion
CN111028277B (en) SAR and optical remote sensing image registration method based on pseudo-twin convolution neural network
WO2019042232A1 (en) Fast and robust multimodal remote sensing image matching method and system
CN109509164B (en) Multi-sensor image fusion method and system based on GDGF
CN106447601B (en) Unmanned aerial vehicle remote sensing image splicing method based on projection-similarity transformation
CN109919960B (en) Image continuous edge detection method based on multi-scale Gabor filter
CN103839265A (en) SAR image registration method based on SIFT and normalized mutual information
CN112254656B (en) Stereoscopic vision three-dimensional displacement measurement method based on structural surface point characteristics
CN110969669B (en) Visible light and infrared camera combined calibration method based on mutual information registration
CN107909018B (en) Stable multi-mode remote sensing image matching method and system
Zhang et al. Application of migration image registration algorithm based on improved SURF in remote sensing image mosaic
CN107481274A (en) A kind of three-dimensional makees the robustness reconstructing method of object point cloud
CN108765476A (en) A kind of polarization image method for registering
CN110222661B (en) Feature extraction method for moving target identification and tracking
CN110458876B (en) Multi-temporal POLSAR image registration method based on SAR-SIFT features
CN110009745B (en) Method for extracting plane from point cloud according to plane element and model drive
CN112308873A (en) Edge detection method for multi-scale Gabor wavelet PCA fusion image
CN116664892A (en) Multi-temporal remote sensing image registration method based on cross attention and deformable convolution
Huang et al. SAR and optical images registration using shape context
Kang et al. Image registration based on harris corner and mutual information
Paffenholz et al. Geo-referencing point clouds with transformational and positional uncertainties
Zhang et al. LPPCO: A novel multimodal medical image registration using new feature descriptor based on the local phase and phase congruency of different orientations
CN106355576A (en) SAR image registration method based on MRF image segmentation algorithm
CN113066015B (en) Multi-mode remote sensing image rotation difference correction method based on neural network
CN113674407B (en) Three-dimensional terrain reconstruction method, device and storage medium based on binocular vision image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210409

RJ01 Rejection of invention patent application after publication