CN115775324B - Phase correlation image matching method under guidance of cross scale filtering - Google Patents

Phase correlation image matching method under guidance of cross scale filtering Download PDF

Info

Publication number
CN115775324B
CN115775324B CN202211603863.9A CN202211603863A CN115775324B CN 115775324 B CN115775324 B CN 115775324B CN 202211603863 A CN202211603863 A CN 202211603863A CN 115775324 B CN115775324 B CN 115775324B
Authority
CN
China
Prior art keywords
image
scale
images
matching
phase
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211603863.9A
Other languages
Chinese (zh)
Other versions
CN115775324A (en
Inventor
程翔
周伟
张永军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN202211603863.9A priority Critical patent/CN115775324B/en
Publication of CN115775324A publication Critical patent/CN115775324A/en
Application granted granted Critical
Publication of CN115775324B publication Critical patent/CN115775324B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a phase correlation image matching method under the guidance of cross scale filtering, which comprises the following steps: simulating visual cortex receptive field response by Log-Gabor filtering to extract structural information with stability in the image, and further realizing image matching by expanding the change relation between the phase correlation calculation structural image phases; the invention extracts the stable structural information of the images by Log-Gabor filtering, can weaken the influence of radiation change on matching, can effectively solve the problems of radiation difference, scale difference and rotation change among images to be matched by expanding phase correlation as a basic way for calculating image change, realizes integral matching among images with rotation, scale difference and relative translation, improves the image matching precision, and can be used in the traditional auxiliary navigation based on the matching of the downward-looking image or the downward-looking correction image.

Description

Phase correlation image matching method under guidance of cross scale filtering
Technical Field
The invention relates to the technical field of image processing, in particular to a phase correlation image matching method under the guidance of cross scale filtering.
Background
The unmanned aerial vehicle has the advantages of small size, light weight, high flexibility, strong concealment, low cost, no potential safety hazard of personnel and the like, has very wide application in the aspects of civil and military fields such as disaster monitoring, geological exploration, map mapping, military reconnaissance, target attack, battlefield situation monitoring and the like, has very important significance for the safety application of the unmanned aerial vehicle, particularly has important auxiliary means for weakening the influence of inertial navigation system error accumulation problems on positioning accuracy under the condition of GNSS failure in long-time flight of the unmanned aerial vehicle, particularly has the working environment which is difficult to reach by manual line control or remote control modes such as long-time operation and the like, is a key guarantee that the unmanned aerial vehicle improves the survival ability and smoothly completes working tasks, and is focused in the navigation field along with the rapid development of vision sensor technology, computer technology and the like.
At present, the scene matching auxiliary navigation technology under the visual perception inspiring has been researched and applied to a certain result, but is subject to the influence of factors such as complex natural environment and flight state, the positioning performance of the scene matching auxiliary navigation technology is far behind that of a human visual perception system, the image matching is a core technology of scene matching auxiliary navigation, images shot in real time in the flight process of the unmanned aerial vehicle are matched with images with geographic reference information prepared in advance, and then the space plane position of the unmanned aerial vehicle at the imaging moment is calculated from the matching positioning result of the real-time image, so that the unmanned aerial vehicle positioning is realized, the real-time requirement of navigation positioning is considered, the execution efficiency requirement of the scene matching on an algorithm is higher, in addition, the real-time image is influenced by factors such as acquisition time and the flight state of the unmanned aerial vehicle in the imaging process, radiation, rotation and scale difference of different degrees exist between the real-time image and the reference image prepared in advance, even local ground object change of certain degree is caused, and error matching is easy to be caused, and the image matching result is inaccurate, so that the navigation precision and robustness are not beneficial to be improved.
Disclosure of Invention
Aiming at the problems, the invention aims to provide a phase-related image matching method under the guidance of cross-scale filtering, which solves the problems that the existing image matching method cannot effectively treat the challenges of radiation difference, scale difference and rotation change between matched images and is easy to cause error matching.
In order to achieve the purpose of the invention, the invention is realized by the following technical scheme: a phase correlation image matching method under the guidance of cross scale filtering comprises the following steps:
step one: correcting the geometry of the real-time image into a temporary image aiming at the gesture data in the traditional navigation environment, taking the real-time image as an image to be registered, taking the waypoint image as a reference image, and combining the reference image and the image to be registered as image input data;
step two: firstly, carrying out Fourier transform processing on a reference image and a temporary image, calculating the change relation between the images by utilizing Log-Gabor filtering, and then converting the scale and rotation change between the images into translation parameters which are resolvable through phase correlation under a logarithmic polar coordinate system by taking phase correlation as a basic matching model;
step three: performing Fourier transform on the image input data while acquiring translation parameters, converting the image input data from a space domain to a frequency domain, simulating visual cortex receptive field response by Log-Gabor filtering, and extracting structural information with stability in the image input data;
step four: based on the extracted structure information, the relative translation amount and the expansion phase correlation between the reference image and the image to be registered are accurately calculated by combining the phase correlation, then two-step phase correlation is sequentially carried out under a logarithmic polar coordinate system and a Cartesian coordinate system, and rotation parameters and scale parameters between the image to be registered and the reference image are respectively solved;
step five: calculating a geometric transformation relation between an image to be registered and a reference image according to the rotation parameter, the scale parameter and the translation parameter obtained by solving, constructing a multi-scale information image pyramid of the two images based on the geometric transformation relation, and finally realizing image matching by solving the global transformation parameter between the two images.
The further improvement is that: in the first step, the gesture data in the traditional navigation environment is provided by camera down-view imaging or by means of an inertial navigation system, and the temporary image is a down-view image.
The further improvement is that: in the first step, when the waypoint image is used as a reference, a control point is extracted through feature matching between the real-time image and the waypoint image, when no effective waypoint information exists in the real-time image range, the previous frame image is used as a reference for feature matching, and the connection point obtained by matching is converted into the control point by means of the positioning parameter of the previous frame image.
The further improvement is that: in the second step, the specific step of calculating the change relation between the images is as follows: simulating different super-column responses by means of Log-Gabor filtering, constructing super-column vectors, converting the local images into a bionic visual cell coordinate system, further taking the consistency of the super-column vectors of the same target on different images as a basic criterion, and calculating the change relation between the images according to an affine transformation model.
The further improvement is that: in the second step, in the fourier transform processing, the image signal S (x, y) is fourier transformed to obtain a complex function spectrum:
F(ω xy )=R(ω xy )+iI(ω xy )
in (omega) xy ) For frequency domain coordinates, R is the real part, I is the imaginary part, expressed in exponential form as a combination of amplitude and phase:
in the formula, |F (ω) xy ) The l represents the frequency (ω xy ) The amplitude at which the amplitude is high,representation (omega) xy ) Phase at (c).
The further improvement is that: in the third step, when the structural information is extracted, multi-scale structural features of the two images are respectively extracted by using Log-Gabor filtering of a cross scale, and the scale overlapping rate of the images is enlarged, so that the robustness of the phase correlation of the extension on the scale difference is enhanced.
The further improvement is that: in the fourth step, before two-step phase correlation is sequentially performed under a logarithmic polar coordinate system and a Cartesian coordinate system, the relative rotation angle and the scale difference between the images are effectively calculated through coordinate system conversion, and the integral matching between the images with rotation, scale difference and relative translation is realized.
The further improvement is that: in the fifth step, when the image pyramid is constructed, resampling is performed on the image towards two directions at the same time, and the scale overlapping range between the two image pyramids is enlarged.
The beneficial effects of the invention are as follows: according to the invention, stable structural information in the images is fully mined through Log-Gabor filtering with good bionic performance, the influence of radiation change on matching is weakened, under the inspired that a biological vision system realizes the basic principle of target matching by comparing geometric changes of the same target in different scenes, the relative offset between translation images can be accurately calculated through joint phase correlation, the basic characteristics of relative rotation angle and scale difference between the images can be effectively calculated through coordinate system conversion through expansion phase correlation, challenges of radiation difference, scale difference and rotation change between the images to be matched can be effectively met, the integral matching between images with rotation, scale difference and relative translation is realized, the image matching precision is improved, and the method can be used in traditional auxiliary navigation based on the matching of the downward-looking image or downward-looking corrected image.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the description below are only some embodiments of the invention, and that other drawings can be obtained according to these drawings without inventive faculty for a person skilled in the art.
Fig. 1 is a flow chart of an image matching method of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments that can be obtained by a person of ordinary skill in the art without making any inventive effort are within the scope of the present invention.
Referring to fig. 1, the present embodiment provides a phase-related image matching method under cross-scale filtering guidance, including the following steps:
step one: aiming at the condition that under-view imaging of a camera or gesture data provided by an inertial navigation system in a traditional navigation environment is adopted, a real-time image is geometrically corrected to be a down-view image, the down-view image is taken as a temporary image, the real-time image is taken as an image to be registered, a waypoint image is taken as a reference image, the reference image and the image to be registered are combined to be image input data, when the waypoint image is taken as a reference, a control point is extracted through feature matching between the real-time image and the waypoint image, when no effective waypoint information exists in the real-time image range, feature matching is carried out by taking a previous frame image as a reference, and a connecting point obtained by matching is converted into a control point by means of positioning parameters of the previous frame image;
step two: firstly, carrying out Fourier transform processing on a reference image and a temporary image, calculating the change relation between the images by utilizing Log-Gabor filtering, and then converting the scale and rotation change between the images into translation parameters which are resolvable through phase correlation under a logarithmic polar coordinate system by taking phase correlation as a basic matching model;
the specific steps for calculating the change relation between the images are as follows: simulating different super-column responses by means of Log-Gabor filtering, constructing super-column vectors, converting the local images into a bionic visual cell coordinate system, further taking the consistency of the super-column vectors of the same target on different images as a basic criterion, and calculating the change relation between the images according to an affine transformation model;
in the fourier transform process, the image signal S (x, y) is fourier transformed to obtain a complex function spectrum:
F(ω xy )=R(ω xy )+iI(ω xy )
in (omega) xy ) For frequency domain coordinates, R is the real part, I is the imaginary part, expressed in exponential form as a combination of amplitude and phase:
in the formula, |F (ω) xy ) The l represents the frequency (ω xy ) The amplitude at which the amplitude is high,representation (omega) xy ) Phase at;
step three: performing Fourier transform on image input data while acquiring translation parameters, converting the image input data from a space domain to a frequency domain, simulating visual cortex receptive field response by Log-Gabor filtering, extracting structural information with stability from the image input data, and extracting multi-scale structural features of two images respectively by using the Log-Gabor filtering with cross scales when the structural information is extracted, so that the image scale overlapping rate is increased, and the robustness of the phase correlation on scale difference is enhanced;
step four: based on the extracted structure information, the relative translation amount and the extended phase correlation between the reference image and the image to be registered are accurately calculated by combining the phase correlation, the relative rotation angle and the scale difference between the images are effectively calculated through coordinate system conversion, the integral matching between the images with rotation, scale difference and relative translation is realized, then two-step phase correlation is sequentially carried out under a logarithmic polar coordinate system and a Cartesian coordinate system, and the rotation parameters and the scale parameters between the image to be registered and the reference image are respectively solved;
step five: calculating a geometric transformation relation between an image to be registered and a reference image according to the rotation parameter, the scale parameter and the translation parameter obtained by solving, constructing a multi-scale information image pyramid of the two images based on the geometric transformation relation, resampling the images towards two directions, enlarging the scale overlapping range between the two image pyramids, and finally realizing image matching by solving the global transformation parameter between the two images.
The processing procedure of the phase correlation image matching (CSLGPC) algorithm based on cross scale Log-Gabor filtering is as follows:
input: reference image S 1 Real-time image S 2
And (3) outputting: real-time graph S 2 Center point geographic coordinates X, Y
1. Setting a set of center frequencies according to different scales
{...f 0 (-2),f 0 (-1),f 0 (0),f 0 (1),f 0 (2)...}
Respectively to S 1 ,S 2 Performing Log-Gabor filtering to obtain a total filtering graph LG 1 ,LG 2
2. For LG (polyethylene glycol) 1 ,LG 2 Making extended phase correlation calculations
Determine whether the phase correlation was successful?
3. No { match positioning failure, end })
4. Is that
{
5. Calculation of peak value to calculate LG 1 ,LG 2 The rotation angle theta and the scaling multiple f between
6. LG according to θ and f 2 Generating LG by geometric correction 2 '
7. For LG (polyethylene glycol) 1 ,LG 2 ' phase correlation calculation, if correlation is successful, the translation parameter (d x ,d y ) The method comprises the steps of carrying out a first treatment on the surface of the If the correlation fails, the matching positioning fails and the process is finished
8. According to the geometric transformation parameters θ, f, (d x ,d y ) Calculating a real-time map S 1 Center points X, Y, successful matching and positioning, and ending
}
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, alternatives, and improvements that fall within the spirit and scope of the invention.

Claims (8)

1. The phase correlation image matching method under the guidance of cross scale filtering is characterized by comprising the following steps of:
step one: correcting the geometry of the real-time image into a temporary image aiming at the gesture data in the traditional navigation environment, taking the real-time image as an image to be registered, taking the waypoint image as a reference image, and combining the reference image and the image to be registered as image input data;
step two: firstly, carrying out Fourier transform processing on a reference image and a temporary image, calculating the change relation between the images by utilizing Log-Gabor filtering, and then converting the scale and rotation change between the images into translation parameters which are resolvable through phase correlation under a logarithmic polar coordinate system by taking phase correlation as a basic matching model;
step three: performing Fourier transform on the image input data while acquiring translation parameters, converting the image input data from a space domain to a frequency domain, simulating visual cortex receptive field response by Log-Gabor filtering, and extracting structural information with stability in the image input data;
step four: based on the extracted structure information, the relative translation amount and the expansion phase correlation between the reference image and the image to be registered are accurately calculated by combining the phase correlation, two-step phase correlation is sequentially carried out under a logarithmic polar coordinate system and a Cartesian coordinate system, and rotation parameters and scale parameters between the image to be registered and the reference image are respectively solved;
step five: calculating a geometric transformation relation between an image to be registered and a reference image according to the rotation parameter, the scale parameter and the translation parameter obtained by solving, constructing a multi-scale information image pyramid of the two images based on the geometric transformation relation, and finally realizing image matching by solving the global transformation parameter between the two images.
2. The method for phase-correlated image matching under cross-scale filtering guidance of claim 1, wherein: in the first step, the gesture data in the traditional navigation environment is provided by camera down-view imaging or by means of an inertial navigation system, and the temporary image is a down-view image.
3. The method for phase-correlated image matching under cross-scale filtering guidance of claim 1, wherein: in the first step, when the waypoint image is used as a reference, a control point is extracted through feature matching between the real-time image and the waypoint image, when no effective waypoint information exists in the real-time image range, the previous frame image is used as a reference for feature matching, and the connection point obtained by matching is converted into the control point by means of the positioning parameter of the previous frame image.
4. The method for phase-correlated image matching under cross-scale filtering guidance of claim 1, wherein: in the second step, the specific step of calculating the change relation between the images is as follows: simulating different super-column responses by means of Log-Gabor filtering, constructing super-column vectors, converting the local images into a bionic visual cell coordinate system, further taking the consistency of the super-column vectors of the same target on different images as a basic criterion, and calculating the change relation between the images according to an affine transformation model.
5. The method for phase-correlated image matching under cross-scale filtering guidance of claim 1, wherein: in the second step, in the fourier transform processing, the image signal S (x, y) is fourier transformed to obtain a complex function spectrum:
F(ω xy )=R(ω xy )+iI(ω xy )
in (omega) xy ) For frequency domain coordinates, R is the real part, I is the imaginary part, expressed in exponential form as a combination of amplitude and phase:
in the formula, |F (ω) xy ) The l represents the frequency (ω xy ) The amplitude at which the amplitude is high,representation (omega) xy ) Phase at (c).
6. The method for phase-correlated image matching under cross-scale filtering guidance of claim 1, wherein: in the third step, when the structural information is extracted, multi-scale structural features of the two images are respectively extracted by using Log-Gabor filtering of a cross scale, and the scale overlapping rate of the images is enlarged, so that the robustness of the phase correlation of the extension on the scale difference is enhanced.
7. The method for phase-correlated image matching under cross-scale filtering guidance of claim 1, wherein: in the fourth step, before two-step phase correlation is sequentially performed under a logarithmic polar coordinate system and a Cartesian coordinate system, the relative rotation angle and the scale difference between the images are effectively calculated through coordinate system conversion, and the integral matching between the images with rotation, scale difference and relative translation is realized.
8. The method for phase-correlated image matching under cross-scale filtering guidance of claim 1, wherein: in the fifth step, when the image pyramid is constructed, resampling is performed on the image towards two directions at the same time, and the scale overlapping range between the two image pyramids is enlarged.
CN202211603863.9A 2022-12-13 2022-12-13 Phase correlation image matching method under guidance of cross scale filtering Active CN115775324B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211603863.9A CN115775324B (en) 2022-12-13 2022-12-13 Phase correlation image matching method under guidance of cross scale filtering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211603863.9A CN115775324B (en) 2022-12-13 2022-12-13 Phase correlation image matching method under guidance of cross scale filtering

Publications (2)

Publication Number Publication Date
CN115775324A CN115775324A (en) 2023-03-10
CN115775324B true CN115775324B (en) 2024-01-02

Family

ID=85392197

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211603863.9A Active CN115775324B (en) 2022-12-13 2022-12-13 Phase correlation image matching method under guidance of cross scale filtering

Country Status (1)

Country Link
CN (1) CN115775324B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111462198A (en) * 2020-03-10 2020-07-28 西南交通大学 Multi-mode image registration method with scale, rotation and radiation invariance
CN112233225A (en) * 2020-10-14 2021-01-15 中国科学技术大学 Three-dimensional reconstruction method and system for translational motion object based on phase correlation matching
CN113223066A (en) * 2021-04-13 2021-08-06 浙江大学 Multi-source remote sensing image matching method and device based on characteristic point fine tuning
CN113552585A (en) * 2021-07-14 2021-10-26 浙江大学 Mobile robot positioning method based on satellite map and laser radar information
CN113763274A (en) * 2021-09-08 2021-12-07 湖北工业大学 Multi-source image matching method combining local phase sharpness orientation description

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10445616B2 (en) * 2015-01-22 2019-10-15 Bae Systems Information And Electronic Systems Integration Inc. Enhanced phase correlation for image registration
CN113168712A (en) * 2018-09-18 2021-07-23 近图澳大利亚股份有限公司 System and method for selecting complementary images from multiple images for 3D geometry extraction

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111462198A (en) * 2020-03-10 2020-07-28 西南交通大学 Multi-mode image registration method with scale, rotation and radiation invariance
CN112233225A (en) * 2020-10-14 2021-01-15 中国科学技术大学 Three-dimensional reconstruction method and system for translational motion object based on phase correlation matching
CN113223066A (en) * 2021-04-13 2021-08-06 浙江大学 Multi-source remote sensing image matching method and device based on characteristic point fine tuning
CN113552585A (en) * 2021-07-14 2021-10-26 浙江大学 Mobile robot positioning method based on satellite map and laser radar information
CN113763274A (en) * 2021-09-08 2021-12-07 湖北工业大学 Multi-source image matching method combining local phase sharpness orientation description

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Automatic Example-Based Image Colorization Using Location-Aware Cross-Scale Matching;Bo Li等;《IEEE Transactions on Image Processing》;第28卷(第9期);4606 - 4619 *
Image Matching Using Phase Congruency and Log-Gabor Filters in the SAR Images and Visible Images;Xiaomin Liu等;《ICGEC 2019: Genetic and Evolutionary Computing》;270-278 *
城区机载LiDAR数据与航空影像的自动配准;张永军等;《遥感学报》;第16卷(第03期);579-595 *
基于多重约束的多源光学卫星影像自动匹配方法研究;凌霄;《中国博士学位论文全文数据库_基础科学辑》;A008-36 *
鲁棒性遥感影像特征匹配关键问题研究;李加元;《中国博士学位论文全文数据库_工程科技Ⅱ辑》;C028-15 *

Also Published As

Publication number Publication date
CN115775324A (en) 2023-03-10

Similar Documents

Publication Publication Date Title
Peng et al. Pose measurement and motion estimation of space non-cooperative targets based on laser radar and stereo-vision fusion
CN102435188B (en) Monocular vision/inertia autonomous navigation method for indoor environment
CN102654576B (en) Image registration method based on synthetic aperture radar (SAR) image and digital elevation model (DEM) data
CN104833354A (en) Multibasic multi-module network integration indoor personnel navigation positioning system and implementation method thereof
CN106052674A (en) Indoor robot SLAM method and system
CN110686677A (en) Global positioning method based on geometric information
CN103093459B (en) Utilize the method that airborne LiDAR point cloud data assisted image mates
CN109724586B (en) Spacecraft relative pose measurement method integrating depth map and point cloud
CN105242252A (en) Downward trendline bunching SAR radar positioning method based on imaging matching
CN110160503B (en) Unmanned aerial vehicle landscape matching positioning method considering elevation
CN117541655B (en) Method for eliminating radar map building z-axis accumulated error by fusion of visual semantics
CN105389819A (en) Robust semi-calibrating down-looking image epipolar rectification method and system
CN106353756A (en) Descending track spotlight SAR (synthetic aperture radar) positioning method based on image matching
CN115775324B (en) Phase correlation image matching method under guidance of cross scale filtering
CN108921896A (en) A kind of lower view vision compass merging dotted line feature
CN112580683A (en) Multi-sensor data time alignment system and method based on cross correlation
CN109341685B (en) Fixed wing aircraft vision auxiliary landing navigation method based on homography transformation
CN105550667A (en) Stereo camera based framework information action feature extraction method
CN113900517B (en) Route navigation method and device, electronic equipment and computer readable medium
CN115574816A (en) Bionic vision multi-source information intelligent perception unmanned platform
Kupervasser et al. Robust positioning of drones for land use monitoring in strong terrain relief using vision-based navigation
Madison et al. Target geolocation from a small unmanned aircraft system
Liu et al. Research on NDT-based positioning for autonomous driving
Kong et al. Feature based navigation for UAVs
Fuse et al. Development of a self-localization method using sensors on mobile devices

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant