CN109493374B - Sequential matching method based on multi-frame radar images - Google Patents
Sequential matching method based on multi-frame radar images Download PDFInfo
- Publication number
- CN109493374B CN109493374B CN201811351734.9A CN201811351734A CN109493374B CN 109493374 B CN109493374 B CN 109493374B CN 201811351734 A CN201811351734 A CN 201811351734A CN 109493374 B CN109493374 B CN 109493374B
- Authority
- CN
- China
- Prior art keywords
- matching
- sequential
- radar
- time
- real
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
- G06T7/337—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10044—Radar image
Abstract
The invention discloses a sequential matching method based on multi-frame radar images, which comprises the following steps: building a sequential matching system based on multi-frame radar images; a radar real-time graph simulation module in the sequential matching system generates a radar simulation real-time graph; matching the radar simulation real-time image, completing sub-area image matching and selecting the maximum N local correlation peaks, wherein N is more than or equal to 3; and a sequential judgment module in the sequential matching system finds out an effective matching image and the maximum position thereof. The method solves the problem that the matching error caused by the inertial navigation speed error in the radar image matching process can not be corrected, and realizes the estimation of the inertial navigation speed error and the correction of the influence of the inertial navigation speed error.
Description
Technical Field
The invention relates to a radar image matching method, in particular to a sequential matching method based on multi-frame radar images.
Background
The radar scene matching technology is widely applied to the fields of aircraft navigation and missile guidance due to the characteristics of small weather influence and high precision. Due to the influence of the coherent speckles, the signal-to-noise ratio of the radar image is low, and the matching probability is not high. The probability that the correct matching position of the optical image is at the maximum three local correlation peaks is generally considered to be 93.7%, and the radar image is far lower than the index, so that the correct matching position is difficult to obtain only by the size of the correlation peaks. For multi-frame radar images, inertial navigation information is introduced as independent information quantity, but the influence of a correlation peak is not deeply considered.
Disclosure of Invention
Aiming at the technical problems, the invention aims to provide a sequential matching method based on multi-frame radar images, and solves the problems that the matching probability of a single radar image is low and an image without effective matching information cannot be identified.
A sequential matching method based on multi-frame radar images comprises the following steps:
s1, building a sequential matching system based on the multi-frame radar images;
s2, generating a radar simulation real-time graph;
s3, matching the radar simulation real-time graph, completing sub-area image matching and selecting the largest N local correlation peaks, wherein N is more than or equal to 3;
and S4, finding out the effective matching image and the maximum position thereof.
Further, the sequential matching system comprises a radar real-time graph simulation module, an image matching module and a sequential judgment module.
Further, in the step S2, the generating the radar simulation real-time chart specifically includes that the radar real-time chart simulation module generates an echo according to the input radar satellite image, the elevation data and the track data, and generates the radar simulation real-time chart by using a synthetic aperture imaging method.
Further, the step S3 is executed by the image matching module, and the selection of the correlation peak is specifically that a correlation surface is obtained after matching the radar simulation real-time map, and if a correlation coefficient of a certain point in the correlation surface is centered on the point and is maximum in a range of 5 pixels × 5 pixels, the point is considered as a local correlation peak.
Further, in the step S3, the value of N is 6.
Further, the step S4 is executed by a sequential decision module, specifically, the sequential decision is started after 10 radar simulation real-time image matching results are continuously obtained, and after the 11 th matching result is obtained, the sequential decision is performed by using 10 matching results of 2 th to 11 th times, and so on.
Further, the time for matching 10 consecutive radar simulation real-time image imaging needs to be controlled within 1 minute.
Further, the sequential decision is based on the inertial navigation displacement and the size of the correlation peak.
Further, the specific process of the sequential decision is as follows: selecting a local correlation peak i from the primary matching result, and respectively comparing the local correlation peak i with all local correlation peaks matched for the other 9 times;
in addition, 9 times of matching are carried out, and each matching result is compared with the correlation peak i according to the sequence of the local correlation peak values from large to small;
and if the position difference of the correlation peak i and a local correlation peak of a certain matching result at a certain time and the inertial navigation displacement meet the preset threshold value, adding 1 to the distance weight, multiplying the local correlation peak value by the correlation peak i to serve as the correlation coefficient weight of the correlation peak i, and finishing the comparison with the matching result at the same time.
Further, after the comparison of all the N local correlation peaks matched for 10 times is completed, sequential decision is performed:
selecting N local correlation peaks matched for 10 times, wherein the N local correlation peaks are 10 multiplied by N comparison results, and selecting the maximum value of the distance weight as a distance threshold;
in N comparison results of each matching, comparing correlation coefficient weights of local correlation peaks meeting the distance threshold, wherein the current matching correct position is the position with the larger correlation coefficient weight;
if the local correlation peak meeting the distance threshold value does not exist in the N matched comparison results at a certain time, indicating that the scene is not effectively matched in the corresponding area of the radar image;
and calculating all effective matching results to the position of the last imaging moment according to the inertial navigation displacement, and calculating the average value of the effective matching results to serve as the current matching result.
The method solves the problem that the matching error caused by the inertial navigation speed error in the radar image matching process cannot be corrected, and realizes the estimation of the inertial navigation speed error and the correction of the influence of the inertial navigation speed error.
Drawings
Fig. 1 is a flow chart of a sequential matching method based on multi-frame radar images according to the present invention.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it should be understood that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention provides a sequential matching method based on multi-frame radar images, which comprises the following steps of:
and S1, constructing a sequential matching system based on the multi-frame radar images.
A sequential matching system based on multi-frame radar images comprises: the system comprises a radar real-time graph simulation module, an image matching module and a sequential judgment module;
radar real-time graph simulation module: the system is used for generating a radar real-time graph by using radar satellite images, elevation data and track data in a simulation way;
an image matching module: the method is used for matching the radar real-time image and extracting the maximum 6 local correlation peak positions in the correlation surface;
a sequential judgment module: the method is used for finding out the image which can be effectively matched and the correct matching position thereof according to the inertial navigation position change and the size of each local correlation peak in each image.
And S2, the radar real-time graph simulation module in the sequential matching system generates a radar simulation real-time graph.
And the radar real-time graph simulation module generates an echo according to the input radar satellite image, the elevation data and the track data and generates a radar simulation real-time graph by adopting a synthetic aperture imaging method.
And S3, matching the radar real-time images, completing sub-area image matching and selecting the largest N local correlation peaks, wherein N is more than or equal to 3.
The selection of the correlation peak is specifically that a correlation surface is obtained after matching of the radar simulation real-time graph, and if the correlation coefficient of a certain point in the correlation surface is maximum within the range of 5 pixels multiplied by 5 pixels by taking the point as a center, the point is considered as a local correlation peak. The following table gives an example of local correlation peak selection:
ρ11 | ρ12 | ρ13 | ρ14 | ρ15 |
ρ21 | ρ22 | ρ23 | ρ24 | ρ25 |
ρ31 | ρ32 | ρ33 | ρ34 | ρ35 |
ρ41 | ρ42 | ρ43 | ρ44 | ρ45 |
ρ51 | ρ52 | ρ53 | ρ54 | ρ55 |
in this region, if ρ33Maximum value in the table, then33The corresponding position is a local correlation peak, otherwise not.
Since the probability of the correct matching position at the maximum three correlation peaks is only 93.7%, in the embodiment, 6 is selected from N, and the algorithm real-time performance and reliability can be considered when 6 local correlation peaks with the maximum correlation coefficient are found.
S4, finding out effective matching images and the maximum position thereof by a sequential judgment module in the sequential matching system.
And starting to carry out sequential judgment after continuously acquiring 10 radar image matching results. After that, the 11 th matching result is obtained, and the matching result decision is determined by using the 11 th matching result and 10 times from 2 nd to 10 th times, and so on. In addition, the time for 10 consecutive imaging matches needs to be controlled at 1 min.
The basis for sequential decisions is two: inertial navigation displacement information and correlation peak size.
Since the inertial navigation displacement is the same as the real carrier displacement within a short time, for example, within 1min, the difference between the local correlation peak positions corresponding to the correct matching positions of the continuous 10 real-time images is equivalent to the corresponding inertial navigation displacement (the distance difference is within 20 m).
Selecting a local correlation peak i from the primary matching result, and respectively comparing the local correlation peak i with all local correlation peaks matched for the other 9 times;
in addition, 9 times of matching are carried out, and each matching result is compared with the correlation peak i according to the sequence of the local correlation peak values from large to small;
if the position difference and the inertial navigation displacement of a local correlation peak of a correlation peak i and a matching result of a certain time meet a preset threshold, adding 1 to the distance weight, multiplying the local correlation peak value by the correlation peak i to serve as the correlation coefficient weight of the correlation peak i, and finishing the comparison with the matching result of the time.
And 6 local correlation peaks matched for 10 times are judged sequentially after the comparison is finished:
selecting 6 local correlation peaks matched for 10 times, and taking the maximum value of distance weight in 54 comparison results as a distance threshold;
in 6 comparison results of each matching, comparing the correlation coefficient weights of local correlation peaks meeting the distance threshold, wherein the current matching correct position is the position with the greater correlation coefficient weight;
if the local correlation peak meeting the distance threshold value does not exist in the 6 matched comparison results at a certain time, it is indicated that the scene is not effectively matched in the corresponding area of the radar image;
calculating all effective matching results to the final imaging time position of the 10 th image participating in the sequential judgment according to the inertial navigation displacement information, and calculating the average value of the effective matching results to serve as the current matching result.
So far, the sequential matching based on the multi-needle radar images is completed.
It should be understood that the above embodiments are only examples for clarity of description, and are not limiting. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. And obvious variations or modifications therefrom are intended to be within the scope of the invention.
Claims (5)
1. A sequential matching method based on multi-frame radar images is characterized by comprising the following steps:
s1, building a sequential matching system based on the multi-frame radar images; the sequential matching system comprises a radar real-time graph simulation module, an image matching module and a sequential judgment module;
s2, generating a radar simulation real-time graph;
s3, matching the radar simulation real-time graph, completing sub-area image matching and selecting the largest N local correlation peaks, wherein N is more than or equal to 3;
s4, finding out effective matching images and the maximum positions of the effective matching images, starting to carry out sequential judgment after continuously obtaining 10 radar simulation real-time image matching results, and carrying out sequential judgment by using 10 matching results of 2-11 times in total after obtaining the 11 th matching result, and so on;
wherein, the basis of the sequential judgment is inertial navigation displacement and the size of a correlation peak;
the specific process of the sequential judgment is as follows: selecting a local correlation peak i from the primary matching result, and respectively comparing the local correlation peak i with all local correlation peaks matched for the other 9 times;
in addition, 9 times of matching are carried out, and each matching result is compared with the correlation peak i according to the sequence of the local correlation peak values from large to small;
if the position difference and the inertial navigation displacement of a local correlation peak of a correlation peak i and a certain matching result at a certain time meet a preset threshold value, adding 1 to the distance weight, multiplying the local correlation peak value by the correlation peak i to serve as the correlation coefficient weight of the correlation peak i, and finishing the comparison with the matching result at the same time;
wherein, after the comparison of all the N local correlation peaks matched for 10 times is completed, the judgment is carried out:
selecting N local correlation peaks matched for 10 times, wherein the N local correlation peaks are 9 multiplied by N comparison results, and the maximum value of the distance weight is selected as a distance threshold;
in N comparison results of each matching, comparing correlation coefficient weights of local correlation peaks meeting the distance threshold, wherein the current matching correct position is the position with the larger correlation coefficient weight;
if the local correlation peak meeting the distance threshold value does not exist in the N comparison results of certain matching, the fact that the scene is not effectively matched in the corresponding area of the radar simulation real-time graph is indicated;
and calculating all effective matching results to the position of the last imaging moment according to the inertial navigation displacement, and calculating the average value of the effective matching results to serve as the current matching result.
2. The sequential matching method according to claim 1, wherein the step S2 is to generate a real-time radar simulation chart, specifically, the real-time radar simulation chart is generated by the radar real-time simulation module generating echoes according to the input radar satellite images, elevation data and track data and using a synthetic aperture imaging method.
3. The sequential matching method according to claim 1, wherein the step S3 is performed by the image matching module, and the correlation peak is selected by obtaining a correlation plane after matching the radar simulation real-time map, and if the correlation coefficient of a certain point in the correlation plane is centered on the point and is maximum within a range of 5 pixels x 5 pixels, the point is considered as a local correlation peak.
4. The sequential matching method according to claim 1, wherein in step S3, N is 6.
5. The sequential matching method according to claim 1, wherein the time for 10 consecutive radar simulation real-time image imaging matches is controlled within 1 minute.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811351734.9A CN109493374B (en) | 2018-11-14 | 2018-11-14 | Sequential matching method based on multi-frame radar images |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811351734.9A CN109493374B (en) | 2018-11-14 | 2018-11-14 | Sequential matching method based on multi-frame radar images |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109493374A CN109493374A (en) | 2019-03-19 |
CN109493374B true CN109493374B (en) | 2021-10-26 |
Family
ID=65695756
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811351734.9A Active CN109493374B (en) | 2018-11-14 | 2018-11-14 | Sequential matching method based on multi-frame radar images |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109493374B (en) |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103389074A (en) * | 2013-07-18 | 2013-11-13 | 河南科技大学 | Multi-scale scene matching area selecting method |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2341965A (en) * | 1998-09-24 | 2000-03-29 | Secr Defence | Pattern recognition |
PL2776786T3 (en) * | 2011-11-08 | 2020-01-31 | Saab Ab | Method and system for determining a relation between a first scene and a second scene |
-
2018
- 2018-11-14 CN CN201811351734.9A patent/CN109493374B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103389074A (en) * | 2013-07-18 | 2013-11-13 | 河南科技大学 | Multi-scale scene matching area selecting method |
Non-Patent Citations (1)
Title |
---|
《N 帧连续景象匹配一致性决策算法》;王永明;《计算机学报》;20050612;1032-1035 * |
Also Published As
Publication number | Publication date |
---|---|
CN109493374A (en) | 2019-03-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107491742B (en) | Long-term stable target tracking method for unmanned aerial vehicle | |
EP1505543A2 (en) | Video object tracking | |
CN114782691A (en) | Robot target identification and motion detection method based on deep learning, storage medium and equipment | |
CN114186632B (en) | Method, device, equipment and storage medium for training key point detection model | |
CN106952304B (en) | A kind of depth image calculation method using video sequence interframe correlation | |
CN107728124B (en) | Multi-radar dynamic adjustment method and device based on information entropy | |
CN111462174B (en) | Multi-target tracking method and device and electronic equipment | |
CN111354022B (en) | Target Tracking Method and System Based on Kernel Correlation Filtering | |
WO2021084530A1 (en) | Method and system for generating a depth map | |
CN107392898B (en) | Method and device for calculating pixel point parallax value applied to binocular stereo vision | |
CN113096181B (en) | Method and device for determining equipment pose, storage medium and electronic device | |
CN109493374B (en) | Sequential matching method based on multi-frame radar images | |
CN112270748B (en) | Three-dimensional reconstruction method and device based on image | |
CN116740488B (en) | Training method and device for feature extraction model for visual positioning | |
RU2461019C1 (en) | Method of coordinate-connected identification using statistical evaluation of difference of spatial coordinates | |
CN111046960B (en) | Method for matching different source images in partition mode | |
CN115082519A (en) | Airplane tracking method based on background perception correlation filtering, storage medium and electronic equipment | |
CN113379787B (en) | Target tracking method based on 3D convolution twin neural network and template updating | |
CN111709990B (en) | Camera repositioning method and system | |
CN110060343B (en) | Map construction method and system, server and computer readable medium | |
CN111639691B (en) | Image data sampling method based on feature matching and greedy search | |
CN110954927B (en) | Dynamic weighting method, device and readable storage medium | |
CN114022567A (en) | Pose tracking method and device, electronic equipment and storage medium | |
CN113486907A (en) | Unmanned equipment obstacle avoidance method and device and unmanned equipment | |
CN109163727B (en) | Electronic reconnaissance satellite target track dynamic estimation method and implementation device thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |