CN116091782A - Three-dimensional feature extraction method for space target component - Google Patents
Three-dimensional feature extraction method for space target component Download PDFInfo
- Publication number
- CN116091782A CN116091782A CN202211571024.3A CN202211571024A CN116091782A CN 116091782 A CN116091782 A CN 116091782A CN 202211571024 A CN202211571024 A CN 202211571024A CN 116091782 A CN116091782 A CN 116091782A
- Authority
- CN
- China
- Prior art keywords
- image
- dimensional
- distance
- target
- radar
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000605 extraction Methods 0.000 title claims abstract description 18
- 238000003384 imaging method Methods 0.000 claims abstract description 33
- 239000011159 matrix material Substances 0.000 claims abstract description 18
- 238000012545 processing Methods 0.000 claims abstract description 7
- 238000000034 method Methods 0.000 claims description 28
- 230000009466 transformation Effects 0.000 claims description 15
- 238000005457 optimization Methods 0.000 claims description 14
- 239000002245 particle Substances 0.000 claims description 8
- 238000012937 correction Methods 0.000 claims description 6
- 238000003708 edge detection Methods 0.000 claims description 6
- 238000010276 construction Methods 0.000 claims description 3
- 238000001514 detection method Methods 0.000 claims description 3
- 230000002452 interceptive effect Effects 0.000 claims description 3
- 238000013507 mapping Methods 0.000 claims description 3
- 238000005070 sampling Methods 0.000 claims description 3
- 238000005259 measurement Methods 0.000 description 7
- 238000000691 measurement method Methods 0.000 description 4
- 230000009286 beneficial effect Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 238000010587 phase diagram Methods 0.000 description 2
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/004—Artificial life, i.e. computing arrangements simulating life
- G06N3/006—Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
- G06T5/30—Erosion or dilatation, e.g. thinning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20061—Hough transform
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biophysics (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Biomedical Technology (AREA)
- Health & Medical Sciences (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
A three-dimensional feature extraction method of a space target component comprises the steps of obtaining radar echo signals of three base stations, carrying out distance-Doppler processing to obtain an ISAR two-dimensional image, carrying out image registration on the ISAR two-dimensional image, converting the image into a gray level image, calculating an optimal threshold value of the gray level image, converting the gray level image into a binary image, carrying out expansion and hole filling on the image, obtaining the boundary of the binary image, detecting a linear line segment on the boundary image, obtaining the start-end coordinates and the direction of the line segment, searching component rectangular feature parameters in the binary image, carrying out interference on the three ISAR images to obtain a projection matrix of radar sight reaching an imaging plane, searching projection information of a target component on the radar imaging plane, combining the projection matrix of the radar imaging plane, and obtaining the three-dimensional real size of the target component space. The invention solves the problems of scattering point information loss, target shielding and the like, and improves the three-dimensional attitude information extraction capability of a space target.
Description
Technical Field
The invention relates to a three-dimensional feature extraction method of a space target part, in particular to a three-dimensional feature extraction method of a space target part based on interference ISAR images.
Background
With the importance and investment of space utilization in various countries, the space activity scale is continuously enlarged, and the working state of the space target can be effectively estimated by accurately measuring the gesture of the space target and the important load component, so that the real-time monitoring and gesture estimation of the non-cooperative space target are realized. Traditional target attitude measurement modes are divided into active measurement and passive measurement. Active measurement requires that the aircraft be equipped with a special measurement system, and the complexity of the system is high; and attitude measurement using a photoelectric sensor is easily affected by weather and time. The current acquisition of space target attitude information is to estimate a projection angle through a two-dimensional image sequence and invert the projection angle to a three-dimensional space, but the unknown projection angle enables target inversion to have uncertainty, has larger errors and limits target identification capacity to a great extent.
Disclosure of Invention
The invention aims to provide a three-dimensional feature extraction method of a space target component, which solves the problems of scattering point information loss, target shielding and the like and improves the three-dimensional attitude information extraction capability of the space target.
In order to achieve the above object, the present invention provides a method for extracting three-dimensional features of a space target component, comprising:
s1, acquiring radar echo signals of three base stations, performing distance-Doppler processing, acquiring an ISAR two-dimensional image, performing image registration on the ISAR two-dimensional image, and converting the image into a gray level image;
s2, calculating an optimal threshold value of the gray level map by adopting an OTSU (on-the-fly per se) method, converting the gray level map into a binary image according to the optimal threshold value, and expanding the image and filling holes;
s3, acquiring a boundary of the binary image by using a sobel edge detection algorithm, detecting a linear line segment on the boundary image by using Hough transformation, and acquiring start and end coordinates and directions of the line segment;
step S4, searching part rectangular characteristic parameters in the binary image by adopting an optimization algorithm according to the Hough transformation detection result;
s5, interfering the three ISAR images to obtain radar pitch angles and azimuth angles, and obtaining a projection matrix of radar sight to an imaging plane;
and S6, searching projection information of the target component on the radar imaging plane by using a particle swarm optimization algorithm, and combining the projection matrix of the radar imaging plane to obtain the three-dimensional real size of the target component space.
The step S1 includes:
step S1.1, setting a radar to emit electromagnetic wave signals, setting the radar position as an origin O of space coordinates, and setting A, B, C three radar base stations to receive signals;
the signals received by each base station are acquired, signal processing in the distance-Doppler dimension is carried out, A is assumed to be a main channel, the main channel is taken as an example, and the ISAR two-dimensional image domain expression is as follows:
wherein lambda is the wavelength of electromagnetic wave, gamma is the frequency modulation rate, f r and fa The distance and azimuth frequencies are represented respectively,for the position of the p-th scattering point, R ref For reference distance-> and />The distances from the initial time to the antennas A and O are respectively;
taking A, B channel interference as an example, for the same scattering point, the differences between the three channels are:
wherein ,respectively the positions of the same scattering point in the A channel image and the B channel image, delta f r 、Δf a For the frequency shift of the scattering point distance and orientation on both channels A, B, delta n 、Δ m For the position shift of the scattering point distance and orientation on the A, B two channels, R A0 、R B0 The distance, ω, from the antenna A, B to O at the initial time, respectively z Is the target rotational speed;
step S1.2, optimizing and solving the rotation offset omega by using an average distance minimum entropy method z When the image envelope does not walk within a distance cell, the entropy value of the image is minimum;
ω z =argmin(-∑g(ω z_est )·lng(ω z_est ))
wherein ,ωz_est Is omega z Estimated value, g (omega) z_est ) To use omega z_est Integrating the envelope amplitude value of the range profile along the azimuth direction after envelope correction;
and carrying out mismatch compensation on the channel B by taking the channel A as a reference, wherein the phase form of the compensation offset is as follows:
Δ comp (n,m)=exp(j2π/NΔ n ·n)exp(j2π/MΔ m ·m)
wherein n and m are discrete sampling points in the image distance direction and the azimuth direction respectively, N, M is the total number of samples, and the two compensated channels are subjected to matrix construction in the time domainWhen the image scattering points of the two channels are perfectly aligned, pair +.>The image scattering points obtained by two-dimensional Fourier transform are overlapped, and the image energy is maximum at the moment;
the channel A, C registration method is the same as the channel A, B, thereby obtaining a three-channel gray scale image after registration.
The step S2 includes:
step S2.1, using gray scale image BMP A For example, the BMP is calculated using the maximum inter-class variance method A An optimal threshold TH of the image;
dividing the image into a background part and a target part which are respectively C 1 (smaller than TH), C 2 (greater than TH), the probability of each pixel within the image being divided into background and object classes is p 1 、p 2 The maximum inter-class variance is:
wherein, THG is the global average value,representing the accumulated mean value when the gray level is m, and obtaining m when the maximum variance is the OTSU threshold value TH;
step S2.2, converting the gray level image into a binary image according to the OTSU threshold value, expanding the image and filling holes to obtain a binary image BI 1 、BI 2 、BI 3 。
The step S3 includes:
s3.1, acquiring the target binary boundary image processed in the step S2 by using a sobel edge detection algorithm;
s3.2, acquiring a linear line segment in the boundary image by adopting Hough transformation, wherein the linear line segment comprises start and end coordinates and a pointing direction of the line segment;
converting the Cartesian coordinate system into a polar coordinate system:
ρ=xcosθ+ysinθ
mapping each pixel point (x, y) in each Cartesian coordinate system on the edge in the boundary image to a corresponding sine curve in a polar coordinate system (rho, theta), and extracting the point (rho) with the most intersection in the polar coordinate system after all edge points are subjected to Hough transformation 0 ,θ 0 ) Find the longest edge segment l L Obtain the coordinates of start and end (x 1 ,y 1 )、(x 2 ,y 2 ) Line segment pointing
The step S4 includes:
step S4.1, searching rectangular features of the component by using rectangular descriptors according to Hough transformation results;
the short side and the long side of the rectangle are mutually perpendicular, and the initial coordinate (x 1 ,y 1 ) For the short-side initial coordinates, the short-side end coordinates (x i ,y i ):
minl D -l(x i ,y i )
wherein ,Fpi To search the corresponding rectangular region pixel point when the short side is short, l D For the length of the short side of the component rectangle, l (x i ,y i ) Search length for rectangular short sides of part, n i 、n i-1 The number of pixel points in the coverage area of the rectangular component of the current search value and the last search value is respectively;
step S4.2, after the length of the short side is determined, searching the start and end points of the long side so as to determine four corner points of the target rectangular component and obtain a target partTwo-dimensional dimensions and orientation of a piece
The step S5 includes:
step S5.1, carrying out phase correction on the three-channel radar echo signals obtained in the step S1, and obtaining a three-channel phase form expressed as:
wherein ,RΔAPO 、R ΔBPO 、R ΔCPO Respectively, the signal path histories received by three receiving radars, R ref A set reference distance;
the original interference phase differences of the three channels are:
the three channels are theoretically out of phase:
Δφ AB =2πL 1 sinα/λ
Δφ AC =2πL 2 sinβ/λ
wherein ,L1 、L 2 The length of a base line between an AB channel and an AC channel is shown, alpha is an azimuth angle, and beta is a pitch angle;
by means of de-reference-plane methodsThe interference phase is unwound, the direction tangent to the target motion direction and perpendicular to the line of sight of the antenna A is selected as a reference plane, the interference phase on the reference plane corresponding to the inclined distance is subtracted from the original interference phase, and the main value is taken, so that the actual interference phase phi 'is obtained' AB 、φ′ AC ;
Taking A, B as an example, the interference phase after the reference plane phase is removed is:
φ A ′ B (R ΔAP0 )=φ(R ΔAP0 )-φ g (R ΔAP0 )+2kπ;
wherein k is an integer, and the value of k satisfies-pi < phi' AB ≤π,φ(R ΔAP0 ) Is the distance difference R ΔAP0 Corresponding true phase, phi g (R ΔAP0 ) Is the distance difference R ΔAP0 Phase on corresponding reference plane, R ΔAP0 The pixel corresponds to the difference between the antenna A slant distance and the reference distance;
and (5) solving X, Y, Z positions to obtain:
simultaneously obtaining radar sight angle parameters (alpha, beta);
step S5.2, constructing a radar imaging plane projection matrix according to the angle obtained in the step S5.1:
wherein ,ρr For distance dimension resolution ρ a Is the doppler resolution.
The step S6 includes:
step S6.1, after the two-dimensional characteristics of the target component are obtained, the three-dimensional projection attitude parameters are obtained through a minimum optimization problem based on the component rectangle priori, and the cost function is as follows:
wherein ,0°≤ψ≤180°,/>for the target linear structure attitude pointing, a classical particle swarm optimization algorithm is adopted to solve the projection angle of a target component on an imaging plane +.>Is the optimal solution of (a);
step S6.2, determining the real size of the target component in three dimensions according to the generated projection information of the target component in the image and the extracted two-dimensional extraction information:
wherein ,lL 、l D Real dimension parameters of the target rectangular parts, i L-RD 、l D-RD The dimensional parameters of the extracted target rectangular component projected onto the ISAR imaging plane, respectively.
The invention has the following beneficial effects:
1. based on ISAR images of different angles acquired by a multichannel radar, interference phase information is extracted, pitch angle and azimuth angle of the radar are deduced, a projection matrix on a radar imaging plane is calculated, the interference radar is used for imaging, sequential imaging is not needed under the condition that resolution is ensured, and only the interference image at a single moment is used for searching for target projection parameters, so that three-dimensional attitude parameters of a target are acquired.
2. The three-dimensional attitude inversion is performed by using the ISAR image, so that the three-dimensional attitude inversion method has lower system complexity compared with the traditional active attitude measurement method, and has all-weather advantages over the whole day compared with the photoelectric measurement method.
3. The problem of inaccurate size estimation caused by traditional two-dimensional imaging projection can be solved by utilizing multi-view information of the interference radar, and simultaneously, attitude estimation can be realized through one-time high-resolution three-dimensional imaging.
4. Based on the component information extracted from the two-dimensional image, the particle swarm optimization algorithm is adopted to search the object body coordinate projection angle, the projection matrix is solved, and the non-parameterized search method is beneficial to acquiring the attitude parameters of the non-cooperative object with limited prior information.
Drawings
Fig. 1 is a flowchart of a three-dimensional feature extraction method of a space target component provided by the invention.
Fig. 2 is an InISAR radar layout geometry and a three-dimensional angle solution relationship.
FIG. 3a is a pre-registration three-channel ISAR image
Fig. 3b is a three-channel ISAR image after registration.
FIG. 4a is an ISAR image target component feature extraction diagram.
Fig. 4b is an extracted target part interference phase diagram.
FIG. 5a is an overall three-dimensional inversion of the target.
FIG. 5b is an extracted three-dimensional inversion of the target part.
Detailed Description
The following describes a preferred embodiment of the present invention with reference to fig. 1 to 5.
The space target component three-dimensional feature extraction method based on the interference ISAR image can improve the space target estimation precision, and has use value. The attitude measurement using the radar sensor can work all day long.
As shown in fig. 1, the present invention provides a method for extracting three-dimensional features of a space target component, comprising the following steps:
s1, acquiring radar echo signals of three base stations, performing distance-Doppler processing, acquiring an ISAR two-dimensional image, performing image registration on the ISAR two-dimensional image, and converting the image into a gray level image;
step S1.1, setting a radar to emit electromagnetic wave signals, setting the radar position as an origin O of space coordinates, and setting A, B, C three radar base stations to receive signals;
the signals received by each base station are acquired, signal processing in the distance-Doppler dimension is carried out, A is assumed to be a main channel, the main channel is taken as an example, and the ISAR two-dimensional image domain expression is as follows:
wherein lambda is the wavelength of electromagnetic wave, gamma is the frequency modulation rate, f r and fa The distance and azimuth frequencies are represented respectively,for the position of the p-th scattering point, R ref For reference distance-> and />From the initial time to the day respectivelyThe distance of lines a and O;
it can be seen that for the same scattering point, the difference between the three channels is (exemplified by A, B channel interference):
wherein ,respectively the positions of the same scattering point in the A channel image and the B channel image, delta f r 、Δf a For the frequency shift of the scattering point distance and orientation on both channels A, B, delta n 、Δ m For the position shift of the scattering point distance and orientation on the A, B two channels, R A0 、R B0 The distance, ω, from the antenna A, B to O at the initial time, respectively z Is the target rotational speed;
step S1.2, optimizing and solving the rotation offset omega by using an average distance minimum entropy method z When the image envelope does not walk within a distance cell, the entropy value of the image is minimum;
ω z =argmin(-∑g(ω z_est )·lng(ω z_est ))
wherein ,ωz_est Is omega z Estimated value, g (omega) z_est ) To use omega z_est Integrating the envelope amplitude value of the range profile along the azimuth direction after envelope correction;
and carrying out mismatch compensation on the channel B by taking the channel A as a reference, wherein the phase form of the compensation offset is as follows:
Δ comp (n,m)=exp(j2π/NΔ n ·n)exp(j2π/MΔ m ·m)
wherein n and m are discrete sampling points in the image distance direction and the azimuth direction respectively, N, M is the total number of samples, and the two compensated channels are subjected to matrix construction in the time domainWhen the image scattering points of the two channels are perfectly alignedFor->The image scattering points obtained by two-dimensional Fourier transform are overlapped, and the image energy is maximum at the moment;
the channel A, C registration method is the same as the channel A, B, so that three-channel gray scale images after registration are obtained;
s2, calculating an optimal threshold value of the gray level image by adopting a maximum inter-class variance method (OTSU) method, converting the gray level image into a binary image according to the optimal threshold value, and expanding the image and filling holes;
step S2.1, using gray scale image BMP A For example, the BMP is calculated using the maximum inter-class variance method A An optimal threshold TH of the image;
dividing the image into a background part and a target part which are respectively C 1 (smaller than TH), C 2 (greater than TH), the probability of each pixel within the image being divided into background and object classes is p 1 、p 2 The maximum inter-class variance is:
wherein, THG is the global average value,representing the accumulated mean value when the gray level is m, and obtaining m when the maximum variance is the OTSU threshold value TH;
step S2.2, converting the gray level image into a binary image according to the OTSU threshold value, expanding the image and filling holes to obtain a binary image BI 1 、BI 2 、BI 3 ;
S3, acquiring a boundary of the binary image by using a sobel edge detection algorithm, detecting a linear line segment on the boundary image by using Hough transformation, and acquiring start and end coordinates and directions of the line segment;
s3.1, acquiring the target binary boundary image processed in the step S2 by using a sobel edge detection algorithm;
s3.2, acquiring a linear line segment in the boundary image by adopting Hough transformation, wherein the linear line segment comprises start and end coordinates, directions and the like;
converting the Cartesian coordinate system into a polar coordinate system:
ρ=xcosθ+ysinθ
mapping each pixel point (x, y) in each Cartesian coordinate system on the edge in the boundary image to a corresponding sine curve in a polar coordinate system (rho, theta), and extracting the point (rho) with the most intersection in the polar coordinate system after all edge points are subjected to Hough transformation 0 ,θ 0 ) Find the longest edge segment l L Obtain the coordinates of start and end (x 1 ,y 1 )、(x 2 ,y 2 ) Line segment pointing
Step S4, searching part rectangular characteristic parameters in the binary image by adopting an optimization algorithm according to the Hough transformation detection result;
step S4.1, searching rectangular features of the component by using rectangular descriptors according to Hough transformation results;
the short side and the long side of the rectangle are mutually perpendicular, and the initial coordinate (x 1 ,y 1 ) For the short-side initial coordinates, the short-side end coordinates (x i ,y i ):
minl D -l(x i ,y i )
wherein ,Fpi To search the corresponding rectangular region pixel point when the short side is short, l D For the length of the short side of the component rectangle, l (x i ,y i ) Search length for rectangular short sides of part, n i 、n i-1 The number of pixel points in the coverage area of the rectangular component of the current search value and the last search value is respectively;
step S4.2, after determining the length of the short side, searching the start and end points of the long side (principle is the same as that of step 4.1) to determine the targetFour corner points of the rectangular component, obtaining the two-dimensional size and the direction of the target component
S5, interfering the three ISAR images to obtain radar pitch angles and azimuth angles, and obtaining a projection matrix of radar sight to an imaging plane;
step S5.1, carrying out phase correction on the three-channel radar echo signals obtained in the step S1, and obtaining a three-channel phase form expressed as:
wherein ,RΔAPO 、R ΔBPO 、R ΔCPO Respectively, the signal path histories received by three receiving radars, R ref A set reference distance;
the original interference phase differences of the three channels are:
as shown in the geometrical relationship of fig. 2, the three channels theoretically have a phase difference of:
Δφ AB =2πL 1 sinα/λ
Δφ AC =2πL 2 sinβ/λ
wherein ,L1 、L 2 The length of a base line between an AB channel and an AC channel is shown, alpha is an azimuth angle, and beta is a pitch angle;
as can be seen from the above, the wave path difference of the two antennas is much larger than the wavelength, i.e. the true value of the phase difference between the channels is much larger than 2 pi, which generates phase wrapping phenomenon;
the interference phase is unwound by utilizing a de-reference plane method, the direction tangent to the movement direction of the target and perpendicular to the sight line of the antenna A is selected as a reference plane, the interference phase on the reference plane corresponding to the inclined distance of the antenna A is subtracted from the original interference phase, and the main value is taken, so that the actual interference phase phi 'is obtained' AB 、φ′ AC ;
Taking A, B as an example, the interference phase after the reference plane phase is removed is phi A ′ B (R ΔAP0 )=φ(R ΔAP0 )-φ g (R ΔAP0 )+2kπ;
Wherein k is an integer, and the value of k satisfies-pi < phi A ′ B ≤π,φ(R ΔAP0 ) Is the distance difference R ΔAP0 Corresponding true phase, phi g (R ΔAP0 ) Is the distance difference R ΔAP0 Phase on corresponding reference plane, R ΔAP0 The pixel corresponds to the difference between the antenna A slant distance and the reference distance;
from the angular relationship shown in fig. 2, the X, Y, Z position is solved:
simultaneously obtaining radar sight angle parameters (alpha, beta);
step S5.2, constructing a radar imaging plane projection matrix according to the angle obtained in the step S5.1:
wherein ,ρr For distance dimension resolution ρ a Is Doppler resolution;
s6, searching projection information of the target component on a radar imaging plane by using a particle swarm optimization algorithm, and acquiring the three-dimensional real size of the target component space by combining a radar imaging plane projection matrix;
step S6.1, after the two-dimensional characteristics of the target component are obtained, three-dimensional projection attitude parameters can be obtained through a minimum optimization problem based on the component rectangle prior, and the cost function is as follows:
wherein ,0°≤ψ≤180°,/>for the target linear structure attitude pointing, a classical particle swarm optimization algorithm is adopted to solve the projection angle of a target component on an imaging plane +.>Is the optimal solution of (a);
step S6.2, determining the real size of the target component in three dimensions according to the generated projection information of the target component in the image and the extracted two-dimensional extraction information:
wherein ,lL 、l D Real dimension parameters of the target rectangular parts, i L-RD 、l D-RD The dimensional parameters of the extracted target rectangular component projected onto the ISAR imaging plane, respectively.
Fig. 3a is a three-channel ISAR image before registration, fig. 3b is a three-channel registered image, fig. 4 is a component feature extraction information and component interference phase diagram, and fig. 5 is a target and component three-dimensional inversion diagram. It can be seen from fig. 3a and 3b that the three channels are substantially identical in position after registration, with reference to the a channel. In fig. 4a, the projection length of the long side in the imaging plane is 5.5449m, the included angle between the target gesture and the imaging plane is 29.1659 °, the estimated value of the long side in fig. 5a is 6.35m, the estimated value of the included angle between the target gesture and the imaging plane is 29.9172 °, the projection length of the long side calculated according to the estimated value is 5.5038m, and the error is 0.0411m.
Aiming at the requirements of three-dimensional attitude acquisition of a space target, the invention provides a three-dimensional feature extraction method of the space target component based on an interference ISAR image, and the three-dimensional attitude of the target component is inverted by utilizing angle information extracted by the interference image, so that the problems of scattering point information deletion, target shielding and the like are solved, and the three-dimensional attitude information extraction capability of the space target is improved.
The innovation points and advantages of the invention are that:
1. based on ISAR images of different angles acquired by a multichannel radar, interference phase information is extracted, pitch angle and azimuth angle of the radar are deduced, a projection matrix on a radar imaging plane is calculated, the interference radar is used for imaging, sequential imaging is not needed under the condition that resolution is ensured, and only the interference image at a single moment is used for searching for target projection parameters, so that three-dimensional attitude parameters of a target are acquired.
2. The three-dimensional attitude inversion is performed by using the ISAR image, so that the three-dimensional attitude inversion method has lower system complexity compared with the traditional active attitude measurement method, and has all-weather advantages over the whole day compared with the photoelectric measurement method.
3. The problem of inaccurate size estimation caused by traditional two-dimensional imaging projection can be solved by utilizing multi-view information of the interference radar, and simultaneously, attitude estimation can be realized through one-time high-resolution three-dimensional imaging.
4. Based on the component information extracted from the two-dimensional image, the particle swarm optimization algorithm is adopted to search the object body coordinate projection angle, the projection matrix is solved, and the non-parameterized search method is beneficial to acquiring the attitude parameters of the non-cooperative object with limited prior information.
It should be noted that, in the embodiments of the present invention, the directions or positional relationships indicated by the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc. are based on the directions or positional relationships shown in the drawings, are merely for convenience of describing the embodiments, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
While the present invention has been described in detail through the foregoing description of the preferred embodiment, it should be understood that the foregoing description is not to be considered as limiting the invention. Many modifications and substitutions of the present invention will become apparent to those of ordinary skill in the art upon reading the foregoing. Accordingly, the scope of the invention should be limited only by the attached claims.
Claims (7)
1. A method for extracting three-dimensional features of a space target component, comprising:
s1, acquiring radar echo signals of three base stations, performing distance-Doppler processing, acquiring an ISAR two-dimensional image, performing image registration on the ISAR two-dimensional image, and converting the image into a gray level image;
s2, calculating an optimal threshold value of the gray level map by adopting an OTSU (on-the-fly per se) method, converting the gray level map into a binary image according to the optimal threshold value, and expanding the image and filling holes;
s3, acquiring a boundary of the binary image by using a sobel edge detection algorithm, detecting a linear line segment on the boundary image by using Hough transformation, and acquiring start and end coordinates and directions of the line segment;
step S4, searching part rectangular characteristic parameters in the binary image by adopting an optimization algorithm according to the Hough transformation detection result;
s5, interfering the three ISAR images to obtain radar pitch angles and azimuth angles, and obtaining a projection matrix of radar sight to an imaging plane;
and S6, searching projection information of the target component on the radar imaging plane by using a particle swarm optimization algorithm, and combining the projection matrix of the radar imaging plane to obtain the three-dimensional real size of the target component space.
2. The method for extracting three-dimensional features of a space target component according to claim 1, wherein said step S1 comprises:
step S1.1, setting a radar to emit electromagnetic wave signals, setting the radar position as an origin O of space coordinates, and setting A, B, C three radar base stations to receive signals;
the signals received by each base station are acquired, signal processing in the distance-Doppler dimension is carried out, A is assumed to be a main channel, the main channel is taken as an example, and the ISAR two-dimensional image domain expression is as follows:
wherein lambda is the wavelength of electromagnetic wave, gamma is the frequency modulation rate, f r and fa The distance and azimuth frequencies are represented respectively,for the position of the p-th scattering point, R ref For reference distance-> and />The distances from the initial time to the antennas A and O are respectively;
taking A, B channel interference as an example, for the same scattering point, the differences between the three channels are:
wherein ,respectively the positions of the same scattering point in the A channel image and the B channel image, delta f r 、Δf a For the frequency shift of the scattering point distance and orientation on both channels A, B, delta n 、Δ m For the position shift of the scattering point distance and orientation on the A, B two channels, R A0 、R B0 The distance, ω, from the antenna A, B to O at the initial time, respectively z Is the target rotational speed; />
Step S1.2, optimizing and solving the rotation offset omega by using an average distance minimum entropy method z When the image envelope does not walk within a distance cell, the entropy value of the image is minimum;
ω z =arg min(-∑g(ω z_est )·lng(ω z_est ))
wherein ,ωz_est Is omega z Estimated value, g (omega) z_est ) To use omega z_est Integrating the envelope amplitude value of the range profile along the azimuth direction after envelope correction;
and carrying out mismatch compensation on the channel B by taking the channel A as a reference, wherein the phase form of the compensation offset is as follows:
Δ comp (n,m)=exp(j2π/NΔ n ·n)exp(j2π/MΔ m ·m)
wherein n and m are discrete sampling points in the image distance direction and the azimuth direction respectively, N, M is the total number of samples, and the two compensated channels are subjected to matrix construction in the time domainWhen the image scattering points of the two channels are perfectly aligned, pair +.>The image scattering points obtained by two-dimensional Fourier transform are overlapped, and the image energy is maximum at the moment;
the channel A, C registration method is the same as the channel A, B, thereby obtaining a three-channel gray scale image after registration.
3. The method for extracting three-dimensional features of a space target component according to claim 2, wherein said step S2 comprises:
step S2.1, using gray scale image BMP A For example, the BMP is calculated using the maximum inter-class variance method A An optimal threshold TH of the image;
dividing the image into a background part and a target part which are respectively C 1 (smaller than TH), C 2 (greater than TH), the probability of each pixel within the image being divided into background and object classes is p 1 、p 2 The maximum inter-class variance is:
wherein, THG is the global average value,representing the accumulated mean value when the gray level is m, and obtaining m when the maximum variance is the OTSU threshold value TH;
step S2.2, converting the gray level image into a binary image according to the OTSU threshold value, expanding the image and filling holes to obtain a binary image BI 1 、BI 2 、BI 3 。
4. A method for three-dimensional feature extraction of a space-target component according to claim 3, wherein said step S3 comprises:
s3.1, acquiring the target binary boundary image processed in the step S2 by using a sobel edge detection algorithm;
s3.2, acquiring a linear line segment in the boundary image by adopting Hough transformation, wherein the linear line segment comprises start and end coordinates and a pointing direction of the line segment;
converting the Cartesian coordinate system into a polar coordinate system:
ρ=xcosθ+ysinθ
mapping each pixel point (x, y) in each Cartesian coordinate system on the edge in the boundary image to a corresponding sine curve in a polar coordinate system (rho, theta), and extracting the point (rho) with the most intersection in the polar coordinate system after all edge points are subjected to Hough transformation 0 ,θ 0 ) Find the longest edge segment l L Obtain the coordinates of start and end (x 1 ,y 1 )、(x 2 ,y 2 ) Line segment pointing
5. The method for extracting three-dimensional features of a space target component according to claim 4, wherein said step S4 comprises:
step S4.1, searching rectangular features of the component by using rectangular descriptors according to Hough transformation results;
the short side and the long side of the rectangle are mutually perpendicular, and the initial coordinate (x 1 ,y 1 ) Searching for the end of the short edge for the initial coordinates of the short edgeLabel (x) i ,y i ):
minl D -l(x i ,y i )
wherein ,Fpi To search the corresponding rectangular region pixel point when the short side is short, l D For the length of the short side of the component rectangle, l (x i ,y i ) Search length for rectangular short sides of part, n i 、n i-1 The number of pixel points in the coverage area of the rectangular component of the current search value and the last search value is respectively;
6. The method for extracting three-dimensional features of a space target component according to claim 5, wherein said step S5 comprises:
step S5.1, carrying out phase correction on the three-channel radar echo signals obtained in the step S1, and obtaining a three-channel phase form expressed as:
wherein ,RΔAPO 、R ΔBPO 、R ΔCPO Respectively, the signal path histories received by three receiving radars, R ref A set reference distance;
the original interference phase differences of the three channels are:
the three channels are theoretically out of phase:
Δφ AB =2πL 1 sinα/λ
Δφ AC =2πL 2 sinβ/λ
wherein ,L1 、L 2 The length of a base line between an AB channel and an AC channel is shown, alpha is an azimuth angle, and beta is a pitch angle;
the interference phase is unwound by utilizing a de-reference plane method, the direction tangent to the movement direction of the target and perpendicular to the sight line of the antenna A is selected as a reference plane, the interference phase on the reference plane corresponding to the inclined distance of the antenna A is subtracted from the original interference phase, and the main value is taken, so that the actual interference phase phi 'is obtained' AB 、φ′ AC ;
Taking A, B as an example, the interference phase after the reference plane phase is removed is:
φ A ′ B (R ΔAP0 )=φ(R ΔAP0 )-φ g (R ΔAP0 )+2kπ;
wherein k is an integer, and the value of k satisfies-pi < phi A ′ B ≤π,φ(R ΔAP0 ) Is the distance difference R ΔAP0 Corresponding true phase, phi g (R ΔAP0 ) Is the distance difference R ΔAP0 Phase on corresponding reference plane, R ΔAP0 The pixel corresponds to the difference between the antenna A slant distance and the reference distance;
and (5) solving X, Y, Z positions to obtain:
simultaneously obtaining radar sight angle parameters (alpha, beta);
step S5.2, constructing a radar imaging plane projection matrix according to the angle obtained in the step S5.1:
wherein ,ρr For distance dimension resolution ρ a Is the doppler resolution.
7. The method for extracting three-dimensional features of a space target component according to claim 6, wherein said step S6 comprises:
step S6.1, after the two-dimensional characteristics of the target component are obtained, the three-dimensional projection attitude parameters are obtained through a minimum optimization problem based on the component rectangle priori, and the cost function is as follows:
wherein theta is more than or equal to minus 90 degrees and phi is more than or equal to minus 90 degrees, phi is more than or equal to 0 degrees and less than or equal to 180 degrees,for the target linear structure attitude pointing, a classical particle swarm optimization algorithm is adopted to solve the projection angle (theta L ,ψ L ;θ D ,ψ D ) Is the optimal solution of (a);
step S6.2, determining the real size of the target component in three dimensions according to the generated projection information of the target component in the image and the extracted two-dimensional extraction information:
wherein ,lL 、l D Real dimension parameters of the target rectangular parts, i L-RD 、l D-RD The dimensional parameters of the extracted target rectangular component projected onto the ISAR imaging plane, respectively.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211571024.3A CN116091782A (en) | 2022-12-08 | 2022-12-08 | Three-dimensional feature extraction method for space target component |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211571024.3A CN116091782A (en) | 2022-12-08 | 2022-12-08 | Three-dimensional feature extraction method for space target component |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116091782A true CN116091782A (en) | 2023-05-09 |
Family
ID=86200127
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211571024.3A Pending CN116091782A (en) | 2022-12-08 | 2022-12-08 | Three-dimensional feature extraction method for space target component |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116091782A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116608922A (en) * | 2023-05-17 | 2023-08-18 | 小儒技术(深圳)有限公司 | Radar-based water level and flow velocity measurement method and system |
-
2022
- 2022-12-08 CN CN202211571024.3A patent/CN116091782A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116608922A (en) * | 2023-05-17 | 2023-08-18 | 小儒技术(深圳)有限公司 | Radar-based water level and flow velocity measurement method and system |
CN116608922B (en) * | 2023-05-17 | 2024-04-05 | 小儒技术(深圳)有限公司 | Radar-based water level and flow velocity measurement method and system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US4321601A (en) | Three dimensional, azimuth-correcting mapping radar | |
US9746554B2 (en) | Radar imaging system and related techniques | |
US6061022A (en) | Azimuth and elevation direction finding system based on hybrid amplitude/phase comparison | |
US20200182992A1 (en) | Enhanced object detection and motion state estimation for a vehicle environment detection system | |
US7277042B1 (en) | Compensation of flight path deviation for spotlight SAR | |
CN110068817B (en) | Terrain mapping method, instrument and system based on laser ranging and InSAR | |
US20070126629A1 (en) | Technique for accurate estimate of large antenna inertial two dimensional orientation using relative GPS spatial phase | |
CN112083387B (en) | Radar calibration method and device | |
US20110102242A1 (en) | Radar apparatus | |
CN103487809A (en) | Onboard InSAR data processing method based on BP algorithm and time-varying baseline | |
CN112782645B (en) | Data fitting angle measurement method for four-arm helical antenna | |
CN109856605A (en) | A kind of while formation of the digital multiple beam quadratic fit curve is directed toward modification method | |
CN110823191B (en) | Method and system for determining ocean current measurement performance of mixed baseline dual-antenna squint interference SAR | |
US5489907A (en) | Airborne SAR system for determining the topography of a terrain | |
CN116091782A (en) | Three-dimensional feature extraction method for space target component | |
CN112986949A (en) | SAR high-precision time sequence deformation monitoring method and device for diagonal reflector | |
CN109116351B (en) | Spaceborne InSAR positioning and analyzing algorithm | |
CN113009483B (en) | Speed measuring method, speed measuring device, computer storage medium and computer storage device | |
US7123749B2 (en) | Height measurement apparatus | |
CN115932823B (en) | Method for positioning aircraft to ground target based on heterogeneous region feature matching | |
EP3555658B1 (en) | Method and apparatus for monitoring surface deformations of a scenario | |
RU2624467C2 (en) | Method of determining height of two-dimensional radar station target | |
CN106371096B (en) | Airborne double-antenna InSAR three-dimensional configuration model construction method | |
US20220206136A1 (en) | Method and system for high-integrity vehicle localization | |
CN115712095A (en) | SAR satellite three-dimensional positioning error correction method and system based on single angular reflection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |