CN116383717A - Intelligent comprehensive unmanned aerial vehicle recognition system and method - Google Patents

Intelligent comprehensive unmanned aerial vehicle recognition system and method Download PDF

Info

Publication number
CN116383717A
CN116383717A CN202310326862.2A CN202310326862A CN116383717A CN 116383717 A CN116383717 A CN 116383717A CN 202310326862 A CN202310326862 A CN 202310326862A CN 116383717 A CN116383717 A CN 116383717A
Authority
CN
China
Prior art keywords
target
radar
module
sub
unmanned aerial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310326862.2A
Other languages
Chinese (zh)
Other versions
CN116383717B (en
Inventor
申娟
宋正鑫
刘传保
张亚洲
董光波
周海涟
李静
刘燕
汪小平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
93209 Troops Of Chinese Pla
Original Assignee
93209 Troops Of Chinese Pla
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 93209 Troops Of Chinese Pla filed Critical 93209 Troops Of Chinese Pla
Priority to CN202310326862.2A priority Critical patent/CN116383717B/en
Publication of CN116383717A publication Critical patent/CN116383717A/en
Application granted granted Critical
Publication of CN116383717B publication Critical patent/CN116383717B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/10Pre-processing; Data cleansing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data
    • G06F18/256Fusion techniques of classification results, e.g. of results related to same input data of results relating to different input data, e.g. multimodal recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/02Preprocessing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Biomedical Technology (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Biophysics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention provides an intelligent comprehensive unmanned aerial vehicle identification system and method, comprising the following steps: the method comprises the following steps: receiving and accumulating detection signals of targets through a radar front end; extracting multiple characteristics of the target, and executing the identification of the point track data of the target by the multiple characteristics of the acquired target; determining whether the target is a real target according to the probability that the point track data of the target is identified, performing pixelation processing on the point track data of the target, and layering the pixelation processed image point track data according to the original point track; a gradient descent method is adopted to minimize a loss function network model based on a target tracking algorithm of deep learning; measuring and calculating RCS, and establishing an unmanned aerial vehicle RCS feature library; calculating a one-dimensional distance image and building a library; calculating JEM characteristics and establishing a library; and (3) synthesizing track characteristic and motion characteristic forms, and carrying out comprehensive judgment and identification in a sequential mode to determine the properties of the target.

Description

Intelligent comprehensive unmanned aerial vehicle recognition system and method
Technical Field
The invention relates to the technical field of unmanned aerial vehicle recognition, in particular to an intelligent comprehensive unmanned aerial vehicle recognition system and method.
Background
Along with development and maturity of unmanned aerial vehicle technique, unmanned aerial vehicle's use is more and more extensive, brings high-efficient convenient influence to relevant life amusement and work, but brings very big challenge to the air defense safety simultaneously, especially "low slow little" characteristic unmanned aerial vehicle, because have "low altitude/super low altitude flight, the air speed is slower, target characteristic unobvious" etc. whole or take "low, slow" as main feature, this kind of aircraft uses extensively, daily air defense threat in heavy point city is increasingly greater. In recent years, the 'black flight' monitoring of 'low-low' targets of various small civil unmanned aerial vehicles has become one of important working directions of urban security of all countries of the world, wherein intelligent recognition of unmanned aerial vehicles is the biggest difficulty, and at present, main equipment is to detect by an optical principle, can be greatly influenced by weather, and has very limited working distance. At present, equipment for detecting unmanned aerial vehicle based on electromagnetic echo has the defects of large influence by clutter, poor detection capability and incapability of intelligently identifying unmanned aerial vehicle.
Disclosure of Invention
In view of the above, the present invention aims to overcome the above drawbacks of the prior art, and a first aspect of the present invention provides an intelligent comprehensive unmanned aerial vehicle recognition method, which includes the following steps:
step 1, receiving and accumulating detection signals of targets through a radar front end;
step 2, extracting multiple characteristics of a target from the detection signal received by the front end of the radar;
step 3, identifying point track data of the target to be executed by multiple characteristics of the acquired target; determining whether the target is a real target according to the point track data characteristics of the target, and if the target is determined to be the real target, turning to step 4; if the false target is determined, returning to the step 2;
step 4, pixelation processing is carried out on the point track data of the target, and layering processing is carried out on the image point track data after the pixelation processing according to the original point track;
step 5, calculating the radar cross-sectional area RCS of the real target in step 3, wherein the value sigma of the RCS of the target is expressed as:
Figure SMS_1
wherein: the performance constant K corresponds to:
Figure SMS_2
in the formula: r is R 0 Is the calibrated radar standard distance sigma 0 Is the radar cross-sectional area RCS, (S/N) of the calibration target 0 Is a detectable signal to noise ratio; S/N is the signal-to-noise ratio, R is the distance from the real target to the radar;
step 6, carrying out modeling simulation on the unmanned aerial vehicle, and establishing an RCS feature library of the unmanned aerial vehicle according to an RCS simulation result of the radar cross-sectional area of the unmanned aerial vehicle; calculating a one-dimensional distance image p (t) of the target by adopting broadband data:
Figure SMS_3
wherein t is time; i (t) is the in-phase component of the radar signal echo and q (t) is the quadrature component of the radar signal echo;
step 7, determining a real target to be identified according to the result of the step 4-6; and acquiring a plurality of characteristics of the real target to be identified, and determining the type of the target according to the identification of the plurality of characteristics.
The method according to the first aspect of the present invention, wherein the object is a moving object, step 3 comprises: the probability that the point track data of the target is identified is expressed as:
Figure SMS_4
wherein P () is probability density, t is time, X (t) represents a feature, X (t) is a continuous function of t, deltar is time increment, deltar > 0, H is least squares estimated slope; x takes a digital interval around the target track.
The method according to the first aspect of the invention, step 4 comprises the sub-steps of:
step 4.1, layering processing of the original point tracks comprises the following steps: establishing a two-dimensional matrix by taking position information of target point track data as a reference and distance and azimuth, and dividing different layers on the basis of azimuth information matrix representation to represent different attribute characteristics of the target point track;
step 4.2, based on a target tracking algorithm of deep learning, adopting a gradient descent method to minimize a loss function network model, and reversely adjusting weight parameters of each layer of layering treatment layer by layer; and improving the precision of the network model through repeated iterative training.
The method according to the first aspect of the present invention, step 5 further comprises: measuring a plurality of calibration targets at a predetermined distance according to the radar to obtain fixed parameters of the calibration targets, including: the signal-to-noise ratio, the distance and the radar cross-sectional area RCS are calculated according to the calibrated parameters, so that the performance constant is applied to RCS calculation of a measured real target, and an RCS result of the target to be identified is obtained.
The method according to the first aspect of the invention, step 7 comprises:
step 7.1, judging the type of a real target according to an original point track layering processing result of the real target, a radar scattering cross section area RCS of the real target and a one-dimensional range profile of the real target, and determining a low-altitude slow-speed aircraft target in the real target;
step 7.2, extracting a plurality of characteristics of the low-altitude slow-speed aircraft target, respectively identifying the target according to the plurality of characteristics, and carrying out information fusion on the identification result in a sequential manner; and determining whether the target belongs to an unmanned aerial vehicle target according to the fusion result.
The types of the real targets include: high altitude high speed aircraft, medium low altitude high speed aircraft and low altitude low speed aircraft.
A second aspect of the present invention proposes an intelligent comprehensive unmanned aerial vehicle recognition system for executing the foregoing intelligent comprehensive unmanned aerial vehicle recognition method, the system comprising: a radar front end and a radar rear end; the front end of the radar is an antenna array surface, and the rear end of the radar is all arranged in the comprehensive treatment box;
the radar front end includes: an antenna array surface formed by a plurality of active subarrays; each active subarray is composed of a chip T/R component; the active subarray receives a frequency signal of a frequency source and a control signal of a radar control system, and provides a signal detected by the active subarray to a signal processing module in the form of an optical signal through a downlink data bus;
the radar back end includes: the system comprises a radar control system module, a frequency source module, a time service equipment module, an optical power division module, a signal processing module and a data processing module;
the optical power dividing module comprises: the optical power dividing module divides an uplink frequency source signal and a radar control signal into a plurality of paths of clock power signals, a plurality of paths of local oscillation signals and a plurality of paths of control signals after electro-optical conversion and photoelectric conversion, and the plurality of paths of clock power signals, the plurality of paths of local oscillation signals and the plurality of paths of control signals are respectively provided for a plurality of active subarrays.
The system according to the second aspect of the present invention, wherein the active subarray of the radar front end comprises a chip synthesis layer and a digital transceiver module; each active subarray is integrally constructed by a chip integrated layer, and radio frequency channels of the T/R assembly are connected through printed board wiring; the integrated layer printed board is directly connected with the antenna unit, the T/R assembly and the power supply;
the digital transceiver module includes: the device comprises a multichannel digital board, a frequency mixing module, an active subarray front-end drive amplifier, a radio frequency switching cable and an accessory structure.
The system according to the second aspect of the present invention, the signal processing module of the radar back-end includes the following sub-modules: the system comprises a narrow pulse eliminating sub-module, a pulse compressing sub-module, an anti-synchronous sub-module, a moving target detecting sub-module, a self-adaptive digital beam forming sub-module, a clutter map sub-module, a constant false alarm processing sub-module and a target identifying sub-module;
the data processing module of the radar back end comprises: the system comprises a self-adaptive coherence estimation sub-module, an iterative angle measurement sub-module, a point track processing sub-module and an energy scheduling sub-module.
The system according to the second aspect of the present invention, the point track processing submodule includes: correlation processing unit, smoothing processing unit and filtering processing unit.
According to the system of the second aspect of the invention, the display control terminal at the rear end of the radar adopts a portable remote terminal, and the original video and the point track are simultaneously transmitted to the display control terminal for display; the portable remote terminal is capable of performing a remote control operation on the radar.
The beneficial effects of the invention are as follows: the intelligent comprehensive unmanned aerial vehicle recognition system and the intelligent comprehensive unmanned aerial vehicle recognition method can provide a complete solution for unmanned aerial vehicle detection recognition. The radar front-end may be extended. The antenna array plane is expandable, multifunctional and multi-beam, and the power caliber is changed according to the scene requirement detected by the unmanned aerial vehicle. Comprehensive feature recognition is performed on comprehensive track features, motion features, modulation features, radar cross-sectional area RCS features and the like, so that accuracy of single classification recognition is improved, information fusion is performed through sequential recognition, and reliability of classification recognition results is improved. The invention has technical advancement, and the proposed system and method can be used for guiding engineering realization and have high application value.
Drawings
FIG. 1 is a front end component structure of an intelligent comprehensive unmanned aerial vehicle recognition system of the invention;
FIG. 2 is a rear end component structure of the intelligent comprehensive unmanned aerial vehicle recognition system of the invention;
FIG. 3 is a functional block diagram of an active sub-array module of the present invention;
FIG. 4 is a block diagram of an optical fiber transmission link of the present invention;
FIG. 5 is a block diagram of a clock, local oscillator link of the present invention;
FIG. 6 is a diagram of a point track data pixelation process of the present invention;
FIG. 7 is a flow chart of a point track data layering process of the present invention;
FIG. 8 is a schematic diagram of the overall architecture of a convolutional neural network of the present invention;
FIG. 9 is a schematic diagram of the overall sequential recognition accuracy of the present invention;
fig. 10 is a sequential identification flow chart of the present invention.
Detailed Description
The invention aims to provide an intelligent comprehensive unmanned aerial vehicle identification system and method. In engineering practice, the optimal matching of unmanned aerial vehicle flight scene detection and recognition can be achieved according to the detection capability of the scene adjustment system, comprehensive properties such as radar scattering cross section area RCS, jet engine modulation JEM, motion characteristics and the like of detection targets are comprehensively utilized, comprehensive recognition is carried out on the targets, unmanned aerial vehicles and other targets are distinguished, and accurate and reliable technical conditions are provided for anti-unmanned aerial vehicle air defense guarantee.
The invention is further described below with reference to the drawings and the detailed description.
The first aspect of the invention provides an intelligent comprehensive unmanned aerial vehicle identification method, which comprises the following steps:
step 1, receiving and accumulating detection signals of targets through a radar front end;
step 2, extracting multiple characteristics of a target from the detection signal received by the front end of the radar;
step 3, identifying point track data of the target to be executed by multiple characteristics of the acquired target; determining whether the target is a real target according to the point track data characteristics of the target, and if the target is determined to be the real target, turning to step 4; if the false target is determined, returning to the step 2;
step 4, pixelation processing is carried out on the point track data of the target, and layering processing is carried out on the image point track data after the pixelation processing according to the original point track;
step 5, calculating the radar cross-sectional area RCS of the real target in step 3, wherein the value sigma of the RCS of the target is expressed as:
Figure SMS_5
wherein: the performance constant K corresponds to:
Figure SMS_6
in the formula: r is R 0 Is the calibrated radar standard distance sigma 0 Is the radar cross-sectional area RCS, (S/N) of the calibration target 0 Is a detectable signal to noise ratio; S/N is the signal-to-noise ratio, R is the distance from the real target to the radar;
step 6, carrying out modeling simulation on the unmanned aerial vehicle, and establishing an RCS feature library of the unmanned aerial vehicle according to an RCS simulation result of the radar cross-sectional area of the unmanned aerial vehicle; calculating a one-dimensional distance image p (t) of the target by adopting broadband data:
Figure SMS_7
wherein t is time; i (t) is the in-phase component of the radar signal echo and q (t) is the quadrature component of the radar signal echo;
step 7, determining a real target to be identified according to the result of the step 4-6; and acquiring a plurality of characteristics of the real target to be identified, and determining the type of the target according to the identification of the plurality of characteristics.
In the step 1, a plurality of active subarrays at the front end of the radar transmit radar waves, receive echo signals of targets, convert the echo signals into optical signals by T/R components of the active subarrays as detection signals, and then directly transmit the optical signals to a signal processing module at the rear end of the radar through a downlink data bus, and execute processing on the detection signals and accumulate the detection signals in the signal processing module. And carrying out signal processing flows such as narrow pulse rejection, pulse pressure, anti-asynchronism, moving Target Detection (MTD) and the like on the detection signals in the signal processing module.
In the step 2, the plurality of features includes: target track, target motion parameters, jet engine modulation JEM, radar cross-sectional area RCS, and one-dimensional range profile parameters.
The method according to the first aspect of the present invention, wherein the object is a moving object, step 3 comprises: the probability that the point track data of the target is identified is expressed as:
Figure SMS_8
wherein P () is probability density, t is time, X (t) represents a feature, X (t) is a continuous function of t, deltar is time increment, deltar > 0, H is least squares estimated slope; x takes a digital interval around the target track.
Firstly, a point track layer is identified and distinguished by adopting a fractional Brownian motion model, and the method is defined as follows: on a certain probability space, the random process X with an index H (0 < H < 1) satisfies:
a) With probability 1 there is X (0) =0, and X (t) is a continuous function of t;
b) For any t.gtoreq.0 and Δr.gtoreq.0, the delta X (t+Δr) -X (t) obeys a mean of 0 and a variance of Δr 2H So there is
Figure SMS_9
Where P () is the probability density. From the above formula, X (t+Deltar) -X (t) is stable, its variance is equal to Deltar 2H Proportional, i.e.
Figure SMS_10
The above formula is changed into the following form:
var{X(t+Δr)-X(t)}=E[X(t+Δr)-X(t)] 2 =K·Δr 2H
the logarithm of the two sides of the upper part is obtained:
Figure SMS_11
wherein C is a constant. From the above equation, only the var { X (t+Deltar) -X (t) } and Deltar data pair is calculated, and then the slope H is estimated by least squares to obtain the fractal dimension D, and the intercept C, which is related to the object surface change speed coefficient K.
The spatial correlation features reflect the correlation between target units or clutter units, and in fact reflect the size of the target (relative to the radar resolution unit) and its internal relative motion. The spatial distribution is mainly represented by three dimensions of distance, pitch and azimuth.
Distance correlation coefficient c Δr The correlation coefficient between a plurality of pulse sequences of the distance units to be distinguished and the pulse sequences of the surrounding distance units is defined as follows:
Figure SMS_12
wherein x (r) m P) is the p-th pulse at the r-th m Echo signals on the distance cells.
Distance for further extracting pulse pressure dataOff-profile features. Let the amplitude of echo point be a Δr,p Where Δr is the distance unit, p is the pulse sequence number, and assuming that the number of distance profile points is L, the distance profile sequence of the echo is defined as:
y=a Δr-L,1 ,…,a Δr,1 ,…,a Δr+L,1 ,…,a Δr-L,p ,…,a Δr,p ,…,a Δr+L,p ,…,a Δr-L,P ,…,a Δr,P ,…,a Δr+L,P
the total N can be divided into 2 classifications N1 and N2, which respectively correspond to the target and clutter, and y is set ij The jth distance profile training sample, y, representing an ith class of echoes ij Is an n-dimensional vector, where i=1, 2; j=1, 2, …, h;
n= (2l+1) P. Assuming that the mean and covariance matrices of class Ni are μ, respectively i And C i Then there is
Figure SMS_13
Figure SMS_14
Then for one sample y to be determined, the mahalanobis distance from the sample y to be determined to class Ni is defined as:
d i 2 =d 2 (y,N i )=(y-μ i ) T C i -1 (y-μ i )
defining the distance profile coefficient d of the echo m Is that
d m =d 1 2 -d 2 2
Representing the difference between the mahalanobis distance of the echo samples to the target class and to the clutter class.
The above is the process of obtaining the point track characteristics.
In the step 4, the pixelation processing of the point track data is explained as a preprocessing process of target tracking by a quasi-image method. Typically, the point track data includes range, azimuth information of the target, and information such as data acquisition time and signal to noise ratio, amplitude and range span, azimuth span, etc. of the point track. And in the pixelation processing process, the position information of the target is subjected to image pixel processing, namely RGB imaging is performed according to the track distance and angle information of the target point. Firstly, marking turns according to time and azimuth information of all the point tracks, and imaging the point tracks with different turns according to different RGB values. Fig. 6 shows the process of pixelating the point track data.
According to the scene requirements of the detection tasks, modeling the tasks of the detection unmanned aerial vehicle, wherein the tasks to be executed by the system can be represented by the following mathematical models:
R i ={t ai ,l i ,L ii ,p i },i=1,2,3…
wherein, I i A time window representing each task request event; t is t ai Indicating the arrival time of each task request event, L i Representing the length of execution time required for each task; η (eta) i Representing the percentage of antenna array resources which are required to be occupied by each specific task; p is p i Indicating the working mode priority of each task. The time window is a valid range in which the actual execution time of the beam dwell can move around the expected execution time, and if the actual execution time is beyond this range, the beam dwell is not executed yet, and the system beam request is considered to fail. The size of the time window may be determined and calculated from the reference.
The method according to the first aspect of the invention, step 4 comprises the sub-steps of:
step 4.1, layering processing of the original point tracks comprises the following steps: establishing a two-dimensional matrix by taking position information of target point track data as a reference and distance and azimuth, and dividing different layers on the basis of azimuth information matrix representation to represent different attribute characteristics of the target point track;
step 4.2, based on a target tracking algorithm of deep learning, adopting a gradient descent method to minimize a loss function network model, and reversely adjusting weight parameters of each layer of layering treatment layer by layer; and improving the precision of the network model through repeated iterative training.
Step 5 will be specifically described below. Calculating radar cross-sectional area RCS requires a calibration procedure according to the radar equation:
Figure SMS_15
where Pt is the radar transmitter power, gt is the antenna transmit gain, gr is the antenna receive gain, lambda is the radar wavelength,
standard radar cross-sectional area RCS is calibrated,
Figure SMS_16
and (3) carrying out multiple tests, and obtaining a performance constant according to the signal-to-noise ratio, the distance and the standard radar scattering cross section area RCS of the standard target with the corresponding distance:
Figure SMS_17
then the radar cross-sectional area RCS of the object to be measured is calculated as:
Figure SMS_18
the radar cross-sectional area RCS value error estimation formula of radar measurement target radar is:
Figure SMS_19
wherein L is a For atmospheric attenuation, T e To input noise temperature, B n L for receiver bandwidth f For feeder loss, A M For receiver gain, N F The noise figure, R is the distance, and G is the antenna gain.
Fig. 7 shows a layering process flow of the point track data. Fig. 8 shows the overall architecture of the convolutional neural network.
The method according to the first aspect of the present invention, step 5 further comprises: measuring a plurality of calibration targets at a predetermined distance according to the radar to obtain fixed parameters of the calibration targets, including: the signal-to-noise ratio, the distance and the radar cross-sectional area RCS are calculated according to the calibrated parameters, so that the performance constant is applied to RCS calculation of a measured real target, and an RCS result of the target to be identified is obtained.
The types of the targets include: high altitude high speed aircraft, medium low altitude high speed aircraft and low altitude low speed aircraft.
In step 6, performing modeling simulation on the unmanned aerial vehicle comprises the following steps:
calculating a one-dimensional range profile, the echo signal may be expressed as:
g(t)=i(t)cos(2πf 0 t)+q(t)sin(2πf 0 t)
wherein i (t) and q (t) are orthogonal components of g (t), and are real functions. Then:
g(t)=p(t)cos{2πf 0 t+arctansin(2πf 0 t)}
wherein p (t) is the envelope of g (t), i.e. the one-dimensional range profile of the target:
Figure SMS_20
the calculation formula is as follows:
Figure SMS_21
for spectral modulation, considering only the unmanned aerial vehicle propeller blade scatter component, the echo signal complex envelope can be expressed as:
Figure SMS_22
wherein N is the number of leaves, a (t) is an amplitude modulation signal,
Figure SMS_23
in order to be a phase-modulated signal,
Figure SMS_24
wherein L is the effective length of the blade, lambda is the working wavelength of the system, beta is the target elevation angle, f j For the rotation frequency of the blade, θ 0 Is the initial phase angle of rotation of the reference propeller.
Figure SMS_25
Wherein f d For the doppler shift frequency,
after fourier transformation, it can be obtained:
Figure SMS_26
it can be seen that the spectrum is the sum of a series of Dirac functions with a center frequency f d Each spectral line has a period interval of pNf j
The method according to the first aspect of the invention, step 7 comprises:
step 7.1, judging the type of a real target according to an original point track layering processing result of the real target, a radar scattering cross section area RCS of the real target and a one-dimensional range profile of the real target, and determining a low-altitude slow-speed aircraft target in the real target;
step 7.2, extracting a plurality of characteristics of the low-altitude slow-speed aircraft target, respectively identifying the target according to the plurality of characteristics, and carrying out information fusion on the identification result in a sequential manner; and determining whether the target belongs to an unmanned aerial vehicle target according to the fusion result.
The plurality of features includes: target track, target motion parameters, jet engine modulation JEM, radar cross-sectional area RCS and one-dimensional range profile parameters;
FIG. 9 shows the overall recognition accuracy; fig. 10 shows a sequential identification flow chart.
A second aspect of the present invention proposes an intelligent comprehensive unmanned aerial vehicle recognition system for executing the foregoing intelligent comprehensive unmanned aerial vehicle recognition method, the system comprising: a radar front end and a radar rear end; the front end of the radar is an antenna array surface, and the rear end of the radar is all arranged in the comprehensive treatment box;
the radar front end includes: an antenna array surface formed by a plurality of active subarrays; each active subarray is composed of a chip T/R component; the active subarray receives a frequency signal of a frequency source and a control signal of a radar control system, and provides a signal detected by the active subarray to a signal processing module in the form of an optical signal through a downlink data bus;
the radar back end includes: the system comprises a radar control system module, a frequency source module, a time service equipment module, an optical power division module, a signal processing module and a data processing module;
the optical power dividing module comprises: the optical power dividing module divides an uplink frequency source signal and a radar control signal into a plurality of paths of clock power signals, a plurality of paths of local oscillation signals and a plurality of paths of control signals after electro-optical conversion and photoelectric conversion, and the plurality of paths of clock power signals, the plurality of paths of local oscillation signals and the plurality of paths of control signals are respectively provided for a plurality of active subarrays.
The system is designed to be overall architecture according to the scene and the actual requirements of unmanned aerial vehicle detection, the change of the intelligent detection caliber is realized, the power caliber can be switched according to the target detection requirements of unmanned aerial vehicles and the like in different distance segments, and the system is energy-saving, environment-friendly, efficient and intelligent. The system consists of a front end of an intelligent comprehensive unmanned aerial vehicle recognition system and a rear end of the intelligent comprehensive unmanned aerial vehicle recognition system; may be simply referred to as a radar front end and a radar back end
As shown in fig. 1, the front end of the radar mainly comprises an antenna array surface (comprising 4 active subarrays and being expandable), and the rear end of the radar is a comprehensive processing box. The signal transmission is executed through optic fibre between radar front end and the radar rear end, specifically includes: downstream data is directly transmitted to the signal processing unit through an optical fiber. The uplink signal is transmitted through the optical power division module. Fig. 3 shows a block diagram of the functional module of the active subarray.
The comprehensive treatment box mainly comprises a cabinet body, 1 frequency source, 1 integrated treatment module (comprising signal processing and data processing), 1 time service equipment, 1 optical power division, 1 communication component and other equipment. Wherein the optical power division comprises: 1 clock power division, 2 local oscillator power divisions and 1 monitoring power division. The comprehensive treatment box adopts a light aluminum leather box structure. The rear end composition of the intelligent comprehensive unmanned aerial vehicle recognition system in the comprehensive treatment box is shown in fig. 2.
In the design of the front end of the radar, the development trend of the modular expandable digital array surface is combined, comprehensive consideration is carried out from the aspects of electric performance index realization, structural complexity, mechanical property, thermal control, novel material selection and application, antenna light weight degree, realization technical difficulty and the like, the system architecture of the phased array antenna array surface is determined, basic conceptual scheme research is completed, technical index distribution is reasonably carried out, and a basic framework is provided for the modular expandable antenna array surface layering mode.
The antenna array surface adopts a modular architecture and has the expandable capacity. The active subarray functional modules are basic expansion modules of an array surface, and the array surface is formed by horizontally splicing 4 active subarray functional modules. The active subarray functional module consists of a chip integrated layer and a digital transceiver module.
The active subarrays at the left end and the right end are positioned and fixed with the bracket through screws and positioning pins. The active subarrays of the assembly structure are self-contained and are designed in a conformal way with the active subarrays, and no additional structure is needed. The two active subarrays are connected by using 4 screws, the uppermost part and the lowermost part are connected by using positioning bolts, and the middle part is connected by using loose bolts.
A comprehensive layer and a digital transceiver module are a functional module. The antenna units are horizontally staggered in 8 rows and 8 columns. The composition of the comprehensive layer comprises 64 antenna units, a high-frequency network printed board, a low-frequency wiring printed board, 32 double-channel T/R components and a group of secondary power supply chips. The digital transceiver module comprises 1 8-channel digital board, frequency mixing module, active subarray front drive, radio frequency transfer cable and other auxiliary structures.
The antenna array surface has the scalability. The active subarray functional modules are basic expansion modules of an array surface, and the array surface is formed by horizontally splicing 4 active subarray functional modules. The active subarray functional module consists of a chip integrated layer and a digital transceiver module. The active subarray functional module is an expandable construction unit, has complete and independent array surface receiving and transmitting functions and interfaces such as optical fibers, radio frequency, power supplies and the like, is convenient for quick expansion of array surfaces in different scales, and reduces the change of other equipment after the caliber is expanded.
When the radar is powered on, each module of the subsystem firstly completes the self-checking of the working state, then carries out interconnection checking among the internal modules of the subsystem, and carries out testing of key parameters and complete machine parameters of the subsystem after the normal operation. The radar starts working after the power-on self-test is normal, firstly, a direct digital frequency synthesizer (DDS) is controlled to generate a radio frequency signal, and after the power amplification of a transmitting front-stage amplification and a final-stage power amplification, the radio frequency signal is output and radiated through an antenna; during receiving, echo signals are respectively transmitted to a receiving assembly through a receiving antenna unit, and low-noise amplification, filtering, network synthesis, mixing and sampling are carried out to obtain baseband I/Q signals, and the baseband I/Q signals are transmitted to a signal processing subsystem. The signal processing subsystem is used for achieving narrow pulse rejection, pulse pressure, anti-asynchronism, moving Target Detection (MTD), adaptive Digital Beam Forming (ADBF), clutter map and constant false alarm processing (CFAR). The data processing subsystem completes self-Adaptive Coherent Estimation (ACE), iterative angle measurement, point track processing (including correlation, smoothing and filtering processing) and energy scheduling, and the original video and the point track are simultaneously sent to a terminal for display.
When the active subarray transmits, the direct digital frequency synthesizer point frequency signal of the digital receiving and transmitting channel mixes with the local oscillation signal to generate 8 paths of transmitting excitation signals, each path of power is divided into 8 paths of transmitting excitation signals which are input to 8 transmitting channels of 8 TR components in the active subarray, the excitation signals form radio frequency signals after being amplified by phase shifting and power amplifying chips in the TR components, and the radio frequency signals are fed to an antenna unit through a winding layer and radiated to the space.
When the active subarray is received, the antenna unit receives a free space radio frequency echo signal, the free space radio frequency echo signal is sent to a receiving channel of a T/R assembly, then low-noise amplification and phase shifting are carried out, 8 receiving channels of 8 TR assemblies in each row in the active subarray are subjected to power synthesis by a synthesis layer 1:8 synthesizer, 1 path of radio frequency signals are formed and output to a digital receiving channel, the synthesis layer has 8 rows of 1:8 synthesizers, and 8 paths of synthesized radio frequency signals can be formed and sent to the first 8 channels of digital receiving. After entering a digital channel, the digital signal is transmitted to a Digital Beam Forming (DBF) system in the form of an optical signal after analog-to-digital conversion (ADC), digital Down Conversion (DDC), digital compensation, data packaging and electro-optic conversion.
Fig. 4 is a block diagram of a fiber optic transmission link.
The array surface adopts a highly light-weighted chip T/R assembly, and compared with the traditional brick-type assembly, the resin packaging technology is adopted, so that the traditional metal shell is saved. The number of the array surface T/R components is large, and the weight of the array surface structure shell is greatly reduced by using the chip-type T/R components.
The antenna unit and the comprehensive layer are integrally designed. The T/R assembly radio frequency channel is directly connected with the antenna unit through a printed wiring. The design saves the self structure, the reflecting plate structure, the feeder line, the winding layer, the connector and the like of the traditional antenna unit, greatly reduces the structural weight of the antenna unit, and is beneficial to the design of array face light weight.
The integrated layer printed board is communicated with the antenna unit, the T/R assembly, the power supply and other modules, so that the network inside the integrated layer can be directly realized through the printed board strip line, all networks are highly integrated in the integrated layer, compared with the traditional active subarray integrated layer module, the weight of a blind-mate connector and a structural shell is omitted, and the structural weight is greatly reduced at the level of an active subarray.
In addition, the array surface adopts a simplified design on the structure, and only comprises necessary equipment such as an antenna unit, a T/R component and the like, and a frequency source, a DBF, a primary power supply rectifying part and the like are placed at other positions (such as a comprehensive processing box).
The array surface mainly comprises an optical fiber transmission link and a radio frequency link (comprising a clock, a local oscillator, two local oscillators and monitoring).
The optical fiber transmission link is divided into an uplink control transmission link and a downlink data transmission link. The uplink control transmission link adopts an amplifying and power dividing transmission scheme, mainly comprises an optical amplifying module, a 1:8 optical power dividing module, an uplink and downlink transmission optical cable and the like, and realizes the transmission function of the array surface control signal. The downlink data transmission link realizes the real-time transmission function of mass data of the array surface component. The downlink data transmission adopts multimode optical fibers. The multimode optical fiber is subjected to line concentration processing and then sent to a Digital Beam Forming (DBF) subsystem. The transmission block diagram of the optical link is shown in fig. 5.
The system according to the second aspect of the present invention, wherein the active subarray of the radar front end comprises a chip synthesis layer and a digital transceiver module; each active subarray is integrally constructed by a chip integrated layer, and radio frequency channels of the T/R assembly are connected through printed board wiring; the integrated layer printed board is directly connected with the antenna unit, the T/R assembly and the power supply;
the digital transceiver module includes: the device comprises a multichannel digital board, a frequency mixing module, an active subarray front-end drive amplifier, a radio frequency switching cable and an accessory structure.
The system according to the second aspect of the present invention, the signal processing module of the radar back-end includes the following sub-modules: the system comprises a narrow pulse eliminating sub-module, a pulse compressing sub-module, an anti-synchronous sub-module, a moving target detecting sub-module, a self-adaptive digital beam forming sub-module, a clutter map sub-module, a constant false alarm processing sub-module and a target identifying sub-module;
the data processing module of the radar back end comprises: the system comprises a self-adaptive coherence estimation sub-module, an iterative angle measurement sub-module, a point track processing sub-module and an energy scheduling sub-module.
The system according to the second aspect of the present invention, the point track processing submodule includes: correlation processing unit, smoothing processing unit and filtering processing unit.
According to the system of the second aspect of the invention, the display control terminal at the rear end of the radar adopts a portable remote terminal, and the original video and the point track are simultaneously transmitted to the display control terminal for display; the portable remote terminal is capable of performing a remote control operation on the radar.
Example 1
Taking a small-sized radar as an example,
the comprehensive unmanned aerial vehicle identification step comprises the following steps:
1. the point track distribution is modeled (Brownian function model in the text). And analyzing the characteristic difference between the clutter and the target by extracting the correlation of the space, so as to remove false point tracks and distinguish real point tracks from clutter points. (distance correlation coefficient and distance profile feature used herein)
2. The method comprises the steps of performing image-like processing on the point tracks, wherein the adopted preprocessing method comprises pixelation, multilayering and sliding coverage of image blocks and image areas of different sectors. And after preprocessing, a target tracking algorithm based on deep learning is adopted, and the recognition targets are distinguished according to the track characteristics and the track movement characteristics.
TABLE 1 track characteristics for multiple classes of targets
Figure SMS_27
Table 2 common aerodynamic object motion feature prior statistics
Figure SMS_28
3. And obtaining the radar cross-sectional area RCS measurement value of the target by using the radar cross-sectional area RCS measurement. And distinguishing target types such as unmanned aerial vehicles according to the radar cross-sectional area RCS value of the typical target.
4. And (3) judging the unmanned aerial vehicle and other targets according to the extracted length result by carrying out one-dimensional range profile measurement on the targets.
5. And distinguishing targets such as jet type, helicopter, propeller and the like according to the spectrum modulation of the targets. And for the slow unmanned aerial vehicle and the ground vehicle target, distinguishing is further carried out by extracting environment entropy and signal-to-noise ratio two-dimensional characteristics after power transformation. For unmanned aerial vehicles and ground automobiles, as the recognition degree is not high in speed, meanwhile, the jet engine modulation JEM modulation characteristic cannot be observed by the radar, the environment entropy and the signal-to-noise ratio two-dimensional characteristic after power transformation are extracted in classification, 300 frames of training samples are obtained, and 300 frames of test samples (150 frames of automobiles and unmanned aerial vehicles) are tested.
TABLE 3 Classification recognition results
Figure SMS_29
6. The single-feature single-recognition accuracy is low, and sequential recognition is adopted to improve the recognition accuracy. And calculating the matching degree according to the track characteristics, the motion characteristics, the radar scattering cross section RCS characteristics, the one-dimensional range profile characteristics and the jet engine modulation JEM characteristics by combining the characteristic vectors, sequentially processing, and giving a recognition result according to a set recognition criterion. The typical feature vector can be corrected according to the actual situation and fed back to the feature vector, so that the model is further corrected, and the accuracy is improved.
Sequential identification includes: the point track feature, the motion feature, the radar cross section RCS feature, the one-dimensional range profile feature and the JEM feature of the target are used with a feature vector Z= { Z 1 ,z 2 ,…,z 5 Represented by z, where j The value of (2) depends on the numerical characteristics of the characteristic components. Unmanned aerial vehicle in the recognition result of the system is an important focusing object, the recognition result is generalized into unmanned aerial vehicle targets and other targets, and O= [ O 1 ,o 2 ]Representing the recognition result o 1 For unmanned aerial vehicle target o 2 Is the other object.
Based on prior information such as feature library and the like, the initial feature value vector of different recognition results is Z i * ={z * i1 ,z * i2 ,…,z * i5 },i=1,2。
Establishing similarity calculation function of each feature component, and defining s for the j-th feature component j =d j (z j ,z * j ) J=1, 2,..5 is a matching degree calculation function: s is S i ∈[-1,1]
S i A larger value indicates a more similar characteristic value, in particular, when z=z * When s=d (z, z * ) =1, i.e. the characteristic features corresponding to the current recognition result are satisfied, and the recognition result is determined with a typical confidence.
And establishing a comprehensive correction model, and setting weights for each component according to the sample accumulation condition. Definition omega ij To identify the characteristic component z j For the recognition result o i Is then set of weights W i The expression is as follows:
W i ={ω i1i2 ,…,ω i5 i=1, 2, satisfy
Figure SMS_30
Calculating the similarity of comprehensive features, and the feature vector Z pairs the identification result o i The similarity of (i=1, 2) is:
Figure SMS_31
i=1,2
and obtaining the feature matching similarity degree of the target type, and obtaining a target recognition result on the basis of the set probability confidence coefficient.
The foregoing examples are provided merely to illustrate the technical aspects of the present invention and are not to be construed as limiting thereof, although the present invention has been described in detail with reference to the preferred embodiments, and it will be understood by those skilled in the art that: the technical scheme of the invention can be modified or replaced by the same, and the modified technical scheme cannot deviate from the spirit and scope of the technical scheme of the invention.

Claims (10)

1. An intelligent comprehensive unmanned aerial vehicle identification method is characterized by comprising the following steps:
step 1, receiving and accumulating detection signals of targets through a radar front end;
step 2, extracting multiple characteristics of a target from the detection signal received by the front end of the radar;
step 3, identifying point track data of the target to be executed by multiple characteristics of the acquired target; determining whether the target is a real target according to the point track data characteristics of the target, and if the target is determined to be the real target, turning to step 4; if the false target is determined, returning to the step 2;
step 4, pixelation processing is carried out on the point track data of the target, and layering processing is carried out on the image point track data after the pixelation processing according to the original point track;
step 5, calculating the radar cross-sectional area RCS of the real target in step 3, wherein the value sigma of the RCS of the target is expressed as:
Figure FDA0004153544130000011
wherein: the performance constant K corresponds to:
Figure FDA0004153544130000012
in the formula: r is R 0 Is the calibrated radar standard distance sigma 0 The radar cross-sectional area RCS of the target is calibrated, S/N is the detectable signal-to-noise ratio, and R is the distance from the real target to the radar;
step 6, carrying out modeling simulation on the unmanned aerial vehicle, and establishing an RCS feature library of the unmanned aerial vehicle according to an RCS simulation result of the radar cross-sectional area of the unmanned aerial vehicle; calculating a one-dimensional distance image p (t) of the target by adopting broadband data:
Figure FDA0004153544130000013
wherein t is time; i (t) is the in-phase component of the radar signal echo and q (t) is the quadrature component of the radar signal echo;
step 7, determining a real target to be identified according to the result of the step 4-6; and acquiring a plurality of characteristics of the real target to be identified, and determining the type of the target according to the identification of the plurality of characteristics.
2. The method of claim 1, wherein the target is a moving target, step 3 comprising: the probability that the point track data of the target is identified is expressed as:
Figure FDA0004153544130000021
wherein P () is probability density, t is time, X (t) represents a feature, X (t) is a continuous function of t, deltar is time increment, deltar > 0, H is least squares estimated slope; x takes a digital interval around the target track.
3. The method of claim 1, wherein step 4 comprises the sub-steps of:
step 4.1, layering processing of the original point tracks comprises the following steps: establishing a two-dimensional matrix by taking position information of target point track data as a reference and distance and azimuth, and dividing different layers on the basis of azimuth information matrix representation to represent different attribute characteristics of the target point track;
step 4.2, based on a target tracking algorithm of deep learning, adopting a gradient descent method to minimize a loss function network model, and reversely adjusting weight parameters of each layer of layering treatment layer by layer; and improving the precision of the network model through repeated iterative training.
4. The method of claim 1, wherein step 5 further comprises: measuring a plurality of calibration targets at a predetermined distance according to the radar to obtain fixed parameters of the calibration targets, including: the signal-to-noise ratio, the distance and the radar cross-sectional area RCS are used for calculating a performance constant according to the calibrated parameters so as to be applied to RCS calculation of a measured real target to obtain an RCS result of the target to be identified.
5. The method of claim 4, wherein step 7 comprises:
step 7.1, judging the type of a real target according to an original point track layering processing result of the real target, a radar scattering cross section area RCS of the real target and a one-dimensional range profile of the real target, and determining a low-altitude slow-speed aircraft target in the real target;
step 7.2, extracting a plurality of characteristics of the low-altitude slow-speed aircraft target, respectively identifying the target according to the plurality of characteristics, and carrying out information fusion on the identification result in a sequential manner; and determining whether the target belongs to an unmanned aerial vehicle target according to the fusion result.
6. An intelligent comprehensive unmanned aerial vehicle identification system for performing the intelligent comprehensive unmanned aerial vehicle identification method of any of claims 1-5, the system comprising: a radar front end and a radar rear end; the front end of the radar is an antenna array surface, and the rear end of the radar is all arranged in a comprehensive processing box;
the radar front end includes: an antenna array surface formed by a plurality of active subarrays; each active subarray is composed of a chip T/R component; the active subarray receives a frequency signal of a frequency source and a control signal of a radar control system, and provides a signal detected by the active subarray to a signal processing module in the form of an optical signal through a downlink data bus;
the radar back end includes: the system comprises a radar control system module, a frequency source module, a time service equipment module, an optical power division module, a signal processing module and a data processing module;
the optical power dividing module comprises: the optical power dividing module divides an uplink frequency source signal and a radar control signal into a plurality of paths of clock power signals, a plurality of paths of local oscillation signals and a plurality of paths of control signals after electro-optical conversion and photoelectric conversion, and the plurality of paths of clock power signals, the plurality of paths of local oscillation signals and the plurality of paths of control signals are respectively provided for a plurality of active subarrays.
7. The system of claim 6, wherein the active subarrays of the radar front end comprise a chipped integrated layer and a digital transceiver module; each active subarray is integrally constructed by a chip integrated layer, and radio frequency channels of the T/R assembly are connected through printed board wiring; the integrated layer printed board is directly connected with the antenna unit, the T/R assembly and the power supply;
the digital transceiver module includes: the device comprises a multichannel digital board, a frequency mixing module, an active subarray front-end drive amplifier, a radio frequency switching cable and an accessory structure.
8. The system of claim 6, wherein the signal processing module of the radar back end comprises the following sub-modules: the system comprises a narrow pulse eliminating sub-module, a pulse compressing sub-module, an anti-synchronous sub-module, a moving target detecting sub-module, a self-adaptive digital beam forming sub-module, a clutter map sub-module, a constant false alarm processing sub-module and a target identifying sub-module;
the data processing module of the radar back end comprises: the system comprises a self-adaptive coherence estimation sub-module, an iterative angle measurement sub-module, a point track processing sub-module and an energy scheduling sub-module.
9. The system of claim 8, wherein the point track processing submodule includes: correlation processing unit, smoothing processing unit and filtering processing unit.
10. The system of claim 6, wherein a display control terminal at the rear end of the radar adopts a portable remote terminal, and an original video and a point track are simultaneously transmitted to the display control terminal for display; the portable remote terminal is capable of performing a remote control operation on the radar.
CN202310326862.2A 2023-03-30 2023-03-30 Intelligent comprehensive unmanned aerial vehicle recognition system and method Active CN116383717B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310326862.2A CN116383717B (en) 2023-03-30 2023-03-30 Intelligent comprehensive unmanned aerial vehicle recognition system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310326862.2A CN116383717B (en) 2023-03-30 2023-03-30 Intelligent comprehensive unmanned aerial vehicle recognition system and method

Publications (2)

Publication Number Publication Date
CN116383717A true CN116383717A (en) 2023-07-04
CN116383717B CN116383717B (en) 2024-04-30

Family

ID=86980084

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310326862.2A Active CN116383717B (en) 2023-03-30 2023-03-30 Intelligent comprehensive unmanned aerial vehicle recognition system and method

Country Status (1)

Country Link
CN (1) CN116383717B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030164792A1 (en) * 2000-07-24 2003-09-04 Mohammed Jahangir Method and apparatus for recognising a radar target
CN106597411A (en) * 2016-12-30 2017-04-26 无锡市雷华科技有限公司 Radar signal processing method
KR101872017B1 (en) * 2017-11-07 2018-06-27 엘아이지넥스원 주식회사 Target Identifying Method Based on Extracted Scattering Point in Millimeter Wave Seeker and Recording Medium Storing Computer Program thereof
RU2020121449A (en) * 2020-06-26 2021-12-27 Российская Федерация, от имени которой выступает Федеральное государственное казенное учреждение "Войсковая часть 68240" Multifunctional complex of means of detection, tracking and radio countermeasures to the use of small unmanned aerial vehicles
CN115291207A (en) * 2022-08-17 2022-11-04 西南石油大学 Multi-target detection method for small rotor unmanned aerial vehicle based on MIMO radar

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030164792A1 (en) * 2000-07-24 2003-09-04 Mohammed Jahangir Method and apparatus for recognising a radar target
CN106597411A (en) * 2016-12-30 2017-04-26 无锡市雷华科技有限公司 Radar signal processing method
KR101872017B1 (en) * 2017-11-07 2018-06-27 엘아이지넥스원 주식회사 Target Identifying Method Based on Extracted Scattering Point in Millimeter Wave Seeker and Recording Medium Storing Computer Program thereof
RU2020121449A (en) * 2020-06-26 2021-12-27 Российская Федерация, от имени которой выступает Федеральное государственное казенное учреждение "Войсковая часть 68240" Multifunctional complex of means of detection, tracking and radio countermeasures to the use of small unmanned aerial vehicles
CN115291207A (en) * 2022-08-17 2022-11-04 西南石油大学 Multi-target detection method for small rotor unmanned aerial vehicle based on MIMO radar

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JUAN SHEN 等: "Site optimization for multifunction radar based on big data", IEEE, 31 December 2019 (2019-12-31), pages 1 - 5, XP033813457, DOI: 10.1109/ICSIDP47821.2019.9173336 *
王久友 等: "一种适用于MIMO 雷达的距离⁃角度耦合 去除新方法", 现代电子技术, vol. 46, no. 1, 1 January 2023 (2023-01-01), pages 6 - 11 *

Also Published As

Publication number Publication date
CN116383717B (en) 2024-04-30

Similar Documents

Publication Publication Date Title
CN111458711B (en) Satellite-borne dual-band SAR system and detection method of ship target
US9562968B2 (en) Sensor system and method for determining target location using sparsity-based processing
Barton Radar system analysis and modeling
CN108761419B (en) Low-altitude wind shear wind speed estimation method based on self-adaptive processing of combined space-time main channel
CN108549059B (en) Low-altitude target elevation angle estimation method under complex terrain condition
CN104749555B (en) Phase difference direction finding and spatial spectrum direction finding combined direction-finding positioning system
CN104991249A (en) Landslide MIMO radar monitoring system and monitoring method
CN109239657A (en) Load the radiation source high-precision locating method under nested battle array unmanned aerial vehicle platform
CN109581352A (en) A kind of super-resolution angle measuring system based on millimetre-wave radar
CN111257655B (en) Intercepted distance testing device for radio frequency sensor
CN110297213A (en) Radiation source positioning device and method based on the unmanned aerial vehicle platform for loading relatively prime linear array
CN104280566A (en) Low altitude wind shear wind speed estimation method based on space-time amplitude and phase estimation
CN114488034A (en) Passive detection and interference reconnaissance integrated device and method
CN113805169B (en) Space target low-power consumption small satellite radar searching and tracking method
CN116383717B (en) Intelligent comprehensive unmanned aerial vehicle recognition system and method
CN108254763A (en) A kind of business small unmanned plane remote probe and method of disposal
CN112835034B (en) Dual-channel radar ground height measurement system and method
CN111398960B (en) GEO satellite-borne SAR bistatic configuration design method based on moving target detection
CN115469286A (en) Super-resolution angle measurement method based on millimeter wave automobile radar minimum redundancy MIMO array
CN111239682B (en) Electromagnetic emission source positioning system and method
Zheng et al. Uav direction estimation in high-speed railway environment
CN113392522A (en) Electromagnetic compatibility evaluation method for multi-antenna system of aerial remote sensing platform
CN112835006A (en) Method and system for tracking radar small-target detection on sea based on interframe accumulation
Zheng et al. UAV Direction Estimation Based on Spatial Smoothing Technology
CN117034836B (en) Satellite multi-load data processing system evaluation method based on software defined chip

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant