CN115508821A - Multisource fuses unmanned aerial vehicle intelligent detection system - Google Patents

Multisource fuses unmanned aerial vehicle intelligent detection system Download PDF

Info

Publication number
CN115508821A
CN115508821A CN202210728659.3A CN202210728659A CN115508821A CN 115508821 A CN115508821 A CN 115508821A CN 202210728659 A CN202210728659 A CN 202210728659A CN 115508821 A CN115508821 A CN 115508821A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
radio
radio frequency
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210728659.3A
Other languages
Chinese (zh)
Inventor
多滨
王豪
罗俊松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Tianzong Yuanhang Intelligent Technology Co ltd
Original Assignee
Chengdu Tianzong Yuanhang Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Tianzong Yuanhang Intelligent Technology Co ltd filed Critical Chengdu Tianzong Yuanhang Intelligent Technology Co ltd
Priority to CN202210728659.3A priority Critical patent/CN115508821A/en
Publication of CN115508821A publication Critical patent/CN115508821A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/42Simultaneous measurement of distance and other co-ordinates
    • G01S13/426Scanning radar, e.g. 3D radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/505Systems of measurement based on relative movement of target using Doppler effect for determining closest range to a target or corresponding time, e.g. miss-distance indicator
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/882Radar or analogous systems specially adapted for specific applications for altimeters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/886Radar or analogous systems specially adapted for specific applications for alarm systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/36Means for anti-jamming, e.g. ECCM, i.e. electronic counter-counter measures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention requests to protect an intelligent detection system of a multi-source fusion unmanned aerial vehicle, wherein a radio frequency fingerprint identification module is used for collecting radio frequency signals of an unmanned aerial vehicle target in an environment; the radio detection module is used for detecting radio signals of the unmanned aerial vehicle in the environment; the radar detection module is used for scanning the signal of the unmanned aerial vehicle in the environment through electromagnetic waves; the photoelectric image detection module is used for capturing an image of the unmanned aerial vehicle in the environment; the signal processor is used for performing signal processing on a radio frequency signal of an unmanned aerial vehicle target, a radio signal of the unmanned aerial vehicle and an electromagnetic wave to a scanning signal of the unmanned aerial vehicle in the environment and a target image of the unmanned aerial vehicle; the characteristic fusion module is used for carrying out decision-making layer characteristic fusion on the recognition results; tracking the target all the time through a photoelectric device; the unmanned aerial vehicle intelligent detection system module is used for displaying signal data, processing results, feature fusion results and real-time data of unmanned aerial vehicle tracking and positioning. The invention realizes the accurate, effective and reliable detection of the unmanned aerial vehicle.

Description

Multisource fuses unmanned aerial vehicle intelligent detection system
Technical Field
The invention relates to the field of unmanned aerial vehicle detection, in particular to an unmanned aerial vehicle intelligent detection system with multi-source fusion.
Background
In order to meet the detection requirements of the unmanned aerial vehicle under various complex conditions, the detection technical means of the unmanned aerial vehicle comprise technical categories such as radio frequency fingerprints, radio, radar, photoelectricity and the like. The radio frequency fingerprint identification technology passively detects signals of the unmanned aerial vehicle through a radio frequency antenna, processes the radio frequency signals through a correlation algorithm, extracts the radio frequency fingerprint to identify the specific model of the unmanned aerial vehicle, has an effective action range of more than 3km, can be applied in most scenes, and has low identification accuracy under the condition of multiple unmanned aerial vehicles; the radio detection technology detects the unmanned aerial vehicle signal through the radio detection equipment, can carry out lateral direction and positioning on the unmanned aerial vehicle signal, but is easily interfered by other radio signals; the radar detection measures the direction of the unmanned aerial vehicle by sending electromagnetic waves and detecting echoes reflected by the unmanned aerial vehicle, the effective action range of the radar is wide, but the cost is high, and the difficulty in detecting the small unmanned aerial vehicle is high; photoelectric detection utilizes the orbit to differentiate and the light stream characteristic discernment unmanned aerial vehicle target through gathering unmanned aerial vehicle image, but effective working distance is less, and easily receives weather effect, and is lower in the aspect of unmanned aerial vehicle recognition rate. Above unmanned aerial vehicle surveys identification technology all has its limitation, adopts single technique to be difficult to accurate, effective and reliably realize unmanned aerial vehicle's listening.
The existing system aims to improve detection precision and anti-interference capability, the efficiency and identification rate of a plurality of unmanned aerial vehicles are still insufficient, meanwhile, related technologies capable of identifying specific models and manufacturers of the unmanned aerial vehicles are not available, the radio frequency fingerprint technology is superior in this respect, differences generated by electronic hardware in the production and manufacturing process are finally reflected on radio frequency signals, namely, radio frequency fingerprints, even if electronic hardware produced in the same batch is different, the models and the manufacturers of the unmanned aerial vehicles can be specifically identified by adopting the radio frequency fingerprint technology, the invention adopts multi-source fusion for the identification of the plurality of unmanned aerial vehicles, and adopts various identification methods of radio frequency fingerprint identification, radio detection, radar detection and photoelectric identification, so that the problems of low characteristic identification rate and the like in the single technology are solved, and meanwhile, various characteristic data sets are combined to realize the efficient and reliable identification of the unmanned aerial vehicles.
Disclosure of Invention
The present invention is directed to solving the above problems of the prior art. A multisource integration unmanned aerial vehicle intelligent detection system is provided. The technical scheme of the invention is as follows:
the utility model provides a multisource fuses unmanned aerial vehicle intelligent detection system, it includes: radio frequency fingerprint identification module, radio detection module, radar detection module, photoelectric image detection module, signal processor, photoelectric image information processing module, characteristic fusion module, unmanned aerial vehicle tracking and positioning module and unmanned aerial vehicle intelligent detection system module, wherein
The radio frequency fingerprint identification module is used for collecting radio frequency signals of an unmanned aerial vehicle target in the environment;
the radio detection module is used for detecting radio signals of the unmanned aerial vehicle in the environment;
the radar detection module is used for scanning signals of the unmanned aerial vehicle in the environment through electromagnetic waves;
the photoelectric image detection module is used for capturing an image of the unmanned aerial vehicle in the environment;
the signal processor is used for carrying out signal processing on a radio frequency signal of an unmanned aerial vehicle target, a radio signal of the unmanned aerial vehicle, a scanning signal of the unmanned aerial vehicle in an environment and a target image of the unmanned aerial vehicle by using electromagnetic waves to obtain an identification result of a radio frequency fingerprint of the unmanned aerial vehicle, an identification and positioning result of the radio signal of the unmanned aerial vehicle, an identification and positioning result of radar detection of the unmanned aerial vehicle and an identification result of the image of the unmanned aerial vehicle;
the feature fusion module is used for performing decision-making layer feature fusion on an identification result of the radio frequency fingerprint of the unmanned aerial vehicle, an identification and positioning result of a radio signal of the unmanned aerial vehicle, an identification and positioning result of radar detection of the unmanned aerial vehicle and an identification result of an image of the unmanned aerial vehicle;
the unmanned aerial vehicle tracking and positioning module comprises a tracking module and a positioning module, a photoelectric device tracks a target all the time and rotates along with the unmanned aerial vehicle at the position of a previous frame of image acquired by the photoelectric image detection module in the unmanned aerial vehicle tracking stage, radar detection is used as a main part and radio detection is used as an auxiliary part in the unmanned aerial vehicle positioning stage, and the radar detection module feeds back the direction, the distance and the height of the target and radio detection feeds back the direction of the target;
the unmanned aerial vehicle intelligent detection system module is used for displaying unmanned aerial vehicle radio frequency signal data and processing results, radio detection information data and processing results, radar detection information data and processing results, photoelectric image data and processing results, feature fusion results and real-time data of unmanned aerial vehicle tracking and positioning.
Further, the signal processor comprises a radio frequency signal processing module, a radio detection information processing module and a radar detection information processing module, wherein,
the radio frequency signal processing module is used for carrying out corresponding preprocessing, denoising, feature extraction and identification algorithms on the acquired radio frequency signals to identify the unmanned aerial vehicle and obtain an identification result of the radio frequency fingerprint of the unmanned aerial vehicle;
the radio detection information processing module is used for carrying out graying processing on the intensity spatial distribution of the radio signals formed by detection, obtaining the characteristics of a gradient histogram, identifying a target unmanned aerial vehicle and obtaining the identification and positioning results of the radio signals of the unmanned aerial vehicle;
the radar detection information processing module is used for processing the result of electromagnetic wave scanning by using a wavelet threshold denoising method, extracting target micro Doppler characteristics and obtaining the identification and positioning results of the radar detection of the unmanned aerial vehicle;
the photoelectric image information processing module is used for carrying out target detection on the collected unmanned aerial vehicle image, screening out an unmanned aerial vehicle target by utilizing track discrimination and optical flow characteristic discrimination after detecting a moving target, and carrying out feature recognition on the unmanned aerial vehicle target to obtain a recognition result of the unmanned aerial vehicle image;
further, the radio frequency fingerprint identification module specifically includes the following steps:
s1, passively detecting a 2.4GH frequency band radio frequency signal by a radio frequency signal acquisition device to generate a signal oscillogram;
s2, preprocessing the signal waveform, and continuously down-sampling the signal by using wavelet transform;
s3, the processed signals enter a two-stage detection system, the first stage distinguishes radio frequency signals or noise, the separated radio frequency signals enter a second stage, the radio frequency signals are distinguished as unmanned aerial vehicle signals and other Internet of things radio frequency signals in the second stage, and the separated unmanned aerial vehicle radio frequency signals are subjected to subsequent processing;
s4, when the radio frequency signal of the unmanned aerial vehicle is detected, short-time Fourier transform is adopted, and then characteristics are extracted;
s5, using an identification algorithm including KNN, SVM or NN for the extracted features, comparing the extracted features with the existing radio frequency fingerprint data set, judging that the radio frequency signal which is not in the data set base is an illegal unmanned aerial vehicle, and giving a corresponding prompt by the system at a radio frequency fingerprint identification module;
s6, when data collection is carried out, storing the collected radio frequency fingerprint characteristics into a radio frequency fingerprint characteristic library for subsequent radio frequency fingerprint identification;
and S7, transmitting the detection result into a multi-source feature fusion module for further identification.
Further, the step S3 of detecting the two-machine detection system specifically includes:
A1. inputting the detected signal into a two-stage detection system;
A2. in the first stage, a Bayesian hypothesis test is adopted by a detector to determine whether a captured signal is a radio frequency signal or noise, and if the radio frequency signal is detected, a second-stage detector is activated;
A3. in the second phase, the detector decides whether the captured radio frequency signal comes from an interferer or from a drone, the interferer includes WiFi or bluetooth devices, the detector uses bandwidth analysis and modulation-based characteristics to perform interference detection, first the bandwidth of WiFi is significantly higher than that of drone and bluetooth signals, so the WiFi signal can be separated by bandwidth identification, most mobile bluetooth devices use gaussian frequency shift keying (GFSK/FSK) modulation, in the present invention, two GFSK/FSK modulation characteristics, i.e. frequency offset and symbol duration, will be used to distinguish bluetooth signals. Thus, if the detected radio frequency signal is not from a WiFi or bluetooth interferer, it is assumed to be a signal transmitted by the drone; thus, the detected signals pass to the next stage feature extraction classification system to identify the drone.
Further, the detected signal is transmitted to a feature extraction and classification system at the next stage to identify the unmanned aerial vehicle, and the method specifically includes:
B1. the detected unmanned aerial vehicle signal is transmitted into a feature extraction module to be extracted;
B2. for the representation of the radio frequency signal in the energy time-frequency domain, a spectrogram of the signal is obtained by short-time Fourier transform SIFT, and the formula for calculating the spectrogram is as follows
Figure RE-GDA0003859102110000041
Wherein, y T [n]Is a pre-processed signal captured by the detection system, m is discrete time, ω is frequency, k is a discrete variable, ω n]Is a sliding window function as a filter;
B3. in the obtained spectrogram, the maximum energy value is taken along a time axis, an energy track is calculated, an energy transient state is estimated by searching the maximum mutation of the mean value or the variance of the normalized energy track, and the transient characteristic of the signal in an energy domain is defined by the energy transient characteristic;
B4. and detecting an energy transient state, selecting a characteristic value with higher weight to extract radio frequency fingerprints, wherein the radio frequency fingerprints are the statistical time characteristics representing the energy transient state.
Further, the radio detection module specifically includes the following steps:
s21, after the radio scanning device carries out omnibearing scanning, data drawing is carried out on the collected signals to obtain a radio signal intensity distribution diagram;
s22, performing data correction on the drawn signal intensity distribution diagram, and performing gray processing;
s23, acquiring gradient and direction from the image, partitioning the image, performing normalization processing on the partitioned sub-blocks, combining histograms of the sub-blocks to obtain a directional gradient histogram feature vector on the image, and acquiring the direction of the unmanned aerial vehicle;
s24, identifying the extracted features, matching the extracted features with the radio feature data set of the unmanned aerial vehicle, and sending an alarm by the unmatched unmanned aerial vehicle at the radio detection module and judging the unmanned aerial vehicle to be a suspected illegal unmanned aerial vehicle;
s25, when data collection is carried out, storing the collected radio signal characteristics of the unmanned aerial vehicle into a radio characteristic library of the unmanned aerial vehicle so as to facilitate radio identification of the unmanned aerial vehicle in the following process;
and S26, transmitting the detection result into a multi-source feature fusion module for further accurate identification.
Further, the radar detection module specifically includes the following steps:
s31, the radar detection device emits electromagnetic waves to carry out all-dimensional scanning on the surrounding air environment, and the target unmanned aerial vehicle generates an echo signal when being scanned by the radar;
s32, after the radar receives the echo signal sent back, denoising the noise-containing radar echo signal of the unmanned aerial vehicle by using a wavelet threshold denoising method;
s33, amplifying the echo signals, extracting micro Doppler characteristics, measuring the distance of the unmanned aerial vehicle by using the time when electromagnetic waves reach the target unmanned aerial vehicle and return to a radar antenna from the target unmanned aerial vehicle, and measuring the height and direction information of the unmanned aerial vehicle by using the directional characteristic of a radar;
s34, identifying the extracted features, matching the extracted features with signals in a radar micro Doppler feature database, judging the unmatched unmanned aerial vehicle as a suspected illegal unmanned aerial vehicle, and sending an alarm in a radar detection module;
s35, during data collection, storing the collected characteristics into a radar micro Doppler characteristic data set library for subsequent identification;
and S36, transmitting the detection result into a multi-source feature fusion module for further accurate identification.
Further, the feature fusion module specifically includes the following steps:
s51, transmitting the radio frequency fingerprint identification decision result, the radio identification decision result, the radar identification decision result and the photoelectric image identification decision result into a feature fusion module;
s52, fusing radio frequency and radio frequency data, radio detection data, radar detection data and photoelectric image detection data based on a weighting method principle in a layered decision;
s53, the fused data are transmitted into a fusion recognition module, when an illegal unmanned aerial vehicle is detected, the system generates an alarm and tracks and positions the target unmanned aerial vehicle in real time;
and S54, finally determining the specific type of the unmanned aerial vehicle target according to the similarity of the fused target and each characteristic template.
Further, the S52 weighting method specifically includes the following steps:
C1. calculating similarity HD between the radio frequency fingerprint identification decision result, the radio identification decision result, the radar identification decision result and the photoelectric identification decision result and the unmanned aerial vehicle feature library by using Hamming distance a 、HD b 、 HD c And HD d And calculating the similarity between the decision result and different characteristic values in the characteristic library by using a Hamming distance algorithm.
C2. The weighted fusion unit fuses the radio frequency fingerprint identification decision result, the radio identification decision result, the radar identification decision result and the photoelectric identification decision result, and the fusion formula is as follows:
FS=w a ×(1-HD a )+w b ×(1-HD b )+w c ×(1-HD c )+w d ×(1-HD d )
wherein FS represents the fused recognition result and nobodySimilarity of machine feature library, w a 、w b 、w c And w d Weight coefficients representing results of radio frequency fingerprint, radio, radar and photoelectric recognition, and w a +w b +w c +w d =1, the weight coefficient can be evaluated according to actual conditions, and finally the type of the unmanned aerial vehicle target can be determined.
Further, unmanned aerial vehicle is tracked and orientation module is used for handling the unmanned aerial vehicle positional information who obtains, when the system produced the warning, carries out real-time tracking and location to the illegal unmanned aerial vehicle that the feature fuses the identification module and discerns, and concrete step is as follows:
s61, the system generates an alarm, and a radar positioning device, a radio detection device and a photoelectric image tracking device of the system start to operate;
s62, positioning the target unmanned aerial vehicle in real time according to the direction, the distance and the height of the target obtained from the radar detection module and the positioning information obtained from the radio detection module;
s63, after the target is determined to be the unmanned aerial vehicle, acquiring the position of the previous frame of image detected by the photoelectric image detection module, and then tracking the target by the photoelectric device all the time;
s64, performing combined network pre-training based on the established unmanned aerial vehicle image data set, performing Kalman filtering prediction on the previous frame, updating the current frame according to the detection result, performing multiple matching by combining the motion and appearance characteristics reserved during initialization, outputting the track of a confirmed state, and adjusting a photoelectric lens to track the target unmanned aerial vehicle in real time;
and S65, when the target unmanned aerial vehicle flies away from the no-fly zone, finishing tracking, and finishing operation of the unmanned aerial vehicle tracking and positioning module.
The invention has the following advantages and beneficial effects:
compared with other detection systems, the invention adopts radio frequency fingerprint identification, radio identification, radar identification, photoelectric identification and multi-source fusion technology, can identify the specific model of the unmanned aerial vehicle, and has higher identification rate when facing multiple unmanned aerial vehicles; the limitation of single-technology unmanned aerial vehicle detection is solved, and all-weather working capacity is achieved; aiming at other wireless signals, the radio frequency fingerprint identification module in the system can distinguish WIFI signals, bluetooth signals and unmanned aerial vehicle signals by using the bandwidth and modulation characteristics of the wireless signals, and the system has stronger anti-interference capability and wider application scene; by establishing the radio frequency fingerprint, the radio signal, the radar micro Doppler characteristic and the photoelectric image data set of the unmanned aerial vehicle and utilizing a weighting method to perform decision fusion on the identification results of all modules, the model of the unmanned aerial vehicle can be accurately, effectively and reliably identified. The final result can be applied to prisons, airports and important activity places, and has rich market prospect and value.
Drawings
FIG. 1 is a block diagram of a radio frequency fingerprinting module according to a preferred embodiment of the present invention;
FIG. 2 is a block diagram of a radio detection identification module according to the present invention;
FIG. 3 is a block diagram of a radar detection and identification module of the present invention;
FIG. 4 is a block diagram of a photoelectric identification module according to the present invention;
FIG. 5 is a block diagram of a multi-source feature fusion module of the present invention;
fig. 6 is a block diagram of an intelligent detection system of a multi-source fusion unmanned aerial vehicle according to the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be described in detail and clearly in the following with reference to the accompanying drawings. The described embodiments are only some of the embodiments of the present invention.
The technical scheme for solving the technical problems is as follows:
as shown in fig. 6, the invention is an unmanned aerial vehicle intelligent detection system with multi-source fusion, comprising a radio frequency fingerprint identification module, a radio detection module, a radar detection module, a photoelectric detection module, a feature fusion module, an unmanned aerial vehicle tracking and positioning module and an unmanned aerial vehicle intelligent detection platform.
The radio frequency fingerprint identification module shown in fig. 1 specifically includes the following steps:
s1, passively detecting a 2.4GH frequency band radio frequency signal by a radio frequency signal acquisition device to generate a signal oscillogram;
s2, preprocessing the signal waveform, and continuously down-sampling the signal by using wavelet transform;
s3, the processed signals enter a two-stage detection system, the first stage distinguishes radio frequency signals or noise, the separated radio frequency signals enter a second stage, the radio frequency signals are distinguished as unmanned aerial vehicle signals and other Internet of things radio frequency signals in the second stage, and the separated unmanned aerial vehicle radio frequency signals are subjected to subsequent processing;
s4, when the radio frequency signal of the unmanned aerial vehicle is detected, short-time Fourier transform is adopted, and then characteristics are extracted;
s5, using an identification algorithm for the extracted features, wherein the identification algorithm comprises KNN, SVM or NN but is not limited to KNN, SVM or NN, comparing the identification algorithm with the existing radio frequency fingerprint data set, judging that the radio frequency signals which are not in the database are illegal unmanned aerial vehicles, and giving corresponding prompts by the system at a radio frequency fingerprint identification module;
s6, when data collection is carried out, storing the collected radio frequency fingerprint characteristics into a radio frequency fingerprint characteristic library for subsequent radio frequency fingerprint identification;
and S7, transmitting the detection result into a multi-source feature fusion module for further accurate identification.
Further, the step S3 of detecting the system by two machines specifically includes:
1. inputting the detected signal into a two-stage detection system;
2. in the first stage, the detector employs bayesian hypothesis testing to determine whether the captured signal is a radio frequency signal or noise. If a radio frequency signal is detected, activating a second stage detector;
3. in the second phase, the detector decides whether the captured radio frequency signal is from an interferer (WiFi or bluetooth device) or a drone. The detector uses bandwidth analysis and modulation-based characteristics for interference detection, and thus if the detected radio frequency signal is not from a WiFi or bluetooth interference source, it is assumed to be the signal transmitted by the drone. Thus, the detected signals pass to the next stage feature extraction classification system to identify the drone.
The step S3 of the two-machine detection system specifically comprises the following steps:
1. the detected unmanned aerial vehicle signal is transmitted into the feature extraction module to be extracted.
2. For a representation of the radio frequency signal in the energy time-frequency domain, a spectrogram of the signal is obtained by short-time fourier transform (STFT). The formula for calculating the spectrogram is as follows
Figure RE-GDA0003859102110000091
Wherein, y T [n]Is a pre-processed signal captured by the detection system, m is a discrete time, k is a discrete variable, ω is a frequency, ω n]As a function of the sliding window of the filter.
3. The energy trajectory is calculated by taking the maximum energy value along the time axis in the obtained spectrogram. The energy transient is estimated by finding the largest abrupt change in the mean or variance of the normalized energy trajectory. The energy transient characteristics define the transient characteristics of the signal in the energy domain.
4. When an energy transient is detected, radio frequency fingerprints (a group of 15 statistical features, which can be selected as feature values with higher weights) are extracted, and four feature values with the highest weights are selected, wherein the radio frequency fingerprints are the statistical time features representing the energy transient.
The radio detection module shown in fig. 2 specifically includes the following steps:
s21, after the radio scanning device carries out omnibearing scanning, data drawing is carried out on the collected signals to obtain a radio signal intensity distribution diagram;
s22, performing data correction on the drawn signal intensity distribution diagram, and performing gray processing;
s23, acquiring gradient and direction from the image, partitioning the image, performing normalization processing on the partitioned sub-blocks, combining histograms of the sub-blocks to obtain a directional gradient histogram feature vector on the image, and acquiring the direction of the unmanned aerial vehicle;
s24, identifying the extracted features, matching the extracted features with the radio feature data set of the unmanned aerial vehicle, sending an alarm by the unmatched unmanned aerial vehicle at the radio detection module, and judging that the unmanned aerial vehicle is a suspected illegal unmanned aerial vehicle;
s25, when data collection is carried out, storing the collected radio signal characteristics of the unmanned aerial vehicle into an unmanned aerial vehicle radio characteristic library for subsequent unmanned aerial vehicle radio identification;
and S26, transmitting the detection result into a multi-source feature fusion module for further accurate identification.
The radar detection module specifically includes the following steps as shown in fig. 3:
s31, the radar detection device emits electromagnetic waves to carry out all-around scanning on the surrounding air environment, and the target unmanned aerial vehicle generates an echo signal when being scanned by the radar;
s32, after the radar receives the echo signal sent back, denoising the noise-containing radar echo signal of the unmanned aerial vehicle by using a wavelet threshold denoising method;
s33, amplifying the echo signals, extracting micro Doppler characteristics, measuring the distance of the unmanned aerial vehicle by using the time when electromagnetic waves reach the target unmanned aerial vehicle and return to a radar antenna from the target unmanned aerial vehicle, and measuring the information such as the height, the direction and the like of the unmanned aerial vehicle by using the directional characteristic of a radar;
s34, identifying the extracted features, matching the extracted features with signals in a radar micro Doppler feature database, judging the unmanned aerial vehicle which is not matched as a suspected illegal unmanned aerial vehicle, and giving an alarm in a radar detection module;
s35, during data collection, storing the collected characteristics into a radar micro Doppler characteristic data set library for subsequent identification;
and S36, transmitting the detection result into a multi-source feature fusion module for further accurate identification.
The photodetection module specifically includes the following steps as shown in fig. 4:
s41, detecting surrounding aerial targets by using a photoelectric detection device, and collecting images in real time;
s42, the target detection module receives video data shot by the photoelectric camera, detects whether a moving target exists in the monitoring range, and sends the moving track of the target and the information of the area where the target is located to the target identification module when the moving target is detected;
s43, after the target detection module detects the moving target, preliminarily screening the target through trajectory discrimination and optical flow characteristic discrimination to eliminate part of non-unmanned aerial vehicle targets;
s44, extracting the characteristics of the screened unmanned aerial vehicle image, and storing the acquired characteristics into an unmanned aerial vehicle photoelectric image characteristic library for subsequent identification when data collection is carried out;
s45, the extracted photoelectric image features of the unmanned aerial vehicle are transmitted into a target recognition system and are matched with a photoelectric image feature data set of the unmanned aerial vehicle, the unmatched unmanned aerial vehicle is identified as an illegal unmanned aerial vehicle, and an unmanned aerial vehicle photoelectric detection module gives an alarm;
and S46, transmitting the detection result into a multi-source feature fusion module for further accurate identification.
The feature fusion module is shown in fig. 5, and specifically includes the following steps:
s51, transmitting the radio frequency fingerprint identification decision result, the radio identification decision result, the radar identification decision result and the photoelectric image identification decision result into a feature fusion module;
s52, fusing radio frequency and radio frequency data, radio detection data, radar detection data and photoelectric image detection data based on a weighting method principle in a hierarchical decision, reducing the demand of multi-source fusion on computing resources, and improving the environmental applicability and the identification accuracy of the unmanned aerial vehicle identification positioning system;
s53, the fused data are transmitted into a fusion recognition module, when an illegal unmanned aerial vehicle is detected, the system generates an alarm and tracks and positions the target unmanned aerial vehicle in real time;
and S54, finally determining the specific type of the unmanned aerial vehicle target according to the similarity of the fused target and each characteristic template.
The S52 weighting method comprises the following specific steps:
1. respectively calculating similarity HD between a radio frequency fingerprint identification decision result, a radio identification decision result, a radar identification decision result and a photoelectric identification decision result and an unmanned aerial vehicle feature library by using Hamming distance a 、HD b 、 HD c And HD d And calculating the similarity between the decision result and different characteristic values in the characteristic library by using a Hamming distance algorithm.
2. And the weighting fusion unit fuses the radio frequency fingerprint identification decision result, the radio identification decision result, the radar identification decision result and the photoelectric identification decision result. The following is the fusion formula
FS=w a ×(1-HD a )+w b ×(1-HD b )+w c ×(1-HD c )+w d ×(1-HD d )
Wherein FS represents the similarity between the fused recognition result and the unmanned aerial vehicle feature library, w a 、w b 、w c And w d Weight coefficients representing results of radio frequency fingerprint, radio, radar and photoelectric recognition, and w a +w b +w c +w d And =1. The weight coefficient can be valued according to actual conditions, and finally the type of the unmanned aerial vehicle target can be determined.
The unmanned aerial vehicle tracking and positioning module is used for processing the obtained unmanned aerial vehicle position information, and when the system generates an alarm, the illegal unmanned aerial vehicle identified by the feature fusion identification module is tracked and positioned in real time, and the method comprises the following specific steps:
s61, the system generates an alarm, and a radar positioning device, a radio detection device and a photoelectric image tracking device of the system start to operate;
s62, positioning the target unmanned aerial vehicle in real time according to the direction, the distance and the height of the target obtained from the radar detection module and the positioning information obtained from the radio detection module;
and S63, after the target is confirmed to be the unmanned aerial vehicle, acquiring the position of the previous frame of image detected by the photoelectric image detection module, and then, always tracking the target by the photoelectric device.
And S64, performing combined network pre-training based on the established unmanned aerial vehicle image data set, performing Kalman filtering prediction on the previous frame, updating the current frame according to the detection result, performing multiple matching by combining the motion and appearance characteristics reserved during initialization, outputting the track of the confirmed state, and adjusting the photoelectric lens to track the target unmanned aerial vehicle in real time.
And S65, when the target unmanned aerial vehicle flies away from the no-fly zone, finishing tracking, and finishing the operation of the unmanned aerial vehicle tracking and positioning module.
Unmanned aerial vehicle intelligent detection system specifically has the following expression:
s71, the system controls the modules to cooperate with each other, and when the system is started, the radio frequency signal detection module, the radio signal detection module, the radar detection module and the photoelectric detection module start to operate;
s72, when the radio frequency model detection module starts to operate, the unmanned aerial vehicle radio frequency fingerprint display module displays a oscillogram of an unmanned aerial vehicle radio frequency signal transmitted by the detection module, the oscillogram after wavelet transform down-sampling of the signal, a spectrogram after short-time Fourier transform and radio frequency fingerprint characteristic data;
s73, when the radio detection module starts to operate, the detection data are transmitted into the system, and the radio detection display module displays a radio signal intensity distribution diagram of the unmanned aerial vehicle and radio characteristic data of the unmanned aerial vehicle;
s74, after the radar detection module starts to operate, the unmanned aerial vehicle radar detection display module displays unmanned aerial vehicle echo data scanned by the radar and micro Doppler characteristic data;
s75, after the photoelectric detection module starts to operate, the photoelectric detection display module displays an unmanned aerial vehicle target detection image and unmanned aerial vehicle photoelectric image characteristic data in real time;
s76, displaying a multi-source feature fusion recognition result, determining the specific type of the unmanned aerial vehicle target, giving an alarm and performing positioning tracking;
and S77, positioning the illegal unmanned aerial vehicle, mainly detecting and positioning by adopting a radar, displaying information such as the direction, distance, height and the like of the unmanned aerial vehicle, displaying a signal intensity distribution diagram of the unmanned aerial vehicle, namely the direction of the unmanned aerial vehicle by adopting radio as assistance, and tracking the target in real time according to the position of the first frame image of the photoelectric image detection module.
And S78, when the illegal unmanned aerial vehicle leaves the target area, finishing tracking, finishing the unmanned aerial vehicle tracking system, detecting surrounding aerial targets by the detection module, and when the system finishes running, stopping running of all modules successively.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. One typical implementation device is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smartphone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising one of 8230; \8230;" 8230; "does not exclude the presence of additional like elements in a process, method, article, or apparatus that comprises that element.
The above examples are to be construed as merely illustrative and not limitative of the remainder of the disclosure. After reading the description of the invention, the skilled person can make various changes or modifications to the invention, and these equivalent changes and modifications also fall into the scope of the invention defined by the claims.

Claims (10)

1. The utility model provides a multisource fuses unmanned aerial vehicle intelligent detection system which characterized in that includes: radio frequency fingerprint identification module, radio detection module, radar detection module, photoelectric image detection module, signal processor, photoelectric image information processing module, characteristic fusion module, unmanned aerial vehicle tracking and positioning module and unmanned aerial vehicle intelligent detection system module, wherein
The radio frequency fingerprint identification module is used for collecting radio frequency signals of the unmanned aerial vehicle target in the environment;
the radio detection module is used for detecting radio signals of the unmanned aerial vehicle in the environment;
the radar detection module is used for scanning signals of the unmanned aerial vehicle in the environment through electromagnetic waves;
the photoelectric image detection module is used for capturing an image of the unmanned aerial vehicle in the environment;
the signal processor is used for carrying out signal processing on a radio frequency signal of an unmanned aerial vehicle target, a radio signal of the unmanned aerial vehicle, a scanning signal of the unmanned aerial vehicle in an environment and a target image of the unmanned aerial vehicle by using electromagnetic waves to obtain an identification result of a radio frequency fingerprint of the unmanned aerial vehicle, an identification and positioning result of the radio signal of the unmanned aerial vehicle, an identification and positioning result of radar detection of the unmanned aerial vehicle and an identification result of the image of the unmanned aerial vehicle;
the feature fusion module is used for performing decision-making layer feature fusion on an identification result of a radio frequency fingerprint of the unmanned aerial vehicle, an identification and positioning result of a radio signal of the unmanned aerial vehicle, an identification and positioning result of radar detection of the unmanned aerial vehicle and an identification result of an image of the unmanned aerial vehicle;
the unmanned aerial vehicle tracking and positioning module comprises a tracking module and a positioning module, wherein in the unmanned aerial vehicle tracking stage, the photoelectric device tracks a target all the time and rotates along with the unmanned aerial vehicle at the position of a previous frame of image acquired by the photoelectric image detection module, in the unmanned aerial vehicle positioning stage, radar detection is used as a main part, radio detection is used as an auxiliary part, the radar detection module feeds back the direction, the distance and the height of the target, and radio detection feeds back the direction of the target;
the unmanned aerial vehicle intelligent detection system module is used for displaying unmanned aerial vehicle radio frequency signal data and processing results, radio detection information data and processing results, radar detection information data and processing results, photoelectric image data and processing results, feature fusion results and real-time data of unmanned aerial vehicle tracking and positioning.
2. The intelligent detection system of multi-source converged unmanned aerial vehicle according to claim 1, wherein the signal processor comprises a radio frequency signal processing module, a radio detection information processing module, and a radar detection information processing module,
the radio frequency signal processing module is used for carrying out corresponding preprocessing, denoising, feature extraction and identification algorithms on the acquired radio frequency signals to identify the unmanned aerial vehicle and obtain an identification result of the radio frequency fingerprint of the unmanned aerial vehicle;
the radio detection information processing module is used for carrying out graying processing on the intensity spatial distribution of the radio signals formed by detection, obtaining the characteristics of a gradient histogram, identifying a target unmanned aerial vehicle and obtaining the identification and positioning results of the radio signals of the unmanned aerial vehicle;
the radar detection information processing module is used for processing the result of electromagnetic wave scanning by using a wavelet threshold denoising method, extracting target micro Doppler characteristics and obtaining the identification and positioning results of the radar detection of the unmanned aerial vehicle;
the photoelectric image information processing module is used for carrying out target detection on the collected unmanned aerial vehicle image, screening out the unmanned aerial vehicle target by utilizing track discrimination and light stream characteristic discrimination after detecting the moving target, carrying out feature recognition on the unmanned aerial vehicle target and obtaining the recognition result of the unmanned aerial vehicle image.
3. The intelligent detection system of a multi-source fusion unmanned aerial vehicle according to claim 1, wherein the radio frequency fingerprint identification module specifically comprises the following steps:
s1, passively detecting a 2.4GH frequency band radio frequency signal by a radio frequency signal acquisition device to generate a signal oscillogram;
s2, preprocessing the signal waveform, and continuously down-sampling the signal by using wavelet transform;
s3, the processed signals enter a two-stage detection system, the first stage distinguishes radio frequency signals or noise, the separated radio frequency signals enter a second stage, the radio frequency signals are distinguished as unmanned aerial vehicle signals and other Internet of things radio frequency signals in the second stage, and the separated unmanned aerial vehicle radio frequency signals are subjected to subsequent processing;
s4, when the radio frequency signal of the unmanned aerial vehicle is detected, short-time Fourier transform is adopted, and then characteristics are extracted;
s5, using an identification algorithm including KNN, SVM or NN for the extracted features, comparing the extracted features with the existing radio frequency fingerprint data set, judging that the radio frequency signal which is not in the data set base is an illegal unmanned aerial vehicle, and giving a corresponding prompt by the system at a radio frequency fingerprint identification module;
s6, when data collection is carried out, storing the collected radio frequency fingerprint characteristics into a radio frequency fingerprint characteristic library for subsequent radio frequency fingerprint identification;
and S7, transmitting the detection result into a multi-source feature fusion module for further identification.
4. The intelligent detection system for the multi-source fusion unmanned aerial vehicle of claim 3, wherein the detection step of the two-machine detection system of the step S3 specifically comprises:
A1. inputting the detected signal into a two-stage detection system;
A2. in the first stage, the detector adopts Bayesian hypothesis test to determine whether the captured signal is a radio frequency signal or noise, and if the radio frequency signal is detected, the second-stage detector is activated;
A3. a second phase, in which the detector determines whether the captured radio-frequency signal is from an interferer or from a drone, the interferer comprising a WiFi or bluetooth device, the detector performing interference detection using bandwidth analysis and modulation-based characteristics, the bandwidth of WiFi being, firstly, significantly higher than that of the drone and bluetooth signals, so that the WiFi signal can be separated by bandwidth identification, most mobile bluetooth devices using gaussian frequency shift keying, GFSK/FSK, modulation, distinguishing the bluetooth signal using two GFSK/FSK modulation characteristics, i.e. frequency offset and symbol duration, so that, if the detected radio-frequency signal is not from a WiFi or bluetooth interferer, it is assumed to be a signal transmitted by the drone; thus, the detected signals pass to the next stage feature extraction classification system to identify the drone.
5. The intelligent detection system of multi-source fusion unmanned aerial vehicle of claim 4,
the detected signals are transmitted to a next-stage feature extraction and classification system to identify the unmanned aerial vehicle, and the method specifically comprises the following steps:
B1. the detected unmanned aerial vehicle signal is transmitted into a feature extraction module to be extracted;
B2. for the representation of the radio frequency signal in the energy time-frequency domain, a spectrogram of the signal is obtained by short-time Fourier transform SIFT, and the formula for calculating the spectrogram is as follows
Figure RE-FDA0003859102100000031
Wherein, y T [n]Is a pre-processed signal captured by the detection system, m is discrete time, ω is frequency, k is a discrete variable, ω n]Is a sliding window function as a filter;
B3. in the obtained spectrogram, the maximum energy value is taken along a time axis, an energy track is calculated, the energy transient state is estimated by searching the maximum mutation of the mean value or the variance of the normalized energy track, and the transient characteristic of the signal in an energy domain is defined by the energy transient characteristic;
B4. and detecting an energy transient state, selecting a characteristic value with higher weight to extract radio frequency fingerprints, wherein the radio frequency fingerprints are the statistical time characteristics representing the energy transient state.
6. The intelligent detection system of a multi-source fusion unmanned aerial vehicle of claim 1, wherein the radio detection module specifically comprises the following steps:
s21, after the radio scanning device carries out omnibearing scanning, data drawing is carried out on the collected signals to obtain a radio signal intensity distribution diagram;
s22, performing data correction on the drawn signal intensity distribution diagram, and performing gray processing;
s23, acquiring gradient and direction from the image, partitioning the image, performing normalization processing on the partitioned sub-blocks, combining histograms of the sub-blocks to obtain a directional gradient histogram feature vector on the image, and acquiring the direction of the unmanned aerial vehicle;
s24, identifying the extracted features, matching the extracted features with the radio feature data set of the unmanned aerial vehicle, and sending an alarm by the unmatched unmanned aerial vehicle at the radio detection module and judging the unmanned aerial vehicle to be a suspected illegal unmanned aerial vehicle;
s25, when data collection is carried out, storing the collected radio signal characteristics of the unmanned aerial vehicle into an unmanned aerial vehicle radio characteristic library for subsequent unmanned aerial vehicle radio identification;
and S26, transmitting the detection result into a multi-source feature fusion module for further accurate identification.
7. The intelligent detection system of a multi-source fusion unmanned aerial vehicle according to claim 1, wherein the radar detection module specifically comprises the following steps:
s31, the radar detection device emits electromagnetic waves to carry out all-around scanning on the surrounding air environment, and the target unmanned aerial vehicle generates an echo signal when being scanned by the radar;
s32, after the radar receives the echo signal sent back, denoising the noise-containing radar echo signal of the unmanned aerial vehicle by using a wavelet threshold denoising method;
s33, amplifying the echo signals, extracting micro Doppler characteristics, measuring the distance of the unmanned aerial vehicle by using the time when electromagnetic waves reach the target unmanned aerial vehicle and return to a radar antenna from the target unmanned aerial vehicle, and measuring the height and direction information of the unmanned aerial vehicle by using the directional characteristic of a radar;
s34, identifying the extracted features, matching the extracted features with signals in a radar micro Doppler feature database, judging the unmatched unmanned aerial vehicle as a suspected illegal unmanned aerial vehicle, and sending an alarm in a radar detection module;
s35, during data collection, storing the collected characteristics into a radar micro Doppler characteristic data set library for subsequent identification;
and S36, transmitting the detection result into a multi-source feature fusion module for further accurate identification.
8. The intelligent detection system of a multi-source fusion unmanned aerial vehicle of claim 1, wherein the feature fusion module specifically comprises the following steps:
s51, transmitting the radio frequency fingerprint identification decision result, the radio identification decision result, the radar identification decision result and the photoelectric image identification decision result into a feature fusion module;
s52, fusing radio frequency and radio frequency data, radio detection data, radar detection data and photoelectric image detection data based on a weighting method principle in a layered decision;
s53, the fused data are transmitted into a fusion recognition module, when an illegal unmanned aerial vehicle is detected, the system generates an alarm and tracks and positions the target unmanned aerial vehicle in real time;
and S54, finally determining the specific type of the unmanned aerial vehicle target according to the similarity of the fused target and each characteristic template.
9. The multi-source fusion unmanned aerial vehicle intelligent detection system of claim 8, wherein the S52 weighting method comprises the following specific steps:
C1. calculating similarity HD between the radio frequency fingerprint identification decision result, the radio identification decision result, the radar identification decision result and the photoelectric identification decision result and the unmanned aerial vehicle feature library by using Hamming distance a 、HD b 、HD c And HD d And the Hamming distance algorithm calculates the similarity between the decision result and different characteristic values in the characteristic library.
C2. The weighted fusion unit fuses the radio frequency fingerprint identification decision result, the radio identification decision result, the radar identification decision result and the photoelectric identification decision result, and the fusion formula is as follows:
FS=w a ×(1-HD a )+w b ×(1-HD b )+w c ×(1-HD c )+w d ×(1-HD d )
FS represents similarity between the fused recognition result and the unmanned aerial vehicle feature library, w a 、w b 、w c And w d Weight coefficients representing results of radio frequency fingerprint, radio, radar and photoelectric recognition, and w a +w b +w c +w d =1, the weight coefficient can be evaluated according to actual conditions, and finally the type of the unmanned aerial vehicle target can be determined.
10. The intelligent detection system of multi-source fusion unmanned aerial vehicle of claim 1,
the unmanned aerial vehicle tracking and positioning module is used for processing the obtained unmanned aerial vehicle position information, and when the system generates an alarm, the illegal unmanned aerial vehicle identified by the feature fusion identification module is tracked and positioned in real time, and the method comprises the following specific steps:
s61, the system generates an alarm, and a radar positioning device, a radio detection device and a photoelectric image tracking device of the system start to operate;
s62, positioning the target unmanned aerial vehicle in real time according to the direction, the distance and the height of the target obtained from the radar detection module and the positioning information obtained from the radio detection module;
s63, after the target is determined to be the unmanned aerial vehicle, acquiring the position of the previous frame of image detected by the photoelectric image detection module, and then tracking the target by the photoelectric device all the time;
s64, performing combined network pre-training based on the established unmanned aerial vehicle image dataset, performing Kalman filtering prediction on the previous frame, updating the current frame according to the detection result, performing various matching by combining the motion and appearance characteristics reserved during initialization, outputting the track of a confirmed state, and adjusting a photoelectric lens to track the target unmanned aerial vehicle in real time;
and S65, when the target unmanned aerial vehicle flies away from the no-fly zone, finishing tracking, and finishing the operation of the unmanned aerial vehicle tracking and positioning module.
CN202210728659.3A 2022-06-24 2022-06-24 Multisource fuses unmanned aerial vehicle intelligent detection system Pending CN115508821A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210728659.3A CN115508821A (en) 2022-06-24 2022-06-24 Multisource fuses unmanned aerial vehicle intelligent detection system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210728659.3A CN115508821A (en) 2022-06-24 2022-06-24 Multisource fuses unmanned aerial vehicle intelligent detection system

Publications (1)

Publication Number Publication Date
CN115508821A true CN115508821A (en) 2022-12-23

Family

ID=84502233

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210728659.3A Pending CN115508821A (en) 2022-06-24 2022-06-24 Multisource fuses unmanned aerial vehicle intelligent detection system

Country Status (1)

Country Link
CN (1) CN115508821A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115840223A (en) * 2023-02-15 2023-03-24 成都熵泱科技有限公司 Unmanned aerial vehicle detection system and method capable of identifying target attributes
CN116359836A (en) * 2023-05-31 2023-06-30 成都金支点科技有限公司 Unmanned aerial vehicle target tracking method and system based on super-resolution direction finding
CN116678421A (en) * 2023-06-12 2023-09-01 深圳沧穹科技有限公司 Multisource fusion positioning method and system based on multi-module BLE transmitting device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080088508A1 (en) * 1999-03-05 2008-04-17 Smith Alexander E Enhanced Passive Coherent Location Techniques to Track and Identify UAVs, UCAVs, MAVs, and Other Objects
US20170192089A1 (en) * 2014-12-19 2017-07-06 Xidrone Systems, Inc. Deterent for unmanned aerial systems
CN108957445A (en) * 2018-07-30 2018-12-07 四川九洲空管科技有限责任公司 A kind of low-altitude low-velocity small targets detection system and its detection method
CN208298351U (en) * 2018-05-30 2018-12-28 安徽中天保安服务集团有限公司 A kind of radar monitoring early warning system based on unmanned plane
CN112068111A (en) * 2020-08-13 2020-12-11 中国人民解放军海军工程大学 Unmanned aerial vehicle target detection method based on multi-sensor information fusion
KR102298950B1 (en) * 2020-07-23 2021-09-08 한국항공우주산업 주식회사 synchronic positional tracking method using radar of multi unmanned aerial vehicles
CN113538585A (en) * 2021-09-17 2021-10-22 深圳火眼智能有限公司 High-precision multi-target intelligent identification, positioning and tracking method and system based on unmanned aerial vehicle

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080088508A1 (en) * 1999-03-05 2008-04-17 Smith Alexander E Enhanced Passive Coherent Location Techniques to Track and Identify UAVs, UCAVs, MAVs, and Other Objects
US20170192089A1 (en) * 2014-12-19 2017-07-06 Xidrone Systems, Inc. Deterent for unmanned aerial systems
CN208298351U (en) * 2018-05-30 2018-12-28 安徽中天保安服务集团有限公司 A kind of radar monitoring early warning system based on unmanned plane
CN108957445A (en) * 2018-07-30 2018-12-07 四川九洲空管科技有限责任公司 A kind of low-altitude low-velocity small targets detection system and its detection method
KR102298950B1 (en) * 2020-07-23 2021-09-08 한국항공우주산업 주식회사 synchronic positional tracking method using radar of multi unmanned aerial vehicles
CN112068111A (en) * 2020-08-13 2020-12-11 中国人民解放军海军工程大学 Unmanned aerial vehicle target detection method based on multi-sensor information fusion
CN113538585A (en) * 2021-09-17 2021-10-22 深圳火眼智能有限公司 High-precision multi-target intelligent identification, positioning and tracking method and system based on unmanned aerial vehicle

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
XIE, Y等: "Dual-source detection and identification system based on UAV radio frequency signal", 《IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT》, vol. 70, 9 August 2021 (2021-08-09), pages 1 - 15, XP011872233, DOI: 10.1109/TIM.2021.3103565 *
田园等: "基于瞬态信号特征的无人机数传电台识别研究", 《现代电子技术》, vol. 45, no. 12, 15 June 2022 (2022-06-15), pages 105 - 109 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115840223A (en) * 2023-02-15 2023-03-24 成都熵泱科技有限公司 Unmanned aerial vehicle detection system and method capable of identifying target attributes
CN116359836A (en) * 2023-05-31 2023-06-30 成都金支点科技有限公司 Unmanned aerial vehicle target tracking method and system based on super-resolution direction finding
CN116359836B (en) * 2023-05-31 2023-08-15 成都金支点科技有限公司 Unmanned aerial vehicle target tracking method and system based on super-resolution direction finding
CN116678421A (en) * 2023-06-12 2023-09-01 深圳沧穹科技有限公司 Multisource fusion positioning method and system based on multi-module BLE transmitting device
CN116678421B (en) * 2023-06-12 2024-01-23 深圳沧穹科技有限公司 Multisource fusion positioning method and system based on multi-module BLE transmitting device

Similar Documents

Publication Publication Date Title
CN107862705B (en) Unmanned aerial vehicle small target detection method based on motion characteristics and deep learning characteristics
CN115508821A (en) Multisource fuses unmanned aerial vehicle intelligent detection system
CN107590439B (en) Target person identification and tracking method and device based on monitoring video
CN112016445B (en) Monitoring video-based remnant detection method
CN103048654B (en) Radar sensor and method for detecting objects using the same
KR100754385B1 (en) Apparatus and method for object localization, tracking, and separation using audio and video sensors
CN111505632B (en) Ultra-wideband radar action attitude identification method based on power spectrum and Doppler characteristics
CN110456320B (en) Ultra-wideband radar identity recognition method based on free space gait time sequence characteristics
Mahlisch et al. A multiple detector approach to low-resolution FIR pedestrian recognition
Ni et al. Gait-based person identification and intruder detection using mm-wave sensing in multi-person scenario
CN108304857A (en) A kind of personal identification method based on multimodel perceptions
CN113640768B (en) Low-resolution radar target identification method based on wavelet transformation
CN103295221A (en) Water surface target motion detecting method simulating compound eye visual mechanism and polarization imaging
Qiao et al. Human activity classification based on micro-Doppler signatures separation
Skaria et al. Deep-learning for hand-gesture recognition with simultaneous thermal and radar sensors
Bulatović et al. Mel-spectrogram features for acoustic vehicle detection and speed estimation
CN114814832A (en) Millimeter wave radar-based real-time monitoring system and method for human body falling behavior
Wang et al. A novel underground pipeline surveillance system based on hybrid acoustic features
CN107390164B (en) A kind of continuous tracking method of underwater distributed multi-source target
CN114417908A (en) Multi-mode fusion-based unmanned aerial vehicle detection system and method
CN117665807A (en) Face recognition method based on millimeter wave multi-person zero sample
CN101093540A (en) Method for recognizing human ear by detecting human ear and syncretizing information under complex background
CN117452496A (en) Target detection and identification method for seismoacoustic sensor
CN113449711A (en) Micro Doppler image sign language perception identification method based on direction density characteristics
CN107578036A (en) A kind of depth image tumble recognizer based on wavelet moment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination