CN111817794B - Multi-domain cooperative unmanned aerial vehicle detection method and system based on deep learning - Google Patents

Multi-domain cooperative unmanned aerial vehicle detection method and system based on deep learning Download PDF

Info

Publication number
CN111817794B
CN111817794B CN202010478187.1A CN202010478187A CN111817794B CN 111817794 B CN111817794 B CN 111817794B CN 202010478187 A CN202010478187 A CN 202010478187A CN 111817794 B CN111817794 B CN 111817794B
Authority
CN
China
Prior art keywords
time domain
domain
data
unmanned aerial
aerial vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010478187.1A
Other languages
Chinese (zh)
Other versions
CN111817794A (en
Inventor
白迪
崔勇强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South Central Minzu University
Original Assignee
South Central University for Nationalities
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South Central University for Nationalities filed Critical South Central University for Nationalities
Priority to CN202010478187.1A priority Critical patent/CN111817794B/en
Publication of CN111817794A publication Critical patent/CN111817794A/en
Application granted granted Critical
Publication of CN111817794B publication Critical patent/CN111817794B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B17/00Monitoring; Testing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides a multi-domain collaborative unmanned aerial vehicle detection method and system based on deep learning, which are realized by adopting a monitoring antenna array, a radio frequency front end and a signal processing board, wherein the monitoring antenna array receives a communication waveform signal of an unmanned aerial vehicle, the communication waveform signal is subjected to low noise amplification, filtering and down-conversion by the radio frequency front end and then is input to the signal processing board, the signal processing board respectively performs characteristic extraction of a space domain, a time domain and a frequency domain on the signal, then three extracted characteristic vector data are spliced together to form a multi-domain collaborative vector group, then comprehensive characteristics of the multi-domain collaborative vector group are extracted through a neural network model, and existence information of the unmanned aerial vehicle is output. The invention has the beneficial effects that: the unmanned aerial vehicle detection accuracy is improved, and the false alarm rate is reduced.

Description

Multi-domain cooperative unmanned aerial vehicle detection method and system based on deep learning
Technical Field
The invention relates to the technical field of unmanned aerial vehicle countermeasures, in particular to a multi-domain collaborative unmanned aerial vehicle detection method and system based on deep learning.
Background
In recent years, the unmanned aerial vehicle industry continues to increase rapidly, between 2014 and 2018, the market scale of the global rotor unmanned aerial vehicle increases by about 20% every year, and in each large electric business platform and each commercial farm, people can purchase one unmanned aerial vehicle which flies in one hand and has functions of aerial photography and the like at least by about two thousand yuan. However, when the entrance threshold of the unmanned aerial vehicle is lowered continuously, the unmanned aerial vehicle is in a high-emergence situation. When the unmanned aerial vehicle is not allowed to enter airport airspace, public places and sensitive areas, the risk of harming public safety and national safety exists.
At present, the unmanned aerial vehicle detection technology at home and abroad is rapidly developed, and mainly comprises active radar detection, external radiation source radar detection, infrared imaging monitoring, passive spectrum monitoring, acoustic monitoring technology and the like. Because unmanned aerial vehicle belongs to low, slow, little flight target, and initiative radar detection effect is limited, and the use of external radiation source radar is subject to the electromagnetic environment in the air, and the detection distance of acoustics monitoring technology is limited, and infrared detector's cost is high and receives weather effect great, consequently, passive spectrum monitoring is the unmanned aerial vehicle detection means of present mainstream, but its detection precision remains to be promoted further.
Disclosure of Invention
In view of the above, the invention provides a method and a system for detecting a multi-domain cooperative unmanned aerial vehicle based on deep learning, which are based on a passive monitoring technology, combine information of a time domain, a space domain and a frequency domain, and construct a multi-domain combined detection method based on deep learning, so that the detection accuracy of a black-flying unmanned aerial vehicle is improved, and the false alarm rate of unmanned aerial vehicle detection is reduced.
The invention provides a multi-domain collaborative unmanned aerial vehicle detection method based on deep learning, which is realized by adopting a monitoring antenna array, a radio frequency front end and a signal processing board, and comprises the following steps:
s1, an upper detection antenna and a lower detection antenna of the monitoring antenna array respectively receive communication waveform signals of the unmanned aerial vehicle, and IQ down-conversion is carried out to obtain first IQ time domain data and second IQ time domain data after low-noise amplification and filtering are carried out by a radio frequency front end, and the first IQ time domain data and the second IQ time domain data are input to a signal processing board;
s2, respectively carrying out data preprocessing of a space domain, a time domain and a frequency domain on the first IQ time domain data and the second IQ time domain data, extracting three eigenvectors of the data, and splicing the three eigenvectors together to form a multi-domain cooperative vector group;
s3, inputting the multi-domain cooperation vector group obtained in the step S2 into a neural network model trained in advance, and outputting the existence information of the unmanned aerial vehicle;
wherein, the steps S2, S3 are located in the signal processing board.
Further, the specific process of step S2 is as follows:
s201, respectively intercepting data I with the same time and length of k from the first IQ time domain data and the second IQ time domain data1、I2And obtaining complete IQ time domain data through weighting:
I=αI1+(1-α)I2
wherein alpha represents a weighting coefficient, and the value range of alpha is more than or equal to 0 and less than or equal to 1;
s202, calculating data I1、I2And obtaining the ratio of the amplitude value and the amplitude value to obtain a space domain feature vector V1=[n1,n2,…,nk],n1,n2,…,nkRepresenting the amplitude ratio of the communication waveform signals received by the upper detecting antenna and the lower detecting antenna at the corresponding moment;
s203, normalizing the IQ time domain data I and removing singular values to obtain a time domain characteristic vector V2=[m1,m2,…,m3],m1,m2,…,mkRepresenting the corresponding time domain eigenvalue;
s204, performing k-point FFT (fast Fourier transform) on the IQ time domain data I to obtain a frequency domain characteristic vector V3=[j1,j2,…,j3],j1,j2,…,jkRepresenting a corresponding FFT transformation result value;
s205, converting the space domain feature vector V1Time domain feature vector V2Frequency domain feature vector V3Splicing to obtain a multi-domain collaborative vector set S ═ V1,V2,V3]。
Furthermore, the input of the neural network model is a multi-domain cooperative vector group, a multilayer one-dimensional CNN network is adopted for feature extraction, full-connection network layers are utilized for fusing features, and existence information of the unmanned aerial vehicle is output through each layer of activation function; and constructing a training set and label data by adopting simulation or actual measurement data, training the neural network model, wherein the label data is whether the unmanned aerial vehicle exists or not, and optimizing by using an Adam optimization function until loss function loss is converged to obtain the trained neural network model.
Furthermore, the monitoring antenna array adopts an upper detecting antenna and a lower detecting antenna to form a horizontal 120-degree sector in a spatial domain, the main lobe angles in the vertical direction of the upper detecting antenna and the lower detecting antenna are both 60 degrees, and an antenna fixing hole is 30 meters away from the ground.
Furthermore, two channels of the radio frequency front end are respectively synchronous with the upper detection antenna and the lower detection antenna, and the communication waveform signal of the unmanned aerial vehicle enters the radio frequency front end through the upper detection antenna and the lower detection antenna respectively and then is subjected to low-noise amplification and filtering, and then IQ down-conversion is carried out to respectively obtain first IQ time domain data and second IQ time domain data.
According to another aspect of the present invention, in order to solve the technical problem, the present invention further provides a depth learning-based multi-domain cooperative unmanned aerial vehicle detection system, which is implemented by using a monitoring antenna array, a radio frequency front end and a signal processing board, and includes the following modules:
the data acquisition module is used for monitoring an upper detection antenna and a lower detection antenna of the antenna array to respectively receive communication waveform signals of the unmanned aerial vehicle, performing low-noise amplification and filtering through a radio frequency front end, performing IQ down-conversion to obtain first IQ time domain data and second IQ time domain data, and inputting the first IQ time domain data and the second IQ time domain data to the signal processing board;
the characteristic vector construction module is used for respectively carrying out data preprocessing of a space domain, a time domain and a frequency domain on the first IQ time domain data and the second IQ time domain data, extracting three characteristic vectors of the data, and splicing the three characteristic vectors together to form a multi-domain collaborative vector group;
the detection result output module is used for inputting the multi-domain cooperation vector group into a neural network model which is trained in advance and outputting the existence information of the unmanned aerial vehicle;
the characteristic vector construction module and the detection result output module are located in the signal processing board.
Further, the feature vector construction module further includes:
a time domain data construction submodule for respectively intercepting data I with the same time and length of k from the first IQ time domain data and the second IQ time domain data1、I2And obtaining complete IQ time domain data through weighting:
I=αI1+(1-α)I2
wherein alpha represents a weighting coefficient, and the value range of alpha is more than or equal to 0 and less than or equal to 1;
a space domain feature vector construction submodule for calculating data I1、I2And obtaining the ratio of the amplitude value and the amplitude value to obtain a space domain feature vector V1=[n1,n2,…,nk],n1,n2,…,nkRepresenting the amplitude ratio of the communication waveform signals received by the upper detecting antenna and the lower detecting antenna at the corresponding moment;
a time domain feature vector construction submodule for carrying out normalization and singular value removal processing on the IQ time domain data I to obtain a time domain feature vector V2=[m1,m2,…,m3],m1,m2,…,mkRepresenting the corresponding time domain eigenvalue;
a frequency domain feature vector construction submodule for performing k-point FFT on the IQ time domain data I to obtain a frequency domain feature vector V3=[j1,j2,…,j3],j1,j2,…,jkRepresenting a corresponding FFT transformation result value;
a multi-domain collaborative vector construction submodule for constructing the space domain feature vector V1Time domain feature vector V2Frequency domain feature vector V3Splicing to obtain a multi-domain collaborative vector set S ═ V1,V2,V3]。
Furthermore, the input of the neural network model is a multi-domain cooperative vector group, a multilayer one-dimensional CNN network is adopted for feature extraction, full-connection network layers are utilized for fusing features, and existence information of the unmanned aerial vehicle is output through each layer of activation function; and constructing a training set and label data by adopting simulation or actual measurement data, training the neural network model, wherein the label data is whether the unmanned aerial vehicle exists or not, and optimizing by using an Adam optimization function until loss function loss is converged to obtain the trained neural network model.
Furthermore, the monitoring antenna array adopts an upper detecting antenna and a lower detecting antenna to form a horizontal 120-degree sector in a spatial domain, the main lobe angles in the vertical direction of the upper detecting antenna and the lower detecting antenna are both 60 degrees, and an antenna fixing hole is 30 meters away from the ground.
Furthermore, two channels of the radio frequency front end are respectively synchronous with the upper detection antenna and the lower detection antenna, and the communication waveform signal of the unmanned aerial vehicle enters the radio frequency front end through the upper detection antenna and the lower detection antenna respectively and then is subjected to low-noise amplification and filtering, and then IQ down-conversion is carried out to respectively obtain first IQ time domain data and second IQ time domain data.
The technical scheme provided by the invention has the beneficial effects that: the invention combines the characteristics of the unmanned aerial vehicle in the time domain, the space domain and the frequency domain to construct a multi-dimensional integrated multi-domain cooperative detection method for the unmanned aerial vehicle, eliminates false unmanned aerial vehicles caused by the influence of electromagnetic environment, excavates the target information of real unmanned aerial vehicles, improves the detection accuracy of the unmanned aerial vehicles and reduces the false alarm rate; can be used for safety protection of government organs, military facilities, energy storage depots/stations, large commercial venues, private places and sensitive areas.
Drawings
Fig. 1 is a structural flow chart of a deep learning-based multi-domain cooperative unmanned aerial vehicle detection method according to an embodiment of the present invention;
fig. 2 is a schematic layout of a monitoring antenna array according to an embodiment of the present invention;
fig. 3 is a data processing flow chart of a signal processing board provided by an embodiment of the present invention;
fig. 4 is a time domain waveform diagram of 6 types of unmanned aerial vehicles according to an embodiment of the present invention;
fig. 5 is a structural diagram of a neural network model provided in an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be further described with reference to the accompanying drawings.
Referring to fig. 1, an embodiment of the present invention provides a method for detecting a multi-domain cooperative drone based on deep learning, which is implemented by using a monitoring antenna array 1, a radio frequency front end 2, and a signal processing board 3, where the monitoring antenna array 1 receives a communication waveform signal between a drone 4 and a remote control 5, the communication waveform signal is subjected to low-noise amplification, filtering and down-conversion by a radio frequency front end 2 and then input to a signal processing board 3, the signal processing board 3 divides the signal into a space domain, a time domain and a frequency domain for feature extraction respectively, then the three extracted feature vector data are spliced together to form a multi-domain cooperation vector group, the comprehensive features of the multi-domain cooperation vector group are extracted through a neural network model, therefore, the existence information of the unmanned aerial vehicle is mined, so that the detection rate of the unmanned aerial vehicle is improved, and the false alarm rate is reduced.
The monitoring antenna array 1 includes an upper detecting antenna and a lower detecting antenna, referring to fig. 2, in this embodiment, the spatial position of the wireless electromagnetic target is identified by a method of antenna wave velocity cross positioning, a horizontal 120-degree sector in a spatial domain is formed by the upper detecting antenna and the lower detecting antenna, the angle of a main lobe of the two antennas in the vertical direction is about 60 degrees, the distance between an antenna fixing hole and the ground is about 30 meters, and the upper detecting antenna and the lower detecting antenna can be overlapped in different areas in the air by adjusting the installation angle θ according to different detection range requirements.
Two channels of the radio frequency front end 2 are respectively synchronous with the upper detection antenna and the lower detection antenna, unmanned aerial vehicle communication waveform signals enter the radio frequency front end 2 through the upper detection antenna and the lower detection antenna respectively and then are subjected to low-noise amplification and filtering, and then IQ down-conversion is carried out to respectively obtain first IQ time domain data and second IQ time domain data.
The multi-domain collaborative unmanned aerial vehicle detection method based on deep learning comprises the following steps:
s1, the upper detection antenna and the lower detection antenna of the monitoring antenna array 1 respectively receive the communication waveform signals of the unmanned aerial vehicle, and perform IQ down-conversion after low-noise amplification and filtering by the radio frequency front end 2 to obtain first IQ time domain data I1Second IQ time domain data I2And input to the signal processing board 3.
Referring to fig. 2, when the "black-flying" unmanned aerial vehicle flies to the limited area formed by the monitoring antenna array 1 from a distance, the unmanned aerial vehicle gradually flies away until flying out of the main lobe range of the lower detection antenna, and at this time, the power spectrum amplitude of the "black-flying" unmanned aerial vehicle signal received by the lower detection antenna gradually decreases; when the unmanned aerial vehicle gradually flies into the main lobe range of the upper antenna, the power spectrum amplitude of the 'black flying' unmanned aerial vehicle signal received by the upper detection antenna gradually becomes larger, and the specific change curve is shown in the lower right corner in fig. 2.
S2, please refer to fig. 3, the signal processing board 3 performs data preprocessing of a spatial domain, a time domain and a frequency domain on the first IQ time domain data and the second IQ time domain data, respectively, extracts three eigenvectors of the data, and splices the three eigenvectors together to form a multi-domain cooperative vector group.
Specifically, the process of step S2 is:
s201, respectively intercepting data I with the same time and length of k from the first IQ time domain data and the second IQ time domain data1、I2And obtaining complete IQ time domain data through weighting:
I=αI1+(1-α)I2
wherein alpha represents a weighting coefficient, and the value range of alpha is more than or equal to 0 and less than or equal to 1.
S202, calculating data I1、I2And obtaining the ratio of the amplitude value and the amplitude value to obtain a space domain feature vector V1=[n1,n2,…,nk],n1,n2,…,nkRepresenting the amplitude ratio of the communication waveform signals received by the upper detecting antenna and the lower detecting antenna at the corresponding moment; according to the lower right corner in fig. 2, when the unmanned aerial vehicle gradually approaches, the amplitude ratio gradually increases from small to large, and the unmanned aerial vehicle is in a rising curve situation, and through the identification of the situation change, the radio interference equipment with static ground can be eliminated.
S203, normalizing the IQ time domain data I and removing singular values to obtain a time domain characteristic vector V2=[m1,m2,…,m3],m1,m2,…,mkRepresenting the corresponding time domain eigenvalues.
It should be noted that, a main constituent element of the urban complex electromagnetic environment is home WIFI, data interaction frequency between a WIFI Access Point (AP) in the home environment and devices such as a mobile phone and a notebook is about 10Hz, however, communication frequency between the unmanned aerial vehicle and a remote control is higher, and communication is usually performed 30 times per second, so that the communication frequency is an important time domain feature for distinguishing the common WIFI from the unmanned aerial vehicle; meanwhile, the communication links of each series of unmanned aerial vehicles of each manufacturer are different, which means that the used coding and modulation modes are different, resulting in different communication waveforms of each series of unmanned aerial vehicles, please refer to fig. 4, which provides time domain waveform diagrams of 6 types of unmanned aerial vehicles with different models, and the models of the unmanned aerial vehicles from the upper left corner to the lower right corner are: DJI matrix 100, DJI Phantom 3, Hobby King T6A V2, DX6e Spektrum, JR X9303, Jeti Duplex DC-16. It can be seen that the time domain waveforms of the remote control signals of all brands of drones are different from each other, and even if DJI matrix 100 and DJI Phantom 3 belong to the same brands of Xinjiang, the time domain waveforms are far different from each other, which indicates that the waveform profile is another significant feature of the drones, and step S202 is used for extracting the communication waveform profile.
S204, performing k-point FFT (fast Fourier transform) on the IQ time domain data I to obtain a frequency domain characteristic vector V3=[j1,j2,…,j3],j1,j2,…,jkRepresenting the corresponding FFT transformation result value.
S205, converting the space domain feature vector V1Time domain feature vector V2Frequency domain feature vector V3Splicing to obtain a multi-domain collaborative vector set S ═ V1,V2,V3]。
And S3, inputting the multi-domain cooperation vector group obtained in the step S2 into a pre-trained neural network model, automatically extracting the comprehensive features of the multi-domain cooperation vector group by means of the feature mining capability of the deep neural network model, and outputting the existence information of the unmanned aerial vehicle.
Referring to fig. 5, the neural network model adopts a multi-layer one-dimensional CNN network to perform feature extraction, utilizes a fully-connected network layer to fuse features, and outputs a detection result by selecting a suitable activation function.
In this document, the terms front, back, upper and lower are used to define the components in the drawings and the positions of the components relative to each other, and are used for clarity and convenience of the technical solution. It is to be understood that the use of the directional terms should not be taken to limit the scope of the claims.
The features of the embodiments and embodiments described herein above may be combined with each other without conflict.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (8)

1. A multi-domain collaborative unmanned aerial vehicle detection method based on deep learning is characterized by being achieved by adopting a monitoring antenna array, a radio frequency front end and a signal processing board, and the multi-domain collaborative unmanned aerial vehicle detection method comprises the following steps:
s1, an upper detection antenna and a lower detection antenna of the monitoring antenna array respectively receive communication waveform signals of the unmanned aerial vehicle, and IQ down-conversion is carried out to obtain first IQ time domain data and second IQ time domain data after low-noise amplification and filtering are carried out by a radio frequency front end, and the first IQ time domain data and the second IQ time domain data are input to a signal processing board;
s2, respectively carrying out data preprocessing of a space domain, a time domain and a frequency domain on the first IQ time domain data and the second IQ time domain data, extracting three eigenvectors of the data, and splicing the three eigenvectors together to form a multi-domain cooperative vector group;
s3, inputting the multi-domain cooperation vector group obtained in the step S2 into a neural network model trained in advance, and outputting the existence information of the unmanned aerial vehicle;
wherein, the steps S2 and S3 are positioned in the signal processing board;
the specific process of step S2 is as follows:
s201, respectively intercepting data I with the same time and length of k from the first IQ time domain data and the second IQ time domain data1、I2And obtaining complete IQ time domain data through weighting:
I=αI1+(1-α)I2
wherein alpha represents a weighting coefficient, and the value range of alpha is more than or equal to 0 and less than or equal to 1;
s202, calculating data I1、I2And obtaining the ratio of the amplitude value and the amplitude value to obtain a space domain feature vector V1=[n1,n2,…,nk],n1,n2,…,nkRepresenting the amplitude ratio of the communication waveform signals received by the upper detecting antenna and the lower detecting antenna at the corresponding moment;
s203, normalizing the IQ time domain data I and removing singular values to obtain a time domain characteristic vector V2=[m1,m2,…,mk ],m1,m2,…,mkRepresenting the corresponding time domain eigenvalue;
s204, performing k-point FFT (fast Fourier transform) on the IQ time domain data I to obtain a frequency domain characteristic vector V3=[j1,j2,…,jk ],j1,j2,…,jkRepresenting a corresponding FFT transformation result value;
s205, converting the space domain feature vector V1Time domain feature vector V2Frequency domain feature vector V3Splicing to obtain a multi-domain collaborative vector set S ═ V1,V2,V3]。
2. The deep learning-based multi-domain cooperative unmanned aerial vehicle detection method according to claim 1, wherein the input of the neural network model is a multi-domain cooperative vector group, a multilayer one-dimensional CNN network is adopted for feature extraction, full-connection network layers are utilized for feature fusion, and existence information of the unmanned aerial vehicle is output through activation functions of the layers; and constructing a training set and label data by adopting simulation or actual measurement data, training the neural network model, wherein the label data is whether the unmanned aerial vehicle exists or not, and optimizing by using an Adam optimization function until loss function loss is converged to obtain the trained neural network model.
3. The deep learning-based multi-domain cooperative unmanned aerial vehicle detection method according to claim 1, wherein the monitoring antenna array comprises an upper detection antenna and a lower detection antenna, and forms a horizontal 120 ° sector in a spatial domain, the main lobe angles of the upper detection antenna and the lower detection antenna in the vertical direction are both 60 °, and an antenna fixing hole is 30 meters away from the ground.
4. The deep learning-based multi-domain cooperative unmanned aerial vehicle detection method according to claim 1, wherein two channels of the radio frequency front end are synchronized with an upper detection antenna and a lower detection antenna respectively, and an unmanned aerial vehicle communication waveform signal enters the radio frequency front end through the upper detection antenna and the lower detection antenna respectively and then is subjected to low noise amplification and filtering, and then is subjected to IQ down-conversion to obtain first IQ time domain data and second IQ time domain data respectively.
5. The utility model provides a many fields unmanned aerial vehicle detecting system in coordination based on deep learning which characterized in that adopts monitoring antenna array, radio frequency front end and signal processing board to realize, many fields unmanned aerial vehicle detecting system in coordination includes following module:
the data acquisition module is used for monitoring an upper detection antenna and a lower detection antenna of the antenna array to respectively receive communication waveform signals of the unmanned aerial vehicle, performing low-noise amplification and filtering through a radio frequency front end, performing IQ down-conversion to obtain first IQ time domain data and second IQ time domain data, and inputting the first IQ time domain data and the second IQ time domain data to the signal processing board;
the characteristic vector construction module is used for respectively carrying out data preprocessing of a space domain, a time domain and a frequency domain on the first IQ time domain data and the second IQ time domain data, extracting three characteristic vectors of the data, and splicing the three characteristic vectors together to form a multi-domain collaborative vector group;
the detection result output module is used for inputting the multi-domain cooperation vector group into a neural network model which is trained in advance and outputting the existence information of the unmanned aerial vehicle;
the characteristic vector construction module and the detection result output module are positioned in the signal processing board;
a time domain data construction submodule for respectively intercepting data I with the same time and length of k from the first IQ time domain data and the second IQ time domain data1、I2And obtaining complete IQ time domain data through weighting:
I=αI1+(1-α)I2
wherein alpha represents a weighting coefficient, and the value range of alpha is more than or equal to 0 and less than or equal to 1;
a space domain feature vector construction submodule for calculating data I1、I2And obtaining the ratio of the amplitude value and the amplitude value to obtain a space domain feature vector V1=[n1,n2,…,nk],n1,n2,…,nkRepresenting the amplitude ratio of the communication waveform signals received by the upper detecting antenna and the lower detecting antenna at the corresponding moment;
a time domain feature vector construction submodule for carrying out normalization and singular value removal processing on the IQ time domain data I to obtain a time domain feature vector V2=[m1,m2,…,mk ],m1,m2,…,mkRepresenting the corresponding time domain eigenvalue;
a frequency domain feature vector construction submodule for performing k-point FFT on the IQ time domain data I to obtain a frequency domain feature vector V3=[j1,j2,…,jk ],j1,j2,…,jkRepresenting a corresponding FFT transformation result value;
a multi-domain collaborative vector construction submodule for constructing the space domain feature vector V1Time domain feature vector V2Frequency domain feature vector V3Splicing to obtain a multi-domain collaborative vector set S ═ V1,V2,V3]。
6. The deep learning-based multi-domain collaborative unmanned aerial vehicle detection system according to claim 5, wherein the input of the neural network model is a multi-domain collaborative vector group, a multilayer one-dimensional CNN network is adopted for feature extraction, full-connection network layers are utilized for feature fusion, and existence information of the unmanned aerial vehicle is output through activation functions of the layers; and constructing a training set and label data by adopting simulation or actual measurement data, training the neural network model, wherein the label data is whether the unmanned aerial vehicle exists or not, and optimizing by using an Adam optimization function until loss function loss is converged to obtain the trained neural network model.
7. The deep learning based multi-domain cooperative unmanned aerial vehicle detection system of claim 5, wherein the monitoring antenna array comprises an upper detection antenna and a lower detection antenna, and forms a horizontal 120 ° sector in a spatial domain, the main lobe angles of the upper detection antenna and the lower detection antenna in the vertical direction are both 60 °, and an antenna fixing hole is 30 meters away from the ground.
8. The deep learning-based multi-domain cooperative unmanned aerial vehicle detection system according to claim 5, wherein two channels of the radio frequency front end are synchronized with an upper detection antenna and a lower detection antenna respectively, and an unmanned aerial vehicle communication waveform signal enters the radio frequency front end through the upper detection antenna and the lower detection antenna respectively and then is subjected to low noise amplification and filtering, and then is subjected to IQ down-conversion to obtain first IQ time domain data and second IQ time domain data respectively.
CN202010478187.1A 2020-05-29 2020-05-29 Multi-domain cooperative unmanned aerial vehicle detection method and system based on deep learning Active CN111817794B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010478187.1A CN111817794B (en) 2020-05-29 2020-05-29 Multi-domain cooperative unmanned aerial vehicle detection method and system based on deep learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010478187.1A CN111817794B (en) 2020-05-29 2020-05-29 Multi-domain cooperative unmanned aerial vehicle detection method and system based on deep learning

Publications (2)

Publication Number Publication Date
CN111817794A CN111817794A (en) 2020-10-23
CN111817794B true CN111817794B (en) 2021-04-13

Family

ID=72848410

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010478187.1A Active CN111817794B (en) 2020-05-29 2020-05-29 Multi-domain cooperative unmanned aerial vehicle detection method and system based on deep learning

Country Status (1)

Country Link
CN (1) CN111817794B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112835110A (en) * 2021-01-07 2021-05-25 湖北甄业科技有限公司 Passive detection method for civil unmanned aerial vehicle system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108280395A (en) * 2017-12-22 2018-07-13 中国电子科技集团公司第三十研究所 A kind of efficient identification method flying control signal to low small slow unmanned plane
CN110084094A (en) * 2019-03-06 2019-08-02 中国电子科技集团公司第三十八研究所 A kind of unmanned plane target identification classification method based on deep learning
CN110516683A (en) * 2018-05-21 2019-11-29 朱姝 A kind of unmanned plane detection image recognition methods
CN111062310A (en) * 2019-12-13 2020-04-24 哈尔滨工程大学 Few-sample unmanned aerial vehicle image identification method based on virtual sample generation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108280395A (en) * 2017-12-22 2018-07-13 中国电子科技集团公司第三十研究所 A kind of efficient identification method flying control signal to low small slow unmanned plane
CN110516683A (en) * 2018-05-21 2019-11-29 朱姝 A kind of unmanned plane detection image recognition methods
CN110084094A (en) * 2019-03-06 2019-08-02 中国电子科技集团公司第三十八研究所 A kind of unmanned plane target identification classification method based on deep learning
CN111062310A (en) * 2019-12-13 2020-04-24 哈尔滨工程大学 Few-sample unmanned aerial vehicle image identification method based on virtual sample generation

Also Published As

Publication number Publication date
CN111817794A (en) 2020-10-23

Similar Documents

Publication Publication Date Title
Azari et al. Key technologies and system trade-offs for detection and localization of amateur drones
Sedunov et al. Stevens drone detection acoustic system and experiments in acoustics UAV tracking
CN110297213A (en) Radiation source positioning device and method based on the unmanned aerial vehicle platform for loading relatively prime linear array
CN105717486A (en) Positioning method and system for radio interference source
CN108089205A (en) A kind of unmanned plane flies to control personnel location system
CN111817794B (en) Multi-domain cooperative unmanned aerial vehicle detection method and system based on deep learning
Wang et al. Multi-classification of UWB signal propagation channels based on one-dimensional wavelet packet analysis and CNN
Oliveira et al. Low cost antenna array based drone tracking device for outdoor environments
CN114417908A (en) Multi-mode fusion-based unmanned aerial vehicle detection system and method
Wakabayashi et al. Drone audition listening from the sky estimates multiple sound source positions by integrating sound source localization and data association
CN110390273A (en) A kind of indoor occupant intrusion detection method based on multicore transfer learning
CN107884749B (en) Low-altitude unmanned-machine passive acoustic detection positioning device
CN109547936A (en) Indoor orientation method based on Wi-Fi signal and environmental background sound
CN116669035A (en) Airborne intelligent reflecting surface-assisted general sense integrated safe transmission design method
CN115586487A (en) Passive detection positioning system for low-altitude unmanned aerial vehicle
CN112418178B (en) Unmanned aerial vehicle intelligent detection method and system
Banerjee et al. A novel sound source localization method using a global-best guided cuckoo search algorithm for drone-based search and rescue operations
CN115087341B (en) Electromagnetic signal scrambling method and system based on waveguide window
CN117135639B (en) Wireless communication coverage prediction method, system and equipment based on neural network
Ezuma UAV detection and classification using radar, radio frequency and machine learning techniques
Silic et al. An experimental evaluation of radio models for localizing fixed-wing UAVs in rural environments
CN215728771U (en) Unmanned aerial vehicle detection device based on passive audio frequency
Egi et al. An efficient architecture for modeling path loss on forest canopy using LiDAR and wireless sensor networks fusion
Mandal et al. Intruder Drone Detection using Unmanned Aerial Vehicle Borne Radar (UAVBR) via Reconfigurable Intelligent Reflective Surface (IRS)
WO2019023829A1 (en) Unmanned aerial vehicle interference method and interference device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant