CN115032627A - Distributed multi-sensor multi-mode unmanned cluster target fusion tracking method - Google Patents

Distributed multi-sensor multi-mode unmanned cluster target fusion tracking method Download PDF

Info

Publication number
CN115032627A
CN115032627A CN202210447621.9A CN202210447621A CN115032627A CN 115032627 A CN115032627 A CN 115032627A CN 202210447621 A CN202210447621 A CN 202210447621A CN 115032627 A CN115032627 A CN 115032627A
Authority
CN
China
Prior art keywords
information
target
aerial vehicle
unmanned aerial
radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210447621.9A
Other languages
Chinese (zh)
Inventor
郭强
刘策越
李岩
常亮
周启晨
张先国
任传伦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cetc Cyberspace Security Research Institute Co ltd
CETC 15 Research Institute
CETC 30 Research Institute
Original Assignee
Cetc Cyberspace Security Research Institute Co ltd
CETC 15 Research Institute
CETC 30 Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cetc Cyberspace Security Research Institute Co ltd, CETC 15 Research Institute, CETC 30 Research Institute filed Critical Cetc Cyberspace Security Research Institute Co ltd
Priority to CN202210447621.9A priority Critical patent/CN115032627A/en
Publication of CN115032627A publication Critical patent/CN115032627A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The application discloses a distributed multi-sensor multi-mode unmanned cluster target fusion tracking method and a system, wherein the system comprises: the radar equipment is used for transmitting electromagnetic waves, analyzing the echo signals to obtain pose information of the target unmanned aerial vehicle, and sending the pose information to the control equipment; the wireless device is used for acquiring remote control and image transmission signals of the target unmanned aerial vehicle and extracting the characteristics of the acquired signals; the photoelectric equipment is aligned to the target unmanned aerial vehicle and used for acquiring image information of the target unmanned aerial vehicle; the control device is used for controlling the photoelectric device to be aligned with the target unmanned aerial vehicle according to the pose information; the control equipment is further used for determining tracking information of the target unmanned aerial vehicle according to the pose information, the signal characteristic information and the image information. The utility model provides a scheme can reduce the false alarm rate of surveying to the rate of accuracy of unmanned aerial vehicle detection has been improved.

Description

Distributed multi-sensor multi-mode unmanned cluster target fusion tracking method
Technical Field
The invention relates to the technical field of detection, in particular to a distributed multi-sensor multi-mode unmanned cluster target fusion tracking method.
Background
Unmanned aerial vehicles have become a focus of advanced research in various countries due to high operational efficiency, strong flexibility, controllable operational cost and high cost-effectiveness ratio, and therefore detection and defense of unmanned aerial vehicles become more and more important. In addition, along with the development of unmanned aerial vehicle autonomy and networked communication technology, the operation mode of unmanned aerial vehicle more and more tends to change from single-frame operation mode to unmanned aerial vehicle cluster cooperative operation mode, and the circumstances of the unmanned aerial vehicle cluster that is surveyed, tracked, defended than single-frame unmanned aerial vehicle defense are more complicated, and how to accurately survey the unmanned aerial vehicle cluster is the main difficult problem of unmanned aerial vehicle prevention and control.
At present, generally, radar, radio and photoelectric detection are used as main detection means for unmanned aerial vehicle defense, and radar equipment can obtain information such as distance, speed and direction of the unmanned aerial vehicle by analyzing round-trip delay and phase of electromagnetic signals. However, radar detection is easily interfered by continuous moving objects such as meteorological clutters and birds, so that the false alarm rate of data is high, and the accuracy rate of unmanned aerial vehicle detection is influenced.
Radio detection equipment can survey, search for the radio communication signal between unmanned aerial vehicle and the remote controller when flying to information such as acquisition signal frequency, position, but direction finding precision is poor.
The photoelectric equipment integrates infrared light and visible light, can provide clear and visual video information of a target, but has limited target tracking.
In order to improve the detection and identification probability of the unmanned cluster target in various application environments and reduce the false alarm rate and the false alarm rate, the distributed multi-modal sensors are comprehensively utilized for detection and tracking, and the multi-sensor multi-modal detection data are fused to realize the complementation of various sensors in function and performance, so that the reliability and credibility of the comprehensive detection result are improved.
Disclosure of Invention
The application aims to provide a distributed multi-sensor multi-mode unmanned cluster target fusion tracking method and system, which can reduce the false alarm rate of detection, thereby improving the accuracy rate of unmanned aerial vehicle cluster detection. The application adopts the following technical scheme:
in a first aspect, an embodiment of the present application provides a distributed multi-sensor multi-modal unmanned cluster target fusion tracking system, where the system includes:
the radar equipment is used for transmitting electromagnetic waves, analyzing the echo signals to obtain pose information of the target unmanned aerial vehicle, and sending the pose information to the control equipment;
the photoelectric equipment is used for aligning the target unmanned aerial vehicle and acquiring image information of the target unmanned aerial vehicle;
the control device is used for controlling the photoelectric device to be aligned with the target unmanned aerial vehicle according to the pose information;
the control equipment is further used for determining tracking information of the target unmanned aerial vehicle according to the pose information and the image information.
Optionally, the system further comprises:
the radio equipment is used for acquiring remote control and image transmission signals of the target unmanned aerial vehicle, performing characteristic extraction and direction finding on a signal source of the acquired signals to obtain signal direction information of the signal source and signal characteristic information of the signal source, and sending the signal characteristic information and the direction information to the control equipment;
the control device is specifically configured to determine tracking information of the target unmanned aerial vehicle according to the pose information, the image information, the signal feature information, and the signal orientation information.
Optionally, the control device is further configured to:
determining signal characteristics such as frequency spectrum and power of the target unmanned aerial vehicle from the intercepted signals, screening target signal characteristics matched with the signal characteristic information from a preset characteristic library, and determining the type of the unmanned aerial vehicle corresponding to the target signal characteristics as the type of the target unmanned aerial vehicle;
and signal characteristics corresponding to the types of the unmanned aerial vehicles are stored in the preset characteristic library.
Optionally, the tracking information includes track information and video tracking information of the target drone.
Optionally, the system further comprises:
the display device is used for acquiring and displaying the tracking information from the control device;
the control equipment is also used for acquiring the running state information of the radar equipment, the photoelectric equipment and the radio equipment and sending the running state information to the display equipment;
the display device is further configured to display the operating state information.
Optionally, before determining the tracking information of the target drone according to the pose information and the image information, the control device is further configured to:
determining location information of a protection destination;
determining a position relation between the target unmanned aerial vehicle and the protection destination according to the pose information and a relative position relation, wherein the relative position relation is the position relation between the protection destination and the radar equipment.
Optionally, the pose information is coordinate information of the target unmanned aerial vehicle in a spherical coordinate system, and the spherical coordinate system takes the radar device as an origin of coordinates;
before determining the position relationship between the target unmanned aerial vehicle and the protection destination according to the pose information and the relative position relationship, the control device is further configured to:
converting the coordinate information of the target unmanned aerial vehicle under a spherical coordinate system into the coordinate information of the target unmanned aerial vehicle under a space rectangular coordinate system, wherein the space rectangular coordinate system takes the radar equipment as a coordinate origin;
the control device is specifically configured to determine, according to the coordinate information and the relative position relationship of the target unmanned aerial vehicle in the space rectangular coordinate system, the position information of the target unmanned aerial vehicle in the protection rectangular coordinate system, where the protection rectangular coordinate system is: and the space rectangular coordinate system takes the position where the protection place is located as a coordinate origin.
Optionally, the control device is further configured to: converting the position information of the target unmanned aerial vehicle under the protection rectangular coordinate system into the position information under a protection ball coordinate system, wherein the protection ball coordinate system is as follows: and the spherical coordinate system takes the position where the protection place is located as a coordinate origin.
Optionally, the radar devices include multiple ones, the multiple ones are asynchronous radar devices, sampling bands of the multiple ones are different, and the multiple ones form a radar networking device;
the control device is further configured to, before controlling the optoelectronic device to aim at the target drone according to the pose information:
determining a reference radar device having the shortest sampling interval from among the plurality of radar devices;
determining target pose information respectively corresponding to target radar devices except the reference radar device aiming at the pose information detected by the reference radar device at a first moment, wherein the target pose information is the pose information detected by the target radar device at a moment closest to the first moment;
and determining the pose information detected by the radar networking equipment at the first moment by combining the pose information detected by the reference radar equipment at the first moment and the pose information of each target.
Optionally, the radio device is specifically configured to acquire a remote control signal and/or a map-borne signal of the target drone.
Optionally, the control device is specifically configured to determine pose information detected by the radar networking device at the first time according to the following formula:
Figure BDA0003617298180000031
wherein,
Figure BDA0003617298180000032
representing the pose information detected by the radar networking equipment at the first moment,
Figure BDA0003617298180000033
represents the inverse of the measurement error of the reference radar device,
Figure BDA0003617298180000034
representing the inverse of the measurement error of the ith target radar device, m being the number of target radar devices, X J Representing the position and orientation information, X, detected by the reference radar equipment at the first moment i And representing the target position and attitude information corresponding to the ith target radar equipment.
12. The method of claim 11, wherein the control device is further configured to determine a measurement error of the radar networking device according to the following equation:
Figure BDA0003617298180000035
wherein,
Figure BDA0003617298180000036
and representing the inverse of the measurement error of the radar networking equipment.
In a second aspect, an embodiment of the present application provides a distributed multi-sensor multi-modal unmanned cluster target fusion tracking method, which is applied to a distributed multi-sensor multi-modal unmanned cluster target fusion tracking system, where the system includes: the radar device and the photoelectric device are in communication connection with the control device; the method comprises the following steps:
the radar equipment transmits electromagnetic waves, obtains pose information of the target unmanned aerial vehicle by analyzing echo signals, and sends the pose information to the control equipment;
the control device controls the photoelectric device to be aligned with the target unmanned aerial vehicle according to the pose information;
the photoelectric equipment is aligned to the target unmanned aerial vehicle, and the photoelectric equipment acquires image information of the target unmanned aerial vehicle;
and the control equipment determines the tracking information of the target unmanned aerial vehicle according to the pose information and the image information.
14. The method of claim 13, wherein the drone detecting system further comprises: a radio device;
the method further comprises the following steps:
the radio equipment acquires signals sent or received by the target unmanned aerial vehicle, performs characteristic extraction and direction finding on a signal source of the acquired signals to obtain signal direction information of the signal source and signal characteristic information of the signal source, and sends the signal characteristic information and the direction information to the control equipment;
the control equipment determines tracking information of the target unmanned aerial vehicle according to the pose information and the image information, and the tracking information comprises the following steps:
and the control equipment determines the tracking information of the target unmanned aerial vehicle according to the pose information, the image information, the signal characteristic information and the signal azimuth information.
The application provides an unmanned aerial vehicle detecting system's radar equipment can obtain target unmanned aerial vehicle's position appearance information through carrying out the analysis to echo signal, will position appearance information sends controlgear, like this, controlgear can acquire target unmanned aerial vehicle's position appearance information. Through cooperative detection and cross fusion of radar equipment and radio equipment, the target discovery probability of the unmanned aerial vehicle is improved, and false alarms caused by low-speed targets such as meteorological clutters and flying birds are reduced. The control equipment can control the photoelectric equipment to aim at the target unmanned aerial vehicle according to the position and pose information of the target unmanned aerial vehicle, and the photoelectric equipment can accurately acquire the image information of the target unmanned aerial vehicle, so that the control equipment can accurately determine the tracking information of the target unmanned aerial vehicle according to the position and pose information and the image information of the target unmanned aerial vehicle.
Therefore, according to the scheme provided by the application, the radar device, the radio device and the photoelectric device are fused, after the radar device monitors the pose information of the target unmanned aerial vehicle, the photoelectric device can aim at the target unmanned aerial vehicle based on the pose information and acquire images of the target unmanned aerial vehicle monitored by the radar device, in this way, the control device can determine whether the pose information monitored by the radar device is accurate according to the images acquired by the photoelectric device, and when the radar device is interfered by continuous moving objects such as weather clutter and birds to cause a false alarm phenomenon, the control device can further determine whether the information monitored by the radar device is accurate according to the images acquired by the photoelectric device and the signal characteristics acquired by the radio device, so that the false alarm rate of data is reduced, and the detection accuracy of the unmanned aerial vehicle is improved.
Drawings
Fig. 1 is a schematic structural diagram of an unmanned aerial vehicle detection system provided in the present application;
FIG. 2 is a schematic diagram showing the relationship between spherical coordinates and rectangular coordinates;
FIG. 3 is a schematic diagram showing comparison of sampling periods of different-band radars;
fig. 4 is a schematic flow chart of the unmanned aerial vehicle detection method provided in the present application.
Detailed Description
The technical solution in the present application will be described below with reference to the accompanying drawings. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
In the following, the terms "first", "second", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or as implying any indication of the number of technical features indicated. Thus, a feature defined as "first," "second," etc. may explicitly or implicitly include one or more of that feature.
In order to reduce the false alarm rate of detection and improve the accuracy rate of unmanned aerial vehicle detection, the first embodiment of the application provides a distributed multi-sensor multi-modal unmanned cluster target fusion tracking system. The scheme that this application provided can be used for surveying unmanned aerial vehicle, is particularly useful for surveying low little unmanned aerial vehicle cluster accurately.
As shown in fig. 1, the system includes: the device comprises a control device 1, a radar device 2, a radio device 3 and a photoelectric device 4, wherein the control device is respectively connected with the radar device 2, the radio device 3 and the photoelectric device 4 in a communication mode. Specifically, the control device 1 may be communicatively connected to the radar device 2, the radio device 3, and the photoelectric device 4 through a wired communication cable or through wireless communication.
In the embodiment of the application, before detection, the system can register each detection device, so as to realize standardized protocol access of the radar device 2, the radio device 3 and the photoelectric device 4. The control device 1 may specifically perform instruction control on the radar device 2 accessing the control device 1, and configure the scanning range, the working frequency point, the rotational speed of the pan/tilt head, the sweep frequency, the due north direction, and other working parameters of the radar device 2. Specifically, the time may be timed by a GPS module of the radar device and may be calibrated with the time of the control device 1 and the display device 5 described below. And carrying out north correction based on the detection data of the GPS module, and starting the radar target detection function.
The control device 1 may be a computer, a desktop computer, a notebook computer, a server, a mobile terminal, or other devices having data processing and signal control functions, and the application does not limit the specific form of the control device 1.
The radar device 2 is used for emitting electromagnetic waves, analyzing the echo signals to obtain pose information of the target unmanned aerial vehicle, and sending the obtained pose information to the control device 1.
The radio device 3 is used for searching for the remote control and image transmission signals of the unmanned aerial vehicle, reading the characteristics of the signals such as frequency spectrum and power, and sending the obtained signal information to the control device 1.
The optoelectronic device 4 is an image capturing device, and the optoelectronic device 4 may be a camera, a video camera, or the like, but is not limited thereto. The optoelectronic device 4 may capture a picture or video of the target drone. The optoelectronic device 4 may have a visible light and infrared light collection function to collect an infrared image and a visible light image of the target drone.
The pose information may include the distance of the target drone from the radar device 2, the speed, height, elevation, azimuth, and other motion information of the target drone, so as to achieve continuous stable tracking of the target drone. The radar device 2 can search and emit electromagnetic waves to a full airspace through the scanning antenna, and extract pose information of the target unmanned aerial vehicle by analyzing Doppler characteristics of echo signals.
The control device 1 is used for controlling the photoelectric device 4 to aim at the target unmanned aerial vehicle according to the pose information. Specifically, the control device 1 may control the pan/tilt head of the optoelectronic device 4 to rotate according to the pose information, so that the optoelectronic device 4 rotates to a state of being aligned with the target drone. Specifically, the control device 1 can control parameters such as the rotational speed, the azimuth angle and the pitch angle of the holder of the photoelectric device 4, control the photoelectric device 4 to adjust the focal length, zoom and fix the focus of the target unmanned aerial vehicle, and then accurately acquire the visible light and infrared detection video information of the target unmanned aerial vehicle.
Above-mentioned optoelectronic device 4 aims at target unmanned aerial vehicle under control equipment 1's control, and optoelectronic device 4 is used for gathering target unmanned aerial vehicle's image information behind the alignment target unmanned aerial vehicle.
The control device 1 is further configured to determine tracking information of the target unmanned aerial vehicle according to the pose information obtained by the radar device 2 and the image information obtained by the photoelectric device 4. Specifically, the control device 1 may integrate the pose information and the image information into a comprehensive message including, but not limited to, the position, the speed, the video, the point track information, and the like of the target drone.
The application provides an unmanned aerial vehicle detecting system's radar equipment 2 can obtain target unmanned aerial vehicle's position appearance information through carrying out the analysis to echo signal, will position appearance information sends for controlgear 1, like this, controlgear 1 can acquire target unmanned aerial vehicle's position appearance information, because controlgear 1 can aim at target unmanned aerial vehicle according to target unmanned aerial vehicle's position appearance information control optoelectronic device 4, optoelectronic device 4 can gather target unmanned aerial vehicle's image information very accurately, and like this, controlgear 1 can accurately confirm the tracking information of target unmanned aerial vehicle according to target unmanned aerial vehicle's position appearance information and image information.
Therefore, the scheme provided by the application integrates the radar equipment and the photoelectric equipment, after the radar equipment monitors the pose information of the target unmanned aerial vehicle, the photoelectric equipment can aim at the radar equipment based on the pose information and acquire images of the target unmanned aerial vehicle monitored by the radar equipment, so that the control equipment can determine whether the pose information monitored by the radar equipment is accurate according to the images acquired by the photoelectric equipment, and when the radar equipment is interfered by continuous moving objects such as meteorological clutter and flying birds and the like to cause a false alarm phenomenon, whether the information monitored by the radar equipment is accurate can be further confirmed according to the images acquired by the photoelectric equipment, so that the false alarm rate of data is reduced, and the accuracy of unmanned aerial vehicle detection is improved. The radar detection information and the photoelectric detection information form complementation and cross validation, and the false alarm rate of the system is reduced.
In one embodiment, as shown in fig. 1, the system may further include: a radio communicatively coupled to the control device. The radio is for: the method comprises the steps of obtaining signals sent or received by a target unmanned aerial vehicle, carrying out feature extraction and direction finding on a signal source of the obtained signals, obtaining signal direction information of the signal source and signal feature information of the signal source, and sending the signal feature information and the direction information to the control equipment.
The control device is specifically used for determining tracking information of the target unmanned aerial vehicle according to pose information obtained by the radar device, image information and signal characteristic information obtained by the photoelectric device and the signal azimuth information. Specifically, the above-mentioned comprehensive packet may further include information such as a signal spectrum and a category of the target drone.
The radio equipment can adopt radio frequency scanning technology to reconnoiter and analyze the data transmission and the image transmission signals of the target unmanned aerial vehicle. Specifically, the radio may collect signals of the target drone using a spectrum sensing antenna to achieve target direction finding using radio direction finding technology. The radio equipment can detect and search information such as frequency, direction and the like between the target unmanned aerial vehicle and the remote controller during flight.
This embodiment can be through the signal that radio equipment intercepting target unmanned aerial vehicle sent or received to realize the direction finding in order to obtain target unmanned aerial vehicle's position information according to the signal of intercepting, like this, can further improve the accuracy of surveying, and can obtain target unmanned aerial vehicle's multimode position appearance information, the accurate analysis of being convenient for.
In a particular embodiment, the control device is further configured to: and screening out target signal characteristics matched with the signal characteristic information from a preset characteristic library, and determining the type of the unmanned aerial vehicle corresponding to the target signal characteristics as the type of the target unmanned aerial vehicle.
And signal characteristics corresponding to the types of the unmanned aerial vehicles are stored in a preset characteristic library.
The signal characteristics may include electromagnetic characteristics of the signal, and specifically may include, but are not limited to, a spectrum, a wavelength, and the like of the electromagnetic signal.
The type of target unmanned aerial vehicle can be determined to this embodiment, is convenient for carry out accurate analysis to target unmanned aerial vehicle according to this type.
In a specific embodiment, the tracking information may include track information and video tracking information of the target drone. In this embodiment, the control device may determine the track information and the video tracking information of the target unmanned aerial vehicle according to the pose information and the image information.
The embodiment can determine the navigation track and the navigation video of the target unmanned aerial vehicle, so that the target unmanned aerial vehicle can be checked and monitored by the protection personnel more conveniently.
In one embodiment, the system may further include a display device 5, and the display device 5 is configured to acquire and display the tracking information from the control device. Like this, be convenient for the protection personnel from the actual equipment operating condition of looking over target unmanned aerial vehicle to in discovery target unmanned aerial vehicle and protection target unmanned aerial vehicle.
The control device may be further configured to acquire operation state information of the radar device, the photoelectric device, and the radio device, and send the operation state information to the display device 5, and the display device 5 is further configured to display the operation state information. The operation state information may indicate whether the operation state of the device is normal. Therefore, the working personnel can find whether each detection device operates normally in time.
In a specific embodiment, before determining the tracking information of the target drone according to the pose information and the image information, the control device is further configured to:
determining location information of a protection destination;
and determining the position relation between the target unmanned aerial vehicle and the shelter according to the pose information and the relative position relation.
The relative positional relationship is a positional relationship between the protective material and the radar apparatus.
The location information of the protection destination may be a location where the protection destination is located, for example, the location information of the protection destination may be a longitude and latitude of the protection destination, or may represent the location information of the protection destination in other forms, which is not particularly limited in this application.
Since the position information of the radar device is preset and known, after the position information of the protection place is determined, the position relationship (i.e., relative position relationship) between the protection place and the radar device can be easily determined, and the relative position relationship may include: the distance between the guard ground and the radar apparatus, the orientation of the guard ground with respect to the radar apparatus.
The embodiment determines the position relation between the target unmanned aerial vehicle and the ground of protection, so that the protection personnel can conveniently know the position of the target unmanned aerial vehicle at the ground of protection.
In a specific embodiment, the pose information is information of a spherical coordinate system of the target drone in a spherical coordinate system, and the spherical coordinate system uses the radar device as an origin of coordinates. Because the pose information of the target detected by most radars is the coordinate under the spherical coordinate system, the pose information of the embodiment is the coordinate information under the spherical coordinate system, so that the system is more convenient to set.
Before determining the position relationship between the target unmanned aerial vehicle and the protection destination according to the pose information and the relative position relationship, the control device is further configured to:
and converting the information of the spherical coordinate system into the information of the spatial rectangular coordinate system of the target unmanned aerial vehicle under the spatial rectangular coordinate system.
The space rectangular coordinate system takes the radar equipment as the origin of coordinates.
In the embodiment of the application, the protection ground coordinates can be set through the control equipment, and coordinate transformation and spatial registration are carried out on the detection data of the asynchronous radar equipment.
For example, the target drone detected by the radar device has spherical coordinate system information (R, α, β) in a spherical coordinate system, and the spatial rectangular coordinate system information (x, y, z) to be converted into the spherical coordinate system information. The corresponding geometrical relationship between the spherical coordinate system and the spatial rectangular coordinate system is shown in fig. 2. The position of the radar equipment is used as an original point O, a connecting line R of the original point O and the target unmanned aerial vehicle M is used as a polar diameter, the direction of the geomagnetic east is the direction of an x axis, the direction of the geomagnetic north is the direction of a y axis, an angle alpha formed by the polar diameter R and a horizontal projection R of an XOY plane is used as an elevation angle, and an angle beta formed by the horizontal projection R and the positive direction of a z axis is used as an azimuth angle. The coordinate transformation of the two coordinate systems is shown in formula (1).
Figure BDA0003617298180000071
Therefore, in this embodiment, the control device may determine the position information of the target drone in the protection rectangular coordinate system according to equation (1).
Optionally, the control device is specifically configured to translate the spatial rectangular coordinate system information to obtain position information of the target unmanned aerial vehicle in the protection rectangular coordinate system. Translation information for performing translation is determined according to the relative positional relationship. For example, performing the translation may be performed as in equation (2).
P To ground =P Radar +ΔP (2)
Δ P represents the positional relationship between the protective ground and the radar device, P Radar Representing spatial rectangular coordinate system information, P To ground And representing the position information of the target unmanned aerial vehicle under the protection rectangular coordinate system.
And under the destination coordinate system, restoring the spatial rectangular coordinate system into a spherical coordinate system.
The control equipment is specifically used for determining the position information of the target unmanned aerial vehicle under the protection rectangular coordinate system according to the space rectangular coordinate system information and the relative position relation.
The protection rectangular coordinate system is as follows: and the position of the protective place is taken as a space rectangular coordinate system of the coordinate origin.
This embodiment can determine the coordinate of target unmanned aerial vehicle under the space rectangular coordinate system who is the origin of coordinates with the position that protects the place, because the coordinate under the rectangular coordinate system is more convenient for discern and confirm, so this embodiment is more convenient for the protection personnel to carry out the position to target unmanned aerial vehicle and confirms.
In a particular embodiment, the control device is further configured to: converting the position information of the target unmanned aerial vehicle under the protection rectangular coordinate system into the position information under the protection spherical coordinate system, wherein the protection spherical coordinate system is as follows: and the spherical coordinate system takes the position where the protection place is located as a coordinate origin. For the demand that satisfies different coordinates and express, this embodiment can make the protection personnel learn the coordinate of target unmanned aerial vehicle under the spherical coordinate system that the protection will be on the ground, and the protection personnel of being convenient for carry out many-sided pursuit.
In an embodiment, the radar device may include a plurality of radar devices, the plurality of radar devices are asynchronous radar devices, the plurality of radar devices form a radar networking device, and sampling bands of the plurality of radar devices are different.
The sampling wave bands of the plurality of radar devices are different, and it can be understood that the wave bands and the sampling intervals of every two radar devices are different. For example, the sampling band of one radar device is the L band, and the sampling band of the other radar device is the Ku band. The radar device may be in other bands, and the present application is not particularly limited.
This embodiment can be surveyed the target unmanned aerial vehicle of different grade type through the radar equipment that a plurality of wave bands are different, and the wave band scope of surveying is wider, and the detection accuracy is higher.
The control device is further configured to, before controlling the optoelectronic device to aim at the target drone according to the pose information:
determining a reference radar device with the shortest sampling interval from a plurality of radar devices;
determining target pose information respectively corresponding to target radar devices except the reference radar device aiming at the pose information detected by the reference radar device at a first moment, wherein the target pose information is the pose information detected by the target radar device at a moment closest to the first moment;
and determining the position and attitude information detected by the radar networking equipment at the first moment by combining the position and attitude information detected by the reference radar equipment at the first moment and the position and attitude information of each target.
The first time is each time when the reference radar device performs sampling.
Correspondingly, the control device is specifically configured to control the photoelectric device to aim at the target unmanned aerial vehicle according to pose information obtained by the radar networking device at the first moment. The control device is specifically used for determining tracking information of the target unmanned aerial vehicle according to the pose information and the image information obtained by the radar networking device at the first moment.
In this embodiment, the radar device with the shortest sampling time is used as the reference radar device, and the pose information of the target unmanned aerial vehicle obtained by the radar networking device is obtained by combining with each target radar device, so that the obtained pose information is more in information amount and more accurate.
In a specific embodiment, the radio device is specifically configured to acquire a remote control signal and/or a map-borne signal of the target drone.
In a specific embodiment, the control device is specifically configured to determine pose information detected by the radar networking device at the first time according to the following formula:
Figure BDA0003617298180000091
wherein,
Figure BDA0003617298180000092
representing the pose information detected by the radar networking equipment at the first moment,
Figure BDA0003617298180000093
represents the inverse of the measurement error of the reference radar device,
Figure BDA0003617298180000094
representing the inverse of the measurement error of the ith target radar device, m being the number of target radar devices, X J Representing the pose information, X, detected by the reference radar device at the first time i And representing target pose information corresponding to the ith target radar equipment.
The pose information detected by the radar networking equipment at the first moment can be conveniently determined according to the formula. And the calculation deviation is considered during calculation, so that the calculation accuracy is higher.
In a specific embodiment, the control device is further configured to determine a measurement error of the radar networking device according to the following formula:
Figure BDA0003617298180000095
wherein,
Figure BDA0003617298180000096
and representing the measurement error of the radar networking equipment.
The embodiment can conveniently determine the measurement error of the radar networking equipment, so that the protective personnel can conveniently acquire the accurate pose of the target unmanned aerial vehicle according to the determined measurement error.
For example, 2 parts of Ku-band and L-band radars are taken as an example, and the detection time intervals are too different, so that the conventional multipoint association judgment method is difficult to be adopted to perform asynchronous sensor track data association, and difficulty is brought to the fusion of detection data. As shown in fig. 3, it is assumed that the sampling periods of the L-band radar and the Ku-band radar are T, t, respectively. The time of the L-band radar for updating the target state is T-1 and T, and between two continuous target state updates, the Ku-band radar has N updates, which are respectively: t-N, t-N +1, … t-2, t-1; between time T and T + N of the L-band radar for updating the target state, the Ku-band radar is also updated for N times, which are respectively: t +1, t +2, … t + N-1, t + N. In the embodiment of the application, the time registration can refer to the Ku-band radar equipment with a short sampling interval. Sampling data X for Ku band at each time K The inverse of the measurement error of
Figure BDA0003617298180000097
Searching L-band sampling data X closest to the time of the L-band sampling data X L The reciprocal of the measurement error is
Figure BDA0003617298180000098
And time registration of the Ku-band radar and the L-band radar is realized, further, the measured values of the L-band radar and the Ku-band radar are subjected to weighted fitting, and the calculated value is used as an asynchronous radar detection data fusion value. The fused measurement values and their errors are shown in the following formula:
Figure BDA0003617298180000099
according to the method, the multi-mode sensor detection information is fused and displayed, radar, radio, photoelectric multi-mode and multi-sensor detection data are subjected to correlation fusion after information processing, an unmanned aerial vehicle cluster target comprehensive detection message and a real-time situation including target position, speed, video, category, identity and point track information are formed, and the information is displayed on the display device. The multi-mode multi-sensor detection information fusion can supplement a single-source detection result, and the detection and identification capability of an oncoming unmanned cluster target is improved.
The second embodiment of the present application further provides a distributed multi-sensor multi-modal unmanned cluster target fusion tracking method, which is applied to a distributed multi-sensor multi-modal unmanned cluster target fusion tracking system, where the system includes: the radar device and the photoelectric device are in communication connection with the control device; the method comprises the following steps:
the radar equipment transmits electromagnetic waves, obtains pose information of the target unmanned aerial vehicle by analyzing echo signals, and sends the pose information to the control equipment;
the control device controls the photoelectric device to be aligned with the target unmanned aerial vehicle according to the pose information;
the photoelectric equipment is aligned to the target unmanned aerial vehicle, and the photoelectric equipment acquires image information of the target unmanned aerial vehicle;
and the control equipment determines the tracking information of the target unmanned aerial vehicle according to the pose information and the image information.
In one embodiment, the system may further include a radio.
The unmanned aerial vehicle detection method further comprises the following steps:
the method comprises the steps that a radio device obtains signals sent or received by a target unmanned aerial vehicle, carries out direction finding on a signal source of the obtained signals to obtain direction information of the signal source, and sends the direction information and the sent or received signals to a control device;
and the control equipment determines the tracking information of the target unmanned aerial vehicle according to the pose information, the image information and the azimuth information.
Corresponding to the above-mentioned unmanned aerial vehicle detection method, an embodiment of the present invention further provides a computer program product containing instructions, which when run on a computer, causes the computer to execute the unmanned aerial vehicle detection method according to any one of the above-mentioned embodiments.
It should be understood that the above description is only for the purpose of helping those skilled in the art better understand the embodiments of the present application, and is not intended to limit the scope of the embodiments of the present application. Various equivalent modifications or changes, or combinations of any two or more of the above, may be apparent to those skilled in the art in light of the above examples given. Such modifications, variations, or combinations are also within the scope of the embodiments of the present application.
It should also be understood that the foregoing descriptions of the embodiments of the present application focus on highlighting differences between the various embodiments, and that the same or similar elements that are not mentioned may be referred to one another and, for brevity, are not repeated herein.
It should also be understood that the manner, the case, the category, and the division of the embodiments in the present application are only for convenience of description, and should not constitute a particular limitation, and features in various manners, categories, cases, and embodiments may be combined without contradiction.
It is also to be understood that the terminology and/or the description of the various embodiments herein is consistent and mutually inconsistent if no specific statement or logic conflicts exists, and that the technical features of the various embodiments may be combined to form new embodiments based on their inherent logical relationships.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and all the changes or substitutions should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (14)

1. A distributed multi-sensor multi-modal unmanned cluster target fusion tracking system, the system comprising:
the radar equipment is used for transmitting electromagnetic waves, analyzing the echo signals to obtain pose information of the target unmanned aerial vehicle, and sending the pose information to the control equipment;
the photoelectric equipment is used for aligning the target unmanned aerial vehicle and acquiring image information of the target unmanned aerial vehicle;
the control device is used for controlling the photoelectric device to be aligned with the target unmanned aerial vehicle according to the pose information;
the control equipment is further used for determining tracking information of the target unmanned aerial vehicle according to the pose information and the image information.
2. The system of claim 1, further comprising:
the radio equipment is used for acquiring signals sent or received by the target unmanned aerial vehicle, performing characteristic extraction and direction finding on a signal source of the acquired signals to obtain signal direction information and signal characteristic information of the signal source, and sending the signal direction information and the characteristic information to the control equipment;
the control device is specifically configured to determine tracking information of the target unmanned aerial vehicle according to the pose information, the image information, the signal feature information and the signal orientation information.
3. The system of claim 2, wherein the control device is further configured to:
screening out target signal characteristics matched with the signal characteristic information from a preset characteristic library, and determining the type of the unmanned aerial vehicle corresponding to the target signal characteristics as the type of the target unmanned aerial vehicle;
and the preset feature library stores signal features corresponding to all unmanned aerial vehicle types.
4. The system of claim 2, wherein the tracking information includes track information and video tracking information of the target drone.
5. The system of claim 2, further comprising:
the display device is used for acquiring and displaying the tracking information from the control device;
the control equipment is also used for acquiring the running state information of the radar equipment, the photoelectric equipment and the radio equipment and sending the running state information to the display equipment;
the display device is further configured to display the operating state information.
6. The system of claim 2, wherein the control device, prior to determining tracking information for the target drone from the pose information, the image information, is further configured to:
determining position information of a protection destination;
determining a position relation between the target unmanned aerial vehicle and the protection destination according to the pose information and a relative position relation, wherein the relative position relation is the position relation between the protection destination and the radar equipment.
7. The system according to claim 6, wherein the pose information is coordinate information of the target drone in a spherical coordinate system, the spherical coordinate system having the radar device as an origin of coordinates;
before determining the position relationship between the target unmanned aerial vehicle and the protection destination according to the pose information and the relative position relationship, the control device is further configured to:
converting the coordinate information of the target unmanned aerial vehicle under a spherical coordinate system into the coordinate information of the target unmanned aerial vehicle under a space rectangular coordinate system, wherein the space rectangular coordinate system takes the radar equipment as a coordinate origin;
the control device is specifically configured to determine, according to the relationship between the coordinate information and the relative position of the target unmanned aerial vehicle in the spatial rectangular coordinate system, the position information of the target unmanned aerial vehicle in a protection rectangular coordinate system, where the protection rectangular coordinate system is: and the space rectangular coordinate system takes the position where the protection place is located as a coordinate origin.
8. The system of claim 7, wherein the control device is further configured to: converting the position information of the target unmanned aerial vehicle under the protection rectangular coordinate system into the position information under a protection ball coordinate system, wherein the protection ball coordinate system is as follows: and the spherical coordinate system takes the position where the protection place is located as a coordinate origin.
9. The system according to claim 1, wherein the radar device comprises a plurality of radar devices, a plurality of the radar devices are asynchronous radar devices, sampling wave bands of the plurality of the radar devices are different, and the plurality of the radar devices form a radar networking device;
the control device is further configured to, before controlling the optoelectronic device to aim at the target drone according to the pose information:
determining a reference radar device having the shortest sampling interval from among the plurality of radar devices;
determining target pose information corresponding to target radar equipment except the reference radar equipment according to pose information detected by the reference radar equipment at a first moment, wherein the target pose information is the pose information detected by the target radar equipment at a moment closest to the first moment;
and determining the pose information detected by the radar networking equipment at the first moment by combining the pose information detected by the reference radar equipment at the first moment and the pose information of each target.
10. The system of claim 2, wherein the radio is specifically configured to obtain a remote control signal and/or a map-borne signal of the target drone.
11. The system according to claim 9, wherein the control device is specifically configured to determine pose information detected by the radar networking device at the first time according to the following formula:
Figure FDA0003617298170000021
wherein,
Figure FDA0003617298170000022
representing the pose information detected by the radar networking equipment at the first moment,
Figure FDA0003617298170000023
represents the inverse of the measurement error of the reference radar device,
Figure FDA0003617298170000024
representing the inverse of the measurement error of the ith target radar device, m being the number of target radar devices, X J Representing the pose information, X, detected by the reference radar device at the first time i And representing target pose information corresponding to the ith target radar equipment.
12. The system of claim 11, wherein the control device is further configured to determine a measurement error of the radar networking device according to the following equation:
Figure FDA0003617298170000025
wherein,
Figure FDA0003617298170000026
and representing the inverse of the measurement error of the radar networking equipment.
13. A distributed multi-sensor multi-mode unmanned cluster target fusion tracking method is characterized by being applied to a distributed multi-sensor multi-mode unmanned cluster target fusion tracking system, and the system comprises: the radar device and the photoelectric device are in communication connection with the control device; the method comprises the following steps:
the radar equipment transmits electromagnetic waves, obtains pose information of the target unmanned aerial vehicle by analyzing echo signals, and sends the pose information to the control equipment;
the control device controls the photoelectric device to be aligned with the target unmanned aerial vehicle according to the pose information;
the photoelectric equipment is aligned to the target unmanned aerial vehicle, and the photoelectric equipment acquires image information of the target unmanned aerial vehicle;
and the control equipment determines the tracking information of the target unmanned aerial vehicle according to the pose information and the image information.
14. The method of claim 13, wherein the drone detecting system further comprises: a radio device;
the method further comprises the following steps:
the radio equipment acquires signals sent or received by the target unmanned aerial vehicle, performs characteristic extraction and direction finding on a signal source of the acquired signals to obtain signal direction information of the signal source and signal characteristic information of the signal source, and sends the signal characteristic information and the direction information to the control equipment;
the control equipment determines tracking information of the target unmanned aerial vehicle according to the pose information and the image information, and the tracking information comprises the following steps:
and the control equipment determines the tracking information of the target unmanned aerial vehicle according to the pose information, the image information, the signal characteristic information and the signal azimuth information.
CN202210447621.9A 2022-04-26 2022-04-26 Distributed multi-sensor multi-mode unmanned cluster target fusion tracking method Pending CN115032627A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210447621.9A CN115032627A (en) 2022-04-26 2022-04-26 Distributed multi-sensor multi-mode unmanned cluster target fusion tracking method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210447621.9A CN115032627A (en) 2022-04-26 2022-04-26 Distributed multi-sensor multi-mode unmanned cluster target fusion tracking method

Publications (1)

Publication Number Publication Date
CN115032627A true CN115032627A (en) 2022-09-09

Family

ID=83119702

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210447621.9A Pending CN115032627A (en) 2022-04-26 2022-04-26 Distributed multi-sensor multi-mode unmanned cluster target fusion tracking method

Country Status (1)

Country Link
CN (1) CN115032627A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115630514A (en) * 2022-10-29 2023-01-20 中国电子科技集团公司第十五研究所 Unmanned aerial vehicle cluster cooperative task allocation method and device
CN116359836A (en) * 2023-05-31 2023-06-30 成都金支点科技有限公司 Unmanned aerial vehicle target tracking method and system based on super-resolution direction finding
CN117784120A (en) * 2024-02-23 2024-03-29 南京新航线无人机科技有限公司 Unmanned aerial vehicle flight state monitoring method and system

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115630514A (en) * 2022-10-29 2023-01-20 中国电子科技集团公司第十五研究所 Unmanned aerial vehicle cluster cooperative task allocation method and device
CN116359836A (en) * 2023-05-31 2023-06-30 成都金支点科技有限公司 Unmanned aerial vehicle target tracking method and system based on super-resolution direction finding
CN116359836B (en) * 2023-05-31 2023-08-15 成都金支点科技有限公司 Unmanned aerial vehicle target tracking method and system based on super-resolution direction finding
CN117784120A (en) * 2024-02-23 2024-03-29 南京新航线无人机科技有限公司 Unmanned aerial vehicle flight state monitoring method and system
CN117784120B (en) * 2024-02-23 2024-05-28 南京新航线无人机科技有限公司 Unmanned aerial vehicle flight state monitoring method and system

Similar Documents

Publication Publication Date Title
CN115032627A (en) Distributed multi-sensor multi-mode unmanned cluster target fusion tracking method
US9784836B2 (en) System for monitoring power lines
CA2767312C (en) Automatic video surveillance system and method
CN109873669A (en) A kind of unmanned plane detection method and unmanned plane detection system
CN112525162A (en) System and method for measuring image distance of power transmission line by unmanned aerial vehicle
CN110081982B (en) Unmanned aerial vehicle target positioning method based on double-spectrum photoelectric search
CN104297739B (en) Method for guiding photoelectric tracking equipment in navigation monitoring
CN107783545A (en) Post disaster relief rotor wing unmanned aerial vehicle obstacle avoidance system based on OODA ring multi-sensor information fusions
KR101886932B1 (en) Positioning system for gpr data using geographic information system and road surface image
CN113156417B (en) Anti-unmanned aerial vehicle detection system, method and radar equipment
CN110297234B (en) Networked large-area passive air target intersection determination method and system
CN104535996A (en) Image/laser ranging/ low-altitude frequency-modulated continuous wave radar integrated system
CN112348882A (en) Low-altitude target tracking information fusion method and system based on multi-source detector
CN116520275A (en) Radar photoelectric integrated method and system for detecting and tracking low-speed small target
CN115932834A (en) Anti-unmanned aerial vehicle system target detection method based on multi-source heterogeneous data fusion
RU113046U1 (en) COMPREHENSIVE SYSTEM FOR EARLY DETECTION OF FOREST FIRES, BUILT ON THE PRINCIPLE OF A VARIETY SENSOR PANORAMIC SURVEY OF THE AREA WITH THE FUNCTION OF HIGH-PRECISION DETERMINATION OF THE FIRE OF THE FIRE
EP3385747A1 (en) Method, device and system for mapping position detections to a graphical representation
Yuan et al. MMAUD: A Comprehensive Multi-Modal Anti-UAV Dataset for Modern Miniature Drone Threats
CN112083420B (en) Unmanned aerial vehicle collision avoidance method and device and unmanned aerial vehicle
Stacy et al. Ingara: an integrated airborne imaging radar system
CN114047501B (en) Indoor positioning system based on millimeter wave radar
CN116125488A (en) Target tracking method, signal fusion method, device, terminal and storage medium
CN113917875A (en) Open universal intelligent controller, method and storage medium for autonomous unmanned system
RU2770827C1 (en) Multi-position radar method
JP4292796B2 (en) Monitoring device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination