CN114047505A - Target detection display control method and device supporting photoelectric cross positioning in radar failure scene - Google Patents
Target detection display control method and device supporting photoelectric cross positioning in radar failure scene Download PDFInfo
- Publication number
- CN114047505A CN114047505A CN202111232889.2A CN202111232889A CN114047505A CN 114047505 A CN114047505 A CN 114047505A CN 202111232889 A CN202111232889 A CN 202111232889A CN 114047505 A CN114047505 A CN 114047505A
- Authority
- CN
- China
- Prior art keywords
- radar
- photoelectric
- module
- data
- source
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 80
- 238000000034 method Methods 0.000 title claims abstract description 36
- 230000004927 fusion Effects 0.000 claims abstract description 64
- 238000012545 processing Methods 0.000 claims abstract description 52
- 230000036541 health Effects 0.000 claims abstract description 43
- 238000006243 chemical reaction Methods 0.000 claims abstract description 19
- 238000005516 engineering process Methods 0.000 claims abstract description 10
- 238000004364 calculation method Methods 0.000 claims description 15
- 230000003287 optical effect Effects 0.000 claims description 10
- 238000011084 recovery Methods 0.000 claims description 9
- 230000004931 aggregating effect Effects 0.000 claims description 8
- 238000013527 convolutional neural network Methods 0.000 claims description 7
- 238000013135 deep learning Methods 0.000 claims description 7
- 230000008569 process Effects 0.000 claims description 7
- 230000002618 waking effect Effects 0.000 claims description 6
- 230000005693 optoelectronics Effects 0.000 claims description 5
- 230000006872 improvement Effects 0.000 claims description 4
- 239000000284 extract Substances 0.000 claims description 3
- 238000000605 extraction Methods 0.000 claims description 3
- 238000011897 real-time detection Methods 0.000 claims description 3
- 238000005070 sampling Methods 0.000 claims description 3
- 230000003993 interaction Effects 0.000 description 4
- 238000012544 monitoring process Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 230000006855 networking Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/06—Systems determining position data of a target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/50—Systems of measurement based on relative movement of target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/41—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
- G01S7/415—Identification of targets based on measurements of movement associated with the target
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Computer Networks & Wireless Communication (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
The invention provides a target detection display control method and a target detection display control device supporting photoelectric cross positioning in a radar failure scene; the method comprises protocol conversion, target data processing, equipment health detection, multi-source radar fusion, multi-source photoelectric cross positioning target classification identification and display control management. The invention is based on a streaming data processing framework, realizes multi-source signal fusion display under the condition that radar equipment and photoelectric equipment work simultaneously, relies on the advantages of photoelectric detection, combines the technology of cross positioning of a plurality of photoelectric equipment, realizes target classification identification and tracking under the condition of realizing more accurate positioning of an aerial target, and can also utilize target position data generated by photoelectric cross positioning to develop business under the condition of radar failure.
Description
Technical Field
The invention relates to the technical field of target detection, in particular to a target detection display control method and device supporting photoelectric cross positioning in a radar failure scene.
Background
The unmanned aerial vehicle target has characteristics such as "flying height is low, flight speed is slow, target RCS is little, mobility is strong, interference killing feature is strong", and traditional means has great limitation to its detection, and it is big to survey the degree of difficulty, is difficult to stabilize to its detection, tracking, discernment.
At present, the detection means for low altitude unmanned aerial vehicles at home and abroad mainly takes radar detection as a main part and is assisted by modes such as photoelectric detection, frequency spectrum detection and the like. The radar detection low-altitude unmanned aerial vehicle target can work all day long, the detection distance and the target tracking performance are excellent, the effective identification of the target cannot be realized, the bird target and the unmanned aerial vehicle target cannot be distinguished, meanwhile, the radar cannot be adopted for detection in certain special scenes or terrain states, or radar equipment is prone to failure.
At this time, a multi-source photoelectric detection means is needed, and by relying on the advantages of photoelectric detection and combining a technology of cross positioning of a plurality of photoelectric detection devices, high-level functions such as video tracking, classification and identification of targets are realized under the condition of realizing accurate positioning of the targets in the air.
The photoelectric detection device can not only give out video signals, optical information and visual characteristics of a detected object, but also give out some relative position information of an aerial target such as pitching and orientation by means of an integrated turntable, a direction controller, a view field management device and the like, and the existing unmanned aerial vehicle target detection device is only provided with a single photoelectric detection device, cannot form networking advantages and is only used as additional information input under radar detection; or although a plurality of photoelectric detection devices are equipped to form a networking structure, the relative position information provided by the photoelectric detection devices is not fully utilized, and more blind-supplement video monitoring is realized through networking.
That is to say, current unmanned aerial vehicle target detection device relies on photoelectric detection equipment under the scene of radar inefficacy purely and can't give the comparatively accurate locating information of unmanned aerial vehicle target, can only give some unmanned aerial vehicle's visual characteristic and video information, and these visual characteristic and video information are not too big to the development effect of follow-up unmanned aerial vehicle counter-braking and management business, therefore prior art can not satisfy the business characteristics and the requirement of unmanned aerial vehicle low latitude detection counter-braking.
Disclosure of Invention
The invention aims to provide a target detection display control method and device supporting photoelectric cross positioning in a radar failure scene, and aims to solve the problem that the prior art cannot meet the service characteristics and requirements of unmanned aerial vehicle low-altitude detection countermeasures.
The invention provides a target detection and display control method supporting photoelectric cross positioning in a radar failure scene, which comprises the following steps:
s1, the protocol conversion module receives the photoelectric message and the photoelectric video stream from the photoelectric equipment and the thunder and lightning message from the radar equipment, directly sends the received optical television frequency stream to the target classification and identification module, converts the received photoelectric message and the radar message into a uniform data format and sends the uniform data format to the target data processing module; wherein:
the radar message includes: heartbeat data and radar target data of the radar device;
the photoelectric message comprises: heartbeat data and photoelectric detection data of the photoelectric equipment;
s2, the target data processing module processes the photoelectric message and the radar message after the protocol conversion:
aggregating radar target data into a local radar track and outputting the local radar track to a multi-source radar fusion module;
the photoelectric detection data are directly forwarded to a multi-source photoelectric cross positioning module;
directly forwarding heartbeat data of the radar equipment and heartbeat data of the photoelectric equipment to an equipment health detection module;
s3, the equipment health detection module detects the health states of the radar equipment and the photoelectric equipment in real time according to heartbeat data of the radar equipment and the heartbeat data of the photoelectric equipment, and outputs a data source failure instruction, a data source removing instruction or a data source recovering instruction to the multi-source radar fusion module and the multi-source photoelectric cross positioning module according to the detected health states, so that the data from the corresponding radar equipment and the photoelectric equipment are subjected to failure, removal or recovery operation when the data are processed in the multi-source radar fusion module and the multi-source photoelectric cross positioning module;
s4, the multi-source radar fusion module adopts SF distributed fusion algorithm to perform associated fusion on a plurality of local radar tracks from the target data processing module to form a system radar track of the current fusion period;
s5, the multi-source photoelectric cross positioning module gives out target positioning data to photoelectric detection data from the target data processing module through a cross positioning algorithm based on space stereo calculation geometry;
s6, the target classification and identification module performs target classification and identification on the optical television frequency flow by adopting a deep learning convolutional neural network with YoloV3 as a framework to obtain a target classification and identification result;
and S7, the display control management module displays the system radar track, the target positioning data and the target classification and identification results, and plays the photoelectric video stream.
Optionally, the unified data format in step S1 adopts a JSON format.
Further, the method for aggregating the radar target data into the local radar track and outputting the local radar track to the multi-source radar fusion module in the step S2 includes:
calculating the Mahalanobis distance between radar target data by using a tracking algorithm;
comparing the calculated Mahalanobis distance with a preset Mahalanobis distance threshold;
and aggregating the radar target data which are smaller than the preset Mahalanobis distance threshold value into a local radar track.
Further, step S3 includes the following sub-steps:
s31, the equipment health detection module starts a plurality of state detection threads to detect the health states of the radar equipment and the photoelectric equipment in real time according to the heartbeat data of the radar equipment and the heartbeat data of the photoelectric equipment;
s32, each state detection thread extracts the device ID number currently still operating healthily from the heartbeat data, and caches the device ID number, the number of times of continuously receiving heartbeat data, and the number of times of continuously not receiving heartbeat data for real-time detection of the health state of the device:
(1) if the frequency of continuously not receiving heartbeat data of a certain device is greater than a failure threshold value, setting the health state of the device to be failed, simultaneously waking up an instruction sending thread to send a data source failure instruction to a multi-source radar fusion module and a multi-source photoelectric cross positioning module, and performing failure operation on the data from the device when the data are processed in the multi-source radar fusion module and the multi-source photoelectric cross positioning module, namely the data from the device do not participate in the processing of the multi-source radar fusion module and the multi-source photoelectric cross positioning module;
(2) if the number of times that a certain device does not receive heartbeat data continuously is larger than a rejection threshold value, setting the health state of the device as rejection, deleting related data of the device from a cache, simultaneously waking up an instruction sending thread to send a data source rejection instruction to a multi-source radar fusion module and a multi-source photoelectric cross positioning module, and preventing the data from the device from being read when the multi-source radar fusion module and the multi-source photoelectric cross positioning module are processed;
(3) if the frequency of continuously receiving heartbeat data by a certain device is greater than a recovery threshold value in a failure state or a rejection state, caching the data of the device and setting the health state of the device to be healthy, simultaneously awakening an instruction sending thread to send a data source recovery instruction to a multi-source radar fusion module and a multi-source photoelectric cross positioning module, so that the data from the device is recovered when being processed in the multi-source radar fusion module and the multi-source photoelectric cross positioning module, namely the data from the device participates in the processing of the multi-source radar fusion module and the multi-source photoelectric cross positioning module again.
Further, assuming that there are N local radar tracks from the target data processing module in step S4, the calculation formula of the SF distributed fusion algorithm is as follows:
wherein,represents the optimal estimate of the ith local radar track at time k, Pi(k | k) represents the mean square error of the ith local radar track at time k,represents the optimal estimation of the system radar track at the k-th moment, and P (k | k) represents the mean square error of the system radar track at the k-th moment.
Further, step S5 includes the following sub-steps:
s51, acquiring the longitude and latitude of the photoelectric equipment through the positioning modules carried by the two photoelectric equipment, assuming that all calculations are carried out under an xyz coordinate system, and setting the coordinate of the position of the photoelectric equipment 1 as (x)1,y1,z1) The coordinate of the position of the optoelectronic device 2 is (x)2,y2,z2) Simultaneously assuming the target direction detected by opto-electronic device 1 and opto-electronic device 2 at the k-th instantAngle is respectively alpha1And alpha2Target pitch angles are respectively beta1And beta2Then, the coordinates of the position of the target at the k-th time are calculated as (x (k), y (k), z (k)) according to the following formula:
s52, selecting photoelectric detection data provided by two photoelectric devices capable of forming an intersection point in the multi-source photoelectric cross positioning module, starting a calculation thread to perform calculation once by the method of step S51 at fixed intervals, and if the photoelectric devices are in an active state and the facing positions of the two photoelectric devices continuously maintain the intersection point in the process, continuously generating subsequent position point coordinates of the target, thereby obtaining target positioning data.
Further, step S6 includes the following sub-steps:
s61, sampling and segmenting the optical television frequency stream by adopting an image extraction technology to obtain a plurality of video frame images;
s62, performing image quality improvement processing on the video frame image by adopting an interframe difference processing technology;
s63, performing image processing on the video frame image processed by the inter-frame difference processing technology by adopting a deep learning convolutional neural network with YoloV3 as a framework and extracting image features;
and S64, inputting the extracted image features into the target classification recognition model for target classification recognition to obtain a target classification recognition result.
Further, in step S7, the display control management module displays the system radar track, the target location data and the target classification and identification result in a human-computer interaction interface manner, and plays the photoelectric video stream.
The invention also provides a device for realizing the target detection display control method supporting photoelectric cross positioning in the radar failure scene, which comprises a display control management module, a protocol conversion module, a target data processing module, an equipment health detection module, a multi-source radar fusion module, a multi-source photoelectric cross positioning module and a target classification identification module, wherein the protocol conversion module, the target data processing module, the equipment health detection module, the multi-source radar fusion module, the multi-source photoelectric cross positioning module and the target classification identification module are controlled by the display control management module;
the input end of the protocol conversion module is used for connecting radar equipment and photoelectric equipment; the protocol conversion module is connected with the input end of the target classification identification module on one hand and connected with the target data processing module on the other hand;
the target data processing module is respectively connected with the equipment health detection module, the multi-source radar fusion module and the multi-source photoelectric cross positioning module, and the equipment health detection module, the multi-source radar fusion module, the multi-source photoelectric cross positioning module and the target classification identification module are all connected with the display control management module.
In summary, due to the adoption of the technical scheme, the invention has the beneficial effects that:
1. the invention is based on a streaming data processing framework, realizes multi-source signal fusion display under the condition that radar equipment and photoelectric equipment work simultaneously, relies on the advantages of photoelectric detection, combines the technology of cross positioning of a plurality of photoelectric equipment, realizes target classification identification and tracking under the condition of realizing more accurate positioning of an aerial target, and can also utilize target position data generated by photoelectric cross positioning to develop business under the condition of radar failure.
2. The invention can realize the automatic analysis, cross positioning, track initiation and display management of the relative position of the detection target of the photoelectric equipment;
3. the invention supports the integration of the radar track and the photoelectric cross positioning track, and simultaneously supports the independent display and the mutual switching of the radar track and the photoelectric cross positioning track;
4. the invention supports the low-altitude target monitoring functions of system radar track, target positioning data, target classification and identification result display, photoelectric video stream playing and the like;
5. the invention utilizes the unified protocol conversion, can support the self-introduction of different monitoring source data according to the actual requirement under the condition of less change, and realizes the consistency of the data transmission in different monitoring source data messages.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention, and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a schematic architecture diagram of a target detection display control method and device supporting photoelectric cross positioning in a radar failure scene according to an embodiment of the present invention.
FIG. 2 is a flow chart of the device health detection module processing data according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of a multi-source photoelectric cross location module processing data according to an embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Examples
The embodiment provides a target detection display control method supporting photoelectric cross positioning in a radar failure scene, which mainly adopts a technical framework of stream processing to complete the functions of receiving, processing, fusing, detecting, cross positioning, displaying, controlling, managing and the like of radar messages, photoelectric messages and photoelectric video streams.
As shown in fig. 1, the display control method includes the following steps:
s1, the protocol conversion module receives the photoelectric message and the photoelectric video stream from the photoelectric equipment and the thunder and lightning message from the radar equipment, directly sends the received optical television frequency stream to the target classification and identification module, converts the received photoelectric message and the radar message into a uniform data format and sends the uniform data format to the target data processing module; in this embodiment, a JSON format is adopted as a unified data format. Wherein:
the radar message includes: heartbeat data and radar target data of the radar device; the radar target data mainly comprises longitude, latitude, pitch, azimuth and speed;
the photoelectric message comprises: heartbeat data and photoelectric detection data of the photoelectric equipment; the photoelectric detection data mainly comprises pitching and orientation;
s2, the target data processing module processes the photoelectric message and the radar message after the protocol conversion:
(1) aggregating radar target data into a local radar track and outputting the local radar track to a multi-source radar fusion module; specifically, the method comprises the following steps:
calculating the Mahalanobis distance between each radar target data based on the longitude, the latitude, the pitch, the azimuth and the speed contained in the radar target data by using a tracking algorithm; wherein, the tracking algorithm is the prior art and is not described herein again;
comparing the calculated Mahalanobis distance with a preset Mahalanobis distance threshold; the Mahalanobis distance threshold is preset according to requirements;
aggregating radar target data corresponding to Mahalanobis distance threshold values smaller than preset value into local radar tracks
(2) The photoelectric detection data are directly forwarded to a multi-source photoelectric cross positioning module;
(3) directly forwarding heartbeat data of the radar equipment and heartbeat data of the photoelectric equipment to an equipment health detection module;
s3, the equipment health detection module detects the health states of the radar equipment and the photoelectric equipment in real time according to heartbeat data of the radar equipment and the heartbeat data of the photoelectric equipment, and outputs a data source failure instruction, a data source removing instruction or a data source recovering instruction to the multi-source radar fusion module and the multi-source photoelectric cross positioning module according to the detected health states, so that the data from the corresponding radar equipment and the photoelectric equipment are subjected to failure, removal or recovery operation when the data are processed in the multi-source radar fusion module and the multi-source photoelectric cross positioning module; as shown in fig. 2, specifically:
s31, the equipment health detection module starts a plurality of state detection threads to detect the health states of the radar equipment and the photoelectric equipment in real time according to the heartbeat data of the radar equipment and the heartbeat data of the photoelectric equipment;
s32, each state detection thread extracts the device ID number currently still operating healthily from the heartbeat data, and caches the device ID number, the number of times of continuously receiving heartbeat data, and the number of times of continuously not receiving heartbeat data for real-time detection of the health state of the device:
(1) if the frequency of continuously not receiving heartbeat data of a certain device is greater than a failure threshold value, setting the health state of the device to be failed, simultaneously waking up an instruction sending thread to send a data source failure instruction to a multi-source radar fusion module and a multi-source photoelectric cross positioning module, and performing failure operation on the data from the device when the data are processed in the multi-source radar fusion module and the multi-source photoelectric cross positioning module, namely the data from the device do not participate in the processing of the multi-source radar fusion module and the multi-source photoelectric cross positioning module;
(2) if the number of times that a certain device does not receive heartbeat data continuously is larger than a rejection threshold value, setting the health state of the device as rejection, deleting related data of the device from a cache, simultaneously waking up an instruction sending thread to send a data source rejection instruction to a multi-source radar fusion module and a multi-source photoelectric cross positioning module, and preventing the data from the device from being read when the multi-source radar fusion module and the multi-source photoelectric cross positioning module are processed;
(3) if the frequency of continuously receiving heartbeat data by a certain device is greater than a recovery threshold value in a failure state or a rejection state, caching the data of the device and setting the health state of the device to be healthy, simultaneously awakening an instruction sending thread to send a data source recovery instruction to a multi-source radar fusion module and a multi-source photoelectric cross positioning module, so that the data from the device is recovered when being processed in the multi-source radar fusion module and the multi-source photoelectric cross positioning module, namely the data from the device participates in the processing of the multi-source radar fusion module and the multi-source photoelectric cross positioning module again.
S4, the multi-source radar fusion module adopts SF distributed fusion algorithm to perform associated fusion on a plurality of local radar tracks from the target data processing module to form a system radar track of the current fusion period; wherein, assuming that there are N local radar tracks from the target data processing module, the calculation formula of the SF distributed fusion algorithm is as follows:
wherein,represents the optimal estimate of the ith local radar track at time k, Pi(k | k) represents the mean square error of the ith local radar track at time k,represents the optimal estimation of the system radar track at the k-th moment, and P (k | k) represents the mean square error of the system radar track at the k-th moment.
S5, the multi-source photoelectric cross positioning module gives out target positioning data to photoelectric detection data from the target data processing module through a cross positioning algorithm based on space stereo calculation geometry; as shown in fig. 3, specifically:
s51, obtaining the positioning information by two positioning modules carried by the photoelectric equipmentThe longitude and latitude of the photoelectric device are obtained, all calculation is performed under an xyz coordinate system, and the coordinate of the position of the photoelectric device 1 is set to (x)1,y1,z1) The coordinate of the position of the optoelectronic device 2 is (x)2,y2,z2) Meanwhile, assume that the target direction angles detected by the photoelectric devices 1 and 2 at the k-th time are respectively alpha1And alpha2Target pitch angles are respectively beta1And beta2Then, the coordinates of the position of the target at the k-th time are calculated as (x (k), y (k), z (k)) according to the following formula:
s52, selecting photoelectric detection data provided by two photoelectric devices capable of forming an intersection point in the multi-source photoelectric cross positioning module, starting a calculation thread to perform calculation once by the method of step S51 at fixed intervals, and if the photoelectric devices are in an active state and the facing positions of the two photoelectric devices continuously maintain the intersection point in the process, continuously generating subsequent position point coordinates of the target, thereby obtaining target positioning data.
S6, the target classification and identification module performs target classification and identification on the optical television frequency flow by adopting a deep learning convolutional neural network with YoloV3 as a framework to obtain a target classification and identification result; specifically, the method comprises the following steps:
s61, sampling and segmenting the optical television frequency stream by adopting an image extraction technology to obtain a plurality of video frame images;
s62, performing image quality improvement processing on the video frame image by adopting an interframe difference processing technology;
s63, performing image processing on the video frame image processed by the inter-frame difference processing technology by adopting a deep learning convolutional neural network with YoloV3 as a framework and extracting image features; the deep learning convolutional neural network with YoloV3 as a framework is the prior art and is not described herein again;
and S64, inputting the extracted image features into the target classification recognition model for target classification recognition to obtain a target classification recognition result. The target classification and identification model is a classifier which is trained by adopting training samples in advance, and the classifier can select a common classifier for target classification and identification according to needs.
And S7, the display control management module displays the system radar track, the target positioning data and the target classification and identification results, and plays the photoelectric video stream. In this embodiment, the display control management module may calculate and display capability based on geographic data and spatial information provided by the GIS engine, apply technologies in the fields of computer graphics, UI interaction, computer vision, network communication, and the like, read system radar track, target location data, target classification recognition results, photoelectric video streams, and the like, and display the system radar track, the target location data, the target classification recognition results, and the photoelectric video streams in a human-computer interaction interface manner after processing such as combining, packaging, rendering, and the like, and play the photoelectric video streams. Meanwhile, the interface operation and the instruction operation of the user can be monitored during the process of the man-machine interaction interface, and the interface operation and the instruction operation are fed back to each functional module in the modes of events, network streams and the like.
As shown in fig. 1, in order to implement the above-mentioned target detection display control method supporting photoelectric cross location in a radar failure scene, the display control device provided in this embodiment includes a display control management module, and a protocol conversion module, a target data processing module, an equipment health detection module, a multi-source radar fusion module, a multi-source photoelectric cross location module, and a target classification identification module that are controlled by the display control management module;
the input end of the protocol conversion module is used for connecting radar equipment and photoelectric equipment; the protocol conversion module is connected with the input end of the target classification identification module on one hand and connected with the target data processing module on the other hand;
the target data processing module is respectively connected with the equipment health detection module, the multi-source radar fusion module and the multi-source photoelectric cross positioning module, and the equipment health detection module, the multi-source radar fusion module, the multi-source photoelectric cross positioning module and the target classification identification module are all connected with the display control management module.
The protocol conversion module, the target data processing module, the equipment health detection module, the multi-source radar fusion module, the multi-source photoelectric cross positioning module, the target classification identification module and the display control management module in the display control device work in the display control method, and are not described again.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (9)
1. A target detection display control method supporting photoelectric cross positioning in a radar failure scene is characterized by comprising the following steps:
s1, the protocol conversion module receives the photoelectric message and the photoelectric video stream from the photoelectric equipment and the thunder and lightning message from the radar equipment, directly sends the received optical television frequency stream to the target classification and identification module, converts the received photoelectric message and the radar message into a uniform data format and sends the uniform data format to the target data processing module; wherein:
the radar message includes: heartbeat data and radar target data of the radar device;
the photoelectric message comprises: heartbeat data and photoelectric detection data of the photoelectric equipment;
s2, the target data processing module processes the photoelectric message and the radar message after the protocol conversion:
aggregating radar target data into a local radar track and outputting the local radar track to a multi-source radar fusion module;
the photoelectric detection data are directly forwarded to a multi-source photoelectric cross positioning module;
directly forwarding heartbeat data of the radar equipment and heartbeat data of the photoelectric equipment to an equipment health detection module;
s3, the equipment health detection module detects the health states of the radar equipment and the photoelectric equipment in real time according to heartbeat data of the radar equipment and the heartbeat data of the photoelectric equipment, and outputs a data source failure instruction, a data source removing instruction or a data source recovering instruction to the multi-source radar fusion module and the multi-source photoelectric cross positioning module according to the detected health states, so that the data from the corresponding radar equipment and the photoelectric equipment are subjected to failure, removal or recovery operation when the data are processed in the multi-source radar fusion module and the multi-source photoelectric cross positioning module;
s4, the multi-source radar fusion module adopts SF distributed fusion algorithm to perform associated fusion on a plurality of local radar tracks from the target data processing module to form a system radar track of the current fusion period;
s5, the multi-source photoelectric cross positioning module gives out target positioning data to photoelectric detection data from the target data processing module through a cross positioning algorithm based on space stereo calculation geometry;
s6, the target classification and identification module performs target classification and identification on the optical television frequency flow by adopting a deep learning convolutional neural network with YoloV3 as a framework to obtain a target classification and identification result;
and S7, the display control management module displays the system radar track, the target positioning data and the target classification and identification results, and plays the photoelectric video stream.
2. The method for detecting and displaying objects supporting photoelectric cross-positioning in radar failure scenarios as claimed in claim 1, wherein the unified data format in step S1 is JSON format.
3. The method for detecting and displaying and controlling the target supporting the photoelectric cross positioning in the radar failure scene according to claim 1, wherein the method for aggregating the radar target data into the local radar track and outputting the local radar track to the multi-source radar fusion module in the step S2 comprises:
calculating the Mahalanobis distance between radar target data by using a tracking algorithm;
comparing the calculated Mahalanobis distance with a preset Mahalanobis distance threshold;
and aggregating the radar target data which are smaller than the preset Mahalanobis distance threshold value into a local radar track.
4. The method for detecting and controlling the target supporting the photoelectric cross positioning in the radar failure scenario according to claim 1, wherein step S3 includes the following sub-steps:
s31, the equipment health detection module starts a plurality of state detection threads to detect the health states of the radar equipment and the photoelectric equipment in real time according to the heartbeat data of the radar equipment and the heartbeat data of the photoelectric equipment;
s32, each state detection thread extracts the device ID number currently still operating healthily from the heartbeat data, and caches the device ID number, the number of times of continuously receiving heartbeat data, and the number of times of continuously not receiving heartbeat data for real-time detection of the health state of the device:
(1) if the frequency of continuously not receiving heartbeat data of a certain device is greater than a failure threshold value, setting the health state of the device to be failed, simultaneously waking up an instruction sending thread to send a data source failure instruction to a multi-source radar fusion module and a multi-source photoelectric cross positioning module, and performing failure operation on the data from the device when the data are processed in the multi-source radar fusion module and the multi-source photoelectric cross positioning module, namely the data from the device do not participate in the processing of the multi-source radar fusion module and the multi-source photoelectric cross positioning module;
(2) if the number of times that a certain device does not receive heartbeat data continuously is larger than a rejection threshold value, setting the health state of the device as rejection, deleting related data of the device from a cache, simultaneously waking up an instruction sending thread to send a data source rejection instruction to a multi-source radar fusion module and a multi-source photoelectric cross positioning module, and preventing the data from the device from being read when the multi-source radar fusion module and the multi-source photoelectric cross positioning module are processed;
(3) if the frequency of continuously receiving heartbeat data by a certain device is greater than a recovery threshold value in a failure state or a rejection state, caching the data of the device and setting the health state of the device to be healthy, simultaneously awakening an instruction sending thread to send a data source recovery instruction to a multi-source radar fusion module and a multi-source photoelectric cross positioning module, so that the data from the device is recovered when being processed in the multi-source radar fusion module and the multi-source photoelectric cross positioning module, namely the data from the device participates in the processing of the multi-source radar fusion module and the multi-source photoelectric cross positioning module again.
5. The method for detecting and controlling objects supporting photoelectric cross-positioning in radar failure scenarios as claimed in claim 1, wherein in step S4, assuming that there are N local radar tracks from the object data processing module, the SF distributed fusion algorithm has the following calculation formula:
wherein,represents the optimal estimate of the ith local radar track at time k, Pi(k | k) represents the mean square error of the ith local radar track at time k,represents the optimal estimation of the system radar track at the k-th moment, and P (k | k) represents the mean square error of the system radar track at the k-th moment.
6. The method for detecting and controlling the target supporting the photoelectric cross positioning in the radar failure scenario according to claim 1, wherein step S5 includes the following sub-steps:
s51, acquiring the longitude and latitude of the photoelectric equipment through the positioning modules carried by the two photoelectric equipment, and assuming that all calculations are carried outThe coordinate of the position of the photoelectric device 1 is set to (x) in the xyz coordinate system1,y1,z1) The coordinate of the position of the optoelectronic device 2 is (x)2,y2,z2) Meanwhile, assume that the target direction angles detected by the photoelectric devices 1 and 2 at the k-th time are respectively alpha1And alpha2Target pitch angles are respectively beta1And beta2Then, the coordinates of the position of the target at the k-th time are calculated as (x (k), y (k), z (k)) according to the following formula:
s52, selecting photoelectric detection data provided by two photoelectric devices capable of forming an intersection point in the multi-source photoelectric cross positioning module, starting a calculation thread to perform calculation once by the method of step S51 at fixed intervals, and if the photoelectric devices are in an active state and the facing positions of the two photoelectric devices continuously maintain the intersection point in the process, continuously generating subsequent position point coordinates of the target, thereby obtaining target positioning data.
7. The method for detecting and controlling the target supporting the photoelectric cross positioning in the radar failure scenario according to claim 1, wherein step S6 includes the following sub-steps:
s61, sampling and segmenting the optical television frequency stream by adopting an image extraction technology to obtain a plurality of video frame images;
s62, performing image quality improvement processing on the video frame image by adopting an interframe difference processing technology;
s63, performing image processing on the video frame image processed by the inter-frame difference processing technology by adopting a deep learning convolutional neural network with YoloV3 as a framework and extracting image features;
and S64, inputting the extracted image features into the target classification recognition model for target classification recognition to obtain a target classification recognition result.
8. The method for detecting and displaying targets supporting photoelectric cross location under radar failure scenes as claimed in claim 1, wherein in step S7, the display management module displays the radar tracks, the target location data and the target classification and identification results of the system and plays the photoelectric video stream in a man-machine interface manner.
9. An apparatus for implementing the target detection display control method supporting photoelectric cross positioning in a radar failure scene according to any one of claims 1 to 8, wherein the apparatus comprises a display control management module, and a protocol conversion module, a target data processing module, an equipment health detection module, a multi-source radar fusion module, a multi-source photoelectric cross positioning module and a target classification identification module which are controlled by the display control management module;
the input end of the protocol conversion module is used for connecting radar equipment and photoelectric equipment; the protocol conversion module is connected with the input end of the target classification identification module on one hand and connected with the target data processing module on the other hand;
the target data processing module is respectively connected with the equipment health detection module, the multi-source radar fusion module and the multi-source photoelectric cross positioning module, and the equipment health detection module, the multi-source radar fusion module, the multi-source photoelectric cross positioning module and the target classification identification module are all connected with the display control management module.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111232889.2A CN114047505A (en) | 2021-10-22 | 2021-10-22 | Target detection display control method and device supporting photoelectric cross positioning in radar failure scene |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111232889.2A CN114047505A (en) | 2021-10-22 | 2021-10-22 | Target detection display control method and device supporting photoelectric cross positioning in radar failure scene |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114047505A true CN114047505A (en) | 2022-02-15 |
Family
ID=80205938
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111232889.2A Pending CN114047505A (en) | 2021-10-22 | 2021-10-22 | Target detection display control method and device supporting photoelectric cross positioning in radar failure scene |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114047505A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114660588A (en) * | 2022-02-25 | 2022-06-24 | 中国电子科技集团公司第二十八研究所 | Distributed photoelectric target tracking system for anti-unmanned aerial vehicle |
CN115932765A (en) * | 2022-12-13 | 2023-04-07 | 扬州宇安电子科技有限公司 | Radar failure automatic detection system and method based on multi-source data analysis |
CN114660588B (en) * | 2022-02-25 | 2024-10-22 | 中国电子科技集团公司第二十八研究所 | Distributed photoelectric target tracking system for anti-unmanned aerial vehicle |
-
2021
- 2021-10-22 CN CN202111232889.2A patent/CN114047505A/en active Pending
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114660588A (en) * | 2022-02-25 | 2022-06-24 | 中国电子科技集团公司第二十八研究所 | Distributed photoelectric target tracking system for anti-unmanned aerial vehicle |
CN114660588B (en) * | 2022-02-25 | 2024-10-22 | 中国电子科技集团公司第二十八研究所 | Distributed photoelectric target tracking system for anti-unmanned aerial vehicle |
CN115932765A (en) * | 2022-12-13 | 2023-04-07 | 扬州宇安电子科技有限公司 | Radar failure automatic detection system and method based on multi-source data analysis |
CN115932765B (en) * | 2022-12-13 | 2023-10-13 | 扬州宇安电子科技有限公司 | Radar failure automatic detection system and method based on multi-source data analysis |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103941746B (en) | Image processing system and method is patrolled and examined without man-machine | |
CN103280059B (en) | A kind of intelligent monitor system for subsea cable operation maintenance | |
CN105513087A (en) | Laser aiming and tracking equipment and method for controlling same | |
CN105407278A (en) | Panoramic video traffic situation monitoring system and method | |
CN101883261A (en) | Method and system for abnormal target detection and relay tracking under large-range monitoring scene | |
CN111767798B (en) | Intelligent broadcasting guide method and system for indoor networking video monitoring | |
CN103581614A (en) | Method and system for tracking targets in video based on PTZ | |
CN110058597A (en) | A kind of automatic Pilot heterogeneous system and implementation method | |
CN114114314A (en) | Power transmission line inspection detection system and detection method based on laser point cloud | |
CN109840454B (en) | Target positioning method, device, storage medium and equipment | |
CN101909206A (en) | Video-based intelligent flight vehicle tracking system | |
CN104185078A (en) | Video monitoring processing method, device and system thereof | |
CN103581627A (en) | Image and information fusion display method for high-definition video | |
CN105730705A (en) | Aircraft camera shooting positioning system | |
CN113589837A (en) | Electric power real-time inspection method based on edge cloud | |
CN110047092A (en) | Multiple target method for real time tracking under a kind of complex environment | |
CN109708659B (en) | Distributed intelligent photoelectric low-altitude protection system | |
EP4398183A1 (en) | Image stitching method and system based on multiple unmanned aerial vehicles | |
CN109948474A (en) | AI thermal imaging all-weather intelligent monitoring method | |
CN113014872A (en) | Automatic panorama operation and maintenance system based on 5G private network, Beidou positioning and image recognition | |
JP2021157853A (en) | Method and device for differentiating color of signal light and road-side apparatus | |
CN114047505A (en) | Target detection display control method and device supporting photoelectric cross positioning in radar failure scene | |
CN108521605A (en) | The playback method and play system of remote sensing video | |
CN111027195A (en) | Simulation scene generation method, device and equipment | |
CN114202819A (en) | Robot-based substation inspection method and system and computer |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |