CN115115992B - Multi-platform photoelectric auto-disturbance rejection tracking system and method based on brain map control right decision - Google Patents

Multi-platform photoelectric auto-disturbance rejection tracking system and method based on brain map control right decision Download PDF

Info

Publication number
CN115115992B
CN115115992B CN202210880757.9A CN202210880757A CN115115992B CN 115115992 B CN115115992 B CN 115115992B CN 202210880757 A CN202210880757 A CN 202210880757A CN 115115992 B CN115115992 B CN 115115992B
Authority
CN
China
Prior art keywords
target
tracking
photoelectric
processing unit
interference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210880757.9A
Other languages
Chinese (zh)
Other versions
CN115115992A (en
Inventor
李焱
李珍
周俊鹏
董宇星
刘海波
张海波
郭永强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Original Assignee
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changchun Institute of Optics Fine Mechanics and Physics of CAS filed Critical Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority to CN202210880757.9A priority Critical patent/CN115115992B/en
Publication of CN115115992A publication Critical patent/CN115115992A/en
Application granted granted Critical
Publication of CN115115992B publication Critical patent/CN115115992B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/62Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Biomedical Technology (AREA)
  • Dermatology (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • Artificial Intelligence (AREA)
  • Human Computer Interaction (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

A multi-platform photoelectric active disturbance rejection tracking system and method based on brain graph control right decision relates to the field of photoelectric countermeasure detection, and the method comprises the following steps: the photoelectric tracking equipment stably tracks the target and simultaneously updates the characteristic information of the target before being interfered in the information comprehensive processing unit in real time; the information comprehensive processing unit identifies interference according to the criterion and outputs an interference identification result; based on a brain graph control right decision principle, a control right decision device carries out decision distribution of the control right of the servo control unit according to the recognition interference result and the target feature matching result; during the memory tracking, extracting the target characteristic information of all captured targets in the optical television field, matching the characteristics with the characteristics before interference, finding out the correct target for re-tracking, collecting the target characteristic matching result and then switching to the control right decision. The interference recognition and correct target capture accuracy rate of the invention is more than 90%, and the memory tracking track extrapolation time is not less than 10 seconds when the track prediction error is not more than 0.3 deg.

Description

Multi-platform photoelectric active disturbance rejection tracking system and method based on brain graph control right decision
Technical Field
The invention relates to the technical field of photoelectric countermeasure detection, in particular to a multi-platform photoelectric active disturbance rejection tracking system and method based on brain graph control right decision.
Background
In the target tracking process of the photoelectric tracking device, due to the fact that the environment background is complex, the process of tracking the target by the video image is often interfered by laser-induced glare, smoke, interference bombs, birds, sea surface bright bands, sea clutter, ships, buildings, cloud layers, sunlight and the like, as shown in fig. 1, it can be seen that the video image is randomly and constantly interfered by the various interference sources, and the phenomenon that the target is mistakenly captured or blindly lost by the photoelectric tracking device is caused.
The method mainly includes the steps that currently, most of coping strategies are manual participation methods, namely, after an interference source is found, operators of equipment immediately perform manual intervention through the common combination of a control lever and a key of photoelectric tracking equipment, during the period that a target is blanked or interfered, the photoelectric tracking equipment is controlled by the aid of the photoelectric control lever in a guaranteed mode, track extrapolation is performed through experience, video image stability is maintained, after the interference source is dissipated, manual interpretation and operation are performed through experience again, the correct target is extracted again, and target tracking is recovered.
In the actual use process of the equipment, because most interference sources are random and irregular, the equipment operators are difficult to realize the purpose of resisting interference through correct and timely operation. The following phenomena often occur:
(a) The operator cannot perform manual intervention on the interference source in time;
(b) The target is blanked or the track extrapolation error is generated during the continuous interference period, and the target is lost;
(c) After the interferer dissipates, the correct target cannot be recaptured.
The method relying on manual intervention for resisting interference cannot ensure the accuracy and timeliness.
In some technical and method documents, documents such as chinese patent publication No. CN114281110A, servo memory tracking implementation method based on path prediction, and chinese patent publication No. CN109357675A, memory tracking algorithm method based on kalman filter, etc. only provide a method for how to perform track prediction, and no solution is provided for how to design an automatic anti-interference system, how to automatically identify interference, and how to automatically reacquire a correct target after interference dissipation.
Disclosure of Invention
The invention provides a multi-platform photoelectric active disturbance rejection tracking system and method based on brain graph control right decision, and aims to solve the problems that the existing photoelectric tracking equipment is difficult in interference identification, large in flight path prediction error, inaccurate in matching after interference dissipation, incapable of automatically recapturing correct targets and the like.
The technical scheme adopted by the invention for solving the technical problem is as follows:
the invention discloses a multi-platform photoelectric active disturbance rejection tracking system based on brain graph control right decision, which comprises:
the video image processing unit acquires a video image in real time, and inputs the video image to the information comprehensive processing unit and the control right decision maker after processing;
the information comprehensive processing unit is used for receiving the video image, updating target characteristic information in real time, identifying interference and outputting an identification interference result; when the photoelectric tracking equipment is in a memory tracking state, extracting target characteristic information, matching target characteristics and outputting a target characteristic matching result;
the control right decision device is based on a brain graph control right decision principle and performs decision distribution of the control right of the servo control unit according to the recognition interference result and the target characteristic matching result;
the servo control unit controls the servo control unit to complete the target stable tracking of the photoelectric tracking equipment through the video image processing unit when the photoelectric tracking equipment is not interfered or the target characteristic matching is successful; when the photoelectric tracking equipment is interfered or the target characteristic matching is unsuccessful, the information comprehensive processing unit controls the servo control unit to complete the memory tracking of the photoelectric tracking equipment.
The invention discloses a multi-platform photoelectric active disturbance rejection tracking method based on brain graph control right decision, which comprises the following steps:
s1: the photoelectric tracking equipment stably tracks the target and simultaneously updates the characteristic information of the target before being interfered in the information comprehensive processing unit in real time;
s2: the information comprehensive processing unit identifies interference according to the criterion and outputs an interference identification result;
s3: based on a brain graph control right decision principle, a control right decision device carries out decision distribution of the control right of the servo control unit according to the recognition interference result and the target feature matching result;
s4: memory tracking;
s5: during the memory tracking period, extracting target characteristic information of all captured targets in the optical television field;
s6: during the memory tracking, the target characteristic information in the step S5 is matched with the target characteristic information in the step S1 one by one, a correct target is found out for tracking again, and the step S3 is switched to carry out control right decision after the target characteristic matching result is collected.
Further, in step S1, the target feature information includes: target characteristics, a target pixel number dynamic threshold, a target gray level dynamic threshold, a target average speed, a target average acceleration, a target track motion direction, a target motion speed dynamic threshold and a predicted target track path.
Further, in step S2, the photoelectric tracking device is considered to be interfered when at least 1 of the following criteria is satisfied:
(1) Blanking the target;
(2) The target pixel number is mutated, and a dynamic threshold value of the target pixel number is broken through;
(3) The target gray level is suddenly changed, and a dynamic threshold value of the target gray level is broken through;
(4) And the target miss distance is mutated, so that the target speed is mutated, and a target movement speed dynamic threshold value is broken through.
Further, the specific operation flow of step S3 is as follows:
s3.1 that the photoelectric tracking equipment is in a target stable tracking state
S3.1.1, the recognition interference result in the step S2 shows that the photoelectric tracking equipment is not interfered, the servo control unit is controlled by the video image processing unit, and the servo control unit takes the target miss distance acquired by the video image processing unit as input information to complete closed-loop control so that the photoelectric tracking equipment keeps a target stable tracking state;
s3.1.2, when the interference recognition result in the step S2 shows that the photoelectric tracking equipment is interfered, the servo control unit is controlled by the information comprehensive processing unit, the servo control unit takes the angle information of the predicted target track path output by the information comprehensive processing unit as input information to complete closed-loop control, and the photoelectric tracking equipment is switched to the memory tracking state in the step S4;
s3.2 photoelectric tracking equipment is in a memory tracking state
S3.2.1 when the target characteristics are successfully matched, the servo control unit is controlled by the video image processing unit, the servo control unit takes the target miss distance acquired by the video image processing unit as input information to complete closed-loop control, and the photoelectric tracking equipment is switched to a target stable tracking state;
and S3.2.2, when the target feature matching is unsuccessful, the servo control unit is controlled by the information comprehensive processing unit, the servo control unit finishes closed-loop control by taking the angle information of the predicted target track path output by the information comprehensive processing unit as input information, and abandons all extracted targets in the current optical television field at the same time to search the targets again so that the photoelectric tracking equipment keeps the memory tracking state in the step S4.
Further, the specific operation flow of step S4 is as follows:
and during the period that the target is blanked or continuously interfered, the servo control unit carries out closed-loop tracking according to the predicted target track path in the step S1, so that the photoelectric tracking equipment keeps a memory tracking state.
Further, in step S5, the target feature information includes: the target miss distance, the target characteristics, the target average pixel number, the target average gray scale, the target average speed and the target track motion direction.
Further, the specific operation flow of step S6 is as follows:
s6.1, continuously and effectively enabling the target miss distance in the step S5 to at least meet 50 fields;
s6.2 step S5 is consistent with the target characteristics in step S1;
s6.3, the target average pixel number in the step S5 is between the target pixel number dynamic thresholds in the step S1;
s6.4, the target average gray scale in the step S5 is between the target gray scale dynamic thresholds in the step S1;
s6.5, the motion direction of the target track in the step S5 is consistent with that of the target track in the step S1;
s6.6 the target average speed in step S5 is between the target movement speed dynamic thresholds in step S1;
s6.7, transmitting the matching results from the step S6.1 to the step S6.6 to a control right decision device in the step S3 for decision;
and when S6.1-S6.6 are all satisfied, the target feature matching is considered to be successful, otherwise, the target feature matching is not successful.
The beneficial effects of the invention are:
the invention has been widely applied to various photoelectric tracking devices such as multi-platform (ship-borne, airborne, vehicle-borne, land-based, etc.) and the like. The method can automatically identify the interference source, the identification of the interference source is quicker and more accurate, the accuracy of the interference identification of the photoelectric tracking equipment is improved from 10-30% to more than 90%, the extrapolation time of the memory tracking track is not less than 10 seconds under the condition that the track prediction error is not more than 0.3 degrees, and the memory tracking track prediction track is more accurate; after the interference source is dissipated, the correct target is automatically captured, the accuracy rate of capturing the correct target is improved to more than 90% from 5% -8%, the anti-interference capability of the photoelectric tracking equipment is greatly improved, unattended operation of the photoelectric tracking equipment can be realized, a human-in-loop is eliminated, and the method is an effective method for realizing automatic anti-interference of the photoelectric tracking equipment. The invention has low cost, no need of professional operation and no unattended operation, and has high application value.
Drawings
Fig. 1 is a diagram of a complex environment interference situation of a conventional photoelectric tracking device. In fig. 1, a is a laser induced glare interference situation diagram, B is a sunlight interference situation diagram, C is a building shielding interference situation diagram, and D is a smoke interference situation diagram.
Fig. 2 is a block diagram illustrating the structure of the multi-platform electro-optical active disturbance rejection tracking system based on the decision of the brain graph control right according to the present invention.
Fig. 3 is a flowchart of a multi-platform electro-optical active disturbance rejection tracking method based on electroencephalogram control authority decision according to the present invention.
Fig. 4 is a diagram of interference identification data analysis.
FIG. 5 is a graph of track prediction error.
Fig. 6 is an analysis diagram of average velocity matching success data.
Fig. 7 is a graph of the active disturbance rejection situation based on the control right decision maker.
Fig. 8 is a diagram of the active disturbance rejection situation based on the control right decision maker.
Fig. 9 is a diagram of the active disturbance rejection situation based on the control right decision maker.
Fig. 10 is a graph of the active disturbance rejection situation based on the control right decision maker.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings.
The invention discloses a multi-platform photoelectric auto-disturbance rejection tracking system based on brain map control right decision, which mainly comprises: the device comprises a video image processing unit, an information comprehensive processing unit, a control right decision maker and a servo control unit.
There are two pathways in brain map control power decision:
a) Human eye → fast path of torso (requirement for fast response);
b) Human eye → brain → the slow pathway of the trunk (complex logical thinking).
Two paths of the multi-platform photoelectric auto-disturbance-rejection tracking system based on brain graph control right decision are shown in fig. 2:
a) Video image processing unit → fast path of servo control unit (fast response to real-time tracking requirement);
b) Video image processing unit → information synthesis processing unit → slow path of servo control unit (complex logic judgment and operation).
The video image processing unit acquires a video image in real time, and inputs the video image to the information comprehensive processing unit and the control right decision maker after processing;
the information comprehensive processing unit is used for receiving the video image, updating target characteristic information in real time, identifying interference, outputting an interference identification result and predicting a target track path; when the photoelectric tracking equipment is in a memory tracking state, extracting target characteristic information, matching target characteristics and outputting a target characteristic matching result;
the control right decision device is based on a brain graph control right decision principle and performs decision distribution of the control right of the servo control unit according to the recognition interference result and the target characteristic matching result;
the servo control unit controls the servo control unit to complete the target stable tracking of the photoelectric tracking equipment through the video image processing unit when the photoelectric tracking equipment is not interfered or the target characteristic matching is successful; when the photoelectric tracking equipment is interfered or the target characteristic matching is unsuccessful, the information comprehensive processing unit controls the servo control unit to complete the memory tracking of the photoelectric tracking equipment.
As shown in fig. 3, the multi-platform photoelectric auto-disturbance-rejection tracking method based on electroencephalogram control right decision of the present invention mainly includes the following steps: updating target characteristic information → identifying interference → controlling power decision → memory tracking → extracting target characteristic information → target characteristic matching.
The multi-platform photoelectric active disturbance rejection tracking method based on brain map control right decision of the invention is further explained in detail below.
The invention discloses a multi-platform photoelectric auto-disturbance rejection tracking method based on brain graph control right decision, which comprises the following specific operation flows of:
s1: and updating target characteristic information, namely updating the target characteristic information before interference in the information comprehensive processing unit in real time while the photoelectric tracking equipment stably tracks the target, wherein the target characteristic information is data before the photoelectric tracking equipment is interfered. The characteristic information comprises target characteristics (moving targets or static targets), dynamic thresholds of target sizes (target pixel number dynamic thresholds), target gray scale (radiation characteristic) dynamic thresholds, target average speed, target average acceleration, target track motion direction, target motion speed dynamic thresholds, predicted target track paths and the like.
In particular:
s1.1, calculating the average speed (azimuth and pitch) and the average acceleration (azimuth and pitch) of the target by using a Kalman filtering algorithm, wherein the input quantity is angle information (azimuth and pitch) and miss distance information (X axis and Y axis) of the target;
s1.2, judging whether the target is a moving target or a static target, if the average speed in the azimuth or the pitch direction is lower than 0.05 degrees/S, considering that the target is static in the direction and is the static target, and if the average speed in the azimuth or the pitch direction is higher than 0.05 degrees/S, considering that the target moves in the direction and is the moving target;
s1.3, calculating the dynamic threshold of the pixel number occupied by the target in the photoelectric view field, and taking the maximum value (Max) of the pixel data of the N fields 1 ) Minimum value (Min) 1 ) And Mean value (Mean) 1 ) Max, upper limit formula of pixel number dynamic threshold 1 +2*(Max 1 -Mean 1 ) Lower limit formula Min of pixel number dynamic threshold 1 -2*(Mean 1 -Min 1 ) Forming a target pixel number dynamic threshold;
s1.4 calculating dynamic threshold of target gray (radiation characteristic), and taking maximum value (Max) of N field gray data 2 ) Minimum value (Min) 2 ) And Mean value (Mean) 2 ) Upper limit formula Max of target gray (radiation characteristic) dynamic threshold 2 +2*(Max 2 -Mean 2 ) Lower limit formula Min of dynamic threshold of target gray level (radiation characteristic) 2 -2*(Mean 2 -Min 2 ) Forming a target gray (radiation characteristic) dynamic threshold;
s1.5, calculating the motion direction of the target track. In the X-axis direction, the right movement of the target is positive and the left movement of the target is negative relative to the center of the photoelectric field; in the Y-axis direction, the upward movement of the target is positive, and the downward movement of the target is negative;
s1.6 calculating the dynamic threshold of the moving speed of the target, and taking the maximum value (Max) of the speed data of N fields 3 ) Min (Min) 3 ) And Mean value (Mean) 3 ) Max formula for upper limit of dynamic threshold of target moving speed 3 +2*(Max 3 -Mean 3 ),Lower limit formula Min of dynamic threshold of target movement speed 3 -2*(Mean 3 -Min 3 ) Forming a dynamic threshold of the target movement speed;
s1.7, calculating and predicting a target track path through a Kalman filtering algorithm, namely predicting angle information (azimuth and pitch) of the photoelectric tracking equipment.
S2: interference is identified. Sources of interference that cause the target to be concealed within the photoelectric field of view or affect the target tracking stability are considered as interference, in particular:
(1) Target blanking (target miss amount is not valid);
(2) The target size (pixel number) is mutated, and the target pixel number dynamic threshold value in the step S1.3 is broken through;
(3) The target gray (radiation characteristic) is suddenly changed, and the target gray dynamic threshold value in the step S1.4 is broken through;
(4) And (4) sudden change of the target miss distance, so that the target speed is suddenly changed, and the target movement speed dynamic threshold value in the step S1.6 is broken through.
When the at least 1 criterion is met, the photoelectric tracking equipment is considered to be interfered, the information comprehensive processing unit is used for carrying out interference identification and outputting an identification interference result (the photoelectric tracking equipment is not interfered or the photoelectric tracking equipment is interfered). Taking the case that the target speed is suddenly changed, when the dynamic upper threshold of the target movement speed is broken through, it is regarded that the photoelectric tracking device is interfered, as shown in fig. 4, a is the dynamic upper threshold of the target movement speed, b is the target movement speed, and c is the dynamic lower threshold of the target movement speed.
S3: as shown in fig. 2, the control right decider decides whether the servo control unit is controlled by the video image processing unit (fast path) or the information comprehensive processing unit (slow path) based on the brain map control right decision principle. And the control right decision device receives the recognition interference result of the step S2 and the target characteristic matching result of the step S6, and then carries out decision distribution of the control right of the servo control unit, specifically:
s3.1, the photoelectric tracking equipment is in a target stable tracking state:
s3.1.1, the recognition interference result in the step S2 shows that the photoelectric tracking equipment is not interfered, the servo control unit is controlled by the video image processing unit, and the servo control unit takes the target miss distance acquired by the video image processing unit as input information to complete closed-loop control so that the photoelectric tracking equipment keeps a target stable tracking state;
s3.1.2, if the interference recognition result in the step S2 shows that the photoelectric tracking equipment is interfered, the servo control unit is controlled by the information comprehensive processing unit, the servo control unit takes the angle information (direction and pitching) of the predicted target track path output by the information comprehensive processing unit as input information to complete closed-loop control, and the photoelectric tracking equipment is switched to the memory tracking state in the step S4.
S3.2, the photoelectric tracking equipment is in a memory tracking state:
s3.2.1 when the target characteristics are successfully matched, the servo control unit is controlled by the video image processing unit, the servo control unit takes the target miss distance acquired by the video image processing unit as input information to complete closed-loop control, and the photoelectric tracking equipment is switched to a target stable tracking state;
and S3.2.2, when the target feature matching is unsuccessful, the servo control unit is controlled by the information comprehensive processing unit, the servo control unit finishes closed-loop control by taking the angle information (azimuth and pitching) of the predicted target track path output by the information comprehensive processing unit as input information, and simultaneously abandons all extracted targets in the current optical television field, searches the targets again and enables the photoelectric tracking equipment to keep the memory tracking state in the step S4.
S4: and (5) memory tracking. In particular: during the period that the target is blanked or continuously interfered, the servo control unit carries out closed-loop tracking according to the predicted target track path in the step S1.7, so that the photoelectric tracking equipment keeps a memory tracking state (the photoelectric tracking equipment carries out track extrapolation according to the information such as the target track path, the speed, the acceleration and the like before the target is interfered (step S1)).
Examples are: as shown in fig. 5, the data recording frequency is 16Hz, i.e. 62.5ms, a set of data is recorded, often times more than 30 seconds in total. In fig. 5, a is the true azimuth angle of the target motion and b is the predicted extrapolated azimuth angle of the target, it can be seen that the goodness of fit of the two curves is good in the first 10 seconds (160/16). Now, only a part of the data is intercepted for analysis, as shown in table 1, it can be seen that at the serial number 235 (235/16 =14.6875 seconds), the error between the true azimuth angle of the target motion and the target azimuth angle predicted to be extrapolated just exceeds 0.3 °, that is, if the interference source is dissipated at this time, the target will appear near the position 0.3 ° off the center of the photoelectric field of view, which is convenient for the photoelectric tracking device to complete target re-tracking.
Table 1 memorizes tracking angle error analysis table.
Serial number Predicting extrapolated target azimuth angles True azimuth angle of target motion Accuracy of measurement
221 56.636197 56.885231 -0.24903
222 56.644634 56.901481 -0.25685
223 56.653072 56.912367 -0.2593
224 56.66151 56.917761 -0.25625
225 56.669948 56.939665 -0.26972
226 56.678386 56.94499 -0.2666
227 56.686823 56.95578 -0.26896
228 56.695261 56.966537 -0.27128
229 56.703699 56.982849 -0.27915
230 56.712137 56.993606 -0.28147
231 56.720574 57.004142 -0.28357
232 56.729012 57.025789 -0.29678
233 56.73745 57.030934 -0.29348
234 56.745888 57.0416 -0.29571
235 56.754325 57.057784 -0.30346
236 56.762763 57.068307 -0.30554
237 56.771201 57.078858 -0.30766
S5: and extracting target characteristic information. During the memory tracking, all captured targets in the optical television field are subjected to target characteristic information extraction, and the characteristic information comprises target miss distance, target characteristics (moving targets or static targets), average size (average pixel number) of the target, target average gray scale (average radiation characteristic), target average speed, target track moving direction and the like.
In particular:
s5.1, continuously and effectively obtaining target miss amount;
s5.2, input quantity is angle information (azimuth and pitching) and miss distance information (X axis and Y axis) of the target, and the average speed (azimuth and pitching) of the target is calculated through a Kalman filtering algorithm;
s5.3, judging whether the target is a moving target or a static target, if the average speed in the azimuth or the pitch direction is lower than 0.05 degrees/S, considering that the target is static in the direction and is a static target, and if the average speed in the azimuth or the pitch direction is higher than 0.05 degrees/S, considering that the target moves in the direction and is a moving target;
s5.4, calculating the average pixel number occupied by the target in the photoelectric field of view;
s5.5, calculating the average value of the target gray level (radiation characteristic);
and S5.6, calculating the track motion direction of the target. In the X-axis direction, the right movement of the target is positive and the left movement of the target is negative relative to the center of the photoelectric field; in the Y-axis direction, the target upward movement is positive and the target downward movement is negative.
S6: and matching target features. During the memory tracking, the target characteristic information captured in the step S5 is matched with the target characteristic information before being interfered in the step S1 one by one, so that the correct target is found out for re-tracking, and the step S3 is switched to make a control right decision after the target characteristic matching result is collected.
In particular:
s6.1, continuously and effectively enabling the target miss distance in the step S5.1 to at least meet 50 fields;
s6.2 step S5.3 is consistent with the target characteristics (moving target or stationary target) in step S1.2;
s6.3 the target average pixel number in step S5.4 is between the target pixel number dynamic thresholds in step S1.3;
s6.4 the target average gray level in step S5.5 is between the target gray level dynamic thresholds in step S1.4;
s6.5 step S5.6 is consistent with the target track motion direction in step S1.5;
s6.6 the target average speed in step S5.2 is between the target movement speed dynamic thresholds in step S1.6;
and S6.7, transmitting the matching results of the step S6.1, the step S6.2, the step S6.3, the step S6.4, the step S6.5 and the step S6.6 to a control right decision maker of the step S3 for decision. Wherein, the six criteria of step S6.1, step S6.2, step S6.3, step S6.4, step S6.5 and step S6.6 are all required to be satisfied, the target feature matching is considered successful, otherwise the target feature matching is unsuccessful. Taking the successful matching of the average speed of the photoelectric captured target as an example, as shown in fig. 6, a is a dynamic upper threshold of the moving speed of the target before being interfered, b is the average speed of the target, and c is a dynamic lower threshold of the moving speed of the target before being interfered.
The multi-platform photoelectric active disturbance rejection tracking system and method based on brain map control right decision are applied to the ocean situation subjects of a certain type of ship-based platform, the target is a straight 9S helicopter, the interference source is a large-area thick cloud layer, the photoelectric tracking device identifies cloud layer interference in a range of 11; the target is continuously blanked in the cloud layer, the memory tracking extrapolation angle is accurate (the track extrapolation error is less than 0.1 °), as shown in fig. 8 and 9, the 11; and 11, in the step 37 (memory tracking is finished), cloud layer occlusion dissipation is performed, target feature matching is successful (matching takes 519 ms), the control right decision-making unit transfers the control right of the servo control unit from the information comprehensive processing unit to the video image processing unit, the target is restored for re-tracking, and a correct target is re-captured, as shown in fig. 10, the whole process takes 40 seconds in total.
The above description is only a preferred embodiment of the present invention, and does not limit the present invention in any way. It will be understood by those skilled in the art that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (2)

1. A multi-platform photoelectric active disturbance rejection tracking system based on a brain graph control right decision is applied to a multi-platform photoelectric active disturbance rejection tracking method based on the brain graph control right decision, and the method comprises the following steps:
s1: the method comprises the steps that when a photoelectric tracking device stably tracks a target, target characteristic information before interference is updated in an information comprehensive processing unit in real time; the target feature information includes: target characteristics, a target pixel number dynamic threshold, a target gray level dynamic threshold, a target average speed, a target average acceleration, a target track motion direction, a target motion speed dynamic threshold and a predicted target track path;
s2: the information comprehensive processing unit identifies interference according to the criterion and outputs an interference identification result;
the optoelectronic tracking device is considered to be disturbed when at least 1 criterion is satisfied:
(1) Blanking the target;
(2) The target pixel number is mutated, and a dynamic threshold value of the target pixel number is broken through;
(3) The target gray level is suddenly changed, and a dynamic threshold value of the target gray level is broken through;
(4) The target miss distance is mutated, so that the target speed is mutated, and a target motion speed dynamic threshold value is broken through;
s3: based on a brain graph control right decision principle, a control right decision device carries out decision distribution of the control right of the servo control unit according to the recognition interference result and the target feature matching result;
s3.1, the photoelectric tracking equipment is in a target stable tracking state;
s3.1.1, if the interference recognition result in the step S2 shows that the photoelectric tracking equipment is not interfered, the servo control unit is controlled by the video image processing unit, and the servo control unit takes the target miss distance collected by the video image processing unit as input information to complete closed-loop control so that the photoelectric tracking equipment keeps a target stable tracking state;
s3.1.2, when the interference recognition result in the step S2 shows that the photoelectric tracking equipment is interfered, the servo control unit is controlled by the information comprehensive processing unit, the servo control unit takes the angle information of the predicted target track path output by the information comprehensive processing unit as input information to complete closed-loop control, and the photoelectric tracking equipment is switched to the memory tracking state in the step S4;
s3.2, the photoelectric tracking equipment is in a memory tracking state;
s3.2.1 when the target characteristics are successfully matched, the servo control unit is controlled by the video image processing unit, the servo control unit takes the target miss distance acquired by the video image processing unit as input information to complete closed-loop control, and the photoelectric tracking equipment is switched to a target stable tracking state;
s3.2.2, when the target feature matching is unsuccessful, the servo control unit is controlled by the information comprehensive processing unit, the servo control unit finishes closed-loop control by taking the angle information of the predicted target track path output by the information comprehensive processing unit as input information, and simultaneously abandons all extracted targets in the current optical television field and searches the targets again to ensure that the photoelectric tracking equipment keeps the memory tracking state of the step S4;
s4: memory tracking;
s5: during the memory tracking period, extracting target characteristic information of all captured targets in the optical television field; the target feature information includes: the target miss distance, the target characteristics, the target average pixel number, the target average gray scale, the target average speed and the target track motion direction;
s6: during the memory tracking, matching the target characteristic information in the step S5 with the target characteristic information in the step S1, finding out a correct target for tracking again, and switching to the step S3 for decision of control right after collecting the target characteristic matching result;
s6.1, continuously and effectively enabling the target miss distance in the step S5 to at least meet 50 fields;
s6.2 step S5 is consistent with the target characteristics in step S1;
s6.3, the target average pixel number in the step S5 is between the target pixel number dynamic thresholds in the step S1;
s6.4, the target average gray scale in the step S5 is between the target gray scale dynamic thresholds in the step S1;
s6.5, the movement direction of the target track in the step S5 is consistent with that in the step S1;
s6.6 the target average speed in step S5 is between the target movement speed dynamic thresholds in step S1;
s6.7, transmitting the matching results from the step S6.1 to the step S6.6 to a control right decision device in the step S3 for decision;
when S6.1-S6.6 are all satisfied, the target feature matching is considered to be successful, otherwise, the target feature matching is not successful;
the system comprises:
the video image processing unit acquires a video image in real time, and inputs the video image to the information comprehensive processing unit and the control right decision maker after processing;
the information comprehensive processing unit is used for receiving the video image, updating target characteristic information in real time, identifying interference and outputting an identification interference result; when the photoelectric tracking equipment is in a memory tracking state, extracting target characteristic information, matching target characteristics and outputting a target characteristic matching result;
the control right decision device is used for carrying out decision distribution on the control right of the servo control unit according to the interference recognition result and the target feature matching result based on a brain graph control right decision principle;
the servo control unit controls the servo control unit to complete the target stable tracking of the photoelectric tracking equipment through the video image processing unit when the photoelectric tracking equipment is not interfered or the target characteristic matching is successful; when the photoelectric tracking equipment is interfered or the target characteristic matching is unsuccessful, the information comprehensive processing unit controls the servo control unit to complete the memory tracking of the photoelectric tracking equipment.
2. The multi-platform photoelectric auto-disturbance-rejection tracking system based on electroencephalogram control right decision as claimed in claim 1, wherein the specific operation flow of step S4 is as follows:
and in the period that the target is blanked or continuously interfered, the servo control unit carries out closed-loop tracking according to the predicted target track path in the step S1, so that the photoelectric tracking equipment keeps a memory tracking state.
CN202210880757.9A 2022-07-26 2022-07-26 Multi-platform photoelectric auto-disturbance rejection tracking system and method based on brain map control right decision Active CN115115992B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210880757.9A CN115115992B (en) 2022-07-26 2022-07-26 Multi-platform photoelectric auto-disturbance rejection tracking system and method based on brain map control right decision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210880757.9A CN115115992B (en) 2022-07-26 2022-07-26 Multi-platform photoelectric auto-disturbance rejection tracking system and method based on brain map control right decision

Publications (2)

Publication Number Publication Date
CN115115992A CN115115992A (en) 2022-09-27
CN115115992B true CN115115992B (en) 2022-11-15

Family

ID=83333877

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210880757.9A Active CN115115992B (en) 2022-07-26 2022-07-26 Multi-platform photoelectric auto-disturbance rejection tracking system and method based on brain map control right decision

Country Status (1)

Country Link
CN (1) CN115115992B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110276383A (en) * 2019-05-31 2019-09-24 北京理工大学 A kind of nuclear phase pass filtered target localization method based on multichannel memory models
CN113885312A (en) * 2021-11-06 2022-01-04 深圳市明日系统集成有限公司 Photoelectric tracking system and method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104021572B (en) * 2014-07-01 2017-10-10 金陵科技学院 A kind of method of sensitive angle target following
CN110276784B (en) * 2019-06-03 2021-04-06 北京理工大学 Correlation filtering moving target tracking method based on memory mechanism and convolution characteristics
CN114092522B (en) * 2021-11-30 2024-06-07 中国科学院长春光学精密机械与物理研究所 Airport plane take-off and landing intelligent capturing and tracking method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110276383A (en) * 2019-05-31 2019-09-24 北京理工大学 A kind of nuclear phase pass filtered target localization method based on multichannel memory models
CN113885312A (en) * 2021-11-06 2022-01-04 深圳市明日系统集成有限公司 Photoelectric tracking system and method

Also Published As

Publication number Publication date
CN115115992A (en) 2022-09-27

Similar Documents

Publication Publication Date Title
Kartashov et al. Optical detection of unmanned air vehicles on a video stream in a real-time
CN110597264B (en) Unmanned aerial vehicle counter-braking system
CN109407697A (en) A kind of unmanned plane pursuit movement goal systems and method based on binocular distance measurement
CN115661204B (en) Collaborative searching and tracking positioning method for moving target by unmanned aerial vehicle cluster
CN102096925A (en) Real-time closed loop predictive tracking method of maneuvering target
CN102937438B (en) Infrared dim target distance detection method based on optimization method
CN109859202B (en) Deep learning detection method based on USV water surface optical target tracking
CN106447699B (en) High iron catenary object detecting and tracking method based on Kalman filtering
CN105810023B (en) Airport undercarriage control automatic monitoring method
CN114740497B (en) UKF multisource fusion detection-based unmanned aerial vehicle deception method
CN115147790B (en) Future track prediction method of vehicle based on graph neural network
CN115712306A (en) Unmanned aerial vehicle navigation method for multi-machine cooperation target tracking
CN114047785A (en) Method and system for cooperatively searching multiple moving targets by unmanned aerial vehicle cluster
CN115932834A (en) Anti-unmanned aerial vehicle system target detection method based on multi-source heterogeneous data fusion
CN110889353B (en) Space target identification method based on primary focus large-visual-field photoelectric telescope
CN115115992B (en) Multi-platform photoelectric auto-disturbance rejection tracking system and method based on brain map control right decision
CN114139373B (en) Multi-sensor automatic collaborative management method for unmanned aerial vehicle reverse vehicle
CN107390164A (en) A kind of continuous tracking method of underwater distributed multi-source target
CN116977902B (en) Target tracking method and system for on-board photoelectric stabilized platform of coastal defense
CN116753778A (en) Unmanned aerial vehicle countering system and method based on information fusion and task distribution
CN111947647A (en) Robot accurate positioning method integrating vision and laser radar
CN113391642B (en) Unmanned aerial vehicle autonomous obstacle avoidance method and system based on monocular vision
CN114326795B (en) Active avoidance method for aircraft based on star network information
Martínez-de Dios et al. Towards UAS surveillance using event cameras
Huang et al. Visual tracking in cluttered environments using the visual probabilistic data association filter

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant